Sep 30 06:19:15 crc systemd[1]: Starting Kubernetes Kubelet... Sep 30 06:19:15 crc restorecon[4664]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:15 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 06:19:16 crc restorecon[4664]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Sep 30 06:19:16 crc restorecon[4664]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Sep 30 06:19:16 crc kubenswrapper[4691]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 30 06:19:16 crc kubenswrapper[4691]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Sep 30 06:19:16 crc kubenswrapper[4691]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 30 06:19:16 crc kubenswrapper[4691]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 30 06:19:16 crc kubenswrapper[4691]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Sep 30 06:19:16 crc kubenswrapper[4691]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.974294 4691 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.979263 4691 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.979295 4691 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.979305 4691 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.979314 4691 feature_gate.go:330] unrecognized feature gate: NewOLM Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.979323 4691 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.979331 4691 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.979343 4691 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.979356 4691 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.979377 4691 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.979388 4691 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.979397 4691 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.979428 4691 feature_gate.go:330] unrecognized feature gate: Example Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.979438 4691 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.979448 4691 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.979457 4691 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.979466 4691 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.979476 4691 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.979485 4691 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.979494 4691 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.979502 4691 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.979513 4691 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.979523 4691 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.979535 4691 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.979547 4691 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.979555 4691 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.979567 4691 feature_gate.go:330] unrecognized feature gate: PlatformOperators Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.979576 4691 feature_gate.go:330] unrecognized feature gate: SignatureStores Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.979584 4691 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.979593 4691 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.979602 4691 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.979610 4691 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.979619 4691 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.980407 4691 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.980423 4691 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.980433 4691 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.980442 4691 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.980451 4691 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.980460 4691 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.980468 4691 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.980478 4691 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.980487 4691 feature_gate.go:330] unrecognized feature gate: PinnedImages Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.980495 4691 feature_gate.go:330] unrecognized feature gate: OVNObservability Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.980503 4691 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.980512 4691 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.980520 4691 feature_gate.go:330] unrecognized feature gate: GatewayAPI Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.980528 4691 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.980536 4691 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.980544 4691 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.980553 4691 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.980561 4691 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.980569 4691 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.980577 4691 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.980585 4691 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.980596 4691 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.980604 4691 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.980613 4691 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.980622 4691 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.980631 4691 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.980639 4691 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.980650 4691 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.980658 4691 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.980667 4691 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.980675 4691 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.980683 4691 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.980691 4691 feature_gate.go:330] unrecognized feature gate: InsightsConfig Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.980699 4691 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.980707 4691 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.980717 4691 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.980725 4691 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.980734 4691 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.980741 4691 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.980973 4691 flags.go:64] FLAG: --address="0.0.0.0" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.980992 4691 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.981359 4691 flags.go:64] FLAG: --anonymous-auth="true" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.981373 4691 flags.go:64] FLAG: --application-metrics-count-limit="100" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.981386 4691 flags.go:64] FLAG: --authentication-token-webhook="false" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.981396 4691 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.981408 4691 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.981420 4691 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.981431 4691 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.981441 4691 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.981451 4691 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.981461 4691 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.981471 4691 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.981480 4691 flags.go:64] FLAG: --cgroup-root="" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.981490 4691 flags.go:64] FLAG: --cgroups-per-qos="true" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.981500 4691 flags.go:64] FLAG: --client-ca-file="" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.981509 4691 flags.go:64] FLAG: --cloud-config="" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.981519 4691 flags.go:64] FLAG: --cloud-provider="" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.981528 4691 flags.go:64] FLAG: --cluster-dns="[]" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.981541 4691 flags.go:64] FLAG: --cluster-domain="" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.981550 4691 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.981560 4691 flags.go:64] FLAG: --config-dir="" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.981569 4691 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.981579 4691 flags.go:64] FLAG: --container-log-max-files="5" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.981590 4691 flags.go:64] FLAG: --container-log-max-size="10Mi" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.981601 4691 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.981610 4691 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.981620 4691 flags.go:64] FLAG: --containerd-namespace="k8s.io" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.981630 4691 flags.go:64] FLAG: --contention-profiling="false" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.981640 4691 flags.go:64] FLAG: --cpu-cfs-quota="true" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.981650 4691 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.981659 4691 flags.go:64] FLAG: --cpu-manager-policy="none" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.981670 4691 flags.go:64] FLAG: --cpu-manager-policy-options="" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.981681 4691 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.981691 4691 flags.go:64] FLAG: --enable-controller-attach-detach="true" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.981701 4691 flags.go:64] FLAG: --enable-debugging-handlers="true" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.981710 4691 flags.go:64] FLAG: --enable-load-reader="false" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.981719 4691 flags.go:64] FLAG: --enable-server="true" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.981729 4691 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.981740 4691 flags.go:64] FLAG: --event-burst="100" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.981750 4691 flags.go:64] FLAG: --event-qps="50" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.981760 4691 flags.go:64] FLAG: --event-storage-age-limit="default=0" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.981770 4691 flags.go:64] FLAG: --event-storage-event-limit="default=0" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.981779 4691 flags.go:64] FLAG: --eviction-hard="" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.981790 4691 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.981801 4691 flags.go:64] FLAG: --eviction-minimum-reclaim="" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.981810 4691 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.981820 4691 flags.go:64] FLAG: --eviction-soft="" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.981829 4691 flags.go:64] FLAG: --eviction-soft-grace-period="" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.981839 4691 flags.go:64] FLAG: --exit-on-lock-contention="false" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.981848 4691 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.981859 4691 flags.go:64] FLAG: --experimental-mounter-path="" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.981868 4691 flags.go:64] FLAG: --fail-cgroupv1="false" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.981878 4691 flags.go:64] FLAG: --fail-swap-on="true" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.981913 4691 flags.go:64] FLAG: --feature-gates="" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.981925 4691 flags.go:64] FLAG: --file-check-frequency="20s" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.981935 4691 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.981945 4691 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.981955 4691 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.981964 4691 flags.go:64] FLAG: --healthz-port="10248" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.981974 4691 flags.go:64] FLAG: --help="false" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.981984 4691 flags.go:64] FLAG: --hostname-override="" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.981993 4691 flags.go:64] FLAG: --housekeeping-interval="10s" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982003 4691 flags.go:64] FLAG: --http-check-frequency="20s" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982012 4691 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982022 4691 flags.go:64] FLAG: --image-credential-provider-config="" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982031 4691 flags.go:64] FLAG: --image-gc-high-threshold="85" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982041 4691 flags.go:64] FLAG: --image-gc-low-threshold="80" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982052 4691 flags.go:64] FLAG: --image-service-endpoint="" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982061 4691 flags.go:64] FLAG: --kernel-memcg-notification="false" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982070 4691 flags.go:64] FLAG: --kube-api-burst="100" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982080 4691 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982092 4691 flags.go:64] FLAG: --kube-api-qps="50" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982101 4691 flags.go:64] FLAG: --kube-reserved="" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982111 4691 flags.go:64] FLAG: --kube-reserved-cgroup="" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982120 4691 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982130 4691 flags.go:64] FLAG: --kubelet-cgroups="" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982140 4691 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982150 4691 flags.go:64] FLAG: --lock-file="" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982160 4691 flags.go:64] FLAG: --log-cadvisor-usage="false" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982169 4691 flags.go:64] FLAG: --log-flush-frequency="5s" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982179 4691 flags.go:64] FLAG: --log-json-info-buffer-size="0" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982193 4691 flags.go:64] FLAG: --log-json-split-stream="false" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982202 4691 flags.go:64] FLAG: --log-text-info-buffer-size="0" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982212 4691 flags.go:64] FLAG: --log-text-split-stream="false" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982222 4691 flags.go:64] FLAG: --logging-format="text" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982231 4691 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982242 4691 flags.go:64] FLAG: --make-iptables-util-chains="true" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982252 4691 flags.go:64] FLAG: --manifest-url="" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982261 4691 flags.go:64] FLAG: --manifest-url-header="" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982273 4691 flags.go:64] FLAG: --max-housekeeping-interval="15s" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982283 4691 flags.go:64] FLAG: --max-open-files="1000000" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982294 4691 flags.go:64] FLAG: --max-pods="110" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982304 4691 flags.go:64] FLAG: --maximum-dead-containers="-1" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982314 4691 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982324 4691 flags.go:64] FLAG: --memory-manager-policy="None" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982333 4691 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982343 4691 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982353 4691 flags.go:64] FLAG: --node-ip="192.168.126.11" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982363 4691 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982384 4691 flags.go:64] FLAG: --node-status-max-images="50" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982393 4691 flags.go:64] FLAG: --node-status-update-frequency="10s" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982403 4691 flags.go:64] FLAG: --oom-score-adj="-999" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982413 4691 flags.go:64] FLAG: --pod-cidr="" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982423 4691 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982437 4691 flags.go:64] FLAG: --pod-manifest-path="" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982446 4691 flags.go:64] FLAG: --pod-max-pids="-1" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982456 4691 flags.go:64] FLAG: --pods-per-core="0" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982466 4691 flags.go:64] FLAG: --port="10250" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982476 4691 flags.go:64] FLAG: --protect-kernel-defaults="false" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982485 4691 flags.go:64] FLAG: --provider-id="" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982496 4691 flags.go:64] FLAG: --qos-reserved="" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982505 4691 flags.go:64] FLAG: --read-only-port="10255" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982515 4691 flags.go:64] FLAG: --register-node="true" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982525 4691 flags.go:64] FLAG: --register-schedulable="true" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982534 4691 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982556 4691 flags.go:64] FLAG: --registry-burst="10" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982565 4691 flags.go:64] FLAG: --registry-qps="5" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982575 4691 flags.go:64] FLAG: --reserved-cpus="" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982584 4691 flags.go:64] FLAG: --reserved-memory="" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982596 4691 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982605 4691 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982615 4691 flags.go:64] FLAG: --rotate-certificates="false" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982625 4691 flags.go:64] FLAG: --rotate-server-certificates="false" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982634 4691 flags.go:64] FLAG: --runonce="false" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982644 4691 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982653 4691 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982663 4691 flags.go:64] FLAG: --seccomp-default="false" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982673 4691 flags.go:64] FLAG: --serialize-image-pulls="true" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982683 4691 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982692 4691 flags.go:64] FLAG: --storage-driver-db="cadvisor" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982702 4691 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982712 4691 flags.go:64] FLAG: --storage-driver-password="root" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982721 4691 flags.go:64] FLAG: --storage-driver-secure="false" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982731 4691 flags.go:64] FLAG: --storage-driver-table="stats" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982741 4691 flags.go:64] FLAG: --storage-driver-user="root" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982750 4691 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982760 4691 flags.go:64] FLAG: --sync-frequency="1m0s" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982770 4691 flags.go:64] FLAG: --system-cgroups="" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982779 4691 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982794 4691 flags.go:64] FLAG: --system-reserved-cgroup="" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982804 4691 flags.go:64] FLAG: --tls-cert-file="" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982813 4691 flags.go:64] FLAG: --tls-cipher-suites="[]" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982825 4691 flags.go:64] FLAG: --tls-min-version="" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982835 4691 flags.go:64] FLAG: --tls-private-key-file="" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982844 4691 flags.go:64] FLAG: --topology-manager-policy="none" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982854 4691 flags.go:64] FLAG: --topology-manager-policy-options="" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982863 4691 flags.go:64] FLAG: --topology-manager-scope="container" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982873 4691 flags.go:64] FLAG: --v="2" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982909 4691 flags.go:64] FLAG: --version="false" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982921 4691 flags.go:64] FLAG: --vmodule="" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982932 4691 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.982942 4691 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.983160 4691 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.983172 4691 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.983183 4691 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.983192 4691 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.983201 4691 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.983212 4691 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.983222 4691 feature_gate.go:330] unrecognized feature gate: Example Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.983232 4691 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.983241 4691 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.983250 4691 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.983258 4691 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.983267 4691 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.983277 4691 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.983286 4691 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.983294 4691 feature_gate.go:330] unrecognized feature gate: OVNObservability Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.983302 4691 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.983310 4691 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.983319 4691 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.983328 4691 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.983336 4691 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.983345 4691 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.983353 4691 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.983362 4691 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.983376 4691 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.983384 4691 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.983393 4691 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.983401 4691 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.983436 4691 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.983445 4691 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.983454 4691 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.983464 4691 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.983475 4691 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.983484 4691 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.983494 4691 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.983503 4691 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.983514 4691 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.983525 4691 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.983534 4691 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.983544 4691 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.983553 4691 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.983562 4691 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.983570 4691 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.983581 4691 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.983592 4691 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.983601 4691 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.983610 4691 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.983619 4691 feature_gate.go:330] unrecognized feature gate: InsightsConfig Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.983628 4691 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.983638 4691 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.983647 4691 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.983656 4691 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.983665 4691 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.983673 4691 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.983681 4691 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.983689 4691 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.983699 4691 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.983707 4691 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.983715 4691 feature_gate.go:330] unrecognized feature gate: PinnedImages Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.983724 4691 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.983734 4691 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.983742 4691 feature_gate.go:330] unrecognized feature gate: NewOLM Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.983750 4691 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.983758 4691 feature_gate.go:330] unrecognized feature gate: SignatureStores Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.983766 4691 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.983775 4691 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.983783 4691 feature_gate.go:330] unrecognized feature gate: PlatformOperators Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.983791 4691 feature_gate.go:330] unrecognized feature gate: GatewayAPI Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.983799 4691 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.983813 4691 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.983821 4691 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.983829 4691 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.984615 4691 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.993244 4691 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.993266 4691 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993339 4691 feature_gate.go:330] unrecognized feature gate: PlatformOperators Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993346 4691 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993351 4691 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993357 4691 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993362 4691 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993367 4691 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993374 4691 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993381 4691 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993387 4691 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993392 4691 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993398 4691 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993403 4691 feature_gate.go:330] unrecognized feature gate: InsightsConfig Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993408 4691 feature_gate.go:330] unrecognized feature gate: NewOLM Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993412 4691 feature_gate.go:330] unrecognized feature gate: Example Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993417 4691 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993422 4691 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993427 4691 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993432 4691 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993436 4691 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993441 4691 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993446 4691 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993451 4691 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993456 4691 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993460 4691 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993465 4691 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993470 4691 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993475 4691 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993480 4691 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993485 4691 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993490 4691 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993495 4691 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993503 4691 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993509 4691 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993514 4691 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993520 4691 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993525 4691 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993530 4691 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993534 4691 feature_gate.go:330] unrecognized feature gate: GatewayAPI Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993539 4691 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993544 4691 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993548 4691 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993553 4691 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993559 4691 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993566 4691 feature_gate.go:330] unrecognized feature gate: SignatureStores Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993571 4691 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993577 4691 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993583 4691 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993587 4691 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993592 4691 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993597 4691 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993602 4691 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993607 4691 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993612 4691 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993616 4691 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993621 4691 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993626 4691 feature_gate.go:330] unrecognized feature gate: PinnedImages Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993630 4691 feature_gate.go:330] unrecognized feature gate: OVNObservability Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993636 4691 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993641 4691 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993645 4691 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993652 4691 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993658 4691 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993664 4691 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993670 4691 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993675 4691 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993680 4691 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993684 4691 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993689 4691 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993694 4691 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993699 4691 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993704 4691 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Sep 30 06:19:16 crc kubenswrapper[4691]: I0930 06:19:16.993712 4691 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993842 4691 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993849 4691 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993855 4691 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993861 4691 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993865 4691 feature_gate.go:330] unrecognized feature gate: InsightsConfig Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993871 4691 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993875 4691 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993880 4691 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993900 4691 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993905 4691 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993910 4691 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993915 4691 feature_gate.go:330] unrecognized feature gate: GatewayAPI Sep 30 06:19:16 crc kubenswrapper[4691]: W0930 06:19:16.993919 4691 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Sep 30 06:19:17 crc kubenswrapper[4691]: W0930 06:19:16.993924 4691 feature_gate.go:330] unrecognized feature gate: PinnedImages Sep 30 06:19:17 crc kubenswrapper[4691]: W0930 06:19:16.993929 4691 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Sep 30 06:19:17 crc kubenswrapper[4691]: W0930 06:19:16.993933 4691 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Sep 30 06:19:17 crc kubenswrapper[4691]: W0930 06:19:16.993938 4691 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Sep 30 06:19:17 crc kubenswrapper[4691]: W0930 06:19:16.993943 4691 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Sep 30 06:19:17 crc kubenswrapper[4691]: W0930 06:19:16.993949 4691 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Sep 30 06:19:17 crc kubenswrapper[4691]: W0930 06:19:16.993957 4691 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Sep 30 06:19:17 crc kubenswrapper[4691]: W0930 06:19:16.993963 4691 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Sep 30 06:19:17 crc kubenswrapper[4691]: W0930 06:19:16.993968 4691 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Sep 30 06:19:17 crc kubenswrapper[4691]: W0930 06:19:16.993973 4691 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Sep 30 06:19:17 crc kubenswrapper[4691]: W0930 06:19:16.993978 4691 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Sep 30 06:19:17 crc kubenswrapper[4691]: W0930 06:19:16.993985 4691 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Sep 30 06:19:17 crc kubenswrapper[4691]: W0930 06:19:16.993990 4691 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Sep 30 06:19:17 crc kubenswrapper[4691]: W0930 06:19:16.993995 4691 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Sep 30 06:19:17 crc kubenswrapper[4691]: W0930 06:19:16.994000 4691 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Sep 30 06:19:17 crc kubenswrapper[4691]: W0930 06:19:16.994004 4691 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Sep 30 06:19:17 crc kubenswrapper[4691]: W0930 06:19:16.994009 4691 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Sep 30 06:19:17 crc kubenswrapper[4691]: W0930 06:19:16.994014 4691 feature_gate.go:330] unrecognized feature gate: PlatformOperators Sep 30 06:19:17 crc kubenswrapper[4691]: W0930 06:19:16.994018 4691 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Sep 30 06:19:17 crc kubenswrapper[4691]: W0930 06:19:16.994025 4691 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Sep 30 06:19:17 crc kubenswrapper[4691]: W0930 06:19:16.994031 4691 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Sep 30 06:19:17 crc kubenswrapper[4691]: W0930 06:19:16.994037 4691 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Sep 30 06:19:17 crc kubenswrapper[4691]: W0930 06:19:16.994042 4691 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Sep 30 06:19:17 crc kubenswrapper[4691]: W0930 06:19:16.994047 4691 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Sep 30 06:19:17 crc kubenswrapper[4691]: W0930 06:19:16.994052 4691 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Sep 30 06:19:17 crc kubenswrapper[4691]: W0930 06:19:16.994057 4691 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Sep 30 06:19:17 crc kubenswrapper[4691]: W0930 06:19:16.994062 4691 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Sep 30 06:19:17 crc kubenswrapper[4691]: W0930 06:19:16.994066 4691 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Sep 30 06:19:17 crc kubenswrapper[4691]: W0930 06:19:16.994071 4691 feature_gate.go:330] unrecognized feature gate: SignatureStores Sep 30 06:19:17 crc kubenswrapper[4691]: W0930 06:19:16.994076 4691 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Sep 30 06:19:17 crc kubenswrapper[4691]: W0930 06:19:16.994081 4691 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Sep 30 06:19:17 crc kubenswrapper[4691]: W0930 06:19:16.994085 4691 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Sep 30 06:19:17 crc kubenswrapper[4691]: W0930 06:19:16.994090 4691 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Sep 30 06:19:17 crc kubenswrapper[4691]: W0930 06:19:16.994095 4691 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Sep 30 06:19:17 crc kubenswrapper[4691]: W0930 06:19:16.994101 4691 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Sep 30 06:19:17 crc kubenswrapper[4691]: W0930 06:19:16.994106 4691 feature_gate.go:330] unrecognized feature gate: Example Sep 30 06:19:17 crc kubenswrapper[4691]: W0930 06:19:16.994111 4691 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Sep 30 06:19:17 crc kubenswrapper[4691]: W0930 06:19:16.994116 4691 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Sep 30 06:19:17 crc kubenswrapper[4691]: W0930 06:19:16.994121 4691 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Sep 30 06:19:17 crc kubenswrapper[4691]: W0930 06:19:16.994127 4691 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Sep 30 06:19:17 crc kubenswrapper[4691]: W0930 06:19:16.994132 4691 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Sep 30 06:19:17 crc kubenswrapper[4691]: W0930 06:19:16.994137 4691 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Sep 30 06:19:17 crc kubenswrapper[4691]: W0930 06:19:16.994141 4691 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Sep 30 06:19:17 crc kubenswrapper[4691]: W0930 06:19:16.994146 4691 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Sep 30 06:19:17 crc kubenswrapper[4691]: W0930 06:19:16.994151 4691 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Sep 30 06:19:17 crc kubenswrapper[4691]: W0930 06:19:16.994158 4691 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Sep 30 06:19:17 crc kubenswrapper[4691]: W0930 06:19:16.994164 4691 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Sep 30 06:19:17 crc kubenswrapper[4691]: W0930 06:19:16.994169 4691 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Sep 30 06:19:17 crc kubenswrapper[4691]: W0930 06:19:16.994174 4691 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Sep 30 06:19:17 crc kubenswrapper[4691]: W0930 06:19:16.994179 4691 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Sep 30 06:19:17 crc kubenswrapper[4691]: W0930 06:19:16.994183 4691 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Sep 30 06:19:17 crc kubenswrapper[4691]: W0930 06:19:16.994188 4691 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Sep 30 06:19:17 crc kubenswrapper[4691]: W0930 06:19:16.994193 4691 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Sep 30 06:19:17 crc kubenswrapper[4691]: W0930 06:19:16.994198 4691 feature_gate.go:330] unrecognized feature gate: NewOLM Sep 30 06:19:17 crc kubenswrapper[4691]: W0930 06:19:16.994203 4691 feature_gate.go:330] unrecognized feature gate: OVNObservability Sep 30 06:19:17 crc kubenswrapper[4691]: W0930 06:19:16.994208 4691 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Sep 30 06:19:17 crc kubenswrapper[4691]: W0930 06:19:16.994213 4691 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Sep 30 06:19:17 crc kubenswrapper[4691]: W0930 06:19:16.994219 4691 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:16.994227 4691 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:16.995097 4691 server.go:940] "Client rotation is on, will bootstrap in background" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:16.998688 4691 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:16.998777 4691 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.000772 4691 server.go:997] "Starting client certificate rotation" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.000797 4691 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.001041 4691 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-15 13:15:15.854241134 +0000 UTC Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.003875 4691 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 1110h55m58.850375216s for next certificate rotation Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.028102 4691 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.030936 4691 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.045406 4691 log.go:25] "Validated CRI v1 runtime API" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.081285 4691 log.go:25] "Validated CRI v1 image API" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.083204 4691 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.089597 4691 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-09-30-06-07-21-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.089653 4691 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.116162 4691 manager.go:217] Machine: {Timestamp:2025-09-30 06:19:17.11362275 +0000 UTC m=+0.588643870 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:d7c4eecd-4486-44ba-8bf7-42bfa69eb2b7 BootID:5d33cbaa-1b8a-4dde-af56-05c3aae2213e Filesystems:[{Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:f4:73:d9 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:f4:73:d9 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:17:ee:1e Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:16:e1:9d Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:51:dd:60 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:ba:3e:22 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:e2:c5:f7:ce:07:bc Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:0e:2e:32:a6:64:f5 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.116523 4691 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.116688 4691 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.119249 4691 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.119584 4691 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.119647 4691 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.119999 4691 topology_manager.go:138] "Creating topology manager with none policy" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.120017 4691 container_manager_linux.go:303] "Creating device plugin manager" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.120830 4691 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.120863 4691 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.121312 4691 state_mem.go:36] "Initialized new in-memory state store" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.121831 4691 server.go:1245] "Using root directory" path="/var/lib/kubelet" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.126089 4691 kubelet.go:418] "Attempting to sync node with API server" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.126123 4691 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.126167 4691 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.126190 4691 kubelet.go:324] "Adding apiserver pod source" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.126208 4691 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Sep 30 06:19:17 crc kubenswrapper[4691]: W0930 06:19:17.134810 4691 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.46:6443: connect: connection refused Sep 30 06:19:17 crc kubenswrapper[4691]: W0930 06:19:17.134844 4691 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.46:6443: connect: connection refused Sep 30 06:19:17 crc kubenswrapper[4691]: E0930 06:19:17.135070 4691 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.46:6443: connect: connection refused" logger="UnhandledError" Sep 30 06:19:17 crc kubenswrapper[4691]: E0930 06:19:17.135052 4691 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.46:6443: connect: connection refused" logger="UnhandledError" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.137578 4691 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.138943 4691 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.140590 4691 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.143734 4691 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.143778 4691 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.143794 4691 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.143809 4691 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.143831 4691 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.143844 4691 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.143857 4691 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.143878 4691 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.143922 4691 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.143936 4691 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.143954 4691 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.143967 4691 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.146497 4691 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.147241 4691 server.go:1280] "Started kubelet" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.147470 4691 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.46:6443: connect: connection refused Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.147638 4691 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.147702 4691 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.148432 4691 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Sep 30 06:19:17 crc systemd[1]: Started Kubernetes Kubelet. Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.150297 4691 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.150334 4691 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.150709 4691 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 14:57:00.841839294 +0000 UTC Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.150766 4691 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 968h37m43.691079343s for next certificate rotation Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.151045 4691 volume_manager.go:287] "The desired_state_of_world populator starts" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.151064 4691 volume_manager.go:289] "Starting Kubelet Volume Manager" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.151181 4691 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Sep 30 06:19:17 crc kubenswrapper[4691]: E0930 06:19:17.151179 4691 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Sep 30 06:19:17 crc kubenswrapper[4691]: W0930 06:19:17.152128 4691 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.46:6443: connect: connection refused Sep 30 06:19:17 crc kubenswrapper[4691]: E0930 06:19:17.152241 4691 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.46:6443: connect: connection refused" logger="UnhandledError" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.153347 4691 server.go:460] "Adding debug handlers to kubelet server" Sep 30 06:19:17 crc kubenswrapper[4691]: E0930 06:19:17.156355 4691 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.46:6443: connect: connection refused" interval="200ms" Sep 30 06:19:17 crc kubenswrapper[4691]: E0930 06:19:17.155049 4691 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.46:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.1869fb0bb3f3c922 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-09-30 06:19:17.147199778 +0000 UTC m=+0.622220858,LastTimestamp:2025-09-30 06:19:17.147199778 +0000 UTC m=+0.622220858,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.159486 4691 factory.go:55] Registering systemd factory Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.159546 4691 factory.go:221] Registration of the systemd container factory successfully Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.163715 4691 factory.go:153] Registering CRI-O factory Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.163772 4691 factory.go:221] Registration of the crio container factory successfully Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.163945 4691 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.163981 4691 factory.go:103] Registering Raw factory Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.164008 4691 manager.go:1196] Started watching for new ooms in manager Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.165678 4691 manager.go:319] Starting recovery of all containers Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.170295 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.170362 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.170394 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.170415 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.170476 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.170502 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.170523 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.170549 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.170578 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.170603 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.170626 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.170646 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.170672 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.170702 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.170722 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.170742 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.170771 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.170792 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.170816 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.170838 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.170858 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.170911 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.170931 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.170951 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.170974 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.170994 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.171035 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.171167 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.171257 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.171289 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.171313 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.171386 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.171411 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.171446 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.171468 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.171492 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.171524 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.171550 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.171578 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.171600 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.171623 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.171650 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.171671 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.171698 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.171721 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.171744 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.171876 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.171938 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.171978 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.172011 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.172061 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.172092 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.172130 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.172155 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.172187 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.172217 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.172239 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.172266 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.173059 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.173102 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.173126 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.173154 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.173180 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.173233 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.173260 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.173285 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.173310 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.173333 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.173356 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.173383 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.175837 4691 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.175932 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.175967 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.175997 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.176026 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.176053 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.176080 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.176107 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.176132 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.176181 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.176207 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.176234 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.176256 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.176275 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.176294 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.176311 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.176330 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.176348 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.176368 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.176391 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.176412 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.176435 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.176462 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.176493 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.176521 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.176547 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.176573 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.176598 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.176623 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.176648 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.176675 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.176703 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.176730 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.176756 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.176837 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.176881 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.176958 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.176992 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.177022 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.177051 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.177079 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.177108 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.177137 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.177249 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.177276 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.177304 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.177333 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.177360 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.177386 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.177412 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.177442 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.177473 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.177536 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.177563 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.177590 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.177614 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.177641 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.177666 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.177692 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.177719 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.177743 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.177769 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.177795 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.177821 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.177849 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.177876 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.177937 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.177969 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.177995 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.178018 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.178043 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.178070 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.178094 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.178120 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.178146 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.178171 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.178197 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.178223 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.178249 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.178277 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.178303 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.178330 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.178368 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.178399 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.178424 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.178449 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.178473 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.178499 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.178524 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.178549 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.178573 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.178604 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.178628 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.178655 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.178681 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.178708 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.178733 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.178759 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.178788 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.178816 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.178840 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.178862 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.178943 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.178978 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.179003 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.179028 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.179053 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.179076 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.179100 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.179128 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.179152 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.179175 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.179201 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.179224 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.179247 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.179276 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.179303 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.179326 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.179390 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.179416 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.179453 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.179476 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.179499 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.179528 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.179551 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.179578 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.179602 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.179626 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.179650 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.179674 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.179698 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.179723 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.179745 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.179767 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.179789 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.179812 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.179834 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.179858 4691 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.179879 4691 reconstruct.go:97] "Volume reconstruction finished" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.179943 4691 reconciler.go:26] "Reconciler: start to sync state" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.202517 4691 manager.go:324] Recovery completed Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.217641 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.219060 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.219110 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.219124 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.220050 4691 cpu_manager.go:225] "Starting CPU manager" policy="none" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.220256 4691 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.220490 4691 state_mem.go:36] "Initialized new in-memory state store" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.221594 4691 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.223451 4691 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.223511 4691 status_manager.go:217] "Starting to sync pod status with apiserver" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.223543 4691 kubelet.go:2335] "Starting kubelet main sync loop" Sep 30 06:19:17 crc kubenswrapper[4691]: E0930 06:19:17.224087 4691 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Sep 30 06:19:17 crc kubenswrapper[4691]: W0930 06:19:17.224385 4691 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.46:6443: connect: connection refused Sep 30 06:19:17 crc kubenswrapper[4691]: E0930 06:19:17.224475 4691 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.46:6443: connect: connection refused" logger="UnhandledError" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.239647 4691 policy_none.go:49] "None policy: Start" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.240656 4691 memory_manager.go:170] "Starting memorymanager" policy="None" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.240693 4691 state_mem.go:35] "Initializing new in-memory state store" Sep 30 06:19:17 crc kubenswrapper[4691]: E0930 06:19:17.251258 4691 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.300961 4691 manager.go:334] "Starting Device Plugin manager" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.301111 4691 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.301127 4691 server.go:79] "Starting device plugin registration server" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.301584 4691 eviction_manager.go:189] "Eviction manager: starting control loop" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.301607 4691 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.302186 4691 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.302281 4691 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.302296 4691 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Sep 30 06:19:17 crc kubenswrapper[4691]: E0930 06:19:17.308754 4691 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.325007 4691 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.325080 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.326464 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.326497 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.326510 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.326644 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.326786 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.326840 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.327941 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.327967 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.327985 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.328090 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.328250 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.328291 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.328788 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.328817 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.328829 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.328818 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.328927 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.328947 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.329054 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.329191 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.329216 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.329227 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.329199 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.329274 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.329674 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.329711 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.329724 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.329877 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.329981 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.330028 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.330200 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.330232 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.330245 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.332522 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.332561 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.332572 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.332844 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.332953 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.332842 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.333021 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.333041 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.334514 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.334555 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.334568 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:17 crc kubenswrapper[4691]: E0930 06:19:17.357319 4691 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.46:6443: connect: connection refused" interval="400ms" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.383115 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.383158 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.383181 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.383200 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.383222 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.383242 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.383329 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.383374 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.383395 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.383413 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.383452 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.383473 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.383495 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.383517 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.383542 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.402577 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.403918 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.403965 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.403979 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.404014 4691 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 06:19:17 crc kubenswrapper[4691]: E0930 06:19:17.404532 4691 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.46:6443: connect: connection refused" node="crc" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.484976 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.485017 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.485041 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.485062 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.485083 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.485103 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.485122 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.485140 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.485161 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.485179 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.485197 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.485222 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.485220 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.485199 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.485274 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.485361 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.485382 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.485407 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.485274 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.485438 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.485467 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.485413 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.485303 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.485248 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.485454 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.485394 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.485653 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.485742 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.485759 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.485861 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.604692 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.606641 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.606692 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.606706 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.606737 4691 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 06:19:17 crc kubenswrapper[4691]: E0930 06:19:17.607300 4691 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.46:6443: connect: connection refused" node="crc" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.670320 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.690057 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.699319 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 06:19:17 crc kubenswrapper[4691]: W0930 06:19:17.726043 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-9686fe5cfb2c8afbe2f799e7d627fb52860169d6bdbbb2d387c3af7e4d4edcac WatchSource:0}: Error finding container 9686fe5cfb2c8afbe2f799e7d627fb52860169d6bdbbb2d387c3af7e4d4edcac: Status 404 returned error can't find the container with id 9686fe5cfb2c8afbe2f799e7d627fb52860169d6bdbbb2d387c3af7e4d4edcac Sep 30 06:19:17 crc kubenswrapper[4691]: W0930 06:19:17.728014 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-f5b9c59a455c765f9fd4ea10ba4d7aadbf26fd8edb806e70152de79a23b1462e WatchSource:0}: Error finding container f5b9c59a455c765f9fd4ea10ba4d7aadbf26fd8edb806e70152de79a23b1462e: Status 404 returned error can't find the container with id f5b9c59a455c765f9fd4ea10ba4d7aadbf26fd8edb806e70152de79a23b1462e Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.731630 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 06:19:17 crc kubenswrapper[4691]: W0930 06:19:17.733111 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-1af034e6f884986d9e7dd61fe2609ec0394f75b02c0c4320e8f5f93593de231a WatchSource:0}: Error finding container 1af034e6f884986d9e7dd61fe2609ec0394f75b02c0c4320e8f5f93593de231a: Status 404 returned error can't find the container with id 1af034e6f884986d9e7dd61fe2609ec0394f75b02c0c4320e8f5f93593de231a Sep 30 06:19:17 crc kubenswrapper[4691]: I0930 06:19:17.735168 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 06:19:17 crc kubenswrapper[4691]: E0930 06:19:17.759299 4691 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.46:6443: connect: connection refused" interval="800ms" Sep 30 06:19:17 crc kubenswrapper[4691]: W0930 06:19:17.761470 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-93140041d6e0d1b646de766df3c747581931188fc821376a1d385cd36ce73947 WatchSource:0}: Error finding container 93140041d6e0d1b646de766df3c747581931188fc821376a1d385cd36ce73947: Status 404 returned error can't find the container with id 93140041d6e0d1b646de766df3c747581931188fc821376a1d385cd36ce73947 Sep 30 06:19:17 crc kubenswrapper[4691]: W0930 06:19:17.766628 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-25bb68f45ae16b89cbb799ffa23069915c0faf307fb5265d2a61be07a574d4b7 WatchSource:0}: Error finding container 25bb68f45ae16b89cbb799ffa23069915c0faf307fb5265d2a61be07a574d4b7: Status 404 returned error can't find the container with id 25bb68f45ae16b89cbb799ffa23069915c0faf307fb5265d2a61be07a574d4b7 Sep 30 06:19:18 crc kubenswrapper[4691]: I0930 06:19:18.007954 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 06:19:18 crc kubenswrapper[4691]: I0930 06:19:18.009656 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:18 crc kubenswrapper[4691]: I0930 06:19:18.009729 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:18 crc kubenswrapper[4691]: I0930 06:19:18.009752 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:18 crc kubenswrapper[4691]: I0930 06:19:18.009800 4691 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 06:19:18 crc kubenswrapper[4691]: E0930 06:19:18.010374 4691 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.46:6443: connect: connection refused" node="crc" Sep 30 06:19:18 crc kubenswrapper[4691]: I0930 06:19:18.148586 4691 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.46:6443: connect: connection refused Sep 30 06:19:18 crc kubenswrapper[4691]: W0930 06:19:18.191361 4691 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.46:6443: connect: connection refused Sep 30 06:19:18 crc kubenswrapper[4691]: E0930 06:19:18.191502 4691 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.46:6443: connect: connection refused" logger="UnhandledError" Sep 30 06:19:18 crc kubenswrapper[4691]: W0930 06:19:18.196231 4691 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.46:6443: connect: connection refused Sep 30 06:19:18 crc kubenswrapper[4691]: E0930 06:19:18.196336 4691 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.46:6443: connect: connection refused" logger="UnhandledError" Sep 30 06:19:18 crc kubenswrapper[4691]: I0930 06:19:18.227806 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"93140041d6e0d1b646de766df3c747581931188fc821376a1d385cd36ce73947"} Sep 30 06:19:18 crc kubenswrapper[4691]: I0930 06:19:18.229998 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"1af034e6f884986d9e7dd61fe2609ec0394f75b02c0c4320e8f5f93593de231a"} Sep 30 06:19:18 crc kubenswrapper[4691]: I0930 06:19:18.231493 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f5b9c59a455c765f9fd4ea10ba4d7aadbf26fd8edb806e70152de79a23b1462e"} Sep 30 06:19:18 crc kubenswrapper[4691]: I0930 06:19:18.233006 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"9686fe5cfb2c8afbe2f799e7d627fb52860169d6bdbbb2d387c3af7e4d4edcac"} Sep 30 06:19:18 crc kubenswrapper[4691]: I0930 06:19:18.234318 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"25bb68f45ae16b89cbb799ffa23069915c0faf307fb5265d2a61be07a574d4b7"} Sep 30 06:19:18 crc kubenswrapper[4691]: W0930 06:19:18.237250 4691 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.46:6443: connect: connection refused Sep 30 06:19:18 crc kubenswrapper[4691]: E0930 06:19:18.237348 4691 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.46:6443: connect: connection refused" logger="UnhandledError" Sep 30 06:19:18 crc kubenswrapper[4691]: E0930 06:19:18.560482 4691 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.46:6443: connect: connection refused" interval="1.6s" Sep 30 06:19:18 crc kubenswrapper[4691]: W0930 06:19:18.676823 4691 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.46:6443: connect: connection refused Sep 30 06:19:18 crc kubenswrapper[4691]: E0930 06:19:18.676986 4691 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.46:6443: connect: connection refused" logger="UnhandledError" Sep 30 06:19:18 crc kubenswrapper[4691]: I0930 06:19:18.810928 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 06:19:18 crc kubenswrapper[4691]: I0930 06:19:18.812831 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:18 crc kubenswrapper[4691]: I0930 06:19:18.812922 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:18 crc kubenswrapper[4691]: I0930 06:19:18.812947 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:18 crc kubenswrapper[4691]: I0930 06:19:18.812994 4691 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 06:19:18 crc kubenswrapper[4691]: E0930 06:19:18.813498 4691 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.46:6443: connect: connection refused" node="crc" Sep 30 06:19:19 crc kubenswrapper[4691]: I0930 06:19:19.148965 4691 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.46:6443: connect: connection refused Sep 30 06:19:19 crc kubenswrapper[4691]: I0930 06:19:19.240253 4691 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53" exitCode=0 Sep 30 06:19:19 crc kubenswrapper[4691]: I0930 06:19:19.240355 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53"} Sep 30 06:19:19 crc kubenswrapper[4691]: I0930 06:19:19.240481 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 06:19:19 crc kubenswrapper[4691]: I0930 06:19:19.241980 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:19 crc kubenswrapper[4691]: I0930 06:19:19.242024 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:19 crc kubenswrapper[4691]: I0930 06:19:19.242042 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:19 crc kubenswrapper[4691]: I0930 06:19:19.242831 4691 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82" exitCode=0 Sep 30 06:19:19 crc kubenswrapper[4691]: I0930 06:19:19.242878 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82"} Sep 30 06:19:19 crc kubenswrapper[4691]: I0930 06:19:19.243050 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 06:19:19 crc kubenswrapper[4691]: I0930 06:19:19.243957 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 06:19:19 crc kubenswrapper[4691]: I0930 06:19:19.244411 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:19 crc kubenswrapper[4691]: I0930 06:19:19.244473 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:19 crc kubenswrapper[4691]: I0930 06:19:19.244494 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:19 crc kubenswrapper[4691]: I0930 06:19:19.245143 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:19 crc kubenswrapper[4691]: I0930 06:19:19.245186 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:19 crc kubenswrapper[4691]: I0930 06:19:19.245203 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:19 crc kubenswrapper[4691]: I0930 06:19:19.245991 4691 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="fff6d766366a54e73752a41e180f1e850b8e9c41a8189b7f1df5f82d28e2566e" exitCode=0 Sep 30 06:19:19 crc kubenswrapper[4691]: I0930 06:19:19.246113 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 06:19:19 crc kubenswrapper[4691]: I0930 06:19:19.246087 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"fff6d766366a54e73752a41e180f1e850b8e9c41a8189b7f1df5f82d28e2566e"} Sep 30 06:19:19 crc kubenswrapper[4691]: I0930 06:19:19.247856 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:19 crc kubenswrapper[4691]: I0930 06:19:19.247944 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:19 crc kubenswrapper[4691]: I0930 06:19:19.247973 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:19 crc kubenswrapper[4691]: I0930 06:19:19.248948 4691 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="deebf28973e8320961c1318918377f952fadba50dec1e580e461de659c63f23e" exitCode=0 Sep 30 06:19:19 crc kubenswrapper[4691]: I0930 06:19:19.249063 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"deebf28973e8320961c1318918377f952fadba50dec1e580e461de659c63f23e"} Sep 30 06:19:19 crc kubenswrapper[4691]: I0930 06:19:19.254255 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 06:19:19 crc kubenswrapper[4691]: I0930 06:19:19.256388 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d2f054ea0da71d1f65e501212a66771c5e65bfc123e3a3acd5e13900def039d8"} Sep 30 06:19:19 crc kubenswrapper[4691]: I0930 06:19:19.256535 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"9aadde3a7a603aae354d4f9dcf5ef288aeceabfb73ee3c92398fb3419326b27a"} Sep 30 06:19:19 crc kubenswrapper[4691]: I0930 06:19:19.256567 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"469f62522e2f813405840a8654f761a52163bdabeba8cc7c57998ccd40548370"} Sep 30 06:19:19 crc kubenswrapper[4691]: I0930 06:19:19.256594 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"198c76a83115408261c01c78cbd2845e487ad3dc896da0130b2b1338349506d8"} Sep 30 06:19:19 crc kubenswrapper[4691]: I0930 06:19:19.256595 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 06:19:19 crc kubenswrapper[4691]: I0930 06:19:19.257049 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:19 crc kubenswrapper[4691]: I0930 06:19:19.257118 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:19 crc kubenswrapper[4691]: I0930 06:19:19.257143 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:19 crc kubenswrapper[4691]: I0930 06:19:19.257938 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:19 crc kubenswrapper[4691]: I0930 06:19:19.257983 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:19 crc kubenswrapper[4691]: I0930 06:19:19.258000 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:20 crc kubenswrapper[4691]: I0930 06:19:20.149117 4691 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.46:6443: connect: connection refused Sep 30 06:19:20 crc kubenswrapper[4691]: E0930 06:19:20.161719 4691 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.46:6443: connect: connection refused" interval="3.2s" Sep 30 06:19:20 crc kubenswrapper[4691]: I0930 06:19:20.263401 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"39d7f1cd59b9a24d4303415d873bac1ed21326847749130167868b5298b7f8e8"} Sep 30 06:19:20 crc kubenswrapper[4691]: I0930 06:19:20.263445 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f38c9aa49f81f590d234daa993bb92af0c48103b45f5116c27b86daa651930a2"} Sep 30 06:19:20 crc kubenswrapper[4691]: I0930 06:19:20.263455 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"acbe061d62e7bf11b0003c035dce19386981672d5e63236fada6e43b92274c91"} Sep 30 06:19:20 crc kubenswrapper[4691]: I0930 06:19:20.263467 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"07fbce63750320e7345e1a99b4be96355f54c36c70e8929599b990eab3f3e637"} Sep 30 06:19:20 crc kubenswrapper[4691]: I0930 06:19:20.265505 4691 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a" exitCode=0 Sep 30 06:19:20 crc kubenswrapper[4691]: I0930 06:19:20.265582 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a"} Sep 30 06:19:20 crc kubenswrapper[4691]: I0930 06:19:20.265665 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 06:19:20 crc kubenswrapper[4691]: I0930 06:19:20.266903 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"3410624e48656bd289ed6d53742ae7dd5e0ba5148c42d9964a540ae97bd0b8d5"} Sep 30 06:19:20 crc kubenswrapper[4691]: I0930 06:19:20.266952 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 06:19:20 crc kubenswrapper[4691]: I0930 06:19:20.267115 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:20 crc kubenswrapper[4691]: I0930 06:19:20.267171 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:20 crc kubenswrapper[4691]: I0930 06:19:20.267183 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:20 crc kubenswrapper[4691]: I0930 06:19:20.267814 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:20 crc kubenswrapper[4691]: I0930 06:19:20.267862 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:20 crc kubenswrapper[4691]: I0930 06:19:20.267905 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:20 crc kubenswrapper[4691]: I0930 06:19:20.272575 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"d1f3d8c18456850212ce283d46273b39939040ddd575f193acc0910cf479f3b2"} Sep 30 06:19:20 crc kubenswrapper[4691]: I0930 06:19:20.272631 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"810c839e6c66bbacb466fb7023bed728b17be9d13025e2db26ee5b40fea124f7"} Sep 30 06:19:20 crc kubenswrapper[4691]: I0930 06:19:20.272653 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"c055c319af36509c20e700f8f5025b0d356ca5e6038be80dd69282a1f1ad716b"} Sep 30 06:19:20 crc kubenswrapper[4691]: I0930 06:19:20.272593 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 06:19:20 crc kubenswrapper[4691]: I0930 06:19:20.272777 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 06:19:20 crc kubenswrapper[4691]: I0930 06:19:20.273803 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:20 crc kubenswrapper[4691]: I0930 06:19:20.273874 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:20 crc kubenswrapper[4691]: I0930 06:19:20.273927 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:20 crc kubenswrapper[4691]: I0930 06:19:20.275781 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:20 crc kubenswrapper[4691]: I0930 06:19:20.275849 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:20 crc kubenswrapper[4691]: I0930 06:19:20.275868 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:20 crc kubenswrapper[4691]: I0930 06:19:20.413594 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 06:19:20 crc kubenswrapper[4691]: I0930 06:19:20.415374 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:20 crc kubenswrapper[4691]: I0930 06:19:20.415421 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:20 crc kubenswrapper[4691]: I0930 06:19:20.415434 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:20 crc kubenswrapper[4691]: I0930 06:19:20.415464 4691 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 06:19:20 crc kubenswrapper[4691]: E0930 06:19:20.416005 4691 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.46:6443: connect: connection refused" node="crc" Sep 30 06:19:20 crc kubenswrapper[4691]: W0930 06:19:20.516122 4691 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.46:6443: connect: connection refused Sep 30 06:19:20 crc kubenswrapper[4691]: E0930 06:19:20.516257 4691 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.46:6443: connect: connection refused" logger="UnhandledError" Sep 30 06:19:20 crc kubenswrapper[4691]: W0930 06:19:20.852663 4691 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.46:6443: connect: connection refused Sep 30 06:19:20 crc kubenswrapper[4691]: E0930 06:19:20.852788 4691 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.46:6443: connect: connection refused" logger="UnhandledError" Sep 30 06:19:21 crc kubenswrapper[4691]: I0930 06:19:21.278622 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"017d7c032ea394e2dbef22cfc837c65ce54aab7db5fbd32210312b3e7042a698"} Sep 30 06:19:21 crc kubenswrapper[4691]: I0930 06:19:21.278749 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 06:19:21 crc kubenswrapper[4691]: I0930 06:19:21.280236 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:21 crc kubenswrapper[4691]: I0930 06:19:21.280285 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:21 crc kubenswrapper[4691]: I0930 06:19:21.280305 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:21 crc kubenswrapper[4691]: I0930 06:19:21.282254 4691 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7" exitCode=0 Sep 30 06:19:21 crc kubenswrapper[4691]: I0930 06:19:21.282305 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7"} Sep 30 06:19:21 crc kubenswrapper[4691]: I0930 06:19:21.282383 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 06:19:21 crc kubenswrapper[4691]: I0930 06:19:21.282409 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 06:19:21 crc kubenswrapper[4691]: I0930 06:19:21.282405 4691 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 06:19:21 crc kubenswrapper[4691]: I0930 06:19:21.282600 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 06:19:21 crc kubenswrapper[4691]: I0930 06:19:21.283501 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:21 crc kubenswrapper[4691]: I0930 06:19:21.283523 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:21 crc kubenswrapper[4691]: I0930 06:19:21.283531 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:21 crc kubenswrapper[4691]: I0930 06:19:21.283616 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:21 crc kubenswrapper[4691]: I0930 06:19:21.283642 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:21 crc kubenswrapper[4691]: I0930 06:19:21.283656 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:21 crc kubenswrapper[4691]: I0930 06:19:21.283940 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:21 crc kubenswrapper[4691]: I0930 06:19:21.283991 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:21 crc kubenswrapper[4691]: I0930 06:19:21.284004 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:22 crc kubenswrapper[4691]: I0930 06:19:22.101712 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 06:19:22 crc kubenswrapper[4691]: I0930 06:19:22.289584 4691 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 06:19:22 crc kubenswrapper[4691]: I0930 06:19:22.289656 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 06:19:22 crc kubenswrapper[4691]: I0930 06:19:22.289812 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"a65513dc3f11ab14acbafe004e02c1b395b74935e84a00e75d9936bde03a97fe"} Sep 30 06:19:22 crc kubenswrapper[4691]: I0930 06:19:22.289860 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"56f6cedf7ad601b81415785791be352c849e1d559ab8e6a0f33de2c89a93cf09"} Sep 30 06:19:22 crc kubenswrapper[4691]: I0930 06:19:22.289882 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"4b052f8474075dfc1f07b8e0844072d5bbaa94bae1711f7e4cecc6928a65689a"} Sep 30 06:19:22 crc kubenswrapper[4691]: I0930 06:19:22.290847 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:22 crc kubenswrapper[4691]: I0930 06:19:22.290926 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:22 crc kubenswrapper[4691]: I0930 06:19:22.290946 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:22 crc kubenswrapper[4691]: I0930 06:19:22.731349 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 06:19:22 crc kubenswrapper[4691]: I0930 06:19:22.731670 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 06:19:22 crc kubenswrapper[4691]: I0930 06:19:22.734259 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:22 crc kubenswrapper[4691]: I0930 06:19:22.734338 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:22 crc kubenswrapper[4691]: I0930 06:19:22.734356 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:23 crc kubenswrapper[4691]: I0930 06:19:23.299267 4691 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 06:19:23 crc kubenswrapper[4691]: I0930 06:19:23.299340 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 06:19:23 crc kubenswrapper[4691]: I0930 06:19:23.300129 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 06:19:23 crc kubenswrapper[4691]: I0930 06:19:23.300457 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"2841452dae413fa8cdf532c4933ed52cfb007be43a45ee2d25209b54890fd351"} Sep 30 06:19:23 crc kubenswrapper[4691]: I0930 06:19:23.300522 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"6240aa4c18020b30de9389e7a1869c736424444208d4011f228b4e5432520cc1"} Sep 30 06:19:23 crc kubenswrapper[4691]: I0930 06:19:23.301003 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:23 crc kubenswrapper[4691]: I0930 06:19:23.301061 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:23 crc kubenswrapper[4691]: I0930 06:19:23.301079 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:23 crc kubenswrapper[4691]: I0930 06:19:23.301179 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:23 crc kubenswrapper[4691]: I0930 06:19:23.301212 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:23 crc kubenswrapper[4691]: I0930 06:19:23.301229 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:23 crc kubenswrapper[4691]: I0930 06:19:23.358478 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 06:19:23 crc kubenswrapper[4691]: I0930 06:19:23.358678 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 06:19:23 crc kubenswrapper[4691]: I0930 06:19:23.360197 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:23 crc kubenswrapper[4691]: I0930 06:19:23.360245 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:23 crc kubenswrapper[4691]: I0930 06:19:23.360262 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:23 crc kubenswrapper[4691]: I0930 06:19:23.616598 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 06:19:23 crc kubenswrapper[4691]: I0930 06:19:23.618414 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:23 crc kubenswrapper[4691]: I0930 06:19:23.618461 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:23 crc kubenswrapper[4691]: I0930 06:19:23.618479 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:23 crc kubenswrapper[4691]: I0930 06:19:23.618515 4691 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 06:19:24 crc kubenswrapper[4691]: I0930 06:19:24.302513 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 06:19:24 crc kubenswrapper[4691]: I0930 06:19:24.303733 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:24 crc kubenswrapper[4691]: I0930 06:19:24.303950 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:24 crc kubenswrapper[4691]: I0930 06:19:24.304110 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:24 crc kubenswrapper[4691]: I0930 06:19:24.518444 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 06:19:24 crc kubenswrapper[4691]: I0930 06:19:24.518633 4691 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 06:19:24 crc kubenswrapper[4691]: I0930 06:19:24.518703 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 06:19:24 crc kubenswrapper[4691]: I0930 06:19:24.520228 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:24 crc kubenswrapper[4691]: I0930 06:19:24.520286 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:24 crc kubenswrapper[4691]: I0930 06:19:24.520308 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:24 crc kubenswrapper[4691]: I0930 06:19:24.891950 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 06:19:24 crc kubenswrapper[4691]: I0930 06:19:24.892271 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 06:19:24 crc kubenswrapper[4691]: I0930 06:19:24.894403 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:24 crc kubenswrapper[4691]: I0930 06:19:24.894480 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:24 crc kubenswrapper[4691]: I0930 06:19:24.894500 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:24 crc kubenswrapper[4691]: I0930 06:19:24.901871 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 06:19:25 crc kubenswrapper[4691]: I0930 06:19:25.305614 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 06:19:25 crc kubenswrapper[4691]: I0930 06:19:25.306939 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:25 crc kubenswrapper[4691]: I0930 06:19:25.306988 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:25 crc kubenswrapper[4691]: I0930 06:19:25.307005 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:26 crc kubenswrapper[4691]: I0930 06:19:26.194082 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Sep 30 06:19:26 crc kubenswrapper[4691]: I0930 06:19:26.194335 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 06:19:26 crc kubenswrapper[4691]: I0930 06:19:26.195788 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:26 crc kubenswrapper[4691]: I0930 06:19:26.195839 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:26 crc kubenswrapper[4691]: I0930 06:19:26.195857 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:26 crc kubenswrapper[4691]: I0930 06:19:26.398376 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Sep 30 06:19:26 crc kubenswrapper[4691]: I0930 06:19:26.398690 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 06:19:26 crc kubenswrapper[4691]: I0930 06:19:26.400258 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:26 crc kubenswrapper[4691]: I0930 06:19:26.400305 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:26 crc kubenswrapper[4691]: I0930 06:19:26.400323 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:27 crc kubenswrapper[4691]: E0930 06:19:27.308941 4691 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Sep 30 06:19:27 crc kubenswrapper[4691]: I0930 06:19:27.466697 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 06:19:27 crc kubenswrapper[4691]: I0930 06:19:27.467003 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 06:19:27 crc kubenswrapper[4691]: I0930 06:19:27.468487 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:27 crc kubenswrapper[4691]: I0930 06:19:27.468546 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:27 crc kubenswrapper[4691]: I0930 06:19:27.468563 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:28 crc kubenswrapper[4691]: I0930 06:19:28.191305 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 06:19:28 crc kubenswrapper[4691]: I0930 06:19:28.191595 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 06:19:28 crc kubenswrapper[4691]: I0930 06:19:28.193389 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:28 crc kubenswrapper[4691]: I0930 06:19:28.193454 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:28 crc kubenswrapper[4691]: I0930 06:19:28.193472 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:28 crc kubenswrapper[4691]: I0930 06:19:28.197952 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 06:19:28 crc kubenswrapper[4691]: I0930 06:19:28.243321 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 06:19:28 crc kubenswrapper[4691]: I0930 06:19:28.313657 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 06:19:28 crc kubenswrapper[4691]: I0930 06:19:28.315343 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:28 crc kubenswrapper[4691]: I0930 06:19:28.315425 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:28 crc kubenswrapper[4691]: I0930 06:19:28.315445 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:29 crc kubenswrapper[4691]: I0930 06:19:29.316453 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 06:19:29 crc kubenswrapper[4691]: I0930 06:19:29.318972 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:29 crc kubenswrapper[4691]: I0930 06:19:29.319055 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:29 crc kubenswrapper[4691]: I0930 06:19:29.319082 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:30 crc kubenswrapper[4691]: W0930 06:19:30.934790 4691 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Sep 30 06:19:30 crc kubenswrapper[4691]: I0930 06:19:30.935006 4691 trace.go:236] Trace[814658366]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (30-Sep-2025 06:19:20.933) (total time: 10001ms): Sep 30 06:19:30 crc kubenswrapper[4691]: Trace[814658366]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10000ms (06:19:30.934) Sep 30 06:19:30 crc kubenswrapper[4691]: Trace[814658366]: [10.001070119s] [10.001070119s] END Sep 30 06:19:30 crc kubenswrapper[4691]: E0930 06:19:30.935054 4691 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Sep 30 06:19:31 crc kubenswrapper[4691]: W0930 06:19:31.082540 4691 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Sep 30 06:19:31 crc kubenswrapper[4691]: I0930 06:19:31.082668 4691 trace.go:236] Trace[641022716]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (30-Sep-2025 06:19:21.080) (total time: 10002ms): Sep 30 06:19:31 crc kubenswrapper[4691]: Trace[641022716]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (06:19:31.082) Sep 30 06:19:31 crc kubenswrapper[4691]: Trace[641022716]: [10.002078432s] [10.002078432s] END Sep 30 06:19:31 crc kubenswrapper[4691]: E0930 06:19:31.082699 4691 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Sep 30 06:19:31 crc kubenswrapper[4691]: I0930 06:19:31.148847 4691 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Sep 30 06:19:31 crc kubenswrapper[4691]: I0930 06:19:31.191410 4691 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Sep 30 06:19:31 crc kubenswrapper[4691]: I0930 06:19:31.191576 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Sep 30 06:19:32 crc kubenswrapper[4691]: I0930 06:19:32.104814 4691 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Sep 30 06:19:32 crc kubenswrapper[4691]: I0930 06:19:32.104901 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Sep 30 06:19:32 crc kubenswrapper[4691]: I0930 06:19:32.113319 4691 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Sep 30 06:19:32 crc kubenswrapper[4691]: I0930 06:19:32.113396 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Sep 30 06:19:32 crc kubenswrapper[4691]: I0930 06:19:32.324157 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Sep 30 06:19:32 crc kubenswrapper[4691]: I0930 06:19:32.325973 4691 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="017d7c032ea394e2dbef22cfc837c65ce54aab7db5fbd32210312b3e7042a698" exitCode=255 Sep 30 06:19:32 crc kubenswrapper[4691]: I0930 06:19:32.326007 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"017d7c032ea394e2dbef22cfc837c65ce54aab7db5fbd32210312b3e7042a698"} Sep 30 06:19:32 crc kubenswrapper[4691]: I0930 06:19:32.326163 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 06:19:32 crc kubenswrapper[4691]: I0930 06:19:32.327027 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:32 crc kubenswrapper[4691]: I0930 06:19:32.327074 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:32 crc kubenswrapper[4691]: I0930 06:19:32.327093 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:32 crc kubenswrapper[4691]: I0930 06:19:32.327959 4691 scope.go:117] "RemoveContainer" containerID="017d7c032ea394e2dbef22cfc837c65ce54aab7db5fbd32210312b3e7042a698" Sep 30 06:19:33 crc kubenswrapper[4691]: I0930 06:19:33.330910 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Sep 30 06:19:33 crc kubenswrapper[4691]: I0930 06:19:33.332600 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"16fc15eff68924ff75a90b9cb7b5119d95ced9b9fe9e12968bfd077db73f37ca"} Sep 30 06:19:33 crc kubenswrapper[4691]: I0930 06:19:33.332743 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 06:19:33 crc kubenswrapper[4691]: I0930 06:19:33.333435 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:33 crc kubenswrapper[4691]: I0930 06:19:33.333493 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:33 crc kubenswrapper[4691]: I0930 06:19:33.333512 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:34 crc kubenswrapper[4691]: I0930 06:19:34.526382 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 06:19:34 crc kubenswrapper[4691]: I0930 06:19:34.526580 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 06:19:34 crc kubenswrapper[4691]: I0930 06:19:34.526681 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 06:19:34 crc kubenswrapper[4691]: I0930 06:19:34.528002 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:34 crc kubenswrapper[4691]: I0930 06:19:34.528054 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:34 crc kubenswrapper[4691]: I0930 06:19:34.528071 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:34 crc kubenswrapper[4691]: I0930 06:19:34.534211 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 06:19:34 crc kubenswrapper[4691]: I0930 06:19:34.691493 4691 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Sep 30 06:19:35 crc kubenswrapper[4691]: I0930 06:19:35.338574 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 06:19:35 crc kubenswrapper[4691]: I0930 06:19:35.340033 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:35 crc kubenswrapper[4691]: I0930 06:19:35.340117 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:35 crc kubenswrapper[4691]: I0930 06:19:35.340136 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:36 crc kubenswrapper[4691]: I0930 06:19:36.240958 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Sep 30 06:19:36 crc kubenswrapper[4691]: I0930 06:19:36.241206 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 06:19:36 crc kubenswrapper[4691]: I0930 06:19:36.242634 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:36 crc kubenswrapper[4691]: I0930 06:19:36.242679 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:36 crc kubenswrapper[4691]: I0930 06:19:36.242697 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:36 crc kubenswrapper[4691]: I0930 06:19:36.260946 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Sep 30 06:19:36 crc kubenswrapper[4691]: I0930 06:19:36.336344 4691 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Sep 30 06:19:37 crc kubenswrapper[4691]: E0930 06:19:37.105489 4691 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.108286 4691 trace.go:236] Trace[1673301405]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (30-Sep-2025 06:19:25.280) (total time: 11827ms): Sep 30 06:19:37 crc kubenswrapper[4691]: Trace[1673301405]: ---"Objects listed" error: 11827ms (06:19:37.108) Sep 30 06:19:37 crc kubenswrapper[4691]: Trace[1673301405]: [11.827883996s] [11.827883996s] END Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.108318 4691 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.109635 4691 trace.go:236] Trace[842055630]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (30-Sep-2025 06:19:25.164) (total time: 11944ms): Sep 30 06:19:37 crc kubenswrapper[4691]: Trace[842055630]: ---"Objects listed" error: 11944ms (06:19:37.109) Sep 30 06:19:37 crc kubenswrapper[4691]: Trace[842055630]: [11.944637696s] [11.944637696s] END Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.109661 4691 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.109822 4691 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Sep 30 06:19:37 crc kubenswrapper[4691]: E0930 06:19:37.110501 4691 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.137567 4691 apiserver.go:52] "Watching apiserver" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.141769 4691 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.142485 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g"] Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.143485 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.143856 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 06:19:37 crc kubenswrapper[4691]: E0930 06:19:37.143947 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.144030 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 06:19:37 crc kubenswrapper[4691]: E0930 06:19:37.144066 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.144131 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.144507 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.144835 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 06:19:37 crc kubenswrapper[4691]: E0930 06:19:37.144905 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.149547 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.149723 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.149806 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.149951 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.150112 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.150196 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.150231 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.151658 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.151790 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.152034 4691 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.196935 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.208900 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.210779 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.210836 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.210877 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.210930 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.210992 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.211028 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.211060 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.211091 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.211123 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.211152 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.211181 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.211211 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.211244 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.211276 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.211307 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.211339 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.211369 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.211459 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.211521 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.211555 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.211585 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.211641 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.211670 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.212096 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.212178 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.212190 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.212216 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.212360 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.212403 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.212436 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.212467 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.212501 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.212532 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.212568 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.212598 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.212629 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.212670 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.212702 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.212763 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.212794 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.212869 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.212924 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.212958 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.212988 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.213021 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.213054 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.213086 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.213118 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.213150 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.213184 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.213216 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.213248 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.213279 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.213313 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.212211 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.213345 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.213379 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.213412 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.212210 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.212220 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.212423 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.212417 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.212410 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.212834 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.212875 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.213015 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.213053 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.213056 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.213120 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.213225 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.213277 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.213313 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.213326 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.213308 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: E0930 06:19:37.213425 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 06:19:37.713409282 +0000 UTC m=+21.188430322 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.215707 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.213443 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.213552 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.213642 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.213664 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.213717 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.213751 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.213845 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.213851 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.213971 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.214169 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.214163 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.214206 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.214207 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.214380 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.214420 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.214461 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.214497 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.214657 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.214710 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.214821 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.215226 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.215395 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.216298 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.215719 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.216362 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.216387 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.216413 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.216474 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.216498 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.216508 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.216522 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.216591 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.216631 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.216666 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.216662 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.216701 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.216740 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.216778 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.216817 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.216821 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.216852 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.216922 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.216959 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.216867 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.216992 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.216998 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.217023 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.217024 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.217060 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.217093 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.217124 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.217158 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.217193 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.217213 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.217225 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.217258 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.217294 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.217330 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.217365 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.217397 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.217434 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.217471 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.217520 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.217570 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.217619 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.217668 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.217713 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.217749 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.217785 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.217818 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.217853 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.217917 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.217949 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.217968 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.217988 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.218004 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.218025 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.218124 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.218162 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.218197 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.218232 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.218266 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.218306 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.218340 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.218374 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.218409 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.218444 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.218479 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.218514 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.218553 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.218586 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.218623 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.218657 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.218690 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.218724 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.218763 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.218799 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.218836 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.218882 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.220579 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.220622 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.220657 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.218000 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.218236 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.218385 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.218416 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.218586 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.218572 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.218704 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.221347 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.218872 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.219248 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.219296 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.219963 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.219975 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.220120 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.220192 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.220396 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.220507 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.220556 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.220568 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.220709 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.220701 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.220848 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.220952 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.221202 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.221209 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.221287 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.221853 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.222403 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.222467 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.222506 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.222547 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.222622 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.222667 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.222702 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.222738 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.222770 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.222803 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.222839 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.222875 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.222987 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.223024 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.223059 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.223095 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.223145 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.223191 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.223353 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.223459 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.223504 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.223544 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.223588 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.223626 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.223669 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.223711 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.223754 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.223796 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.223833 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.223873 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.223940 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.223978 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.224019 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.224056 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.224103 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.224145 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.224188 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.224230 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.224266 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.224308 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.224350 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.224389 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.224431 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.224468 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.224511 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.224553 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.224591 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.224635 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.224676 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.224713 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.224752 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.224795 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.224839 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.224882 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.224994 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.225031 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.225106 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.225152 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.225699 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.226212 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.226281 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.226689 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.226776 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.226821 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.226867 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.226973 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.227016 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.227079 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.227119 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.227158 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.227227 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.222412 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.222473 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.222997 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.223304 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.223562 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.223850 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.223842 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.224029 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.224276 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.224353 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.224600 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.224635 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.224986 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.225176 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.225426 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.225600 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.225677 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.225738 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.227484 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.225946 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.225966 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.226004 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.226702 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.226715 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.226769 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.226855 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.226980 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.227140 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.227268 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.227414 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.227793 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.228038 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.228333 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.228363 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.228686 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.228822 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.228995 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.229049 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.229076 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.231141 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.231488 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.231671 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.231772 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.231867 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.232032 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.232193 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.232303 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.232654 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.232724 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.232772 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.233004 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: E0930 06:19:37.233030 4691 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.233267 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.233299 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.233070 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.233529 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.233586 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.233718 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.233811 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.233832 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.234125 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.234262 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.234451 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.234513 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.234573 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 06:19:37 crc kubenswrapper[4691]: E0930 06:19:37.234724 4691 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 06:19:37 crc kubenswrapper[4691]: E0930 06:19:37.234806 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 06:19:37.734768473 +0000 UTC m=+21.209789513 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.235235 4691 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Sep 30 06:19:37 crc kubenswrapper[4691]: E0930 06:19:37.235437 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 06:19:37.735419173 +0000 UTC m=+21.210440213 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.237000 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.237980 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.238047 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.238938 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.239003 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.239068 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.241848 4691 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.243595 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.243625 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.243904 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.244508 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.244703 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.245062 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.245540 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.246741 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.246839 4691 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.246864 4691 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.247029 4691 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.247062 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.247088 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.247114 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.247279 4691 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.247329 4691 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.247348 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.247365 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.247383 4691 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.247401 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.247463 4691 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.247486 4691 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.247531 4691 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.247631 4691 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.247650 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.247666 4691 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.247684 4691 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.248034 4691 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.248412 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.247774 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.247968 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.248466 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.249206 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e55ae5a0-03bf-427d-92b2-39cdff18340d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f6cedf7ad601b81415785791be352c849e1d559ab8e6a0f33de2c89a93cf09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65513dc3f11ab14acbafe004e02c1b395b74935e84a00e75d9936bde03a97fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6240aa4c18020b30de9389e7a1869c736424444208d4011f228b4e5432520cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2841452dae413fa8cdf532c4933ed52cfb007be43a45ee2d25209b54890fd351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b052f8474075dfc1f07b8e0844072d5bbaa94bae1711f7e4cecc6928a65689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.248056 4691 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.249648 4691 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.249701 4691 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.249722 4691 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.249740 4691 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.249756 4691 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.249772 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.249788 4691 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.249804 4691 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.249820 4691 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.249835 4691 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.249851 4691 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.249867 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.249905 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.249924 4691 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.249939 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.249955 4691 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.249971 4691 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.249987 4691 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.250002 4691 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.250018 4691 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.250050 4691 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.250065 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.250080 4691 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.250095 4691 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.250127 4691 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.250142 4691 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.250158 4691 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.250200 4691 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.250218 4691 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.250234 4691 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.250250 4691 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.250266 4691 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.250282 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.250298 4691 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.250314 4691 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.250330 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.250346 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.250362 4691 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.250377 4691 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.250393 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.250410 4691 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.250426 4691 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.250445 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.250470 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.250485 4691 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.250506 4691 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.250522 4691 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.250538 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.250554 4691 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.250588 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.250605 4691 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.250621 4691 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.250637 4691 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.250654 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.250670 4691 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.250686 4691 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.250704 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.250727 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.250744 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.250760 4691 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.250775 4691 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.250791 4691 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.250806 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.250830 4691 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.250845 4691 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.250860 4691 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.250883 4691 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.250926 4691 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.250965 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.250981 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.251029 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.251048 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.251074 4691 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.251090 4691 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.251105 4691 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.251121 4691 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.251145 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.251164 4691 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.251179 4691 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.251194 4691 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.251216 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.251232 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.251248 4691 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.251263 4691 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.251278 4691 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.251294 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.251310 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.251326 4691 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.251346 4691 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.251369 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.251384 4691 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.251400 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.251421 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.251443 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.251460 4691 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.251475 4691 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.251491 4691 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.251506 4691 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.251521 4691 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.251547 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.251563 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.251579 4691 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.251601 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.251617 4691 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.251639 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.251655 4691 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.251669 4691 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.252321 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.252599 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.254020 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.261495 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.261783 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.261877 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.262561 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: E0930 06:19:37.265090 4691 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 06:19:37 crc kubenswrapper[4691]: E0930 06:19:37.265129 4691 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 06:19:37 crc kubenswrapper[4691]: E0930 06:19:37.265153 4691 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.262608 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: E0930 06:19:37.265383 4691 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 06:19:37 crc kubenswrapper[4691]: E0930 06:19:37.265392 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 06:19:37.765371224 +0000 UTC m=+21.240392254 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 06:19:37 crc kubenswrapper[4691]: E0930 06:19:37.265415 4691 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 06:19:37 crc kubenswrapper[4691]: E0930 06:19:37.265431 4691 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 06:19:37 crc kubenswrapper[4691]: E0930 06:19:37.265486 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 06:19:37.765470587 +0000 UTC m=+21.240491627 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.266914 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.267187 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.267431 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.267574 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.267596 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.268010 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.268658 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.271127 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.271141 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.271344 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.271388 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.271626 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.273148 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.280666 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb1cfb02-1e6b-4bfb-8104-f1d137231de9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07fbce63750320e7345e1a99b4be96355f54c36c70e8929599b990eab3f3e637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38c9aa49f81f590d234daa993bb92af0c48103b45f5116c27b86daa651930a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acbe061d62e7bf11b0003c035dce19386981672d5e63236fada6e43b92274c91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fc15eff68924ff75a90b9cb7b5119d95ced9b9fe9e12968bfd077db73f37ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://017d7c032ea394e2dbef22cfc837c65ce54aab7db5fbd32210312b3e7042a698\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T06:19:31Z\\\",\\\"message\\\":\\\"W0930 06:19:20.583735 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 06:19:20.584237 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759213160 cert, and key in /tmp/serving-cert-2229078217/serving-signer.crt, /tmp/serving-cert-2229078217/serving-signer.key\\\\nI0930 06:19:20.815054 1 observer_polling.go:159] Starting file observer\\\\nW0930 06:19:20.818069 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 06:19:20.818377 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 06:19:20.821284 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2229078217/tls.crt::/tmp/serving-cert-2229078217/tls.key\\\\\\\"\\\\nF0930 06:19:31.389088 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39d7f1cd59b9a24d4303415d873bac1ed21326847749130167868b5298b7f8e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.282859 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.283003 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.283636 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.283696 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.284352 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.284936 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.285967 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.286134 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.287216 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.287601 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.289013 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.289066 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.290811 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.291470 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.292089 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.292351 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.292506 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.292846 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.293313 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.293325 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.293517 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.294652 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.295090 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.297361 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.297377 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.297382 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.297966 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.298575 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.299278 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.299869 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.299960 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.300110 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.300455 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.300450 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.300631 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.301378 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.303053 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.304787 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.305389 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.306027 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.307867 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.312442 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.313468 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.315221 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.315437 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.315635 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.316221 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.317632 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.318157 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.318416 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.318870 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.320431 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.321515 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.322045 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.322742 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.323836 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.324346 4691 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.324443 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.324909 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.326946 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.327520 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.327960 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.329849 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.330896 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.331512 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.332325 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.332680 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.333732 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.334337 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.335597 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.336254 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.337297 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.338160 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.338373 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.340442 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.341338 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.342125 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.343220 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.344278 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.345097 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.345586 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.346126 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.346978 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.348420 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.351919 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.351947 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.352029 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.352064 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.352150 4691 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.352465 4691 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.352474 4691 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.352484 4691 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.352494 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.352503 4691 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.352511 4691 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.352520 4691 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.352527 4691 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.352535 4691 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.352542 4691 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.352551 4691 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.352559 4691 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.352568 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.352576 4691 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.352585 4691 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.352593 4691 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.352614 4691 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.352622 4691 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.352630 4691 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.352637 4691 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.352648 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.352657 4691 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.352664 4691 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.352672 4691 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.352705 4691 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.352715 4691 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.352796 4691 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.352811 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.352820 4691 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.352829 4691 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.352837 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.352845 4691 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.352853 4691 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.352967 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.352978 4691 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.352986 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.352994 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.353002 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.353010 4691 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.353018 4691 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.353089 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.353097 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.353105 4691 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.353113 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.353121 4691 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.353128 4691 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.353136 4691 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.353144 4691 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.353152 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.353160 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.353169 4691 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.353178 4691 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.353186 4691 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.353195 4691 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.353204 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.353212 4691 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.353220 4691 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.353230 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.353238 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.353247 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.353256 4691 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.353264 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.353273 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.353281 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.353291 4691 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.368234 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e55ae5a0-03bf-427d-92b2-39cdff18340d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f6cedf7ad601b81415785791be352c849e1d559ab8e6a0f33de2c89a93cf09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65513dc3f11ab14acbafe004e02c1b395b74935e84a00e75d9936bde03a97fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6240aa4c18020b30de9389e7a1869c736424444208d4011f228b4e5432520cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2841452dae413fa8cdf532c4933ed52cfb007be43a45ee2d25209b54890fd351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b052f8474075dfc1f07b8e0844072d5bbaa94bae1711f7e4cecc6928a65689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.379254 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb1cfb02-1e6b-4bfb-8104-f1d137231de9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07fbce63750320e7345e1a99b4be96355f54c36c70e8929599b990eab3f3e637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38c9aa49f81f590d234daa993bb92af0c48103b45f5116c27b86daa651930a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acbe061d62e7bf11b0003c035dce19386981672d5e63236fada6e43b92274c91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fc15eff68924ff75a90b9cb7b5119d95ced9b9fe9e12968bfd077db73f37ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://017d7c032ea394e2dbef22cfc837c65ce54aab7db5fbd32210312b3e7042a698\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T06:19:31Z\\\",\\\"message\\\":\\\"W0930 06:19:20.583735 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 06:19:20.584237 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759213160 cert, and key in /tmp/serving-cert-2229078217/serving-signer.crt, /tmp/serving-cert-2229078217/serving-signer.key\\\\nI0930 06:19:20.815054 1 observer_polling.go:159] Starting file observer\\\\nW0930 06:19:20.818069 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 06:19:20.818377 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 06:19:20.821284 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2229078217/tls.crt::/tmp/serving-cert-2229078217/tls.key\\\\\\\"\\\\nF0930 06:19:31.389088 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39d7f1cd59b9a24d4303415d873bac1ed21326847749130167868b5298b7f8e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.392952 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.400909 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.415691 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.425951 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.435418 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.449100 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.460377 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.471082 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.480013 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.486924 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.493171 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e55ae5a0-03bf-427d-92b2-39cdff18340d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f6cedf7ad601b81415785791be352c849e1d559ab8e6a0f33de2c89a93cf09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65513dc3f11ab14acbafe004e02c1b395b74935e84a00e75d9936bde03a97fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6240aa4c18020b30de9389e7a1869c736424444208d4011f228b4e5432520cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2841452dae413fa8cdf532c4933ed52cfb007be43a45ee2d25209b54890fd351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b052f8474075dfc1f07b8e0844072d5bbaa94bae1711f7e4cecc6928a65689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.517331 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb1cfb02-1e6b-4bfb-8104-f1d137231de9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07fbce63750320e7345e1a99b4be96355f54c36c70e8929599b990eab3f3e637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38c9aa49f81f590d234daa993bb92af0c48103b45f5116c27b86daa651930a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acbe061d62e7bf11b0003c035dce19386981672d5e63236fada6e43b92274c91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fc15eff68924ff75a90b9cb7b5119d95ced9b9fe9e12968bfd077db73f37ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://017d7c032ea394e2dbef22cfc837c65ce54aab7db5fbd32210312b3e7042a698\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T06:19:31Z\\\",\\\"message\\\":\\\"W0930 06:19:20.583735 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 06:19:20.584237 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759213160 cert, and key in /tmp/serving-cert-2229078217/serving-signer.crt, /tmp/serving-cert-2229078217/serving-signer.key\\\\nI0930 06:19:20.815054 1 observer_polling.go:159] Starting file observer\\\\nW0930 06:19:20.818069 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 06:19:20.818377 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 06:19:20.821284 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2229078217/tls.crt::/tmp/serving-cert-2229078217/tls.key\\\\\\\"\\\\nF0930 06:19:31.389088 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39d7f1cd59b9a24d4303415d873bac1ed21326847749130167868b5298b7f8e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.529505 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.541137 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.553563 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 06:19:37 crc kubenswrapper[4691]: W0930 06:19:37.564220 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-4377dc2487a0cc7fbdc42cd86902fc52848f3be13976b279bb0e15d0101cb1a7 WatchSource:0}: Error finding container 4377dc2487a0cc7fbdc42cd86902fc52848f3be13976b279bb0e15d0101cb1a7: Status 404 returned error can't find the container with id 4377dc2487a0cc7fbdc42cd86902fc52848f3be13976b279bb0e15d0101cb1a7 Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.755743 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.755810 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.755848 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 06:19:37 crc kubenswrapper[4691]: E0930 06:19:37.755952 4691 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 06:19:37 crc kubenswrapper[4691]: E0930 06:19:37.755993 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 06:19:38.755979674 +0000 UTC m=+22.231000714 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 06:19:37 crc kubenswrapper[4691]: E0930 06:19:37.756035 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 06:19:38.756029636 +0000 UTC m=+22.231050676 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:19:37 crc kubenswrapper[4691]: E0930 06:19:37.756062 4691 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 06:19:37 crc kubenswrapper[4691]: E0930 06:19:37.756081 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 06:19:38.756075777 +0000 UTC m=+22.231096807 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.856685 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 06:19:37 crc kubenswrapper[4691]: I0930 06:19:37.856753 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 06:19:37 crc kubenswrapper[4691]: E0930 06:19:37.856920 4691 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 06:19:37 crc kubenswrapper[4691]: E0930 06:19:37.856942 4691 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 06:19:37 crc kubenswrapper[4691]: E0930 06:19:37.856950 4691 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 06:19:37 crc kubenswrapper[4691]: E0930 06:19:37.856996 4691 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 06:19:37 crc kubenswrapper[4691]: E0930 06:19:37.857008 4691 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 06:19:37 crc kubenswrapper[4691]: E0930 06:19:37.857077 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 06:19:38.857044689 +0000 UTC m=+22.332065729 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 06:19:37 crc kubenswrapper[4691]: E0930 06:19:37.856959 4691 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 06:19:37 crc kubenswrapper[4691]: E0930 06:19:37.857116 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 06:19:38.857110181 +0000 UTC m=+22.332131221 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 06:19:38 crc kubenswrapper[4691]: I0930 06:19:38.195168 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 06:19:38 crc kubenswrapper[4691]: I0930 06:19:38.199351 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Sep 30 06:19:38 crc kubenswrapper[4691]: I0930 06:19:38.203037 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Sep 30 06:19:38 crc kubenswrapper[4691]: I0930 06:19:38.206738 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 06:19:38 crc kubenswrapper[4691]: I0930 06:19:38.214770 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 06:19:38 crc kubenswrapper[4691]: I0930 06:19:38.223702 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 06:19:38 crc kubenswrapper[4691]: E0930 06:19:38.223807 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 06:19:38 crc kubenswrapper[4691]: I0930 06:19:38.224601 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 06:19:38 crc kubenswrapper[4691]: I0930 06:19:38.243418 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e55ae5a0-03bf-427d-92b2-39cdff18340d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f6cedf7ad601b81415785791be352c849e1d559ab8e6a0f33de2c89a93cf09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65513dc3f11ab14acbafe004e02c1b395b74935e84a00e75d9936bde03a97fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6240aa4c18020b30de9389e7a1869c736424444208d4011f228b4e5432520cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2841452dae413fa8cdf532c4933ed52cfb007be43a45ee2d25209b54890fd351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b052f8474075dfc1f07b8e0844072d5bbaa94bae1711f7e4cecc6928a65689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 06:19:38 crc kubenswrapper[4691]: I0930 06:19:38.259377 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb1cfb02-1e6b-4bfb-8104-f1d137231de9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07fbce63750320e7345e1a99b4be96355f54c36c70e8929599b990eab3f3e637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38c9aa49f81f590d234daa993bb92af0c48103b45f5116c27b86daa651930a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acbe061d62e7bf11b0003c035dce19386981672d5e63236fada6e43b92274c91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fc15eff68924ff75a90b9cb7b5119d95ced9b9fe9e12968bfd077db73f37ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://017d7c032ea394e2dbef22cfc837c65ce54aab7db5fbd32210312b3e7042a698\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T06:19:31Z\\\",\\\"message\\\":\\\"W0930 06:19:20.583735 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 06:19:20.584237 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759213160 cert, and key in /tmp/serving-cert-2229078217/serving-signer.crt, /tmp/serving-cert-2229078217/serving-signer.key\\\\nI0930 06:19:20.815054 1 observer_polling.go:159] Starting file observer\\\\nW0930 06:19:20.818069 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 06:19:20.818377 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 06:19:20.821284 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2229078217/tls.crt::/tmp/serving-cert-2229078217/tls.key\\\\\\\"\\\\nF0930 06:19:31.389088 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39d7f1cd59b9a24d4303415d873bac1ed21326847749130167868b5298b7f8e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 06:19:38 crc kubenswrapper[4691]: I0930 06:19:38.275388 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 06:19:38 crc kubenswrapper[4691]: I0930 06:19:38.289152 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Sep 30 06:19:38 crc kubenswrapper[4691]: I0930 06:19:38.302648 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:38Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:38 crc kubenswrapper[4691]: I0930 06:19:38.321205 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:38Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:38 crc kubenswrapper[4691]: I0930 06:19:38.338254 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:38Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:38 crc kubenswrapper[4691]: I0930 06:19:38.347533 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"03f725fd435423024f16548f2fdd44ad1ddc26d1d485d56cc5b614ab2cac7a76"} Sep 30 06:19:38 crc kubenswrapper[4691]: I0930 06:19:38.347572 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"d5fcea9ccb5c1331b7ae737abdcc80364814715f3c43c15793316f16acf0502a"} Sep 30 06:19:38 crc kubenswrapper[4691]: I0930 06:19:38.347597 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"4377dc2487a0cc7fbdc42cd86902fc52848f3be13976b279bb0e15d0101cb1a7"} Sep 30 06:19:38 crc kubenswrapper[4691]: I0930 06:19:38.348674 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"84f5cd0964516f43e93c6425614e83e370013f3308299045c810f36baebdaafc"} Sep 30 06:19:38 crc kubenswrapper[4691]: I0930 06:19:38.349924 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"11fb453d515798fa4f85603df19a5dc7d305375d25c0f679598613aba7312328"} Sep 30 06:19:38 crc kubenswrapper[4691]: I0930 06:19:38.349956 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"70bedd433304954dfbcfa38a2547947d18f763ed58c5db3b29b54ef67cc141bf"} Sep 30 06:19:38 crc kubenswrapper[4691]: I0930 06:19:38.362923 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:38Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:38 crc kubenswrapper[4691]: I0930 06:19:38.395326 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e55ae5a0-03bf-427d-92b2-39cdff18340d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f6cedf7ad601b81415785791be352c849e1d559ab8e6a0f33de2c89a93cf09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65513dc3f11ab14acbafe004e02c1b395b74935e84a00e75d9936bde03a97fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6240aa4c18020b30de9389e7a1869c736424444208d4011f228b4e5432520cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2841452dae413fa8cdf532c4933ed52cfb007be43a45ee2d25209b54890fd351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b052f8474075dfc1f07b8e0844072d5bbaa94bae1711f7e4cecc6928a65689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:38Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:38 crc kubenswrapper[4691]: I0930 06:19:38.422597 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb1cfb02-1e6b-4bfb-8104-f1d137231de9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07fbce63750320e7345e1a99b4be96355f54c36c70e8929599b990eab3f3e637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38c9aa49f81f590d234daa993bb92af0c48103b45f5116c27b86daa651930a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acbe061d62e7bf11b0003c035dce19386981672d5e63236fada6e43b92274c91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fc15eff68924ff75a90b9cb7b5119d95ced9b9fe9e12968bfd077db73f37ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://017d7c032ea394e2dbef22cfc837c65ce54aab7db5fbd32210312b3e7042a698\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T06:19:31Z\\\",\\\"message\\\":\\\"W0930 06:19:20.583735 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 06:19:20.584237 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759213160 cert, and key in /tmp/serving-cert-2229078217/serving-signer.crt, /tmp/serving-cert-2229078217/serving-signer.key\\\\nI0930 06:19:20.815054 1 observer_polling.go:159] Starting file observer\\\\nW0930 06:19:20.818069 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 06:19:20.818377 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 06:19:20.821284 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2229078217/tls.crt::/tmp/serving-cert-2229078217/tls.key\\\\\\\"\\\\nF0930 06:19:31.389088 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39d7f1cd59b9a24d4303415d873bac1ed21326847749130167868b5298b7f8e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:38Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:38 crc kubenswrapper[4691]: I0930 06:19:38.476621 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:38Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:38 crc kubenswrapper[4691]: I0930 06:19:38.518948 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:38Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:38 crc kubenswrapper[4691]: I0930 06:19:38.541490 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:38Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:38 crc kubenswrapper[4691]: I0930 06:19:38.573168 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3ecaa3-192f-40d4-abf8-938b9e03a661\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://469f62522e2f813405840a8654f761a52163bdabeba8cc7c57998ccd40548370\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198c76a83115408261c01c78cbd2845e487ad3dc896da0130b2b1338349506d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aadde3a7a603aae354d4f9dcf5ef288aeceabfb73ee3c92398fb3419326b27a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2f054ea0da71d1f65e501212a66771c5e65bfc123e3a3acd5e13900def039d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:38Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:38 crc kubenswrapper[4691]: I0930 06:19:38.592846 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3ecaa3-192f-40d4-abf8-938b9e03a661\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://469f62522e2f813405840a8654f761a52163bdabeba8cc7c57998ccd40548370\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198c76a83115408261c01c78cbd2845e487ad3dc896da0130b2b1338349506d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aadde3a7a603aae354d4f9dcf5ef288aeceabfb73ee3c92398fb3419326b27a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2f054ea0da71d1f65e501212a66771c5e65bfc123e3a3acd5e13900def039d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:38Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:38 crc kubenswrapper[4691]: I0930 06:19:38.606870 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:38Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:38 crc kubenswrapper[4691]: I0930 06:19:38.616700 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:38Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:38 crc kubenswrapper[4691]: I0930 06:19:38.632328 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-8htrc"] Sep 30 06:19:38 crc kubenswrapper[4691]: I0930 06:19:38.632600 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-8htrc" Sep 30 06:19:38 crc kubenswrapper[4691]: I0930 06:19:38.634417 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Sep 30 06:19:38 crc kubenswrapper[4691]: I0930 06:19:38.636221 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Sep 30 06:19:38 crc kubenswrapper[4691]: I0930 06:19:38.636411 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Sep 30 06:19:38 crc kubenswrapper[4691]: I0930 06:19:38.636723 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:38Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:38 crc kubenswrapper[4691]: I0930 06:19:38.663931 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e55ae5a0-03bf-427d-92b2-39cdff18340d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f6cedf7ad601b81415785791be352c849e1d559ab8e6a0f33de2c89a93cf09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65513dc3f11ab14acbafe004e02c1b395b74935e84a00e75d9936bde03a97fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6240aa4c18020b30de9389e7a1869c736424444208d4011f228b4e5432520cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2841452dae413fa8cdf532c4933ed52cfb007be43a45ee2d25209b54890fd351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b052f8474075dfc1f07b8e0844072d5bbaa94bae1711f7e4cecc6928a65689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:38Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:38 crc kubenswrapper[4691]: I0930 06:19:38.675133 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb1cfb02-1e6b-4bfb-8104-f1d137231de9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07fbce63750320e7345e1a99b4be96355f54c36c70e8929599b990eab3f3e637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38c9aa49f81f590d234daa993bb92af0c48103b45f5116c27b86daa651930a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acbe061d62e7bf11b0003c035dce19386981672d5e63236fada6e43b92274c91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fc15eff68924ff75a90b9cb7b5119d95ced9b9fe9e12968bfd077db73f37ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://017d7c032ea394e2dbef22cfc837c65ce54aab7db5fbd32210312b3e7042a698\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T06:19:31Z\\\",\\\"message\\\":\\\"W0930 06:19:20.583735 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 06:19:20.584237 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759213160 cert, and key in /tmp/serving-cert-2229078217/serving-signer.crt, /tmp/serving-cert-2229078217/serving-signer.key\\\\nI0930 06:19:20.815054 1 observer_polling.go:159] Starting file observer\\\\nW0930 06:19:20.818069 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 06:19:20.818377 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 06:19:20.821284 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2229078217/tls.crt::/tmp/serving-cert-2229078217/tls.key\\\\\\\"\\\\nF0930 06:19:31.389088 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39d7f1cd59b9a24d4303415d873bac1ed21326847749130167868b5298b7f8e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:38Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:38 crc kubenswrapper[4691]: I0930 06:19:38.684821 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11fb453d515798fa4f85603df19a5dc7d305375d25c0f679598613aba7312328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:38Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:38 crc kubenswrapper[4691]: I0930 06:19:38.693776 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:38Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:38 crc kubenswrapper[4691]: I0930 06:19:38.702831 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03f725fd435423024f16548f2fdd44ad1ddc26d1d485d56cc5b614ab2cac7a76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5fcea9ccb5c1331b7ae737abdcc80364814715f3c43c15793316f16acf0502a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:38Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:38 crc kubenswrapper[4691]: I0930 06:19:38.711978 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3ecaa3-192f-40d4-abf8-938b9e03a661\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://469f62522e2f813405840a8654f761a52163bdabeba8cc7c57998ccd40548370\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198c76a83115408261c01c78cbd2845e487ad3dc896da0130b2b1338349506d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aadde3a7a603aae354d4f9dcf5ef288aeceabfb73ee3c92398fb3419326b27a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2f054ea0da71d1f65e501212a66771c5e65bfc123e3a3acd5e13900def039d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:38Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:38 crc kubenswrapper[4691]: I0930 06:19:38.724015 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:38Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:38 crc kubenswrapper[4691]: I0930 06:19:38.734590 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:38Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:38 crc kubenswrapper[4691]: I0930 06:19:38.746479 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb1cfb02-1e6b-4bfb-8104-f1d137231de9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07fbce63750320e7345e1a99b4be96355f54c36c70e8929599b990eab3f3e637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38c9aa49f81f590d234daa993bb92af0c48103b45f5116c27b86daa651930a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acbe061d62e7bf11b0003c035dce19386981672d5e63236fada6e43b92274c91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fc15eff68924ff75a90b9cb7b5119d95ced9b9fe9e12968bfd077db73f37ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://017d7c032ea394e2dbef22cfc837c65ce54aab7db5fbd32210312b3e7042a698\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T06:19:31Z\\\",\\\"message\\\":\\\"W0930 06:19:20.583735 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 06:19:20.584237 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759213160 cert, and key in /tmp/serving-cert-2229078217/serving-signer.crt, /tmp/serving-cert-2229078217/serving-signer.key\\\\nI0930 06:19:20.815054 1 observer_polling.go:159] Starting file observer\\\\nW0930 06:19:20.818069 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 06:19:20.818377 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 06:19:20.821284 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2229078217/tls.crt::/tmp/serving-cert-2229078217/tls.key\\\\\\\"\\\\nF0930 06:19:31.389088 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39d7f1cd59b9a24d4303415d873bac1ed21326847749130167868b5298b7f8e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:38Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:38 crc kubenswrapper[4691]: I0930 06:19:38.758815 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11fb453d515798fa4f85603df19a5dc7d305375d25c0f679598613aba7312328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:38Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:38 crc kubenswrapper[4691]: I0930 06:19:38.761906 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 06:19:38 crc kubenswrapper[4691]: E0930 06:19:38.762002 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 06:19:40.761979923 +0000 UTC m=+24.237000963 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:19:38 crc kubenswrapper[4691]: I0930 06:19:38.762067 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 06:19:38 crc kubenswrapper[4691]: I0930 06:19:38.762096 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/7c1c8663-d263-4b8c-93fa-05ee1b61d7f1-hosts-file\") pod \"node-resolver-8htrc\" (UID: \"7c1c8663-d263-4b8c-93fa-05ee1b61d7f1\") " pod="openshift-dns/node-resolver-8htrc" Sep 30 06:19:38 crc kubenswrapper[4691]: I0930 06:19:38.762112 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dj86t\" (UniqueName: \"kubernetes.io/projected/7c1c8663-d263-4b8c-93fa-05ee1b61d7f1-kube-api-access-dj86t\") pod \"node-resolver-8htrc\" (UID: \"7c1c8663-d263-4b8c-93fa-05ee1b61d7f1\") " pod="openshift-dns/node-resolver-8htrc" Sep 30 06:19:38 crc kubenswrapper[4691]: I0930 06:19:38.762141 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 06:19:38 crc kubenswrapper[4691]: E0930 06:19:38.762203 4691 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 06:19:38 crc kubenswrapper[4691]: E0930 06:19:38.762256 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 06:19:40.762247011 +0000 UTC m=+24.237268041 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 06:19:38 crc kubenswrapper[4691]: E0930 06:19:38.762206 4691 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 06:19:38 crc kubenswrapper[4691]: E0930 06:19:38.762325 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 06:19:40.762318533 +0000 UTC m=+24.237339573 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 06:19:38 crc kubenswrapper[4691]: I0930 06:19:38.768690 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:38Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:38 crc kubenswrapper[4691]: I0930 06:19:38.777935 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03f725fd435423024f16548f2fdd44ad1ddc26d1d485d56cc5b614ab2cac7a76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5fcea9ccb5c1331b7ae737abdcc80364814715f3c43c15793316f16acf0502a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:38Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:38 crc kubenswrapper[4691]: I0930 06:19:38.793269 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:38Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:38 crc kubenswrapper[4691]: I0930 06:19:38.803288 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8htrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c1c8663-d263-4b8c-93fa-05ee1b61d7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj86t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8htrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:38Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:38 crc kubenswrapper[4691]: I0930 06:19:38.823412 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e55ae5a0-03bf-427d-92b2-39cdff18340d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f6cedf7ad601b81415785791be352c849e1d559ab8e6a0f33de2c89a93cf09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65513dc3f11ab14acbafe004e02c1b395b74935e84a00e75d9936bde03a97fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6240aa4c18020b30de9389e7a1869c736424444208d4011f228b4e5432520cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2841452dae413fa8cdf532c4933ed52cfb007be43a45ee2d25209b54890fd351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b052f8474075dfc1f07b8e0844072d5bbaa94bae1711f7e4cecc6928a65689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:38Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:38 crc kubenswrapper[4691]: I0930 06:19:38.862927 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/7c1c8663-d263-4b8c-93fa-05ee1b61d7f1-hosts-file\") pod \"node-resolver-8htrc\" (UID: \"7c1c8663-d263-4b8c-93fa-05ee1b61d7f1\") " pod="openshift-dns/node-resolver-8htrc" Sep 30 06:19:38 crc kubenswrapper[4691]: I0930 06:19:38.862964 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dj86t\" (UniqueName: \"kubernetes.io/projected/7c1c8663-d263-4b8c-93fa-05ee1b61d7f1-kube-api-access-dj86t\") pod \"node-resolver-8htrc\" (UID: \"7c1c8663-d263-4b8c-93fa-05ee1b61d7f1\") " pod="openshift-dns/node-resolver-8htrc" Sep 30 06:19:38 crc kubenswrapper[4691]: I0930 06:19:38.862981 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 06:19:38 crc kubenswrapper[4691]: I0930 06:19:38.863006 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 06:19:38 crc kubenswrapper[4691]: E0930 06:19:38.863100 4691 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 06:19:38 crc kubenswrapper[4691]: E0930 06:19:38.863124 4691 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 06:19:38 crc kubenswrapper[4691]: E0930 06:19:38.863134 4691 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 06:19:38 crc kubenswrapper[4691]: E0930 06:19:38.863175 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 06:19:40.863160981 +0000 UTC m=+24.338182021 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 06:19:38 crc kubenswrapper[4691]: I0930 06:19:38.863445 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/7c1c8663-d263-4b8c-93fa-05ee1b61d7f1-hosts-file\") pod \"node-resolver-8htrc\" (UID: \"7c1c8663-d263-4b8c-93fa-05ee1b61d7f1\") " pod="openshift-dns/node-resolver-8htrc" Sep 30 06:19:38 crc kubenswrapper[4691]: E0930 06:19:38.863618 4691 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 06:19:38 crc kubenswrapper[4691]: E0930 06:19:38.863639 4691 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 06:19:38 crc kubenswrapper[4691]: E0930 06:19:38.863648 4691 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 06:19:38 crc kubenswrapper[4691]: E0930 06:19:38.863677 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 06:19:40.863670277 +0000 UTC m=+24.338691317 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 06:19:38 crc kubenswrapper[4691]: I0930 06:19:38.880083 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dj86t\" (UniqueName: \"kubernetes.io/projected/7c1c8663-d263-4b8c-93fa-05ee1b61d7f1-kube-api-access-dj86t\") pod \"node-resolver-8htrc\" (UID: \"7c1c8663-d263-4b8c-93fa-05ee1b61d7f1\") " pod="openshift-dns/node-resolver-8htrc" Sep 30 06:19:38 crc kubenswrapper[4691]: I0930 06:19:38.947380 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-8htrc" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.029830 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-4w4k6"] Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.030439 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.043059 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-xjjw8"] Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.043539 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-xjjw8" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.045487 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.045624 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.045711 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-sjmvw"] Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.045973 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.046270 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.046449 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.046559 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.046650 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.046740 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.046662 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.047288 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-nzp64"] Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.056617 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-nzp64" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.057155 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.059142 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.059270 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.063144 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.064938 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5bfd073c-4582-4a65-8170-7030f4852174-host-var-lib-kubelet\") pod \"multus-xjjw8\" (UID: \"5bfd073c-4582-4a65-8170-7030f4852174\") " pod="openshift-multus/multus-xjjw8" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.064974 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5bfd073c-4582-4a65-8170-7030f4852174-multus-conf-dir\") pod \"multus-xjjw8\" (UID: \"5bfd073c-4582-4a65-8170-7030f4852174\") " pod="openshift-multus/multus-xjjw8" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.064996 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5bfd073c-4582-4a65-8170-7030f4852174-host-run-multus-certs\") pod \"multus-xjjw8\" (UID: \"5bfd073c-4582-4a65-8170-7030f4852174\") " pod="openshift-multus/multus-xjjw8" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.065016 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5bfd073c-4582-4a65-8170-7030f4852174-etc-kubernetes\") pod \"multus-xjjw8\" (UID: \"5bfd073c-4582-4a65-8170-7030f4852174\") " pod="openshift-multus/multus-xjjw8" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.065036 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5bfd073c-4582-4a65-8170-7030f4852174-cnibin\") pod \"multus-xjjw8\" (UID: \"5bfd073c-4582-4a65-8170-7030f4852174\") " pod="openshift-multus/multus-xjjw8" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.065053 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5bfd073c-4582-4a65-8170-7030f4852174-host-run-k8s-cni-cncf-io\") pod \"multus-xjjw8\" (UID: \"5bfd073c-4582-4a65-8170-7030f4852174\") " pod="openshift-multus/multus-xjjw8" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.065089 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5bfd073c-4582-4a65-8170-7030f4852174-host-var-lib-cni-bin\") pod \"multus-xjjw8\" (UID: \"5bfd073c-4582-4a65-8170-7030f4852174\") " pod="openshift-multus/multus-xjjw8" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.065109 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5bfd073c-4582-4a65-8170-7030f4852174-host-run-netns\") pod \"multus-xjjw8\" (UID: \"5bfd073c-4582-4a65-8170-7030f4852174\") " pod="openshift-multus/multus-xjjw8" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.065127 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5bfd073c-4582-4a65-8170-7030f4852174-host-var-lib-cni-multus\") pod \"multus-xjjw8\" (UID: \"5bfd073c-4582-4a65-8170-7030f4852174\") " pod="openshift-multus/multus-xjjw8" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.065145 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5bfd073c-4582-4a65-8170-7030f4852174-multus-daemon-config\") pod \"multus-xjjw8\" (UID: \"5bfd073c-4582-4a65-8170-7030f4852174\") " pod="openshift-multus/multus-xjjw8" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.065164 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjq2l\" (UniqueName: \"kubernetes.io/projected/5bfd073c-4582-4a65-8170-7030f4852174-kube-api-access-tjq2l\") pod \"multus-xjjw8\" (UID: \"5bfd073c-4582-4a65-8170-7030f4852174\") " pod="openshift-multus/multus-xjjw8" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.065185 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xszvq\" (UniqueName: \"kubernetes.io/projected/69b46ade-8260-448f-84b7-506632d23ff9-kube-api-access-xszvq\") pod \"machine-config-daemon-4w4k6\" (UID: \"69b46ade-8260-448f-84b7-506632d23ff9\") " pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.065219 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/69b46ade-8260-448f-84b7-506632d23ff9-rootfs\") pod \"machine-config-daemon-4w4k6\" (UID: \"69b46ade-8260-448f-84b7-506632d23ff9\") " pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.065237 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/69b46ade-8260-448f-84b7-506632d23ff9-proxy-tls\") pod \"machine-config-daemon-4w4k6\" (UID: \"69b46ade-8260-448f-84b7-506632d23ff9\") " pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.065259 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5bfd073c-4582-4a65-8170-7030f4852174-system-cni-dir\") pod \"multus-xjjw8\" (UID: \"5bfd073c-4582-4a65-8170-7030f4852174\") " pod="openshift-multus/multus-xjjw8" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.065277 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5bfd073c-4582-4a65-8170-7030f4852174-multus-cni-dir\") pod \"multus-xjjw8\" (UID: \"5bfd073c-4582-4a65-8170-7030f4852174\") " pod="openshift-multus/multus-xjjw8" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.065310 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5bfd073c-4582-4a65-8170-7030f4852174-os-release\") pod \"multus-xjjw8\" (UID: \"5bfd073c-4582-4a65-8170-7030f4852174\") " pod="openshift-multus/multus-xjjw8" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.065331 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5bfd073c-4582-4a65-8170-7030f4852174-cni-binary-copy\") pod \"multus-xjjw8\" (UID: \"5bfd073c-4582-4a65-8170-7030f4852174\") " pod="openshift-multus/multus-xjjw8" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.065349 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5bfd073c-4582-4a65-8170-7030f4852174-multus-socket-dir-parent\") pod \"multus-xjjw8\" (UID: \"5bfd073c-4582-4a65-8170-7030f4852174\") " pod="openshift-multus/multus-xjjw8" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.065370 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/69b46ade-8260-448f-84b7-506632d23ff9-mcd-auth-proxy-config\") pod \"machine-config-daemon-4w4k6\" (UID: \"69b46ade-8260-448f-84b7-506632d23ff9\") " pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.065393 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5bfd073c-4582-4a65-8170-7030f4852174-hostroot\") pod \"multus-xjjw8\" (UID: \"5bfd073c-4582-4a65-8170-7030f4852174\") " pod="openshift-multus/multus-xjjw8" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.066755 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.067049 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.067068 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.067220 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.067285 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.067366 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.071236 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.078544 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:39Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.093172 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:39Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.108372 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3ecaa3-192f-40d4-abf8-938b9e03a661\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://469f62522e2f813405840a8654f761a52163bdabeba8cc7c57998ccd40548370\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198c76a83115408261c01c78cbd2845e487ad3dc896da0130b2b1338349506d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aadde3a7a603aae354d4f9dcf5ef288aeceabfb73ee3c92398fb3419326b27a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2f054ea0da71d1f65e501212a66771c5e65bfc123e3a3acd5e13900def039d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:39Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.124141 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11fb453d515798fa4f85603df19a5dc7d305375d25c0f679598613aba7312328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:39Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.141173 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:39Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.155826 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03f725fd435423024f16548f2fdd44ad1ddc26d1d485d56cc5b614ab2cac7a76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5fcea9ccb5c1331b7ae737abdcc80364814715f3c43c15793316f16acf0502a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:39Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.165636 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5bfd073c-4582-4a65-8170-7030f4852174-cnibin\") pod \"multus-xjjw8\" (UID: \"5bfd073c-4582-4a65-8170-7030f4852174\") " pod="openshift-multus/multus-xjjw8" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.165681 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6-os-release\") pod \"multus-additional-cni-plugins-nzp64\" (UID: \"f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6\") " pod="openshift-multus/multus-additional-cni-plugins-nzp64" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.165700 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-var-lib-openvswitch\") pod \"ovnkube-node-sjmvw\" (UID: \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.165725 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5bfd073c-4582-4a65-8170-7030f4852174-host-var-lib-cni-bin\") pod \"multus-xjjw8\" (UID: \"5bfd073c-4582-4a65-8170-7030f4852174\") " pod="openshift-multus/multus-xjjw8" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.165739 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-host-kubelet\") pod \"ovnkube-node-sjmvw\" (UID: \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.165753 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-log-socket\") pod \"ovnkube-node-sjmvw\" (UID: \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.165746 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5bfd073c-4582-4a65-8170-7030f4852174-cnibin\") pod \"multus-xjjw8\" (UID: \"5bfd073c-4582-4a65-8170-7030f4852174\") " pod="openshift-multus/multus-xjjw8" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.165770 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5bfd073c-4582-4a65-8170-7030f4852174-host-run-netns\") pod \"multus-xjjw8\" (UID: \"5bfd073c-4582-4a65-8170-7030f4852174\") " pod="openshift-multus/multus-xjjw8" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.165827 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5bfd073c-4582-4a65-8170-7030f4852174-host-run-netns\") pod \"multus-xjjw8\" (UID: \"5bfd073c-4582-4a65-8170-7030f4852174\") " pod="openshift-multus/multus-xjjw8" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.165868 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-run-openvswitch\") pod \"ovnkube-node-sjmvw\" (UID: \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.165921 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-host-cni-bin\") pod \"ovnkube-node-sjmvw\" (UID: \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.165963 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-ovnkube-script-lib\") pod \"ovnkube-node-sjmvw\" (UID: \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.165988 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/69b46ade-8260-448f-84b7-506632d23ff9-proxy-tls\") pod \"machine-config-daemon-4w4k6\" (UID: \"69b46ade-8260-448f-84b7-506632d23ff9\") " pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.166006 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-ovn-node-metrics-cert\") pod \"ovnkube-node-sjmvw\" (UID: \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.166035 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/69b46ade-8260-448f-84b7-506632d23ff9-rootfs\") pod \"machine-config-daemon-4w4k6\" (UID: \"69b46ade-8260-448f-84b7-506632d23ff9\") " pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.166071 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/69b46ade-8260-448f-84b7-506632d23ff9-rootfs\") pod \"machine-config-daemon-4w4k6\" (UID: \"69b46ade-8260-448f-84b7-506632d23ff9\") " pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.166082 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-nzp64\" (UID: \"f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6\") " pod="openshift-multus/multus-additional-cni-plugins-nzp64" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.166136 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-host-run-netns\") pod \"ovnkube-node-sjmvw\" (UID: \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.166161 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5bfd073c-4582-4a65-8170-7030f4852174-host-var-lib-cni-bin\") pod \"multus-xjjw8\" (UID: \"5bfd073c-4582-4a65-8170-7030f4852174\") " pod="openshift-multus/multus-xjjw8" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.166178 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5bfd073c-4582-4a65-8170-7030f4852174-multus-cni-dir\") pod \"multus-xjjw8\" (UID: \"5bfd073c-4582-4a65-8170-7030f4852174\") " pod="openshift-multus/multus-xjjw8" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.166223 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-systemd-units\") pod \"ovnkube-node-sjmvw\" (UID: \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.166264 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/69b46ade-8260-448f-84b7-506632d23ff9-mcd-auth-proxy-config\") pod \"machine-config-daemon-4w4k6\" (UID: \"69b46ade-8260-448f-84b7-506632d23ff9\") " pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.166291 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5bfd073c-4582-4a65-8170-7030f4852174-os-release\") pod \"multus-xjjw8\" (UID: \"5bfd073c-4582-4a65-8170-7030f4852174\") " pod="openshift-multus/multus-xjjw8" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.166315 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5bfd073c-4582-4a65-8170-7030f4852174-cni-binary-copy\") pod \"multus-xjjw8\" (UID: \"5bfd073c-4582-4a65-8170-7030f4852174\") " pod="openshift-multus/multus-xjjw8" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.166338 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5bfd073c-4582-4a65-8170-7030f4852174-host-run-multus-certs\") pod \"multus-xjjw8\" (UID: \"5bfd073c-4582-4a65-8170-7030f4852174\") " pod="openshift-multus/multus-xjjw8" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.166361 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6-cnibin\") pod \"multus-additional-cni-plugins-nzp64\" (UID: \"f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6\") " pod="openshift-multus/multus-additional-cni-plugins-nzp64" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.166380 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5bfd073c-4582-4a65-8170-7030f4852174-multus-conf-dir\") pod \"multus-xjjw8\" (UID: \"5bfd073c-4582-4a65-8170-7030f4852174\") " pod="openshift-multus/multus-xjjw8" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.166397 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5bfd073c-4582-4a65-8170-7030f4852174-host-run-k8s-cni-cncf-io\") pod \"multus-xjjw8\" (UID: \"5bfd073c-4582-4a65-8170-7030f4852174\") " pod="openshift-multus/multus-xjjw8" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.166412 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5bfd073c-4582-4a65-8170-7030f4852174-etc-kubernetes\") pod \"multus-xjjw8\" (UID: \"5bfd073c-4582-4a65-8170-7030f4852174\") " pod="openshift-multus/multus-xjjw8" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.166435 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6-system-cni-dir\") pod \"multus-additional-cni-plugins-nzp64\" (UID: \"f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6\") " pod="openshift-multus/multus-additional-cni-plugins-nzp64" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.166481 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-sjmvw\" (UID: \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.166499 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-run-ovn\") pod \"ovnkube-node-sjmvw\" (UID: \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.166518 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5bfd073c-4582-4a65-8170-7030f4852174-host-var-lib-cni-multus\") pod \"multus-xjjw8\" (UID: \"5bfd073c-4582-4a65-8170-7030f4852174\") " pod="openshift-multus/multus-xjjw8" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.166535 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5bfd073c-4582-4a65-8170-7030f4852174-multus-daemon-config\") pod \"multus-xjjw8\" (UID: \"5bfd073c-4582-4a65-8170-7030f4852174\") " pod="openshift-multus/multus-xjjw8" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.166553 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjq2l\" (UniqueName: \"kubernetes.io/projected/5bfd073c-4582-4a65-8170-7030f4852174-kube-api-access-tjq2l\") pod \"multus-xjjw8\" (UID: \"5bfd073c-4582-4a65-8170-7030f4852174\") " pod="openshift-multus/multus-xjjw8" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.166577 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-run-systemd\") pod \"ovnkube-node-sjmvw\" (UID: \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.166597 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nvgw\" (UniqueName: \"kubernetes.io/projected/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-kube-api-access-5nvgw\") pod \"ovnkube-node-sjmvw\" (UID: \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.166639 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xszvq\" (UniqueName: \"kubernetes.io/projected/69b46ade-8260-448f-84b7-506632d23ff9-kube-api-access-xszvq\") pod \"machine-config-daemon-4w4k6\" (UID: \"69b46ade-8260-448f-84b7-506632d23ff9\") " pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.166668 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6-cni-binary-copy\") pod \"multus-additional-cni-plugins-nzp64\" (UID: \"f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6\") " pod="openshift-multus/multus-additional-cni-plugins-nzp64" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.166718 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmgj7\" (UniqueName: \"kubernetes.io/projected/f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6-kube-api-access-tmgj7\") pod \"multus-additional-cni-plugins-nzp64\" (UID: \"f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6\") " pod="openshift-multus/multus-additional-cni-plugins-nzp64" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.166739 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-node-log\") pod \"ovnkube-node-sjmvw\" (UID: \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.166753 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-host-cni-netd\") pod \"ovnkube-node-sjmvw\" (UID: \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.166766 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-ovnkube-config\") pod \"ovnkube-node-sjmvw\" (UID: \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.166781 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-host-run-ovn-kubernetes\") pod \"ovnkube-node-sjmvw\" (UID: \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.166804 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5bfd073c-4582-4a65-8170-7030f4852174-system-cni-dir\") pod \"multus-xjjw8\" (UID: \"5bfd073c-4582-4a65-8170-7030f4852174\") " pod="openshift-multus/multus-xjjw8" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.166826 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5bfd073c-4582-4a65-8170-7030f4852174-multus-socket-dir-parent\") pod \"multus-xjjw8\" (UID: \"5bfd073c-4582-4a65-8170-7030f4852174\") " pod="openshift-multus/multus-xjjw8" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.166852 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5bfd073c-4582-4a65-8170-7030f4852174-hostroot\") pod \"multus-xjjw8\" (UID: \"5bfd073c-4582-4a65-8170-7030f4852174\") " pod="openshift-multus/multus-xjjw8" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.166874 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-nzp64\" (UID: \"f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6\") " pod="openshift-multus/multus-additional-cni-plugins-nzp64" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.166917 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-etc-openvswitch\") pod \"ovnkube-node-sjmvw\" (UID: \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.166935 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5bfd073c-4582-4a65-8170-7030f4852174-host-var-lib-kubelet\") pod \"multus-xjjw8\" (UID: \"5bfd073c-4582-4a65-8170-7030f4852174\") " pod="openshift-multus/multus-xjjw8" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.166951 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-host-slash\") pod \"ovnkube-node-sjmvw\" (UID: \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.166977 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-env-overrides\") pod \"ovnkube-node-sjmvw\" (UID: \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.167082 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5bfd073c-4582-4a65-8170-7030f4852174-os-release\") pod \"multus-xjjw8\" (UID: \"5bfd073c-4582-4a65-8170-7030f4852174\") " pod="openshift-multus/multus-xjjw8" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.167386 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5bfd073c-4582-4a65-8170-7030f4852174-system-cni-dir\") pod \"multus-xjjw8\" (UID: \"5bfd073c-4582-4a65-8170-7030f4852174\") " pod="openshift-multus/multus-xjjw8" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.167443 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5bfd073c-4582-4a65-8170-7030f4852174-multus-socket-dir-parent\") pod \"multus-xjjw8\" (UID: \"5bfd073c-4582-4a65-8170-7030f4852174\") " pod="openshift-multus/multus-xjjw8" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.167480 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5bfd073c-4582-4a65-8170-7030f4852174-hostroot\") pod \"multus-xjjw8\" (UID: \"5bfd073c-4582-4a65-8170-7030f4852174\") " pod="openshift-multus/multus-xjjw8" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.166918 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/69b46ade-8260-448f-84b7-506632d23ff9-mcd-auth-proxy-config\") pod \"machine-config-daemon-4w4k6\" (UID: \"69b46ade-8260-448f-84b7-506632d23ff9\") " pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.167519 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5bfd073c-4582-4a65-8170-7030f4852174-cni-binary-copy\") pod \"multus-xjjw8\" (UID: \"5bfd073c-4582-4a65-8170-7030f4852174\") " pod="openshift-multus/multus-xjjw8" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.167542 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5bfd073c-4582-4a65-8170-7030f4852174-host-run-multus-certs\") pod \"multus-xjjw8\" (UID: \"5bfd073c-4582-4a65-8170-7030f4852174\") " pod="openshift-multus/multus-xjjw8" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.166341 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5bfd073c-4582-4a65-8170-7030f4852174-multus-cni-dir\") pod \"multus-xjjw8\" (UID: \"5bfd073c-4582-4a65-8170-7030f4852174\") " pod="openshift-multus/multus-xjjw8" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.167580 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5bfd073c-4582-4a65-8170-7030f4852174-host-var-lib-kubelet\") pod \"multus-xjjw8\" (UID: \"5bfd073c-4582-4a65-8170-7030f4852174\") " pod="openshift-multus/multus-xjjw8" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.167609 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5bfd073c-4582-4a65-8170-7030f4852174-etc-kubernetes\") pod \"multus-xjjw8\" (UID: \"5bfd073c-4582-4a65-8170-7030f4852174\") " pod="openshift-multus/multus-xjjw8" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.167616 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5bfd073c-4582-4a65-8170-7030f4852174-host-run-k8s-cni-cncf-io\") pod \"multus-xjjw8\" (UID: \"5bfd073c-4582-4a65-8170-7030f4852174\") " pod="openshift-multus/multus-xjjw8" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.167647 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5bfd073c-4582-4a65-8170-7030f4852174-host-var-lib-cni-multus\") pod \"multus-xjjw8\" (UID: \"5bfd073c-4582-4a65-8170-7030f4852174\") " pod="openshift-multus/multus-xjjw8" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.167587 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5bfd073c-4582-4a65-8170-7030f4852174-multus-conf-dir\") pod \"multus-xjjw8\" (UID: \"5bfd073c-4582-4a65-8170-7030f4852174\") " pod="openshift-multus/multus-xjjw8" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.168261 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5bfd073c-4582-4a65-8170-7030f4852174-multus-daemon-config\") pod \"multus-xjjw8\" (UID: \"5bfd073c-4582-4a65-8170-7030f4852174\") " pod="openshift-multus/multus-xjjw8" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.172320 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/69b46ade-8260-448f-84b7-506632d23ff9-proxy-tls\") pod \"machine-config-daemon-4w4k6\" (UID: \"69b46ade-8260-448f-84b7-506632d23ff9\") " pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.176091 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:39Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.184621 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xszvq\" (UniqueName: \"kubernetes.io/projected/69b46ade-8260-448f-84b7-506632d23ff9-kube-api-access-xszvq\") pod \"machine-config-daemon-4w4k6\" (UID: \"69b46ade-8260-448f-84b7-506632d23ff9\") " pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.191129 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8htrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c1c8663-d263-4b8c-93fa-05ee1b61d7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj86t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8htrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:39Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.191525 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjq2l\" (UniqueName: \"kubernetes.io/projected/5bfd073c-4582-4a65-8170-7030f4852174-kube-api-access-tjq2l\") pod \"multus-xjjw8\" (UID: \"5bfd073c-4582-4a65-8170-7030f4852174\") " pod="openshift-multus/multus-xjjw8" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.206639 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b46ade-8260-448f-84b7-506632d23ff9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xszvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xszvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4w4k6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:39Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.224215 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 06:19:39 crc kubenswrapper[4691]: E0930 06:19:39.224314 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.224359 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 06:19:39 crc kubenswrapper[4691]: E0930 06:19:39.224472 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.228296 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.228799 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.229519 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.230096 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.230655 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.231977 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.232478 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.233493 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.242050 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e55ae5a0-03bf-427d-92b2-39cdff18340d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f6cedf7ad601b81415785791be352c849e1d559ab8e6a0f33de2c89a93cf09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65513dc3f11ab14acbafe004e02c1b395b74935e84a00e75d9936bde03a97fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6240aa4c18020b30de9389e7a1869c736424444208d4011f228b4e5432520cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2841452dae413fa8cdf532c4933ed52cfb007be43a45ee2d25209b54890fd351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b052f8474075dfc1f07b8e0844072d5bbaa94bae1711f7e4cecc6928a65689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:39Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.263071 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb1cfb02-1e6b-4bfb-8104-f1d137231de9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07fbce63750320e7345e1a99b4be96355f54c36c70e8929599b990eab3f3e637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38c9aa49f81f590d234daa993bb92af0c48103b45f5116c27b86daa651930a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acbe061d62e7bf11b0003c035dce19386981672d5e63236fada6e43b92274c91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fc15eff68924ff75a90b9cb7b5119d95ced9b9fe9e12968bfd077db73f37ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://017d7c032ea394e2dbef22cfc837c65ce54aab7db5fbd32210312b3e7042a698\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T06:19:31Z\\\",\\\"message\\\":\\\"W0930 06:19:20.583735 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 06:19:20.584237 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759213160 cert, and key in /tmp/serving-cert-2229078217/serving-signer.crt, /tmp/serving-cert-2229078217/serving-signer.key\\\\nI0930 06:19:20.815054 1 observer_polling.go:159] Starting file observer\\\\nW0930 06:19:20.818069 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 06:19:20.818377 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 06:19:20.821284 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2229078217/tls.crt::/tmp/serving-cert-2229078217/tls.key\\\\\\\"\\\\nF0930 06:19:31.389088 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39d7f1cd59b9a24d4303415d873bac1ed21326847749130167868b5298b7f8e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:39Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.267628 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-nzp64\" (UID: \"f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6\") " pod="openshift-multus/multus-additional-cni-plugins-nzp64" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.267662 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-etc-openvswitch\") pod \"ovnkube-node-sjmvw\" (UID: \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.267678 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-host-slash\") pod \"ovnkube-node-sjmvw\" (UID: \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.267696 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-env-overrides\") pod \"ovnkube-node-sjmvw\" (UID: \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.267715 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6-os-release\") pod \"multus-additional-cni-plugins-nzp64\" (UID: \"f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6\") " pod="openshift-multus/multus-additional-cni-plugins-nzp64" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.267734 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-var-lib-openvswitch\") pod \"ovnkube-node-sjmvw\" (UID: \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.267749 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-host-kubelet\") pod \"ovnkube-node-sjmvw\" (UID: \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.267764 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-log-socket\") pod \"ovnkube-node-sjmvw\" (UID: \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.267787 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-run-openvswitch\") pod \"ovnkube-node-sjmvw\" (UID: \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.267803 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-host-cni-bin\") pod \"ovnkube-node-sjmvw\" (UID: \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.267817 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-ovnkube-script-lib\") pod \"ovnkube-node-sjmvw\" (UID: \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.267832 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-ovn-node-metrics-cert\") pod \"ovnkube-node-sjmvw\" (UID: \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.267847 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-nzp64\" (UID: \"f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6\") " pod="openshift-multus/multus-additional-cni-plugins-nzp64" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.267862 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-host-run-netns\") pod \"ovnkube-node-sjmvw\" (UID: \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.267896 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-systemd-units\") pod \"ovnkube-node-sjmvw\" (UID: \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.267913 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6-cnibin\") pod \"multus-additional-cni-plugins-nzp64\" (UID: \"f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6\") " pod="openshift-multus/multus-additional-cni-plugins-nzp64" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.267929 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6-system-cni-dir\") pod \"multus-additional-cni-plugins-nzp64\" (UID: \"f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6\") " pod="openshift-multus/multus-additional-cni-plugins-nzp64" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.267944 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-sjmvw\" (UID: \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.267958 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-run-ovn\") pod \"ovnkube-node-sjmvw\" (UID: \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.267973 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-run-systemd\") pod \"ovnkube-node-sjmvw\" (UID: \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.267986 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nvgw\" (UniqueName: \"kubernetes.io/projected/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-kube-api-access-5nvgw\") pod \"ovnkube-node-sjmvw\" (UID: \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.268002 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6-cni-binary-copy\") pod \"multus-additional-cni-plugins-nzp64\" (UID: \"f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6\") " pod="openshift-multus/multus-additional-cni-plugins-nzp64" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.268016 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmgj7\" (UniqueName: \"kubernetes.io/projected/f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6-kube-api-access-tmgj7\") pod \"multus-additional-cni-plugins-nzp64\" (UID: \"f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6\") " pod="openshift-multus/multus-additional-cni-plugins-nzp64" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.268029 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-node-log\") pod \"ovnkube-node-sjmvw\" (UID: \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.268054 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-host-run-ovn-kubernetes\") pod \"ovnkube-node-sjmvw\" (UID: \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.268066 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-host-cni-netd\") pod \"ovnkube-node-sjmvw\" (UID: \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.268079 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-ovnkube-config\") pod \"ovnkube-node-sjmvw\" (UID: \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.268712 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-ovnkube-config\") pod \"ovnkube-node-sjmvw\" (UID: \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.269221 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-nzp64\" (UID: \"f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6\") " pod="openshift-multus/multus-additional-cni-plugins-nzp64" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.269268 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-etc-openvswitch\") pod \"ovnkube-node-sjmvw\" (UID: \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.269292 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-host-slash\") pod \"ovnkube-node-sjmvw\" (UID: \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.269579 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-env-overrides\") pod \"ovnkube-node-sjmvw\" (UID: \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.269632 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6-os-release\") pod \"multus-additional-cni-plugins-nzp64\" (UID: \"f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6\") " pod="openshift-multus/multus-additional-cni-plugins-nzp64" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.269654 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-var-lib-openvswitch\") pod \"ovnkube-node-sjmvw\" (UID: \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.269673 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-host-kubelet\") pod \"ovnkube-node-sjmvw\" (UID: \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.269697 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-log-socket\") pod \"ovnkube-node-sjmvw\" (UID: \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.269721 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-run-openvswitch\") pod \"ovnkube-node-sjmvw\" (UID: \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.269741 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-host-cni-bin\") pod \"ovnkube-node-sjmvw\" (UID: \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.270208 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-ovnkube-script-lib\") pod \"ovnkube-node-sjmvw\" (UID: \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.270606 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-run-ovn\") pod \"ovnkube-node-sjmvw\" (UID: \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.270729 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-host-run-ovn-kubernetes\") pod \"ovnkube-node-sjmvw\" (UID: \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.270779 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-node-log\") pod \"ovnkube-node-sjmvw\" (UID: \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.270765 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-host-cni-netd\") pod \"ovnkube-node-sjmvw\" (UID: \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.270804 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-run-systemd\") pod \"ovnkube-node-sjmvw\" (UID: \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.270807 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-sjmvw\" (UID: \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.270827 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-host-run-netns\") pod \"ovnkube-node-sjmvw\" (UID: \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.270854 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-systemd-units\") pod \"ovnkube-node-sjmvw\" (UID: \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.270919 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6-cnibin\") pod \"multus-additional-cni-plugins-nzp64\" (UID: \"f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6\") " pod="openshift-multus/multus-additional-cni-plugins-nzp64" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.270960 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6-system-cni-dir\") pod \"multus-additional-cni-plugins-nzp64\" (UID: \"f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6\") " pod="openshift-multus/multus-additional-cni-plugins-nzp64" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.271111 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-nzp64\" (UID: \"f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6\") " pod="openshift-multus/multus-additional-cni-plugins-nzp64" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.271386 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6-cni-binary-copy\") pod \"multus-additional-cni-plugins-nzp64\" (UID: \"f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6\") " pod="openshift-multus/multus-additional-cni-plugins-nzp64" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.273487 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-ovn-node-metrics-cert\") pod \"ovnkube-node-sjmvw\" (UID: \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.281386 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3ecaa3-192f-40d4-abf8-938b9e03a661\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://469f62522e2f813405840a8654f761a52163bdabeba8cc7c57998ccd40548370\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198c76a83115408261c01c78cbd2845e487ad3dc896da0130b2b1338349506d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aadde3a7a603aae354d4f9dcf5ef288aeceabfb73ee3c92398fb3419326b27a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2f054ea0da71d1f65e501212a66771c5e65bfc123e3a3acd5e13900def039d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:39Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.288465 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nvgw\" (UniqueName: \"kubernetes.io/projected/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-kube-api-access-5nvgw\") pod \"ovnkube-node-sjmvw\" (UID: \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\") " pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.288947 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmgj7\" (UniqueName: \"kubernetes.io/projected/f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6-kube-api-access-tmgj7\") pod \"multus-additional-cni-plugins-nzp64\" (UID: \"f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6\") " pod="openshift-multus/multus-additional-cni-plugins-nzp64" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.295522 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11fb453d515798fa4f85603df19a5dc7d305375d25c0f679598613aba7312328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:39Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.313373 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nzp64" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nzp64\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:39Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.328256 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sjmvw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:39Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.338849 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:39Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.345263 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.353210 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-8htrc" event={"ID":"7c1c8663-d263-4b8c-93fa-05ee1b61d7f1","Type":"ContainerStarted","Data":"73cf34ef0b11305b959da324b55f8a211f0c278cf2dec835bb6c91ad385e4dca"} Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.353247 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-8htrc" event={"ID":"7c1c8663-d263-4b8c-93fa-05ee1b61d7f1","Type":"ContainerStarted","Data":"135023c464e5dd1364beff99cb7c5b741a1843084d83ccb026de265aff73a910"} Sep 30 06:19:39 crc kubenswrapper[4691]: W0930 06:19:39.355724 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69b46ade_8260_448f_84b7_506632d23ff9.slice/crio-757cb82315be148af8ddf6b447c8d45a3310a812b362975660e6b36f2031ba0a WatchSource:0}: Error finding container 757cb82315be148af8ddf6b447c8d45a3310a812b362975660e6b36f2031ba0a: Status 404 returned error can't find the container with id 757cb82315be148af8ddf6b447c8d45a3310a812b362975660e6b36f2031ba0a Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.363683 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:39Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.364910 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-xjjw8" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.377202 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xjjw8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bfd073c-4582-4a65-8170-7030f4852174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xjjw8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:39Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:39 crc kubenswrapper[4691]: W0930 06:19:39.378649 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5bfd073c_4582_4a65_8170_7030f4852174.slice/crio-3bd0eebb70c0d75956f4e4880424745902cabd0c564631d7e736b999f1cdab3a WatchSource:0}: Error finding container 3bd0eebb70c0d75956f4e4880424745902cabd0c564631d7e736b999f1cdab3a: Status 404 returned error can't find the container with id 3bd0eebb70c0d75956f4e4880424745902cabd0c564631d7e736b999f1cdab3a Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.383199 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-nzp64" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.389198 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:39Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.399784 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.401731 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8htrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c1c8663-d263-4b8c-93fa-05ee1b61d7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj86t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8htrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:39Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.414755 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b46ade-8260-448f-84b7-506632d23ff9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xszvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xszvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4w4k6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:39Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:39 crc kubenswrapper[4691]: W0930 06:19:39.417154 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f1b023d_cbb5_4ddf_a9d0_274d2fc70c1d.slice/crio-0ab6d01b4928b3b2967394cb35e4c99eb8f282ea2c9f82bfaba96756581f6b1b WatchSource:0}: Error finding container 0ab6d01b4928b3b2967394cb35e4c99eb8f282ea2c9f82bfaba96756581f6b1b: Status 404 returned error can't find the container with id 0ab6d01b4928b3b2967394cb35e4c99eb8f282ea2c9f82bfaba96756581f6b1b Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.447238 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e55ae5a0-03bf-427d-92b2-39cdff18340d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f6cedf7ad601b81415785791be352c849e1d559ab8e6a0f33de2c89a93cf09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65513dc3f11ab14acbafe004e02c1b395b74935e84a00e75d9936bde03a97fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6240aa4c18020b30de9389e7a1869c736424444208d4011f228b4e5432520cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2841452dae413fa8cdf532c4933ed52cfb007be43a45ee2d25209b54890fd351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b052f8474075dfc1f07b8e0844072d5bbaa94bae1711f7e4cecc6928a65689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:39Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.467870 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb1cfb02-1e6b-4bfb-8104-f1d137231de9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07fbce63750320e7345e1a99b4be96355f54c36c70e8929599b990eab3f3e637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38c9aa49f81f590d234daa993bb92af0c48103b45f5116c27b86daa651930a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acbe061d62e7bf11b0003c035dce19386981672d5e63236fada6e43b92274c91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fc15eff68924ff75a90b9cb7b5119d95ced9b9fe9e12968bfd077db73f37ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://017d7c032ea394e2dbef22cfc837c65ce54aab7db5fbd32210312b3e7042a698\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T06:19:31Z\\\",\\\"message\\\":\\\"W0930 06:19:20.583735 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 06:19:20.584237 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759213160 cert, and key in /tmp/serving-cert-2229078217/serving-signer.crt, /tmp/serving-cert-2229078217/serving-signer.key\\\\nI0930 06:19:20.815054 1 observer_polling.go:159] Starting file observer\\\\nW0930 06:19:20.818069 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 06:19:20.818377 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 06:19:20.821284 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2229078217/tls.crt::/tmp/serving-cert-2229078217/tls.key\\\\\\\"\\\\nF0930 06:19:31.389088 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39d7f1cd59b9a24d4303415d873bac1ed21326847749130167868b5298b7f8e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:39Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.479077 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:39Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.496427 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03f725fd435423024f16548f2fdd44ad1ddc26d1d485d56cc5b614ab2cac7a76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5fcea9ccb5c1331b7ae737abdcc80364814715f3c43c15793316f16acf0502a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:39Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.543739 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sjmvw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:39Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.585202 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11fb453d515798fa4f85603df19a5dc7d305375d25c0f679598613aba7312328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:39Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.617102 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nzp64" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nzp64\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:39Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.656048 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xjjw8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bfd073c-4582-4a65-8170-7030f4852174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xjjw8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:39Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.694510 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:39Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.738781 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:39Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.786244 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:39Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.817699 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03f725fd435423024f16548f2fdd44ad1ddc26d1d485d56cc5b614ab2cac7a76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5fcea9ccb5c1331b7ae737abdcc80364814715f3c43c15793316f16acf0502a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:39Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.858504 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:39Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.906271 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8htrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c1c8663-d263-4b8c-93fa-05ee1b61d7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73cf34ef0b11305b959da324b55f8a211f0c278cf2dec835bb6c91ad385e4dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj86t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8htrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:39Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.936358 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b46ade-8260-448f-84b7-506632d23ff9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xszvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xszvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4w4k6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:39Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:39 crc kubenswrapper[4691]: I0930 06:19:39.987392 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e55ae5a0-03bf-427d-92b2-39cdff18340d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f6cedf7ad601b81415785791be352c849e1d559ab8e6a0f33de2c89a93cf09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65513dc3f11ab14acbafe004e02c1b395b74935e84a00e75d9936bde03a97fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6240aa4c18020b30de9389e7a1869c736424444208d4011f228b4e5432520cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2841452dae413fa8cdf532c4933ed52cfb007be43a45ee2d25209b54890fd351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b052f8474075dfc1f07b8e0844072d5bbaa94bae1711f7e4cecc6928a65689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:39Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:40 crc kubenswrapper[4691]: I0930 06:19:40.024879 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb1cfb02-1e6b-4bfb-8104-f1d137231de9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07fbce63750320e7345e1a99b4be96355f54c36c70e8929599b990eab3f3e637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38c9aa49f81f590d234daa993bb92af0c48103b45f5116c27b86daa651930a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acbe061d62e7bf11b0003c035dce19386981672d5e63236fada6e43b92274c91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fc15eff68924ff75a90b9cb7b5119d95ced9b9fe9e12968bfd077db73f37ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://017d7c032ea394e2dbef22cfc837c65ce54aab7db5fbd32210312b3e7042a698\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T06:19:31Z\\\",\\\"message\\\":\\\"W0930 06:19:20.583735 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 06:19:20.584237 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759213160 cert, and key in /tmp/serving-cert-2229078217/serving-signer.crt, /tmp/serving-cert-2229078217/serving-signer.key\\\\nI0930 06:19:20.815054 1 observer_polling.go:159] Starting file observer\\\\nW0930 06:19:20.818069 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 06:19:20.818377 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 06:19:20.821284 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2229078217/tls.crt::/tmp/serving-cert-2229078217/tls.key\\\\\\\"\\\\nF0930 06:19:31.389088 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39d7f1cd59b9a24d4303415d873bac1ed21326847749130167868b5298b7f8e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:40Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:40 crc kubenswrapper[4691]: I0930 06:19:40.061195 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3ecaa3-192f-40d4-abf8-938b9e03a661\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://469f62522e2f813405840a8654f761a52163bdabeba8cc7c57998ccd40548370\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198c76a83115408261c01c78cbd2845e487ad3dc896da0130b2b1338349506d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aadde3a7a603aae354d4f9dcf5ef288aeceabfb73ee3c92398fb3419326b27a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2f054ea0da71d1f65e501212a66771c5e65bfc123e3a3acd5e13900def039d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:40Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:40 crc kubenswrapper[4691]: I0930 06:19:40.224298 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 06:19:40 crc kubenswrapper[4691]: E0930 06:19:40.224424 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 06:19:40 crc kubenswrapper[4691]: I0930 06:19:40.358294 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" event={"ID":"69b46ade-8260-448f-84b7-506632d23ff9","Type":"ContainerStarted","Data":"ff4769002bae0f06423979897a58fdb4b2a78f9ea00abc1d5df5c8d4385c0919"} Sep 30 06:19:40 crc kubenswrapper[4691]: I0930 06:19:40.358345 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" event={"ID":"69b46ade-8260-448f-84b7-506632d23ff9","Type":"ContainerStarted","Data":"1440cd9bacd9b28d5e192b3d9143dda931c741606fdb8f246fba23b8a68c534c"} Sep 30 06:19:40 crc kubenswrapper[4691]: I0930 06:19:40.358359 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" event={"ID":"69b46ade-8260-448f-84b7-506632d23ff9","Type":"ContainerStarted","Data":"757cb82315be148af8ddf6b447c8d45a3310a812b362975660e6b36f2031ba0a"} Sep 30 06:19:40 crc kubenswrapper[4691]: I0930 06:19:40.360862 4691 generic.go:334] "Generic (PLEG): container finished" podID="f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6" containerID="2d675d9797e8541b6fff5df78a8d0701a51d23b03bd2d489d0489a82f440b4a1" exitCode=0 Sep 30 06:19:40 crc kubenswrapper[4691]: I0930 06:19:40.360935 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nzp64" event={"ID":"f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6","Type":"ContainerDied","Data":"2d675d9797e8541b6fff5df78a8d0701a51d23b03bd2d489d0489a82f440b4a1"} Sep 30 06:19:40 crc kubenswrapper[4691]: I0930 06:19:40.360955 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nzp64" event={"ID":"f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6","Type":"ContainerStarted","Data":"3d6f127fbe0bdd268d7f7b38b4b7b47c3b2264e0766c983d243ac83515d31f18"} Sep 30 06:19:40 crc kubenswrapper[4691]: I0930 06:19:40.362866 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xjjw8" event={"ID":"5bfd073c-4582-4a65-8170-7030f4852174","Type":"ContainerStarted","Data":"3a065eb18eb909fb60011f5ae6adf327088480996585d74e861628ffdd759bd9"} Sep 30 06:19:40 crc kubenswrapper[4691]: I0930 06:19:40.362960 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xjjw8" event={"ID":"5bfd073c-4582-4a65-8170-7030f4852174","Type":"ContainerStarted","Data":"3bd0eebb70c0d75956f4e4880424745902cabd0c564631d7e736b999f1cdab3a"} Sep 30 06:19:40 crc kubenswrapper[4691]: I0930 06:19:40.364312 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"ca08f5d521940a428454d90f85150e7b87e2353314c89a42b4c73740c3b23163"} Sep 30 06:19:40 crc kubenswrapper[4691]: I0930 06:19:40.366207 4691 generic.go:334] "Generic (PLEG): container finished" podID="6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" containerID="32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72" exitCode=0 Sep 30 06:19:40 crc kubenswrapper[4691]: I0930 06:19:40.366247 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" event={"ID":"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d","Type":"ContainerDied","Data":"32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72"} Sep 30 06:19:40 crc kubenswrapper[4691]: I0930 06:19:40.366269 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" event={"ID":"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d","Type":"ContainerStarted","Data":"0ab6d01b4928b3b2967394cb35e4c99eb8f282ea2c9f82bfaba96756581f6b1b"} Sep 30 06:19:40 crc kubenswrapper[4691]: I0930 06:19:40.378173 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03f725fd435423024f16548f2fdd44ad1ddc26d1d485d56cc5b614ab2cac7a76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5fcea9ccb5c1331b7ae737abdcc80364814715f3c43c15793316f16acf0502a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:40Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:40 crc kubenswrapper[4691]: I0930 06:19:40.404027 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:40Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:40 crc kubenswrapper[4691]: I0930 06:19:40.417113 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8htrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c1c8663-d263-4b8c-93fa-05ee1b61d7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73cf34ef0b11305b959da324b55f8a211f0c278cf2dec835bb6c91ad385e4dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj86t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8htrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:40Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:40 crc kubenswrapper[4691]: I0930 06:19:40.429372 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b46ade-8260-448f-84b7-506632d23ff9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff4769002bae0f06423979897a58fdb4b2a78f9ea00abc1d5df5c8d4385c0919\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xszvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1440cd9bacd9b28d5e192b3d9143dda931c741606fdb8f246fba23b8a68c534c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xszvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4w4k6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:40Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:40 crc kubenswrapper[4691]: I0930 06:19:40.455688 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e55ae5a0-03bf-427d-92b2-39cdff18340d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f6cedf7ad601b81415785791be352c849e1d559ab8e6a0f33de2c89a93cf09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65513dc3f11ab14acbafe004e02c1b395b74935e84a00e75d9936bde03a97fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6240aa4c18020b30de9389e7a1869c736424444208d4011f228b4e5432520cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2841452dae413fa8cdf532c4933ed52cfb007be43a45ee2d25209b54890fd351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b052f8474075dfc1f07b8e0844072d5bbaa94bae1711f7e4cecc6928a65689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:40Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:40 crc kubenswrapper[4691]: I0930 06:19:40.467864 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb1cfb02-1e6b-4bfb-8104-f1d137231de9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07fbce63750320e7345e1a99b4be96355f54c36c70e8929599b990eab3f3e637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38c9aa49f81f590d234daa993bb92af0c48103b45f5116c27b86daa651930a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acbe061d62e7bf11b0003c035dce19386981672d5e63236fada6e43b92274c91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fc15eff68924ff75a90b9cb7b5119d95ced9b9fe9e12968bfd077db73f37ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://017d7c032ea394e2dbef22cfc837c65ce54aab7db5fbd32210312b3e7042a698\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T06:19:31Z\\\",\\\"message\\\":\\\"W0930 06:19:20.583735 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 06:19:20.584237 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759213160 cert, and key in /tmp/serving-cert-2229078217/serving-signer.crt, /tmp/serving-cert-2229078217/serving-signer.key\\\\nI0930 06:19:20.815054 1 observer_polling.go:159] Starting file observer\\\\nW0930 06:19:20.818069 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 06:19:20.818377 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 06:19:20.821284 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2229078217/tls.crt::/tmp/serving-cert-2229078217/tls.key\\\\\\\"\\\\nF0930 06:19:31.389088 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39d7f1cd59b9a24d4303415d873bac1ed21326847749130167868b5298b7f8e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:40Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:40 crc kubenswrapper[4691]: I0930 06:19:40.478511 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:40Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:40 crc kubenswrapper[4691]: I0930 06:19:40.489380 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3ecaa3-192f-40d4-abf8-938b9e03a661\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://469f62522e2f813405840a8654f761a52163bdabeba8cc7c57998ccd40548370\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198c76a83115408261c01c78cbd2845e487ad3dc896da0130b2b1338349506d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aadde3a7a603aae354d4f9dcf5ef288aeceabfb73ee3c92398fb3419326b27a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2f054ea0da71d1f65e501212a66771c5e65bfc123e3a3acd5e13900def039d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:40Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:40 crc kubenswrapper[4691]: I0930 06:19:40.544577 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11fb453d515798fa4f85603df19a5dc7d305375d25c0f679598613aba7312328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:40Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:40 crc kubenswrapper[4691]: I0930 06:19:40.559254 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nzp64" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nzp64\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:40Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:40 crc kubenswrapper[4691]: I0930 06:19:40.578745 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sjmvw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:40Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:40 crc kubenswrapper[4691]: I0930 06:19:40.589729 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:40Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:40 crc kubenswrapper[4691]: I0930 06:19:40.600466 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:40Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:40 crc kubenswrapper[4691]: I0930 06:19:40.620243 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xjjw8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bfd073c-4582-4a65-8170-7030f4852174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xjjw8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:40Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:40 crc kubenswrapper[4691]: I0930 06:19:40.656966 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3ecaa3-192f-40d4-abf8-938b9e03a661\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://469f62522e2f813405840a8654f761a52163bdabeba8cc7c57998ccd40548370\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198c76a83115408261c01c78cbd2845e487ad3dc896da0130b2b1338349506d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aadde3a7a603aae354d4f9dcf5ef288aeceabfb73ee3c92398fb3419326b27a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2f054ea0da71d1f65e501212a66771c5e65bfc123e3a3acd5e13900def039d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:40Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:40 crc kubenswrapper[4691]: I0930 06:19:40.697273 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11fb453d515798fa4f85603df19a5dc7d305375d25c0f679598613aba7312328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:40Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:40 crc kubenswrapper[4691]: I0930 06:19:40.747259 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nzp64" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d675d9797e8541b6fff5df78a8d0701a51d23b03bd2d489d0489a82f440b4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d675d9797e8541b6fff5df78a8d0701a51d23b03bd2d489d0489a82f440b4a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nzp64\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:40Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:40 crc kubenswrapper[4691]: I0930 06:19:40.780408 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 06:19:40 crc kubenswrapper[4691]: I0930 06:19:40.780515 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 06:19:40 crc kubenswrapper[4691]: I0930 06:19:40.780543 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 06:19:40 crc kubenswrapper[4691]: E0930 06:19:40.780649 4691 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 06:19:40 crc kubenswrapper[4691]: E0930 06:19:40.780680 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 06:19:44.78064144 +0000 UTC m=+28.255662520 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:19:40 crc kubenswrapper[4691]: E0930 06:19:40.780683 4691 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 06:19:40 crc kubenswrapper[4691]: E0930 06:19:40.780731 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 06:19:44.780715182 +0000 UTC m=+28.255736262 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 06:19:40 crc kubenswrapper[4691]: E0930 06:19:40.780815 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 06:19:44.780787315 +0000 UTC m=+28.255808355 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 06:19:40 crc kubenswrapper[4691]: I0930 06:19:40.783894 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sjmvw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:40Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:40 crc kubenswrapper[4691]: I0930 06:19:40.820255 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:40Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:40 crc kubenswrapper[4691]: I0930 06:19:40.863968 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca08f5d521940a428454d90f85150e7b87e2353314c89a42b4c73740c3b23163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:40Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:40 crc kubenswrapper[4691]: I0930 06:19:40.881320 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 06:19:40 crc kubenswrapper[4691]: I0930 06:19:40.881441 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 06:19:40 crc kubenswrapper[4691]: E0930 06:19:40.881584 4691 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 06:19:40 crc kubenswrapper[4691]: E0930 06:19:40.881624 4691 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 06:19:40 crc kubenswrapper[4691]: E0930 06:19:40.881625 4691 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 06:19:40 crc kubenswrapper[4691]: E0930 06:19:40.881644 4691 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 06:19:40 crc kubenswrapper[4691]: E0930 06:19:40.881656 4691 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 06:19:40 crc kubenswrapper[4691]: E0930 06:19:40.881676 4691 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 06:19:40 crc kubenswrapper[4691]: E0930 06:19:40.881736 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 06:19:44.881716015 +0000 UTC m=+28.356737085 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 06:19:40 crc kubenswrapper[4691]: E0930 06:19:40.881765 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 06:19:44.881750896 +0000 UTC m=+28.356771976 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 06:19:40 crc kubenswrapper[4691]: I0930 06:19:40.912384 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xjjw8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bfd073c-4582-4a65-8170-7030f4852174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a065eb18eb909fb60011f5ae6adf327088480996585d74e861628ffdd759bd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xjjw8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:40Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:40 crc kubenswrapper[4691]: I0930 06:19:40.957981 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e55ae5a0-03bf-427d-92b2-39cdff18340d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f6cedf7ad601b81415785791be352c849e1d559ab8e6a0f33de2c89a93cf09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65513dc3f11ab14acbafe004e02c1b395b74935e84a00e75d9936bde03a97fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6240aa4c18020b30de9389e7a1869c736424444208d4011f228b4e5432520cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2841452dae413fa8cdf532c4933ed52cfb007be43a45ee2d25209b54890fd351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b052f8474075dfc1f07b8e0844072d5bbaa94bae1711f7e4cecc6928a65689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:40Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:40 crc kubenswrapper[4691]: I0930 06:19:40.981676 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb1cfb02-1e6b-4bfb-8104-f1d137231de9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07fbce63750320e7345e1a99b4be96355f54c36c70e8929599b990eab3f3e637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38c9aa49f81f590d234daa993bb92af0c48103b45f5116c27b86daa651930a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acbe061d62e7bf11b0003c035dce19386981672d5e63236fada6e43b92274c91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fc15eff68924ff75a90b9cb7b5119d95ced9b9fe9e12968bfd077db73f37ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://017d7c032ea394e2dbef22cfc837c65ce54aab7db5fbd32210312b3e7042a698\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T06:19:31Z\\\",\\\"message\\\":\\\"W0930 06:19:20.583735 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 06:19:20.584237 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759213160 cert, and key in /tmp/serving-cert-2229078217/serving-signer.crt, /tmp/serving-cert-2229078217/serving-signer.key\\\\nI0930 06:19:20.815054 1 observer_polling.go:159] Starting file observer\\\\nW0930 06:19:20.818069 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 06:19:20.818377 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 06:19:20.821284 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2229078217/tls.crt::/tmp/serving-cert-2229078217/tls.key\\\\\\\"\\\\nF0930 06:19:31.389088 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39d7f1cd59b9a24d4303415d873bac1ed21326847749130167868b5298b7f8e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:40Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:41 crc kubenswrapper[4691]: I0930 06:19:41.015380 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:41Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:41 crc kubenswrapper[4691]: I0930 06:19:41.058436 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03f725fd435423024f16548f2fdd44ad1ddc26d1d485d56cc5b614ab2cac7a76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5fcea9ccb5c1331b7ae737abdcc80364814715f3c43c15793316f16acf0502a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:41Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:41 crc kubenswrapper[4691]: I0930 06:19:41.100230 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:41Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:41 crc kubenswrapper[4691]: I0930 06:19:41.135062 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8htrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c1c8663-d263-4b8c-93fa-05ee1b61d7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73cf34ef0b11305b959da324b55f8a211f0c278cf2dec835bb6c91ad385e4dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj86t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8htrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:41Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:41 crc kubenswrapper[4691]: I0930 06:19:41.176310 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b46ade-8260-448f-84b7-506632d23ff9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff4769002bae0f06423979897a58fdb4b2a78f9ea00abc1d5df5c8d4385c0919\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xszvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1440cd9bacd9b28d5e192b3d9143dda931c741606fdb8f246fba23b8a68c534c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xszvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4w4k6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:41Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:41 crc kubenswrapper[4691]: I0930 06:19:41.225682 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 06:19:41 crc kubenswrapper[4691]: E0930 06:19:41.225833 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 06:19:41 crc kubenswrapper[4691]: I0930 06:19:41.226492 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 06:19:41 crc kubenswrapper[4691]: E0930 06:19:41.226602 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 06:19:41 crc kubenswrapper[4691]: I0930 06:19:41.372479 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nzp64" event={"ID":"f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6","Type":"ContainerStarted","Data":"8aacdb742861442ec0e7b14904397b4e1a4bd115cd56e6105b37c42cb762b0dd"} Sep 30 06:19:41 crc kubenswrapper[4691]: I0930 06:19:41.378420 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" event={"ID":"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d","Type":"ContainerStarted","Data":"04bc58a7b740ca100f333ef769a949f962ff6024f8dd87b798c99a28514e0394"} Sep 30 06:19:41 crc kubenswrapper[4691]: I0930 06:19:41.378714 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" event={"ID":"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d","Type":"ContainerStarted","Data":"5b6df86867d5427797ec0c0a28b653ef5b8ec62af450207d9af5f490df725b05"} Sep 30 06:19:41 crc kubenswrapper[4691]: I0930 06:19:41.378977 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" event={"ID":"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d","Type":"ContainerStarted","Data":"20789e47d7e584044b0763af1eb8b81bbde2f247ec14e56a5a9b08c28894c3be"} Sep 30 06:19:41 crc kubenswrapper[4691]: I0930 06:19:41.379169 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" event={"ID":"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d","Type":"ContainerStarted","Data":"d002e869e911c24f154df90324ace473c747a4cfa6dd60ddcbef6473f9815f72"} Sep 30 06:19:41 crc kubenswrapper[4691]: I0930 06:19:41.390756 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:41Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:41 crc kubenswrapper[4691]: I0930 06:19:41.411722 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca08f5d521940a428454d90f85150e7b87e2353314c89a42b4c73740c3b23163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:41Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:41 crc kubenswrapper[4691]: I0930 06:19:41.429768 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xjjw8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bfd073c-4582-4a65-8170-7030f4852174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a065eb18eb909fb60011f5ae6adf327088480996585d74e861628ffdd759bd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xjjw8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:41Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:41 crc kubenswrapper[4691]: I0930 06:19:41.443940 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b46ade-8260-448f-84b7-506632d23ff9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff4769002bae0f06423979897a58fdb4b2a78f9ea00abc1d5df5c8d4385c0919\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xszvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1440cd9bacd9b28d5e192b3d9143dda931c741606fdb8f246fba23b8a68c534c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xszvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4w4k6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:41Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:41 crc kubenswrapper[4691]: I0930 06:19:41.466876 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e55ae5a0-03bf-427d-92b2-39cdff18340d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f6cedf7ad601b81415785791be352c849e1d559ab8e6a0f33de2c89a93cf09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65513dc3f11ab14acbafe004e02c1b395b74935e84a00e75d9936bde03a97fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6240aa4c18020b30de9389e7a1869c736424444208d4011f228b4e5432520cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2841452dae413fa8cdf532c4933ed52cfb007be43a45ee2d25209b54890fd351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b052f8474075dfc1f07b8e0844072d5bbaa94bae1711f7e4cecc6928a65689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:41Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:41 crc kubenswrapper[4691]: I0930 06:19:41.484405 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb1cfb02-1e6b-4bfb-8104-f1d137231de9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07fbce63750320e7345e1a99b4be96355f54c36c70e8929599b990eab3f3e637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38c9aa49f81f590d234daa993bb92af0c48103b45f5116c27b86daa651930a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acbe061d62e7bf11b0003c035dce19386981672d5e63236fada6e43b92274c91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fc15eff68924ff75a90b9cb7b5119d95ced9b9fe9e12968bfd077db73f37ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://017d7c032ea394e2dbef22cfc837c65ce54aab7db5fbd32210312b3e7042a698\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T06:19:31Z\\\",\\\"message\\\":\\\"W0930 06:19:20.583735 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 06:19:20.584237 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759213160 cert, and key in /tmp/serving-cert-2229078217/serving-signer.crt, /tmp/serving-cert-2229078217/serving-signer.key\\\\nI0930 06:19:20.815054 1 observer_polling.go:159] Starting file observer\\\\nW0930 06:19:20.818069 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 06:19:20.818377 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 06:19:20.821284 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2229078217/tls.crt::/tmp/serving-cert-2229078217/tls.key\\\\\\\"\\\\nF0930 06:19:31.389088 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39d7f1cd59b9a24d4303415d873bac1ed21326847749130167868b5298b7f8e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:41Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:41 crc kubenswrapper[4691]: I0930 06:19:41.498097 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:41Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:41 crc kubenswrapper[4691]: I0930 06:19:41.522487 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03f725fd435423024f16548f2fdd44ad1ddc26d1d485d56cc5b614ab2cac7a76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5fcea9ccb5c1331b7ae737abdcc80364814715f3c43c15793316f16acf0502a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:41Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:41 crc kubenswrapper[4691]: I0930 06:19:41.559357 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:41Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:41 crc kubenswrapper[4691]: I0930 06:19:41.579825 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8htrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c1c8663-d263-4b8c-93fa-05ee1b61d7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73cf34ef0b11305b959da324b55f8a211f0c278cf2dec835bb6c91ad385e4dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj86t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8htrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:41Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:41 crc kubenswrapper[4691]: I0930 06:19:41.619626 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3ecaa3-192f-40d4-abf8-938b9e03a661\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://469f62522e2f813405840a8654f761a52163bdabeba8cc7c57998ccd40548370\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198c76a83115408261c01c78cbd2845e487ad3dc896da0130b2b1338349506d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aadde3a7a603aae354d4f9dcf5ef288aeceabfb73ee3c92398fb3419326b27a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2f054ea0da71d1f65e501212a66771c5e65bfc123e3a3acd5e13900def039d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:41Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:41 crc kubenswrapper[4691]: I0930 06:19:41.657784 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11fb453d515798fa4f85603df19a5dc7d305375d25c0f679598613aba7312328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:41Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:41 crc kubenswrapper[4691]: I0930 06:19:41.697560 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nzp64" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d675d9797e8541b6fff5df78a8d0701a51d23b03bd2d489d0489a82f440b4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d675d9797e8541b6fff5df78a8d0701a51d23b03bd2d489d0489a82f440b4a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aacdb742861442ec0e7b14904397b4e1a4bd115cd56e6105b37c42cb762b0dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nzp64\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:41Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:41 crc kubenswrapper[4691]: I0930 06:19:41.749055 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sjmvw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:41Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:42 crc kubenswrapper[4691]: I0930 06:19:42.224449 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 06:19:42 crc kubenswrapper[4691]: E0930 06:19:42.224630 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 06:19:42 crc kubenswrapper[4691]: I0930 06:19:42.386552 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" event={"ID":"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d","Type":"ContainerStarted","Data":"36d95203bc3297779316f06d001c72859f9b4561a9a9d7c3123c5a75e8d8bc6b"} Sep 30 06:19:42 crc kubenswrapper[4691]: I0930 06:19:42.386618 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" event={"ID":"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d","Type":"ContainerStarted","Data":"f46927454f67993366632bb3f6e37e4cb0c5f013266b18c1ac61e9cddc42e023"} Sep 30 06:19:42 crc kubenswrapper[4691]: I0930 06:19:42.390795 4691 generic.go:334] "Generic (PLEG): container finished" podID="f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6" containerID="8aacdb742861442ec0e7b14904397b4e1a4bd115cd56e6105b37c42cb762b0dd" exitCode=0 Sep 30 06:19:42 crc kubenswrapper[4691]: I0930 06:19:42.390865 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nzp64" event={"ID":"f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6","Type":"ContainerDied","Data":"8aacdb742861442ec0e7b14904397b4e1a4bd115cd56e6105b37c42cb762b0dd"} Sep 30 06:19:42 crc kubenswrapper[4691]: I0930 06:19:42.433764 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3ecaa3-192f-40d4-abf8-938b9e03a661\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://469f62522e2f813405840a8654f761a52163bdabeba8cc7c57998ccd40548370\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198c76a83115408261c01c78cbd2845e487ad3dc896da0130b2b1338349506d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aadde3a7a603aae354d4f9dcf5ef288aeceabfb73ee3c92398fb3419326b27a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2f054ea0da71d1f65e501212a66771c5e65bfc123e3a3acd5e13900def039d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:42Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:42 crc kubenswrapper[4691]: I0930 06:19:42.458779 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-p7wmt"] Sep 30 06:19:42 crc kubenswrapper[4691]: I0930 06:19:42.459334 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-p7wmt" Sep 30 06:19:42 crc kubenswrapper[4691]: I0930 06:19:42.461052 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11fb453d515798fa4f85603df19a5dc7d305375d25c0f679598613aba7312328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:42Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:42 crc kubenswrapper[4691]: I0930 06:19:42.462347 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Sep 30 06:19:42 crc kubenswrapper[4691]: I0930 06:19:42.462578 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Sep 30 06:19:42 crc kubenswrapper[4691]: I0930 06:19:42.462451 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Sep 30 06:19:42 crc kubenswrapper[4691]: I0930 06:19:42.464317 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Sep 30 06:19:42 crc kubenswrapper[4691]: I0930 06:19:42.487241 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nzp64" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d675d9797e8541b6fff5df78a8d0701a51d23b03bd2d489d0489a82f440b4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d675d9797e8541b6fff5df78a8d0701a51d23b03bd2d489d0489a82f440b4a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aacdb742861442ec0e7b14904397b4e1a4bd115cd56e6105b37c42cb762b0dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aacdb742861442ec0e7b14904397b4e1a4bd115cd56e6105b37c42cb762b0dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nzp64\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:42Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:42 crc kubenswrapper[4691]: I0930 06:19:42.499757 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jfq9\" (UniqueName: \"kubernetes.io/projected/99a6e728-8795-424d-a99e-7141c75baad5-kube-api-access-6jfq9\") pod \"node-ca-p7wmt\" (UID: \"99a6e728-8795-424d-a99e-7141c75baad5\") " pod="openshift-image-registry/node-ca-p7wmt" Sep 30 06:19:42 crc kubenswrapper[4691]: I0930 06:19:42.499796 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/99a6e728-8795-424d-a99e-7141c75baad5-host\") pod \"node-ca-p7wmt\" (UID: \"99a6e728-8795-424d-a99e-7141c75baad5\") " pod="openshift-image-registry/node-ca-p7wmt" Sep 30 06:19:42 crc kubenswrapper[4691]: I0930 06:19:42.499849 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/99a6e728-8795-424d-a99e-7141c75baad5-serviceca\") pod \"node-ca-p7wmt\" (UID: \"99a6e728-8795-424d-a99e-7141c75baad5\") " pod="openshift-image-registry/node-ca-p7wmt" Sep 30 06:19:42 crc kubenswrapper[4691]: I0930 06:19:42.517701 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sjmvw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:42Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:42 crc kubenswrapper[4691]: I0930 06:19:42.533170 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:42Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:42 crc kubenswrapper[4691]: I0930 06:19:42.546687 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca08f5d521940a428454d90f85150e7b87e2353314c89a42b4c73740c3b23163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:42Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:42 crc kubenswrapper[4691]: I0930 06:19:42.561599 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xjjw8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bfd073c-4582-4a65-8170-7030f4852174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a065eb18eb909fb60011f5ae6adf327088480996585d74e861628ffdd759bd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xjjw8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:42Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:42 crc kubenswrapper[4691]: I0930 06:19:42.574939 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:42Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:42 crc kubenswrapper[4691]: I0930 06:19:42.584552 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8htrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c1c8663-d263-4b8c-93fa-05ee1b61d7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73cf34ef0b11305b959da324b55f8a211f0c278cf2dec835bb6c91ad385e4dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj86t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8htrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:42Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:42 crc kubenswrapper[4691]: I0930 06:19:42.597451 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b46ade-8260-448f-84b7-506632d23ff9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff4769002bae0f06423979897a58fdb4b2a78f9ea00abc1d5df5c8d4385c0919\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xszvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1440cd9bacd9b28d5e192b3d9143dda931c741606fdb8f246fba23b8a68c534c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xszvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4w4k6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:42Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:42 crc kubenswrapper[4691]: I0930 06:19:42.600871 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jfq9\" (UniqueName: \"kubernetes.io/projected/99a6e728-8795-424d-a99e-7141c75baad5-kube-api-access-6jfq9\") pod \"node-ca-p7wmt\" (UID: \"99a6e728-8795-424d-a99e-7141c75baad5\") " pod="openshift-image-registry/node-ca-p7wmt" Sep 30 06:19:42 crc kubenswrapper[4691]: I0930 06:19:42.600939 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/99a6e728-8795-424d-a99e-7141c75baad5-host\") pod \"node-ca-p7wmt\" (UID: \"99a6e728-8795-424d-a99e-7141c75baad5\") " pod="openshift-image-registry/node-ca-p7wmt" Sep 30 06:19:42 crc kubenswrapper[4691]: I0930 06:19:42.600975 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/99a6e728-8795-424d-a99e-7141c75baad5-serviceca\") pod \"node-ca-p7wmt\" (UID: \"99a6e728-8795-424d-a99e-7141c75baad5\") " pod="openshift-image-registry/node-ca-p7wmt" Sep 30 06:19:42 crc kubenswrapper[4691]: I0930 06:19:42.601137 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/99a6e728-8795-424d-a99e-7141c75baad5-host\") pod \"node-ca-p7wmt\" (UID: \"99a6e728-8795-424d-a99e-7141c75baad5\") " pod="openshift-image-registry/node-ca-p7wmt" Sep 30 06:19:42 crc kubenswrapper[4691]: I0930 06:19:42.602554 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/99a6e728-8795-424d-a99e-7141c75baad5-serviceca\") pod \"node-ca-p7wmt\" (UID: \"99a6e728-8795-424d-a99e-7141c75baad5\") " pod="openshift-image-registry/node-ca-p7wmt" Sep 30 06:19:42 crc kubenswrapper[4691]: I0930 06:19:42.622832 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jfq9\" (UniqueName: \"kubernetes.io/projected/99a6e728-8795-424d-a99e-7141c75baad5-kube-api-access-6jfq9\") pod \"node-ca-p7wmt\" (UID: \"99a6e728-8795-424d-a99e-7141c75baad5\") " pod="openshift-image-registry/node-ca-p7wmt" Sep 30 06:19:42 crc kubenswrapper[4691]: I0930 06:19:42.628899 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e55ae5a0-03bf-427d-92b2-39cdff18340d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f6cedf7ad601b81415785791be352c849e1d559ab8e6a0f33de2c89a93cf09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65513dc3f11ab14acbafe004e02c1b395b74935e84a00e75d9936bde03a97fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6240aa4c18020b30de9389e7a1869c736424444208d4011f228b4e5432520cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2841452dae413fa8cdf532c4933ed52cfb007be43a45ee2d25209b54890fd351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b052f8474075dfc1f07b8e0844072d5bbaa94bae1711f7e4cecc6928a65689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:42Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:42 crc kubenswrapper[4691]: I0930 06:19:42.646632 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb1cfb02-1e6b-4bfb-8104-f1d137231de9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07fbce63750320e7345e1a99b4be96355f54c36c70e8929599b990eab3f3e637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38c9aa49f81f590d234daa993bb92af0c48103b45f5116c27b86daa651930a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acbe061d62e7bf11b0003c035dce19386981672d5e63236fada6e43b92274c91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fc15eff68924ff75a90b9cb7b5119d95ced9b9fe9e12968bfd077db73f37ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://017d7c032ea394e2dbef22cfc837c65ce54aab7db5fbd32210312b3e7042a698\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T06:19:31Z\\\",\\\"message\\\":\\\"W0930 06:19:20.583735 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 06:19:20.584237 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759213160 cert, and key in /tmp/serving-cert-2229078217/serving-signer.crt, /tmp/serving-cert-2229078217/serving-signer.key\\\\nI0930 06:19:20.815054 1 observer_polling.go:159] Starting file observer\\\\nW0930 06:19:20.818069 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 06:19:20.818377 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 06:19:20.821284 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2229078217/tls.crt::/tmp/serving-cert-2229078217/tls.key\\\\\\\"\\\\nF0930 06:19:31.389088 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39d7f1cd59b9a24d4303415d873bac1ed21326847749130167868b5298b7f8e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:42Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:42 crc kubenswrapper[4691]: I0930 06:19:42.661605 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:42Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:42 crc kubenswrapper[4691]: I0930 06:19:42.674774 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03f725fd435423024f16548f2fdd44ad1ddc26d1d485d56cc5b614ab2cac7a76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5fcea9ccb5c1331b7ae737abdcc80364814715f3c43c15793316f16acf0502a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:42Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:42 crc kubenswrapper[4691]: I0930 06:19:42.684476 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b46ade-8260-448f-84b7-506632d23ff9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff4769002bae0f06423979897a58fdb4b2a78f9ea00abc1d5df5c8d4385c0919\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xszvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1440cd9bacd9b28d5e192b3d9143dda931c741606fdb8f246fba23b8a68c534c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xszvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4w4k6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:42Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:42 crc kubenswrapper[4691]: I0930 06:19:42.704559 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e55ae5a0-03bf-427d-92b2-39cdff18340d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f6cedf7ad601b81415785791be352c849e1d559ab8e6a0f33de2c89a93cf09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65513dc3f11ab14acbafe004e02c1b395b74935e84a00e75d9936bde03a97fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6240aa4c18020b30de9389e7a1869c736424444208d4011f228b4e5432520cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2841452dae413fa8cdf532c4933ed52cfb007be43a45ee2d25209b54890fd351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b052f8474075dfc1f07b8e0844072d5bbaa94bae1711f7e4cecc6928a65689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:42Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:42 crc kubenswrapper[4691]: I0930 06:19:42.716367 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb1cfb02-1e6b-4bfb-8104-f1d137231de9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07fbce63750320e7345e1a99b4be96355f54c36c70e8929599b990eab3f3e637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38c9aa49f81f590d234daa993bb92af0c48103b45f5116c27b86daa651930a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acbe061d62e7bf11b0003c035dce19386981672d5e63236fada6e43b92274c91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fc15eff68924ff75a90b9cb7b5119d95ced9b9fe9e12968bfd077db73f37ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://017d7c032ea394e2dbef22cfc837c65ce54aab7db5fbd32210312b3e7042a698\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T06:19:31Z\\\",\\\"message\\\":\\\"W0930 06:19:20.583735 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 06:19:20.584237 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759213160 cert, and key in /tmp/serving-cert-2229078217/serving-signer.crt, /tmp/serving-cert-2229078217/serving-signer.key\\\\nI0930 06:19:20.815054 1 observer_polling.go:159] Starting file observer\\\\nW0930 06:19:20.818069 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 06:19:20.818377 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 06:19:20.821284 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2229078217/tls.crt::/tmp/serving-cert-2229078217/tls.key\\\\\\\"\\\\nF0930 06:19:31.389088 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39d7f1cd59b9a24d4303415d873bac1ed21326847749130167868b5298b7f8e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:42Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:42 crc kubenswrapper[4691]: I0930 06:19:42.728351 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:42Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:42 crc kubenswrapper[4691]: I0930 06:19:42.738735 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03f725fd435423024f16548f2fdd44ad1ddc26d1d485d56cc5b614ab2cac7a76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5fcea9ccb5c1331b7ae737abdcc80364814715f3c43c15793316f16acf0502a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:42Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:42 crc kubenswrapper[4691]: I0930 06:19:42.748811 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:42Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:42 crc kubenswrapper[4691]: I0930 06:19:42.757716 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8htrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c1c8663-d263-4b8c-93fa-05ee1b61d7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73cf34ef0b11305b959da324b55f8a211f0c278cf2dec835bb6c91ad385e4dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj86t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8htrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:42Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:42 crc kubenswrapper[4691]: I0930 06:19:42.768348 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3ecaa3-192f-40d4-abf8-938b9e03a661\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://469f62522e2f813405840a8654f761a52163bdabeba8cc7c57998ccd40548370\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198c76a83115408261c01c78cbd2845e487ad3dc896da0130b2b1338349506d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aadde3a7a603aae354d4f9dcf5ef288aeceabfb73ee3c92398fb3419326b27a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2f054ea0da71d1f65e501212a66771c5e65bfc123e3a3acd5e13900def039d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:42Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:42 crc kubenswrapper[4691]: I0930 06:19:42.779912 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-p7wmt" Sep 30 06:19:42 crc kubenswrapper[4691]: I0930 06:19:42.780331 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11fb453d515798fa4f85603df19a5dc7d305375d25c0f679598613aba7312328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:42Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:42 crc kubenswrapper[4691]: W0930 06:19:42.794851 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99a6e728_8795_424d_a99e_7141c75baad5.slice/crio-e6dead16dfb320027de3eb980cc4d5011b00ff484012761b0b6df1235f43a8a1 WatchSource:0}: Error finding container e6dead16dfb320027de3eb980cc4d5011b00ff484012761b0b6df1235f43a8a1: Status 404 returned error can't find the container with id e6dead16dfb320027de3eb980cc4d5011b00ff484012761b0b6df1235f43a8a1 Sep 30 06:19:42 crc kubenswrapper[4691]: I0930 06:19:42.805123 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nzp64" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d675d9797e8541b6fff5df78a8d0701a51d23b03bd2d489d0489a82f440b4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d675d9797e8541b6fff5df78a8d0701a51d23b03bd2d489d0489a82f440b4a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aacdb742861442ec0e7b14904397b4e1a4bd115cd56e6105b37c42cb762b0dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aacdb742861442ec0e7b14904397b4e1a4bd115cd56e6105b37c42cb762b0dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nzp64\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:42Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:42 crc kubenswrapper[4691]: I0930 06:19:42.844491 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sjmvw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:42Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:42 crc kubenswrapper[4691]: I0930 06:19:42.881210 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:42Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:42 crc kubenswrapper[4691]: I0930 06:19:42.921581 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca08f5d521940a428454d90f85150e7b87e2353314c89a42b4c73740c3b23163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:42Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:42 crc kubenswrapper[4691]: I0930 06:19:42.975032 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xjjw8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bfd073c-4582-4a65-8170-7030f4852174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a065eb18eb909fb60011f5ae6adf327088480996585d74e861628ffdd759bd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xjjw8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:42Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:42 crc kubenswrapper[4691]: I0930 06:19:42.998270 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-p7wmt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99a6e728-8795-424d-a99e-7141c75baad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:42Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:42Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6jfq9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-p7wmt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:42Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:43 crc kubenswrapper[4691]: I0930 06:19:43.224791 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 06:19:43 crc kubenswrapper[4691]: I0930 06:19:43.224834 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 06:19:43 crc kubenswrapper[4691]: E0930 06:19:43.225000 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 06:19:43 crc kubenswrapper[4691]: E0930 06:19:43.225311 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 06:19:43 crc kubenswrapper[4691]: I0930 06:19:43.397243 4691 generic.go:334] "Generic (PLEG): container finished" podID="f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6" containerID="b1f6c31e7404be56f11fd658c5e92f44de4704b9c2d5ebbce1fd0af40d2ca8c9" exitCode=0 Sep 30 06:19:43 crc kubenswrapper[4691]: I0930 06:19:43.397366 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nzp64" event={"ID":"f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6","Type":"ContainerDied","Data":"b1f6c31e7404be56f11fd658c5e92f44de4704b9c2d5ebbce1fd0af40d2ca8c9"} Sep 30 06:19:43 crc kubenswrapper[4691]: I0930 06:19:43.399864 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-p7wmt" event={"ID":"99a6e728-8795-424d-a99e-7141c75baad5","Type":"ContainerStarted","Data":"746d60ed8b3230cabda106f1d2e4bc8496618515f2cfb2d6c4238375f71ad2af"} Sep 30 06:19:43 crc kubenswrapper[4691]: I0930 06:19:43.399947 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-p7wmt" event={"ID":"99a6e728-8795-424d-a99e-7141c75baad5","Type":"ContainerStarted","Data":"e6dead16dfb320027de3eb980cc4d5011b00ff484012761b0b6df1235f43a8a1"} Sep 30 06:19:43 crc kubenswrapper[4691]: I0930 06:19:43.418334 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:43Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:43 crc kubenswrapper[4691]: I0930 06:19:43.435216 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca08f5d521940a428454d90f85150e7b87e2353314c89a42b4c73740c3b23163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:43Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:43 crc kubenswrapper[4691]: I0930 06:19:43.451339 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xjjw8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bfd073c-4582-4a65-8170-7030f4852174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a065eb18eb909fb60011f5ae6adf327088480996585d74e861628ffdd759bd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xjjw8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:43Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:43 crc kubenswrapper[4691]: I0930 06:19:43.464453 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-p7wmt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99a6e728-8795-424d-a99e-7141c75baad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:42Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:42Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6jfq9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-p7wmt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:43Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:43 crc kubenswrapper[4691]: I0930 06:19:43.488268 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e55ae5a0-03bf-427d-92b2-39cdff18340d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f6cedf7ad601b81415785791be352c849e1d559ab8e6a0f33de2c89a93cf09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65513dc3f11ab14acbafe004e02c1b395b74935e84a00e75d9936bde03a97fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6240aa4c18020b30de9389e7a1869c736424444208d4011f228b4e5432520cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2841452dae413fa8cdf532c4933ed52cfb007be43a45ee2d25209b54890fd351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b052f8474075dfc1f07b8e0844072d5bbaa94bae1711f7e4cecc6928a65689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:43Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:43 crc kubenswrapper[4691]: I0930 06:19:43.509740 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb1cfb02-1e6b-4bfb-8104-f1d137231de9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07fbce63750320e7345e1a99b4be96355f54c36c70e8929599b990eab3f3e637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38c9aa49f81f590d234daa993bb92af0c48103b45f5116c27b86daa651930a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acbe061d62e7bf11b0003c035dce19386981672d5e63236fada6e43b92274c91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fc15eff68924ff75a90b9cb7b5119d95ced9b9fe9e12968bfd077db73f37ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://017d7c032ea394e2dbef22cfc837c65ce54aab7db5fbd32210312b3e7042a698\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T06:19:31Z\\\",\\\"message\\\":\\\"W0930 06:19:20.583735 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 06:19:20.584237 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759213160 cert, and key in /tmp/serving-cert-2229078217/serving-signer.crt, /tmp/serving-cert-2229078217/serving-signer.key\\\\nI0930 06:19:20.815054 1 observer_polling.go:159] Starting file observer\\\\nW0930 06:19:20.818069 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 06:19:20.818377 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 06:19:20.821284 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2229078217/tls.crt::/tmp/serving-cert-2229078217/tls.key\\\\\\\"\\\\nF0930 06:19:31.389088 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39d7f1cd59b9a24d4303415d873bac1ed21326847749130167868b5298b7f8e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:43Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:43 crc kubenswrapper[4691]: I0930 06:19:43.510748 4691 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Sep 30 06:19:43 crc kubenswrapper[4691]: I0930 06:19:43.512466 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:43 crc kubenswrapper[4691]: I0930 06:19:43.512493 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:43 crc kubenswrapper[4691]: I0930 06:19:43.512501 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:43 crc kubenswrapper[4691]: I0930 06:19:43.512552 4691 kubelet_node_status.go:76] "Attempting to register node" node="crc" Sep 30 06:19:43 crc kubenswrapper[4691]: I0930 06:19:43.520688 4691 kubelet_node_status.go:115] "Node was previously registered" node="crc" Sep 30 06:19:43 crc kubenswrapper[4691]: I0930 06:19:43.521267 4691 kubelet_node_status.go:79] "Successfully registered node" node="crc" Sep 30 06:19:43 crc kubenswrapper[4691]: I0930 06:19:43.523158 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:43 crc kubenswrapper[4691]: I0930 06:19:43.523184 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:43 crc kubenswrapper[4691]: I0930 06:19:43.523191 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:43 crc kubenswrapper[4691]: I0930 06:19:43.523203 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:43 crc kubenswrapper[4691]: I0930 06:19:43.523212 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:43Z","lastTransitionTime":"2025-09-30T06:19:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:43 crc kubenswrapper[4691]: I0930 06:19:43.525443 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:43Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:43 crc kubenswrapper[4691]: E0930 06:19:43.535707 4691 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5d33cbaa-1b8a-4dde-af56-05c3aae2213e\\\",\\\"systemUUID\\\":\\\"d7c4eecd-4486-44ba-8bf7-42bfa69eb2b7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:43Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:43 crc kubenswrapper[4691]: I0930 06:19:43.539452 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03f725fd435423024f16548f2fdd44ad1ddc26d1d485d56cc5b614ab2cac7a76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5fcea9ccb5c1331b7ae737abdcc80364814715f3c43c15793316f16acf0502a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:43Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:43 crc kubenswrapper[4691]: I0930 06:19:43.541152 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:43 crc kubenswrapper[4691]: I0930 06:19:43.541171 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:43 crc kubenswrapper[4691]: I0930 06:19:43.541202 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:43 crc kubenswrapper[4691]: I0930 06:19:43.541215 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:43 crc kubenswrapper[4691]: I0930 06:19:43.541222 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:43Z","lastTransitionTime":"2025-09-30T06:19:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:43 crc kubenswrapper[4691]: I0930 06:19:43.556206 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:43Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:43 crc kubenswrapper[4691]: E0930 06:19:43.559949 4691 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5d33cbaa-1b8a-4dde-af56-05c3aae2213e\\\",\\\"systemUUID\\\":\\\"d7c4eecd-4486-44ba-8bf7-42bfa69eb2b7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:43Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:43 crc kubenswrapper[4691]: I0930 06:19:43.568365 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:43 crc kubenswrapper[4691]: I0930 06:19:43.568447 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:43 crc kubenswrapper[4691]: I0930 06:19:43.568455 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:43 crc kubenswrapper[4691]: I0930 06:19:43.568467 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:43 crc kubenswrapper[4691]: I0930 06:19:43.568476 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:43Z","lastTransitionTime":"2025-09-30T06:19:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:43 crc kubenswrapper[4691]: I0930 06:19:43.577752 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8htrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c1c8663-d263-4b8c-93fa-05ee1b61d7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73cf34ef0b11305b959da324b55f8a211f0c278cf2dec835bb6c91ad385e4dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj86t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8htrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:43Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:43 crc kubenswrapper[4691]: E0930 06:19:43.581234 4691 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5d33cbaa-1b8a-4dde-af56-05c3aae2213e\\\",\\\"systemUUID\\\":\\\"d7c4eecd-4486-44ba-8bf7-42bfa69eb2b7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:43Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:43 crc kubenswrapper[4691]: I0930 06:19:43.584834 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:43 crc kubenswrapper[4691]: I0930 06:19:43.584858 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:43 crc kubenswrapper[4691]: I0930 06:19:43.584866 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:43 crc kubenswrapper[4691]: I0930 06:19:43.584878 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:43 crc kubenswrapper[4691]: I0930 06:19:43.584906 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:43Z","lastTransitionTime":"2025-09-30T06:19:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:43 crc kubenswrapper[4691]: I0930 06:19:43.590749 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b46ade-8260-448f-84b7-506632d23ff9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff4769002bae0f06423979897a58fdb4b2a78f9ea00abc1d5df5c8d4385c0919\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xszvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1440cd9bacd9b28d5e192b3d9143dda931c741606fdb8f246fba23b8a68c534c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xszvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4w4k6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:43Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:43 crc kubenswrapper[4691]: E0930 06:19:43.604995 4691 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5d33cbaa-1b8a-4dde-af56-05c3aae2213e\\\",\\\"systemUUID\\\":\\\"d7c4eecd-4486-44ba-8bf7-42bfa69eb2b7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:43Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:43 crc kubenswrapper[4691]: I0930 06:19:43.607574 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3ecaa3-192f-40d4-abf8-938b9e03a661\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://469f62522e2f813405840a8654f761a52163bdabeba8cc7c57998ccd40548370\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198c76a83115408261c01c78cbd2845e487ad3dc896da0130b2b1338349506d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aadde3a7a603aae354d4f9dcf5ef288aeceabfb73ee3c92398fb3419326b27a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2f054ea0da71d1f65e501212a66771c5e65bfc123e3a3acd5e13900def039d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:43Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:43 crc kubenswrapper[4691]: I0930 06:19:43.610035 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:43 crc kubenswrapper[4691]: I0930 06:19:43.610070 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:43 crc kubenswrapper[4691]: I0930 06:19:43.610078 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:43 crc kubenswrapper[4691]: I0930 06:19:43.610091 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:43 crc kubenswrapper[4691]: I0930 06:19:43.610102 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:43Z","lastTransitionTime":"2025-09-30T06:19:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:43 crc kubenswrapper[4691]: I0930 06:19:43.625358 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11fb453d515798fa4f85603df19a5dc7d305375d25c0f679598613aba7312328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:43Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:43 crc kubenswrapper[4691]: E0930 06:19:43.626866 4691 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5d33cbaa-1b8a-4dde-af56-05c3aae2213e\\\",\\\"systemUUID\\\":\\\"d7c4eecd-4486-44ba-8bf7-42bfa69eb2b7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:43Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:43 crc kubenswrapper[4691]: E0930 06:19:43.627033 4691 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 06:19:43 crc kubenswrapper[4691]: I0930 06:19:43.629344 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:43 crc kubenswrapper[4691]: I0930 06:19:43.629374 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:43 crc kubenswrapper[4691]: I0930 06:19:43.629382 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:43 crc kubenswrapper[4691]: I0930 06:19:43.629396 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:43 crc kubenswrapper[4691]: I0930 06:19:43.629405 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:43Z","lastTransitionTime":"2025-09-30T06:19:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:43 crc kubenswrapper[4691]: I0930 06:19:43.653437 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nzp64" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d675d9797e8541b6fff5df78a8d0701a51d23b03bd2d489d0489a82f440b4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d675d9797e8541b6fff5df78a8d0701a51d23b03bd2d489d0489a82f440b4a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aacdb742861442ec0e7b14904397b4e1a4bd115cd56e6105b37c42cb762b0dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aacdb742861442ec0e7b14904397b4e1a4bd115cd56e6105b37c42cb762b0dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f6c31e7404be56f11fd658c5e92f44de4704b9c2d5ebbce1fd0af40d2ca8c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1f6c31e7404be56f11fd658c5e92f44de4704b9c2d5ebbce1fd0af40d2ca8c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nzp64\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:43Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:43 crc kubenswrapper[4691]: I0930 06:19:43.680477 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sjmvw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:43Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:43 crc kubenswrapper[4691]: I0930 06:19:43.695243 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:43Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:43 crc kubenswrapper[4691]: I0930 06:19:43.712797 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8htrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c1c8663-d263-4b8c-93fa-05ee1b61d7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73cf34ef0b11305b959da324b55f8a211f0c278cf2dec835bb6c91ad385e4dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj86t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8htrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:43Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:43 crc kubenswrapper[4691]: I0930 06:19:43.731968 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:43 crc kubenswrapper[4691]: I0930 06:19:43.732004 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:43 crc kubenswrapper[4691]: I0930 06:19:43.732012 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:43 crc kubenswrapper[4691]: I0930 06:19:43.732023 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:43 crc kubenswrapper[4691]: I0930 06:19:43.732032 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:43Z","lastTransitionTime":"2025-09-30T06:19:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:43 crc kubenswrapper[4691]: I0930 06:19:43.755726 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b46ade-8260-448f-84b7-506632d23ff9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff4769002bae0f06423979897a58fdb4b2a78f9ea00abc1d5df5c8d4385c0919\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xszvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1440cd9bacd9b28d5e192b3d9143dda931c741606fdb8f246fba23b8a68c534c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xszvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4w4k6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:43Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:43 crc kubenswrapper[4691]: I0930 06:19:43.813346 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e55ae5a0-03bf-427d-92b2-39cdff18340d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f6cedf7ad601b81415785791be352c849e1d559ab8e6a0f33de2c89a93cf09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65513dc3f11ab14acbafe004e02c1b395b74935e84a00e75d9936bde03a97fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6240aa4c18020b30de9389e7a1869c736424444208d4011f228b4e5432520cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2841452dae413fa8cdf532c4933ed52cfb007be43a45ee2d25209b54890fd351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b052f8474075dfc1f07b8e0844072d5bbaa94bae1711f7e4cecc6928a65689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:43Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:43 crc kubenswrapper[4691]: I0930 06:19:43.835429 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:43 crc kubenswrapper[4691]: I0930 06:19:43.835501 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:43 crc kubenswrapper[4691]: I0930 06:19:43.835519 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:43 crc kubenswrapper[4691]: I0930 06:19:43.835547 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:43 crc kubenswrapper[4691]: I0930 06:19:43.835568 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:43Z","lastTransitionTime":"2025-09-30T06:19:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:43 crc kubenswrapper[4691]: I0930 06:19:43.839638 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb1cfb02-1e6b-4bfb-8104-f1d137231de9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07fbce63750320e7345e1a99b4be96355f54c36c70e8929599b990eab3f3e637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38c9aa49f81f590d234daa993bb92af0c48103b45f5116c27b86daa651930a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acbe061d62e7bf11b0003c035dce19386981672d5e63236fada6e43b92274c91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fc15eff68924ff75a90b9cb7b5119d95ced9b9fe9e12968bfd077db73f37ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://017d7c032ea394e2dbef22cfc837c65ce54aab7db5fbd32210312b3e7042a698\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T06:19:31Z\\\",\\\"message\\\":\\\"W0930 06:19:20.583735 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 06:19:20.584237 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759213160 cert, and key in /tmp/serving-cert-2229078217/serving-signer.crt, /tmp/serving-cert-2229078217/serving-signer.key\\\\nI0930 06:19:20.815054 1 observer_polling.go:159] Starting file observer\\\\nW0930 06:19:20.818069 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 06:19:20.818377 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 06:19:20.821284 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2229078217/tls.crt::/tmp/serving-cert-2229078217/tls.key\\\\\\\"\\\\nF0930 06:19:31.389088 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39d7f1cd59b9a24d4303415d873bac1ed21326847749130167868b5298b7f8e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:43Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:43 crc kubenswrapper[4691]: I0930 06:19:43.876007 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:43Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:43 crc kubenswrapper[4691]: I0930 06:19:43.919384 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03f725fd435423024f16548f2fdd44ad1ddc26d1d485d56cc5b614ab2cac7a76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5fcea9ccb5c1331b7ae737abdcc80364814715f3c43c15793316f16acf0502a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:43Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:43 crc kubenswrapper[4691]: I0930 06:19:43.938001 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:43 crc kubenswrapper[4691]: I0930 06:19:43.938087 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:43 crc kubenswrapper[4691]: I0930 06:19:43.938114 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:43 crc kubenswrapper[4691]: I0930 06:19:43.938143 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:43 crc kubenswrapper[4691]: I0930 06:19:43.938161 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:43Z","lastTransitionTime":"2025-09-30T06:19:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:43 crc kubenswrapper[4691]: I0930 06:19:43.954563 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3ecaa3-192f-40d4-abf8-938b9e03a661\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://469f62522e2f813405840a8654f761a52163bdabeba8cc7c57998ccd40548370\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198c76a83115408261c01c78cbd2845e487ad3dc896da0130b2b1338349506d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aadde3a7a603aae354d4f9dcf5ef288aeceabfb73ee3c92398fb3419326b27a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2f054ea0da71d1f65e501212a66771c5e65bfc123e3a3acd5e13900def039d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:43Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:44 crc kubenswrapper[4691]: I0930 06:19:44.001069 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11fb453d515798fa4f85603df19a5dc7d305375d25c0f679598613aba7312328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:43Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:44 crc kubenswrapper[4691]: I0930 06:19:44.037510 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nzp64" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d675d9797e8541b6fff5df78a8d0701a51d23b03bd2d489d0489a82f440b4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d675d9797e8541b6fff5df78a8d0701a51d23b03bd2d489d0489a82f440b4a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aacdb742861442ec0e7b14904397b4e1a4bd115cd56e6105b37c42cb762b0dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aacdb742861442ec0e7b14904397b4e1a4bd115cd56e6105b37c42cb762b0dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f6c31e7404be56f11fd658c5e92f44de4704b9c2d5ebbce1fd0af40d2ca8c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1f6c31e7404be56f11fd658c5e92f44de4704b9c2d5ebbce1fd0af40d2ca8c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nzp64\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:44Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:44 crc kubenswrapper[4691]: I0930 06:19:44.039999 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:44 crc kubenswrapper[4691]: I0930 06:19:44.040054 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:44 crc kubenswrapper[4691]: I0930 06:19:44.040070 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:44 crc kubenswrapper[4691]: I0930 06:19:44.040093 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:44 crc kubenswrapper[4691]: I0930 06:19:44.040109 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:44Z","lastTransitionTime":"2025-09-30T06:19:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:44 crc kubenswrapper[4691]: I0930 06:19:44.085216 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sjmvw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:44Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:44 crc kubenswrapper[4691]: I0930 06:19:44.120525 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:44Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:44 crc kubenswrapper[4691]: I0930 06:19:44.143007 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:44 crc kubenswrapper[4691]: I0930 06:19:44.143061 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:44 crc kubenswrapper[4691]: I0930 06:19:44.143079 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:44 crc kubenswrapper[4691]: I0930 06:19:44.143101 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:44 crc kubenswrapper[4691]: I0930 06:19:44.143120 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:44Z","lastTransitionTime":"2025-09-30T06:19:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:44 crc kubenswrapper[4691]: I0930 06:19:44.161630 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca08f5d521940a428454d90f85150e7b87e2353314c89a42b4c73740c3b23163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:44Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:44 crc kubenswrapper[4691]: I0930 06:19:44.205767 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xjjw8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bfd073c-4582-4a65-8170-7030f4852174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a065eb18eb909fb60011f5ae6adf327088480996585d74e861628ffdd759bd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xjjw8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:44Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:44 crc kubenswrapper[4691]: I0930 06:19:44.224757 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 06:19:44 crc kubenswrapper[4691]: E0930 06:19:44.225085 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 06:19:44 crc kubenswrapper[4691]: I0930 06:19:44.237995 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-p7wmt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99a6e728-8795-424d-a99e-7141c75baad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://746d60ed8b3230cabda106f1d2e4bc8496618515f2cfb2d6c4238375f71ad2af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6jfq9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-p7wmt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:44Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:44 crc kubenswrapper[4691]: I0930 06:19:44.246652 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:44 crc kubenswrapper[4691]: I0930 06:19:44.246717 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:44 crc kubenswrapper[4691]: I0930 06:19:44.246736 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:44 crc kubenswrapper[4691]: I0930 06:19:44.246761 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:44 crc kubenswrapper[4691]: I0930 06:19:44.246779 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:44Z","lastTransitionTime":"2025-09-30T06:19:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:44 crc kubenswrapper[4691]: I0930 06:19:44.350611 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:44 crc kubenswrapper[4691]: I0930 06:19:44.350673 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:44 crc kubenswrapper[4691]: I0930 06:19:44.350697 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:44 crc kubenswrapper[4691]: I0930 06:19:44.350727 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:44 crc kubenswrapper[4691]: I0930 06:19:44.350750 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:44Z","lastTransitionTime":"2025-09-30T06:19:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:44 crc kubenswrapper[4691]: I0930 06:19:44.409061 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" event={"ID":"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d","Type":"ContainerStarted","Data":"c9f49abce591ae5c912fdbbdf0253d06ee20088424718dd6d535683a2a2ede5e"} Sep 30 06:19:44 crc kubenswrapper[4691]: I0930 06:19:44.412362 4691 generic.go:334] "Generic (PLEG): container finished" podID="f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6" containerID="74da2d6dc28ebfcbfff21e47ebad84a906c4686fddf32d04eb494be40d811b86" exitCode=0 Sep 30 06:19:44 crc kubenswrapper[4691]: I0930 06:19:44.412414 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nzp64" event={"ID":"f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6","Type":"ContainerDied","Data":"74da2d6dc28ebfcbfff21e47ebad84a906c4686fddf32d04eb494be40d811b86"} Sep 30 06:19:44 crc kubenswrapper[4691]: I0930 06:19:44.430530 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:44Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:44 crc kubenswrapper[4691]: I0930 06:19:44.453331 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:44 crc kubenswrapper[4691]: I0930 06:19:44.453379 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:44 crc kubenswrapper[4691]: I0930 06:19:44.453391 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:44 crc kubenswrapper[4691]: I0930 06:19:44.453407 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:44 crc kubenswrapper[4691]: I0930 06:19:44.453420 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:44Z","lastTransitionTime":"2025-09-30T06:19:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:44 crc kubenswrapper[4691]: I0930 06:19:44.462520 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca08f5d521940a428454d90f85150e7b87e2353314c89a42b4c73740c3b23163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:44Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:44 crc kubenswrapper[4691]: I0930 06:19:44.478782 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xjjw8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bfd073c-4582-4a65-8170-7030f4852174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a065eb18eb909fb60011f5ae6adf327088480996585d74e861628ffdd759bd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xjjw8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:44Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:44 crc kubenswrapper[4691]: I0930 06:19:44.491978 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-p7wmt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99a6e728-8795-424d-a99e-7141c75baad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://746d60ed8b3230cabda106f1d2e4bc8496618515f2cfb2d6c4238375f71ad2af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6jfq9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-p7wmt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:44Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:44 crc kubenswrapper[4691]: I0930 06:19:44.503031 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8htrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c1c8663-d263-4b8c-93fa-05ee1b61d7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73cf34ef0b11305b959da324b55f8a211f0c278cf2dec835bb6c91ad385e4dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj86t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8htrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:44Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:44 crc kubenswrapper[4691]: I0930 06:19:44.514094 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b46ade-8260-448f-84b7-506632d23ff9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff4769002bae0f06423979897a58fdb4b2a78f9ea00abc1d5df5c8d4385c0919\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xszvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1440cd9bacd9b28d5e192b3d9143dda931c741606fdb8f246fba23b8a68c534c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xszvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4w4k6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:44Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:44 crc kubenswrapper[4691]: I0930 06:19:44.581387 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e55ae5a0-03bf-427d-92b2-39cdff18340d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f6cedf7ad601b81415785791be352c849e1d559ab8e6a0f33de2c89a93cf09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65513dc3f11ab14acbafe004e02c1b395b74935e84a00e75d9936bde03a97fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6240aa4c18020b30de9389e7a1869c736424444208d4011f228b4e5432520cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2841452dae413fa8cdf532c4933ed52cfb007be43a45ee2d25209b54890fd351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b052f8474075dfc1f07b8e0844072d5bbaa94bae1711f7e4cecc6928a65689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:44Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:44 crc kubenswrapper[4691]: I0930 06:19:44.583238 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:44 crc kubenswrapper[4691]: I0930 06:19:44.583291 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:44 crc kubenswrapper[4691]: I0930 06:19:44.583308 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:44 crc kubenswrapper[4691]: I0930 06:19:44.583336 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:44 crc kubenswrapper[4691]: I0930 06:19:44.583356 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:44Z","lastTransitionTime":"2025-09-30T06:19:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:44 crc kubenswrapper[4691]: I0930 06:19:44.602786 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb1cfb02-1e6b-4bfb-8104-f1d137231de9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07fbce63750320e7345e1a99b4be96355f54c36c70e8929599b990eab3f3e637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38c9aa49f81f590d234daa993bb92af0c48103b45f5116c27b86daa651930a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acbe061d62e7bf11b0003c035dce19386981672d5e63236fada6e43b92274c91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fc15eff68924ff75a90b9cb7b5119d95ced9b9fe9e12968bfd077db73f37ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://017d7c032ea394e2dbef22cfc837c65ce54aab7db5fbd32210312b3e7042a698\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T06:19:31Z\\\",\\\"message\\\":\\\"W0930 06:19:20.583735 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 06:19:20.584237 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759213160 cert, and key in /tmp/serving-cert-2229078217/serving-signer.crt, /tmp/serving-cert-2229078217/serving-signer.key\\\\nI0930 06:19:20.815054 1 observer_polling.go:159] Starting file observer\\\\nW0930 06:19:20.818069 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 06:19:20.818377 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 06:19:20.821284 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2229078217/tls.crt::/tmp/serving-cert-2229078217/tls.key\\\\\\\"\\\\nF0930 06:19:31.389088 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39d7f1cd59b9a24d4303415d873bac1ed21326847749130167868b5298b7f8e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:44Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:44 crc kubenswrapper[4691]: I0930 06:19:44.617569 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:44Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:44 crc kubenswrapper[4691]: I0930 06:19:44.638569 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03f725fd435423024f16548f2fdd44ad1ddc26d1d485d56cc5b614ab2cac7a76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5fcea9ccb5c1331b7ae737abdcc80364814715f3c43c15793316f16acf0502a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:44Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:44 crc kubenswrapper[4691]: I0930 06:19:44.678637 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:44Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:44 crc kubenswrapper[4691]: I0930 06:19:44.684940 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:44 crc kubenswrapper[4691]: I0930 06:19:44.684974 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:44 crc kubenswrapper[4691]: I0930 06:19:44.684983 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:44 crc kubenswrapper[4691]: I0930 06:19:44.684996 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:44 crc kubenswrapper[4691]: I0930 06:19:44.685004 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:44Z","lastTransitionTime":"2025-09-30T06:19:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:44 crc kubenswrapper[4691]: I0930 06:19:44.725201 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3ecaa3-192f-40d4-abf8-938b9e03a661\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://469f62522e2f813405840a8654f761a52163bdabeba8cc7c57998ccd40548370\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198c76a83115408261c01c78cbd2845e487ad3dc896da0130b2b1338349506d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aadde3a7a603aae354d4f9dcf5ef288aeceabfb73ee3c92398fb3419326b27a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2f054ea0da71d1f65e501212a66771c5e65bfc123e3a3acd5e13900def039d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:44Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:44 crc kubenswrapper[4691]: I0930 06:19:44.757627 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11fb453d515798fa4f85603df19a5dc7d305375d25c0f679598613aba7312328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:44Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:44 crc kubenswrapper[4691]: I0930 06:19:44.790391 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:44 crc kubenswrapper[4691]: I0930 06:19:44.790446 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:44 crc kubenswrapper[4691]: I0930 06:19:44.790464 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:44 crc kubenswrapper[4691]: I0930 06:19:44.790488 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:44 crc kubenswrapper[4691]: I0930 06:19:44.790503 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:44Z","lastTransitionTime":"2025-09-30T06:19:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:44 crc kubenswrapper[4691]: I0930 06:19:44.798939 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nzp64" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d675d9797e8541b6fff5df78a8d0701a51d23b03bd2d489d0489a82f440b4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d675d9797e8541b6fff5df78a8d0701a51d23b03bd2d489d0489a82f440b4a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aacdb742861442ec0e7b14904397b4e1a4bd115cd56e6105b37c42cb762b0dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aacdb742861442ec0e7b14904397b4e1a4bd115cd56e6105b37c42cb762b0dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f6c31e7404be56f11fd658c5e92f44de4704b9c2d5ebbce1fd0af40d2ca8c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1f6c31e7404be56f11fd658c5e92f44de4704b9c2d5ebbce1fd0af40d2ca8c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74da2d6dc28ebfcbfff21e47ebad84a906c4686fddf32d04eb494be40d811b86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74da2d6dc28ebfcbfff21e47ebad84a906c4686fddf32d04eb494be40d811b86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nzp64\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:44Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:44 crc kubenswrapper[4691]: I0930 06:19:44.826377 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 06:19:44 crc kubenswrapper[4691]: E0930 06:19:44.826595 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 06:19:52.826563725 +0000 UTC m=+36.301584765 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:19:44 crc kubenswrapper[4691]: I0930 06:19:44.828850 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 06:19:44 crc kubenswrapper[4691]: I0930 06:19:44.828971 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 06:19:44 crc kubenswrapper[4691]: E0930 06:19:44.829065 4691 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 06:19:44 crc kubenswrapper[4691]: E0930 06:19:44.829107 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 06:19:52.829096514 +0000 UTC m=+36.304117644 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 06:19:44 crc kubenswrapper[4691]: E0930 06:19:44.829432 4691 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 06:19:44 crc kubenswrapper[4691]: E0930 06:19:44.829473 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 06:19:52.829463336 +0000 UTC m=+36.304484376 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 06:19:44 crc kubenswrapper[4691]: I0930 06:19:44.849592 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sjmvw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:44Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:44 crc kubenswrapper[4691]: I0930 06:19:44.893342 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:44 crc kubenswrapper[4691]: I0930 06:19:44.893396 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:44 crc kubenswrapper[4691]: I0930 06:19:44.893418 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:44 crc kubenswrapper[4691]: I0930 06:19:44.893449 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:44 crc kubenswrapper[4691]: I0930 06:19:44.893469 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:44Z","lastTransitionTime":"2025-09-30T06:19:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:44 crc kubenswrapper[4691]: I0930 06:19:44.930391 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 06:19:44 crc kubenswrapper[4691]: I0930 06:19:44.930455 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 06:19:44 crc kubenswrapper[4691]: E0930 06:19:44.930601 4691 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 06:19:44 crc kubenswrapper[4691]: E0930 06:19:44.930621 4691 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 06:19:44 crc kubenswrapper[4691]: E0930 06:19:44.930633 4691 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 06:19:44 crc kubenswrapper[4691]: E0930 06:19:44.930680 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 06:19:52.930663995 +0000 UTC m=+36.405685045 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 06:19:44 crc kubenswrapper[4691]: E0930 06:19:44.931264 4691 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 06:19:44 crc kubenswrapper[4691]: E0930 06:19:44.931336 4691 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 06:19:44 crc kubenswrapper[4691]: E0930 06:19:44.931365 4691 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 06:19:44 crc kubenswrapper[4691]: E0930 06:19:44.931450 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 06:19:52.931419779 +0000 UTC m=+36.406440859 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 06:19:44 crc kubenswrapper[4691]: I0930 06:19:44.997343 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:44 crc kubenswrapper[4691]: I0930 06:19:44.997390 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:44 crc kubenswrapper[4691]: I0930 06:19:44.997401 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:44 crc kubenswrapper[4691]: I0930 06:19:44.997419 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:44 crc kubenswrapper[4691]: I0930 06:19:44.997432 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:44Z","lastTransitionTime":"2025-09-30T06:19:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:45 crc kubenswrapper[4691]: I0930 06:19:45.099913 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:45 crc kubenswrapper[4691]: I0930 06:19:45.099977 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:45 crc kubenswrapper[4691]: I0930 06:19:45.099995 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:45 crc kubenswrapper[4691]: I0930 06:19:45.100021 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:45 crc kubenswrapper[4691]: I0930 06:19:45.100040 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:45Z","lastTransitionTime":"2025-09-30T06:19:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:45 crc kubenswrapper[4691]: I0930 06:19:45.202722 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:45 crc kubenswrapper[4691]: I0930 06:19:45.202772 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:45 crc kubenswrapper[4691]: I0930 06:19:45.202787 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:45 crc kubenswrapper[4691]: I0930 06:19:45.202810 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:45 crc kubenswrapper[4691]: I0930 06:19:45.202825 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:45Z","lastTransitionTime":"2025-09-30T06:19:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:45 crc kubenswrapper[4691]: I0930 06:19:45.224411 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 06:19:45 crc kubenswrapper[4691]: E0930 06:19:45.224636 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 06:19:45 crc kubenswrapper[4691]: I0930 06:19:45.224434 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 06:19:45 crc kubenswrapper[4691]: E0930 06:19:45.224786 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 06:19:45 crc kubenswrapper[4691]: I0930 06:19:45.305978 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:45 crc kubenswrapper[4691]: I0930 06:19:45.306033 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:45 crc kubenswrapper[4691]: I0930 06:19:45.306052 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:45 crc kubenswrapper[4691]: I0930 06:19:45.306076 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:45 crc kubenswrapper[4691]: I0930 06:19:45.306094 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:45Z","lastTransitionTime":"2025-09-30T06:19:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:45 crc kubenswrapper[4691]: I0930 06:19:45.408652 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:45 crc kubenswrapper[4691]: I0930 06:19:45.408699 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:45 crc kubenswrapper[4691]: I0930 06:19:45.408716 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:45 crc kubenswrapper[4691]: I0930 06:19:45.408739 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:45 crc kubenswrapper[4691]: I0930 06:19:45.408755 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:45Z","lastTransitionTime":"2025-09-30T06:19:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:45 crc kubenswrapper[4691]: I0930 06:19:45.422023 4691 generic.go:334] "Generic (PLEG): container finished" podID="f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6" containerID="0d1c4f2763fc5ea8ecd81a1c6aff31d291156a4db150fc70c0111b811bd5467c" exitCode=0 Sep 30 06:19:45 crc kubenswrapper[4691]: I0930 06:19:45.422078 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nzp64" event={"ID":"f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6","Type":"ContainerDied","Data":"0d1c4f2763fc5ea8ecd81a1c6aff31d291156a4db150fc70c0111b811bd5467c"} Sep 30 06:19:45 crc kubenswrapper[4691]: I0930 06:19:45.444324 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3ecaa3-192f-40d4-abf8-938b9e03a661\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://469f62522e2f813405840a8654f761a52163bdabeba8cc7c57998ccd40548370\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198c76a83115408261c01c78cbd2845e487ad3dc896da0130b2b1338349506d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aadde3a7a603aae354d4f9dcf5ef288aeceabfb73ee3c92398fb3419326b27a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2f054ea0da71d1f65e501212a66771c5e65bfc123e3a3acd5e13900def039d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:45Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:45 crc kubenswrapper[4691]: I0930 06:19:45.466001 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11fb453d515798fa4f85603df19a5dc7d305375d25c0f679598613aba7312328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:45Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:45 crc kubenswrapper[4691]: I0930 06:19:45.493612 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nzp64" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d675d9797e8541b6fff5df78a8d0701a51d23b03bd2d489d0489a82f440b4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d675d9797e8541b6fff5df78a8d0701a51d23b03bd2d489d0489a82f440b4a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aacdb742861442ec0e7b14904397b4e1a4bd115cd56e6105b37c42cb762b0dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aacdb742861442ec0e7b14904397b4e1a4bd115cd56e6105b37c42cb762b0dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f6c31e7404be56f11fd658c5e92f44de4704b9c2d5ebbce1fd0af40d2ca8c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1f6c31e7404be56f11fd658c5e92f44de4704b9c2d5ebbce1fd0af40d2ca8c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74da2d6dc28ebfcbfff21e47ebad84a906c4686fddf32d04eb494be40d811b86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74da2d6dc28ebfcbfff21e47ebad84a906c4686fddf32d04eb494be40d811b86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d1c4f2763fc5ea8ecd81a1c6aff31d291156a4db150fc70c0111b811bd5467c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d1c4f2763fc5ea8ecd81a1c6aff31d291156a4db150fc70c0111b811bd5467c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nzp64\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:45Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:45 crc kubenswrapper[4691]: I0930 06:19:45.511803 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:45 crc kubenswrapper[4691]: I0930 06:19:45.511832 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:45 crc kubenswrapper[4691]: I0930 06:19:45.511843 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:45 crc kubenswrapper[4691]: I0930 06:19:45.511859 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:45 crc kubenswrapper[4691]: I0930 06:19:45.511870 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:45Z","lastTransitionTime":"2025-09-30T06:19:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:45 crc kubenswrapper[4691]: I0930 06:19:45.517710 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sjmvw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:45Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:45 crc kubenswrapper[4691]: I0930 06:19:45.534736 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:45Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:45 crc kubenswrapper[4691]: I0930 06:19:45.552277 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca08f5d521940a428454d90f85150e7b87e2353314c89a42b4c73740c3b23163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:45Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:45 crc kubenswrapper[4691]: I0930 06:19:45.572510 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xjjw8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bfd073c-4582-4a65-8170-7030f4852174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a065eb18eb909fb60011f5ae6adf327088480996585d74e861628ffdd759bd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xjjw8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:45Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:45 crc kubenswrapper[4691]: I0930 06:19:45.585908 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-p7wmt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99a6e728-8795-424d-a99e-7141c75baad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://746d60ed8b3230cabda106f1d2e4bc8496618515f2cfb2d6c4238375f71ad2af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6jfq9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-p7wmt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:45Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:45 crc kubenswrapper[4691]: I0930 06:19:45.600365 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8htrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c1c8663-d263-4b8c-93fa-05ee1b61d7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73cf34ef0b11305b959da324b55f8a211f0c278cf2dec835bb6c91ad385e4dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj86t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8htrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:45Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:45 crc kubenswrapper[4691]: I0930 06:19:45.614108 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b46ade-8260-448f-84b7-506632d23ff9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff4769002bae0f06423979897a58fdb4b2a78f9ea00abc1d5df5c8d4385c0919\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xszvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1440cd9bacd9b28d5e192b3d9143dda931c741606fdb8f246fba23b8a68c534c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xszvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4w4k6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:45Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:45 crc kubenswrapper[4691]: I0930 06:19:45.615448 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:45 crc kubenswrapper[4691]: I0930 06:19:45.615494 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:45 crc kubenswrapper[4691]: I0930 06:19:45.615507 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:45 crc kubenswrapper[4691]: I0930 06:19:45.615523 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:45 crc kubenswrapper[4691]: I0930 06:19:45.615841 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:45Z","lastTransitionTime":"2025-09-30T06:19:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:45 crc kubenswrapper[4691]: I0930 06:19:45.638139 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e55ae5a0-03bf-427d-92b2-39cdff18340d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f6cedf7ad601b81415785791be352c849e1d559ab8e6a0f33de2c89a93cf09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65513dc3f11ab14acbafe004e02c1b395b74935e84a00e75d9936bde03a97fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6240aa4c18020b30de9389e7a1869c736424444208d4011f228b4e5432520cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2841452dae413fa8cdf532c4933ed52cfb007be43a45ee2d25209b54890fd351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b052f8474075dfc1f07b8e0844072d5bbaa94bae1711f7e4cecc6928a65689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:45Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:45 crc kubenswrapper[4691]: I0930 06:19:45.653079 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb1cfb02-1e6b-4bfb-8104-f1d137231de9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07fbce63750320e7345e1a99b4be96355f54c36c70e8929599b990eab3f3e637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38c9aa49f81f590d234daa993bb92af0c48103b45f5116c27b86daa651930a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acbe061d62e7bf11b0003c035dce19386981672d5e63236fada6e43b92274c91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fc15eff68924ff75a90b9cb7b5119d95ced9b9fe9e12968bfd077db73f37ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://017d7c032ea394e2dbef22cfc837c65ce54aab7db5fbd32210312b3e7042a698\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T06:19:31Z\\\",\\\"message\\\":\\\"W0930 06:19:20.583735 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 06:19:20.584237 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759213160 cert, and key in /tmp/serving-cert-2229078217/serving-signer.crt, /tmp/serving-cert-2229078217/serving-signer.key\\\\nI0930 06:19:20.815054 1 observer_polling.go:159] Starting file observer\\\\nW0930 06:19:20.818069 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 06:19:20.818377 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 06:19:20.821284 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2229078217/tls.crt::/tmp/serving-cert-2229078217/tls.key\\\\\\\"\\\\nF0930 06:19:31.389088 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39d7f1cd59b9a24d4303415d873bac1ed21326847749130167868b5298b7f8e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:45Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:45 crc kubenswrapper[4691]: I0930 06:19:45.665836 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:45Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:45 crc kubenswrapper[4691]: I0930 06:19:45.679305 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03f725fd435423024f16548f2fdd44ad1ddc26d1d485d56cc5b614ab2cac7a76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5fcea9ccb5c1331b7ae737abdcc80364814715f3c43c15793316f16acf0502a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:45Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:45 crc kubenswrapper[4691]: I0930 06:19:45.691949 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:45Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:45 crc kubenswrapper[4691]: I0930 06:19:45.719144 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:45 crc kubenswrapper[4691]: I0930 06:19:45.719220 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:45 crc kubenswrapper[4691]: I0930 06:19:45.719233 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:45 crc kubenswrapper[4691]: I0930 06:19:45.719252 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:45 crc kubenswrapper[4691]: I0930 06:19:45.719265 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:45Z","lastTransitionTime":"2025-09-30T06:19:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:45 crc kubenswrapper[4691]: I0930 06:19:45.822122 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:45 crc kubenswrapper[4691]: I0930 06:19:45.822167 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:45 crc kubenswrapper[4691]: I0930 06:19:45.822178 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:45 crc kubenswrapper[4691]: I0930 06:19:45.822198 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:45 crc kubenswrapper[4691]: I0930 06:19:45.822211 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:45Z","lastTransitionTime":"2025-09-30T06:19:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:45 crc kubenswrapper[4691]: I0930 06:19:45.925923 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:45 crc kubenswrapper[4691]: I0930 06:19:45.926001 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:45 crc kubenswrapper[4691]: I0930 06:19:45.926019 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:45 crc kubenswrapper[4691]: I0930 06:19:45.926044 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:45 crc kubenswrapper[4691]: I0930 06:19:45.926062 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:45Z","lastTransitionTime":"2025-09-30T06:19:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.028975 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.029036 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.029387 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.029738 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.030038 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:46Z","lastTransitionTime":"2025-09-30T06:19:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.133131 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.133190 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.133208 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.133234 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.133251 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:46Z","lastTransitionTime":"2025-09-30T06:19:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.224254 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 06:19:46 crc kubenswrapper[4691]: E0930 06:19:46.224492 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.237348 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.237402 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.237420 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.237445 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.237465 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:46Z","lastTransitionTime":"2025-09-30T06:19:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.341221 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.341354 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.341558 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.341663 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.341746 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:46Z","lastTransitionTime":"2025-09-30T06:19:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.430148 4691 generic.go:334] "Generic (PLEG): container finished" podID="f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6" containerID="f313d83849679400f0ef2223f644c042290a33d55cab935fdbe9af123c88d210" exitCode=0 Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.430232 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nzp64" event={"ID":"f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6","Type":"ContainerDied","Data":"f313d83849679400f0ef2223f644c042290a33d55cab935fdbe9af123c88d210"} Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.439204 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" event={"ID":"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d","Type":"ContainerStarted","Data":"1d7e580b7d8029c47c3d2878f40287e2bbd9672c12a48dc6b4d598426e4655fa"} Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.439721 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.439770 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.444581 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.444748 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.444868 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.444996 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.445079 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:46Z","lastTransitionTime":"2025-09-30T06:19:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.453708 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11fb453d515798fa4f85603df19a5dc7d305375d25c0f679598613aba7312328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:46Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.481167 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nzp64" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d675d9797e8541b6fff5df78a8d0701a51d23b03bd2d489d0489a82f440b4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d675d9797e8541b6fff5df78a8d0701a51d23b03bd2d489d0489a82f440b4a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aacdb742861442ec0e7b14904397b4e1a4bd115cd56e6105b37c42cb762b0dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aacdb742861442ec0e7b14904397b4e1a4bd115cd56e6105b37c42cb762b0dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f6c31e7404be56f11fd658c5e92f44de4704b9c2d5ebbce1fd0af40d2ca8c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1f6c31e7404be56f11fd658c5e92f44de4704b9c2d5ebbce1fd0af40d2ca8c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74da2d6dc28ebfcbfff21e47ebad84a906c4686fddf32d04eb494be40d811b86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74da2d6dc28ebfcbfff21e47ebad84a906c4686fddf32d04eb494be40d811b86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d1c4f2763fc5ea8ecd81a1c6aff31d291156a4db150fc70c0111b811bd5467c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d1c4f2763fc5ea8ecd81a1c6aff31d291156a4db150fc70c0111b811bd5467c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f313d83849679400f0ef2223f644c042290a33d55cab935fdbe9af123c88d210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f313d83849679400f0ef2223f644c042290a33d55cab935fdbe9af123c88d210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nzp64\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:46Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.515186 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sjmvw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:46Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.521736 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.522343 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.532115 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-p7wmt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99a6e728-8795-424d-a99e-7141c75baad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://746d60ed8b3230cabda106f1d2e4bc8496618515f2cfb2d6c4238375f71ad2af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6jfq9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-p7wmt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:46Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.548069 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.548119 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.548138 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.548160 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.548193 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:46Z","lastTransitionTime":"2025-09-30T06:19:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.548769 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:46Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.563450 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca08f5d521940a428454d90f85150e7b87e2353314c89a42b4c73740c3b23163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:46Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.579370 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xjjw8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bfd073c-4582-4a65-8170-7030f4852174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a065eb18eb909fb60011f5ae6adf327088480996585d74e861628ffdd759bd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xjjw8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:46Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.592240 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03f725fd435423024f16548f2fdd44ad1ddc26d1d485d56cc5b614ab2cac7a76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5fcea9ccb5c1331b7ae737abdcc80364814715f3c43c15793316f16acf0502a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:46Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.607087 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:46Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.618643 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8htrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c1c8663-d263-4b8c-93fa-05ee1b61d7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73cf34ef0b11305b959da324b55f8a211f0c278cf2dec835bb6c91ad385e4dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj86t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8htrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:46Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.629914 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b46ade-8260-448f-84b7-506632d23ff9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff4769002bae0f06423979897a58fdb4b2a78f9ea00abc1d5df5c8d4385c0919\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xszvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1440cd9bacd9b28d5e192b3d9143dda931c741606fdb8f246fba23b8a68c534c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xszvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4w4k6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:46Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.651820 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.651864 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.651875 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.651917 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.651929 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:46Z","lastTransitionTime":"2025-09-30T06:19:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.655923 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e55ae5a0-03bf-427d-92b2-39cdff18340d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f6cedf7ad601b81415785791be352c849e1d559ab8e6a0f33de2c89a93cf09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65513dc3f11ab14acbafe004e02c1b395b74935e84a00e75d9936bde03a97fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6240aa4c18020b30de9389e7a1869c736424444208d4011f228b4e5432520cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2841452dae413fa8cdf532c4933ed52cfb007be43a45ee2d25209b54890fd351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b052f8474075dfc1f07b8e0844072d5bbaa94bae1711f7e4cecc6928a65689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:46Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.670459 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb1cfb02-1e6b-4bfb-8104-f1d137231de9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07fbce63750320e7345e1a99b4be96355f54c36c70e8929599b990eab3f3e637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38c9aa49f81f590d234daa993bb92af0c48103b45f5116c27b86daa651930a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acbe061d62e7bf11b0003c035dce19386981672d5e63236fada6e43b92274c91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fc15eff68924ff75a90b9cb7b5119d95ced9b9fe9e12968bfd077db73f37ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://017d7c032ea394e2dbef22cfc837c65ce54aab7db5fbd32210312b3e7042a698\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T06:19:31Z\\\",\\\"message\\\":\\\"W0930 06:19:20.583735 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 06:19:20.584237 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759213160 cert, and key in /tmp/serving-cert-2229078217/serving-signer.crt, /tmp/serving-cert-2229078217/serving-signer.key\\\\nI0930 06:19:20.815054 1 observer_polling.go:159] Starting file observer\\\\nW0930 06:19:20.818069 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 06:19:20.818377 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 06:19:20.821284 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2229078217/tls.crt::/tmp/serving-cert-2229078217/tls.key\\\\\\\"\\\\nF0930 06:19:31.389088 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39d7f1cd59b9a24d4303415d873bac1ed21326847749130167868b5298b7f8e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:46Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.681867 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:46Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.696690 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3ecaa3-192f-40d4-abf8-938b9e03a661\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://469f62522e2f813405840a8654f761a52163bdabeba8cc7c57998ccd40548370\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198c76a83115408261c01c78cbd2845e487ad3dc896da0130b2b1338349506d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aadde3a7a603aae354d4f9dcf5ef288aeceabfb73ee3c92398fb3419326b27a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2f054ea0da71d1f65e501212a66771c5e65bfc123e3a3acd5e13900def039d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:46Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.713675 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nzp64" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d675d9797e8541b6fff5df78a8d0701a51d23b03bd2d489d0489a82f440b4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d675d9797e8541b6fff5df78a8d0701a51d23b03bd2d489d0489a82f440b4a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aacdb742861442ec0e7b14904397b4e1a4bd115cd56e6105b37c42cb762b0dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aacdb742861442ec0e7b14904397b4e1a4bd115cd56e6105b37c42cb762b0dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f6c31e7404be56f11fd658c5e92f44de4704b9c2d5ebbce1fd0af40d2ca8c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1f6c31e7404be56f11fd658c5e92f44de4704b9c2d5ebbce1fd0af40d2ca8c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74da2d6dc28ebfcbfff21e47ebad84a906c4686fddf32d04eb494be40d811b86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74da2d6dc28ebfcbfff21e47ebad84a906c4686fddf32d04eb494be40d811b86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d1c4f2763fc5ea8ecd81a1c6aff31d291156a4db150fc70c0111b811bd5467c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d1c4f2763fc5ea8ecd81a1c6aff31d291156a4db150fc70c0111b811bd5467c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f313d83849679400f0ef2223f644c042290a33d55cab935fdbe9af123c88d210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f313d83849679400f0ef2223f644c042290a33d55cab935fdbe9af123c88d210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nzp64\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:46Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.737902 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b6df86867d5427797ec0c0a28b653ef5b8ec62af450207d9af5f490df725b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04bc58a7b740ca100f333ef769a949f962ff6024f8dd87b798c99a28514e0394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d95203bc3297779316f06d001c72859f9b4561a9a9d7c3123c5a75e8d8bc6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f46927454f67993366632bb3f6e37e4cb0c5f013266b18c1ac61e9cddc42e023\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20789e47d7e584044b0763af1eb8b81bbde2f247ec14e56a5a9b08c28894c3be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d002e869e911c24f154df90324ace473c747a4cfa6dd60ddcbef6473f9815f72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d7e580b7d8029c47c3d2878f40287e2bbd9672c12a48dc6b4d598426e4655fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f49abce591ae5c912fdbbdf0253d06ee20088424718dd6d535683a2a2ede5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sjmvw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:46Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.752175 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11fb453d515798fa4f85603df19a5dc7d305375d25c0f679598613aba7312328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:46Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.756463 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.758305 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.758421 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.758492 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.758548 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:46Z","lastTransitionTime":"2025-09-30T06:19:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.773049 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca08f5d521940a428454d90f85150e7b87e2353314c89a42b4c73740c3b23163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:46Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.789111 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xjjw8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bfd073c-4582-4a65-8170-7030f4852174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a065eb18eb909fb60011f5ae6adf327088480996585d74e861628ffdd759bd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xjjw8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:46Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.805367 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-p7wmt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99a6e728-8795-424d-a99e-7141c75baad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://746d60ed8b3230cabda106f1d2e4bc8496618515f2cfb2d6c4238375f71ad2af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6jfq9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-p7wmt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:46Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.819550 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:46Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.837803 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb1cfb02-1e6b-4bfb-8104-f1d137231de9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07fbce63750320e7345e1a99b4be96355f54c36c70e8929599b990eab3f3e637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38c9aa49f81f590d234daa993bb92af0c48103b45f5116c27b86daa651930a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acbe061d62e7bf11b0003c035dce19386981672d5e63236fada6e43b92274c91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fc15eff68924ff75a90b9cb7b5119d95ced9b9fe9e12968bfd077db73f37ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://017d7c032ea394e2dbef22cfc837c65ce54aab7db5fbd32210312b3e7042a698\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T06:19:31Z\\\",\\\"message\\\":\\\"W0930 06:19:20.583735 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 06:19:20.584237 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759213160 cert, and key in /tmp/serving-cert-2229078217/serving-signer.crt, /tmp/serving-cert-2229078217/serving-signer.key\\\\nI0930 06:19:20.815054 1 observer_polling.go:159] Starting file observer\\\\nW0930 06:19:20.818069 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 06:19:20.818377 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 06:19:20.821284 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2229078217/tls.crt::/tmp/serving-cert-2229078217/tls.key\\\\\\\"\\\\nF0930 06:19:31.389088 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39d7f1cd59b9a24d4303415d873bac1ed21326847749130167868b5298b7f8e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:46Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.853097 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:46Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.861190 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.861243 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.861262 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.861288 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.861306 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:46Z","lastTransitionTime":"2025-09-30T06:19:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.865587 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03f725fd435423024f16548f2fdd44ad1ddc26d1d485d56cc5b614ab2cac7a76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5fcea9ccb5c1331b7ae737abdcc80364814715f3c43c15793316f16acf0502a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:46Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.878538 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:46Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.887952 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8htrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c1c8663-d263-4b8c-93fa-05ee1b61d7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73cf34ef0b11305b959da324b55f8a211f0c278cf2dec835bb6c91ad385e4dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj86t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8htrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:46Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.897386 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b46ade-8260-448f-84b7-506632d23ff9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff4769002bae0f06423979897a58fdb4b2a78f9ea00abc1d5df5c8d4385c0919\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xszvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1440cd9bacd9b28d5e192b3d9143dda931c741606fdb8f246fba23b8a68c534c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xszvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4w4k6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:46Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.917741 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e55ae5a0-03bf-427d-92b2-39cdff18340d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f6cedf7ad601b81415785791be352c849e1d559ab8e6a0f33de2c89a93cf09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65513dc3f11ab14acbafe004e02c1b395b74935e84a00e75d9936bde03a97fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6240aa4c18020b30de9389e7a1869c736424444208d4011f228b4e5432520cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2841452dae413fa8cdf532c4933ed52cfb007be43a45ee2d25209b54890fd351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b052f8474075dfc1f07b8e0844072d5bbaa94bae1711f7e4cecc6928a65689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:46Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.928600 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3ecaa3-192f-40d4-abf8-938b9e03a661\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://469f62522e2f813405840a8654f761a52163bdabeba8cc7c57998ccd40548370\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198c76a83115408261c01c78cbd2845e487ad3dc896da0130b2b1338349506d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aadde3a7a603aae354d4f9dcf5ef288aeceabfb73ee3c92398fb3419326b27a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2f054ea0da71d1f65e501212a66771c5e65bfc123e3a3acd5e13900def039d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:46Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.963670 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.963722 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.963739 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.963761 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:46 crc kubenswrapper[4691]: I0930 06:19:46.963777 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:46Z","lastTransitionTime":"2025-09-30T06:19:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.066547 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.066868 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.067056 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.067198 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.067349 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:47Z","lastTransitionTime":"2025-09-30T06:19:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.169913 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.169983 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.170005 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.170034 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.170056 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:47Z","lastTransitionTime":"2025-09-30T06:19:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.224318 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.224362 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 06:19:47 crc kubenswrapper[4691]: E0930 06:19:47.224512 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 06:19:47 crc kubenswrapper[4691]: E0930 06:19:47.224632 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.247742 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca08f5d521940a428454d90f85150e7b87e2353314c89a42b4c73740c3b23163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:47Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.267615 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xjjw8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bfd073c-4582-4a65-8170-7030f4852174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a065eb18eb909fb60011f5ae6adf327088480996585d74e861628ffdd759bd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xjjw8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:47Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.277101 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.277169 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.277195 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.277226 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.277245 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:47Z","lastTransitionTime":"2025-09-30T06:19:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.284457 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-p7wmt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99a6e728-8795-424d-a99e-7141c75baad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://746d60ed8b3230cabda106f1d2e4bc8496618515f2cfb2d6c4238375f71ad2af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6jfq9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-p7wmt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:47Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.303990 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:47Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.328860 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb1cfb02-1e6b-4bfb-8104-f1d137231de9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07fbce63750320e7345e1a99b4be96355f54c36c70e8929599b990eab3f3e637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38c9aa49f81f590d234daa993bb92af0c48103b45f5116c27b86daa651930a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acbe061d62e7bf11b0003c035dce19386981672d5e63236fada6e43b92274c91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fc15eff68924ff75a90b9cb7b5119d95ced9b9fe9e12968bfd077db73f37ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://017d7c032ea394e2dbef22cfc837c65ce54aab7db5fbd32210312b3e7042a698\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T06:19:31Z\\\",\\\"message\\\":\\\"W0930 06:19:20.583735 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 06:19:20.584237 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759213160 cert, and key in /tmp/serving-cert-2229078217/serving-signer.crt, /tmp/serving-cert-2229078217/serving-signer.key\\\\nI0930 06:19:20.815054 1 observer_polling.go:159] Starting file observer\\\\nW0930 06:19:20.818069 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 06:19:20.818377 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 06:19:20.821284 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2229078217/tls.crt::/tmp/serving-cert-2229078217/tls.key\\\\\\\"\\\\nF0930 06:19:31.389088 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39d7f1cd59b9a24d4303415d873bac1ed21326847749130167868b5298b7f8e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:47Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.349732 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:47Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.370711 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03f725fd435423024f16548f2fdd44ad1ddc26d1d485d56cc5b614ab2cac7a76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5fcea9ccb5c1331b7ae737abdcc80364814715f3c43c15793316f16acf0502a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:47Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.379975 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.380042 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.380060 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.380086 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.380103 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:47Z","lastTransitionTime":"2025-09-30T06:19:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.389645 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:47Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.408289 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8htrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c1c8663-d263-4b8c-93fa-05ee1b61d7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73cf34ef0b11305b959da324b55f8a211f0c278cf2dec835bb6c91ad385e4dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj86t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8htrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:47Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.426614 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b46ade-8260-448f-84b7-506632d23ff9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff4769002bae0f06423979897a58fdb4b2a78f9ea00abc1d5df5c8d4385c0919\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xszvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1440cd9bacd9b28d5e192b3d9143dda931c741606fdb8f246fba23b8a68c534c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xszvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4w4k6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:47Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.448755 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nzp64" event={"ID":"f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6","Type":"ContainerStarted","Data":"5b7ceecf30a44692340aed4f0df2aa58e8c9552f8ef872b7a872b29516fa2068"} Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.448857 4691 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.462620 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e55ae5a0-03bf-427d-92b2-39cdff18340d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f6cedf7ad601b81415785791be352c849e1d559ab8e6a0f33de2c89a93cf09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65513dc3f11ab14acbafe004e02c1b395b74935e84a00e75d9936bde03a97fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6240aa4c18020b30de9389e7a1869c736424444208d4011f228b4e5432520cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2841452dae413fa8cdf532c4933ed52cfb007be43a45ee2d25209b54890fd351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b052f8474075dfc1f07b8e0844072d5bbaa94bae1711f7e4cecc6928a65689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:47Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.471465 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.480801 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3ecaa3-192f-40d4-abf8-938b9e03a661\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://469f62522e2f813405840a8654f761a52163bdabeba8cc7c57998ccd40548370\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198c76a83115408261c01c78cbd2845e487ad3dc896da0130b2b1338349506d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aadde3a7a603aae354d4f9dcf5ef288aeceabfb73ee3c92398fb3419326b27a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2f054ea0da71d1f65e501212a66771c5e65bfc123e3a3acd5e13900def039d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:47Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.482561 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.482589 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.482600 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.482615 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.482626 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:47Z","lastTransitionTime":"2025-09-30T06:19:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.501126 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nzp64" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d675d9797e8541b6fff5df78a8d0701a51d23b03bd2d489d0489a82f440b4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d675d9797e8541b6fff5df78a8d0701a51d23b03bd2d489d0489a82f440b4a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aacdb742861442ec0e7b14904397b4e1a4bd115cd56e6105b37c42cb762b0dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aacdb742861442ec0e7b14904397b4e1a4bd115cd56e6105b37c42cb762b0dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f6c31e7404be56f11fd658c5e92f44de4704b9c2d5ebbce1fd0af40d2ca8c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1f6c31e7404be56f11fd658c5e92f44de4704b9c2d5ebbce1fd0af40d2ca8c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74da2d6dc28ebfcbfff21e47ebad84a906c4686fddf32d04eb494be40d811b86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74da2d6dc28ebfcbfff21e47ebad84a906c4686fddf32d04eb494be40d811b86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d1c4f2763fc5ea8ecd81a1c6aff31d291156a4db150fc70c0111b811bd5467c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d1c4f2763fc5ea8ecd81a1c6aff31d291156a4db150fc70c0111b811bd5467c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f313d83849679400f0ef2223f644c042290a33d55cab935fdbe9af123c88d210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f313d83849679400f0ef2223f644c042290a33d55cab935fdbe9af123c88d210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nzp64\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:47Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.529532 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b6df86867d5427797ec0c0a28b653ef5b8ec62af450207d9af5f490df725b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04bc58a7b740ca100f333ef769a949f962ff6024f8dd87b798c99a28514e0394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d95203bc3297779316f06d001c72859f9b4561a9a9d7c3123c5a75e8d8bc6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f46927454f67993366632bb3f6e37e4cb0c5f013266b18c1ac61e9cddc42e023\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20789e47d7e584044b0763af1eb8b81bbde2f247ec14e56a5a9b08c28894c3be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d002e869e911c24f154df90324ace473c747a4cfa6dd60ddcbef6473f9815f72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d7e580b7d8029c47c3d2878f40287e2bbd9672c12a48dc6b4d598426e4655fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f49abce591ae5c912fdbbdf0253d06ee20088424718dd6d535683a2a2ede5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sjmvw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:47Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.546533 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11fb453d515798fa4f85603df19a5dc7d305375d25c0f679598613aba7312328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:47Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.565563 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:47Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.582351 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca08f5d521940a428454d90f85150e7b87e2353314c89a42b4c73740c3b23163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:47Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.585575 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.585635 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.585653 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.585679 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.585700 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:47Z","lastTransitionTime":"2025-09-30T06:19:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.597346 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xjjw8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bfd073c-4582-4a65-8170-7030f4852174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a065eb18eb909fb60011f5ae6adf327088480996585d74e861628ffdd759bd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xjjw8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:47Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.612940 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-p7wmt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99a6e728-8795-424d-a99e-7141c75baad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://746d60ed8b3230cabda106f1d2e4bc8496618515f2cfb2d6c4238375f71ad2af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6jfq9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-p7wmt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:47Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.625750 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8htrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c1c8663-d263-4b8c-93fa-05ee1b61d7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73cf34ef0b11305b959da324b55f8a211f0c278cf2dec835bb6c91ad385e4dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj86t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8htrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:47Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.641784 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b46ade-8260-448f-84b7-506632d23ff9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff4769002bae0f06423979897a58fdb4b2a78f9ea00abc1d5df5c8d4385c0919\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xszvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1440cd9bacd9b28d5e192b3d9143dda931c741606fdb8f246fba23b8a68c534c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xszvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4w4k6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:47Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.672712 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e55ae5a0-03bf-427d-92b2-39cdff18340d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f6cedf7ad601b81415785791be352c849e1d559ab8e6a0f33de2c89a93cf09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65513dc3f11ab14acbafe004e02c1b395b74935e84a00e75d9936bde03a97fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6240aa4c18020b30de9389e7a1869c736424444208d4011f228b4e5432520cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2841452dae413fa8cdf532c4933ed52cfb007be43a45ee2d25209b54890fd351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b052f8474075dfc1f07b8e0844072d5bbaa94bae1711f7e4cecc6928a65689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:47Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.688545 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.688571 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.688582 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.688598 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.688609 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:47Z","lastTransitionTime":"2025-09-30T06:19:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.693583 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb1cfb02-1e6b-4bfb-8104-f1d137231de9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07fbce63750320e7345e1a99b4be96355f54c36c70e8929599b990eab3f3e637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38c9aa49f81f590d234daa993bb92af0c48103b45f5116c27b86daa651930a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acbe061d62e7bf11b0003c035dce19386981672d5e63236fada6e43b92274c91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fc15eff68924ff75a90b9cb7b5119d95ced9b9fe9e12968bfd077db73f37ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://017d7c032ea394e2dbef22cfc837c65ce54aab7db5fbd32210312b3e7042a698\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T06:19:31Z\\\",\\\"message\\\":\\\"W0930 06:19:20.583735 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 06:19:20.584237 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759213160 cert, and key in /tmp/serving-cert-2229078217/serving-signer.crt, /tmp/serving-cert-2229078217/serving-signer.key\\\\nI0930 06:19:20.815054 1 observer_polling.go:159] Starting file observer\\\\nW0930 06:19:20.818069 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 06:19:20.818377 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 06:19:20.821284 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2229078217/tls.crt::/tmp/serving-cert-2229078217/tls.key\\\\\\\"\\\\nF0930 06:19:31.389088 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39d7f1cd59b9a24d4303415d873bac1ed21326847749130167868b5298b7f8e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:47Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.709024 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:47Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.722059 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03f725fd435423024f16548f2fdd44ad1ddc26d1d485d56cc5b614ab2cac7a76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5fcea9ccb5c1331b7ae737abdcc80364814715f3c43c15793316f16acf0502a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:47Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.739991 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:47Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.756676 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3ecaa3-192f-40d4-abf8-938b9e03a661\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://469f62522e2f813405840a8654f761a52163bdabeba8cc7c57998ccd40548370\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198c76a83115408261c01c78cbd2845e487ad3dc896da0130b2b1338349506d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aadde3a7a603aae354d4f9dcf5ef288aeceabfb73ee3c92398fb3419326b27a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2f054ea0da71d1f65e501212a66771c5e65bfc123e3a3acd5e13900def039d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:47Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.773250 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11fb453d515798fa4f85603df19a5dc7d305375d25c0f679598613aba7312328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:47Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.791609 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.791687 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.791712 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.791741 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.791763 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:47Z","lastTransitionTime":"2025-09-30T06:19:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.805494 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nzp64" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b7ceecf30a44692340aed4f0df2aa58e8c9552f8ef872b7a872b29516fa2068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d675d9797e8541b6fff5df78a8d0701a51d23b03bd2d489d0489a82f440b4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d675d9797e8541b6fff5df78a8d0701a51d23b03bd2d489d0489a82f440b4a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aacdb742861442ec0e7b14904397b4e1a4bd115cd56e6105b37c42cb762b0dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aacdb742861442ec0e7b14904397b4e1a4bd115cd56e6105b37c42cb762b0dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f6c31e7404be56f11fd658c5e92f44de4704b9c2d5ebbce1fd0af40d2ca8c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1f6c31e7404be56f11fd658c5e92f44de4704b9c2d5ebbce1fd0af40d2ca8c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74da2d6dc28ebfcbfff21e47ebad84a906c4686fddf32d04eb494be40d811b86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74da2d6dc28ebfcbfff21e47ebad84a906c4686fddf32d04eb494be40d811b86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d1c4f2763fc5ea8ecd81a1c6aff31d291156a4db150fc70c0111b811bd5467c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d1c4f2763fc5ea8ecd81a1c6aff31d291156a4db150fc70c0111b811bd5467c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f313d83849679400f0ef2223f644c042290a33d55cab935fdbe9af123c88d210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f313d83849679400f0ef2223f644c042290a33d55cab935fdbe9af123c88d210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nzp64\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:47Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.849544 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b6df86867d5427797ec0c0a28b653ef5b8ec62af450207d9af5f490df725b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04bc58a7b740ca100f333ef769a949f962ff6024f8dd87b798c99a28514e0394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d95203bc3297779316f06d001c72859f9b4561a9a9d7c3123c5a75e8d8bc6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f46927454f67993366632bb3f6e37e4cb0c5f013266b18c1ac61e9cddc42e023\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20789e47d7e584044b0763af1eb8b81bbde2f247ec14e56a5a9b08c28894c3be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d002e869e911c24f154df90324ace473c747a4cfa6dd60ddcbef6473f9815f72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d7e580b7d8029c47c3d2878f40287e2bbd9672c12a48dc6b4d598426e4655fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f49abce591ae5c912fdbbdf0253d06ee20088424718dd6d535683a2a2ede5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sjmvw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:47Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.895001 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.895047 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.895065 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.895088 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.895105 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:47Z","lastTransitionTime":"2025-09-30T06:19:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.998641 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.998712 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.998733 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.998756 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:47 crc kubenswrapper[4691]: I0930 06:19:47.998776 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:47Z","lastTransitionTime":"2025-09-30T06:19:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:48 crc kubenswrapper[4691]: I0930 06:19:48.102003 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:48 crc kubenswrapper[4691]: I0930 06:19:48.102118 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:48 crc kubenswrapper[4691]: I0930 06:19:48.102170 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:48 crc kubenswrapper[4691]: I0930 06:19:48.102202 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:48 crc kubenswrapper[4691]: I0930 06:19:48.102225 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:48Z","lastTransitionTime":"2025-09-30T06:19:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:48 crc kubenswrapper[4691]: I0930 06:19:48.205187 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:48 crc kubenswrapper[4691]: I0930 06:19:48.205272 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:48 crc kubenswrapper[4691]: I0930 06:19:48.205301 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:48 crc kubenswrapper[4691]: I0930 06:19:48.205332 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:48 crc kubenswrapper[4691]: I0930 06:19:48.205355 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:48Z","lastTransitionTime":"2025-09-30T06:19:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:48 crc kubenswrapper[4691]: I0930 06:19:48.224685 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 06:19:48 crc kubenswrapper[4691]: E0930 06:19:48.224964 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 06:19:48 crc kubenswrapper[4691]: I0930 06:19:48.308017 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:48 crc kubenswrapper[4691]: I0930 06:19:48.308056 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:48 crc kubenswrapper[4691]: I0930 06:19:48.308068 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:48 crc kubenswrapper[4691]: I0930 06:19:48.308084 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:48 crc kubenswrapper[4691]: I0930 06:19:48.308095 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:48Z","lastTransitionTime":"2025-09-30T06:19:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:48 crc kubenswrapper[4691]: I0930 06:19:48.411583 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:48 crc kubenswrapper[4691]: I0930 06:19:48.411647 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:48 crc kubenswrapper[4691]: I0930 06:19:48.411676 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:48 crc kubenswrapper[4691]: I0930 06:19:48.411705 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:48 crc kubenswrapper[4691]: I0930 06:19:48.411724 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:48Z","lastTransitionTime":"2025-09-30T06:19:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:48 crc kubenswrapper[4691]: I0930 06:19:48.452362 4691 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 06:19:48 crc kubenswrapper[4691]: I0930 06:19:48.514229 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:48 crc kubenswrapper[4691]: I0930 06:19:48.514290 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:48 crc kubenswrapper[4691]: I0930 06:19:48.514346 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:48 crc kubenswrapper[4691]: I0930 06:19:48.514372 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:48 crc kubenswrapper[4691]: I0930 06:19:48.514429 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:48Z","lastTransitionTime":"2025-09-30T06:19:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:48 crc kubenswrapper[4691]: I0930 06:19:48.616996 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:48 crc kubenswrapper[4691]: I0930 06:19:48.617051 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:48 crc kubenswrapper[4691]: I0930 06:19:48.617068 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:48 crc kubenswrapper[4691]: I0930 06:19:48.617117 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:48 crc kubenswrapper[4691]: I0930 06:19:48.617136 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:48Z","lastTransitionTime":"2025-09-30T06:19:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:48 crc kubenswrapper[4691]: I0930 06:19:48.725319 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:48 crc kubenswrapper[4691]: I0930 06:19:48.725382 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:48 crc kubenswrapper[4691]: I0930 06:19:48.725402 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:48 crc kubenswrapper[4691]: I0930 06:19:48.725428 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:48 crc kubenswrapper[4691]: I0930 06:19:48.725445 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:48Z","lastTransitionTime":"2025-09-30T06:19:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:48 crc kubenswrapper[4691]: I0930 06:19:48.828668 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:48 crc kubenswrapper[4691]: I0930 06:19:48.828719 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:48 crc kubenswrapper[4691]: I0930 06:19:48.828735 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:48 crc kubenswrapper[4691]: I0930 06:19:48.828758 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:48 crc kubenswrapper[4691]: I0930 06:19:48.828775 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:48Z","lastTransitionTime":"2025-09-30T06:19:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:48 crc kubenswrapper[4691]: I0930 06:19:48.932053 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:48 crc kubenswrapper[4691]: I0930 06:19:48.932103 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:48 crc kubenswrapper[4691]: I0930 06:19:48.932120 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:48 crc kubenswrapper[4691]: I0930 06:19:48.932143 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:48 crc kubenswrapper[4691]: I0930 06:19:48.932160 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:48Z","lastTransitionTime":"2025-09-30T06:19:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:49 crc kubenswrapper[4691]: I0930 06:19:49.036741 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:49 crc kubenswrapper[4691]: I0930 06:19:49.036799 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:49 crc kubenswrapper[4691]: I0930 06:19:49.036816 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:49 crc kubenswrapper[4691]: I0930 06:19:49.036841 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:49 crc kubenswrapper[4691]: I0930 06:19:49.036858 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:49Z","lastTransitionTime":"2025-09-30T06:19:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:49 crc kubenswrapper[4691]: I0930 06:19:49.140419 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:49 crc kubenswrapper[4691]: I0930 06:19:49.140495 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:49 crc kubenswrapper[4691]: I0930 06:19:49.140519 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:49 crc kubenswrapper[4691]: I0930 06:19:49.140549 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:49 crc kubenswrapper[4691]: I0930 06:19:49.140575 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:49Z","lastTransitionTime":"2025-09-30T06:19:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:49 crc kubenswrapper[4691]: I0930 06:19:49.224457 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 06:19:49 crc kubenswrapper[4691]: I0930 06:19:49.224630 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 06:19:49 crc kubenswrapper[4691]: E0930 06:19:49.224820 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 06:19:49 crc kubenswrapper[4691]: E0930 06:19:49.225033 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 06:19:49 crc kubenswrapper[4691]: I0930 06:19:49.243842 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:49 crc kubenswrapper[4691]: I0930 06:19:49.243973 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:49 crc kubenswrapper[4691]: I0930 06:19:49.243998 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:49 crc kubenswrapper[4691]: I0930 06:19:49.244025 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:49 crc kubenswrapper[4691]: I0930 06:19:49.244043 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:49Z","lastTransitionTime":"2025-09-30T06:19:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:49 crc kubenswrapper[4691]: I0930 06:19:49.347962 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:49 crc kubenswrapper[4691]: I0930 06:19:49.348032 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:49 crc kubenswrapper[4691]: I0930 06:19:49.348049 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:49 crc kubenswrapper[4691]: I0930 06:19:49.348072 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:49 crc kubenswrapper[4691]: I0930 06:19:49.348088 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:49Z","lastTransitionTime":"2025-09-30T06:19:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:49 crc kubenswrapper[4691]: I0930 06:19:49.451163 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:49 crc kubenswrapper[4691]: I0930 06:19:49.451239 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:49 crc kubenswrapper[4691]: I0930 06:19:49.451258 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:49 crc kubenswrapper[4691]: I0930 06:19:49.451592 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:49 crc kubenswrapper[4691]: I0930 06:19:49.451872 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:49Z","lastTransitionTime":"2025-09-30T06:19:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:49 crc kubenswrapper[4691]: I0930 06:19:49.459424 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sjmvw_6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d/ovnkube-controller/0.log" Sep 30 06:19:49 crc kubenswrapper[4691]: I0930 06:19:49.464112 4691 generic.go:334] "Generic (PLEG): container finished" podID="6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" containerID="1d7e580b7d8029c47c3d2878f40287e2bbd9672c12a48dc6b4d598426e4655fa" exitCode=1 Sep 30 06:19:49 crc kubenswrapper[4691]: I0930 06:19:49.464169 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" event={"ID":"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d","Type":"ContainerDied","Data":"1d7e580b7d8029c47c3d2878f40287e2bbd9672c12a48dc6b4d598426e4655fa"} Sep 30 06:19:49 crc kubenswrapper[4691]: I0930 06:19:49.465262 4691 scope.go:117] "RemoveContainer" containerID="1d7e580b7d8029c47c3d2878f40287e2bbd9672c12a48dc6b4d598426e4655fa" Sep 30 06:19:49 crc kubenswrapper[4691]: I0930 06:19:49.494204 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nzp64" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b7ceecf30a44692340aed4f0df2aa58e8c9552f8ef872b7a872b29516fa2068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d675d9797e8541b6fff5df78a8d0701a51d23b03bd2d489d0489a82f440b4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d675d9797e8541b6fff5df78a8d0701a51d23b03bd2d489d0489a82f440b4a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aacdb742861442ec0e7b14904397b4e1a4bd115cd56e6105b37c42cb762b0dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aacdb742861442ec0e7b14904397b4e1a4bd115cd56e6105b37c42cb762b0dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f6c31e7404be56f11fd658c5e92f44de4704b9c2d5ebbce1fd0af40d2ca8c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1f6c31e7404be56f11fd658c5e92f44de4704b9c2d5ebbce1fd0af40d2ca8c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74da2d6dc28ebfcbfff21e47ebad84a906c4686fddf32d04eb494be40d811b86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74da2d6dc28ebfcbfff21e47ebad84a906c4686fddf32d04eb494be40d811b86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d1c4f2763fc5ea8ecd81a1c6aff31d291156a4db150fc70c0111b811bd5467c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d1c4f2763fc5ea8ecd81a1c6aff31d291156a4db150fc70c0111b811bd5467c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f313d83849679400f0ef2223f644c042290a33d55cab935fdbe9af123c88d210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f313d83849679400f0ef2223f644c042290a33d55cab935fdbe9af123c88d210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nzp64\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:49Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:49 crc kubenswrapper[4691]: I0930 06:19:49.531309 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b6df86867d5427797ec0c0a28b653ef5b8ec62af450207d9af5f490df725b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04bc58a7b740ca100f333ef769a949f962ff6024f8dd87b798c99a28514e0394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d95203bc3297779316f06d001c72859f9b4561a9a9d7c3123c5a75e8d8bc6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f46927454f67993366632bb3f6e37e4cb0c5f013266b18c1ac61e9cddc42e023\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20789e47d7e584044b0763af1eb8b81bbde2f247ec14e56a5a9b08c28894c3be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d002e869e911c24f154df90324ace473c747a4cfa6dd60ddcbef6473f9815f72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1d7e580b7d8029c47c3d2878f40287e2bbd9672c12a48dc6b4d598426e4655fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d7e580b7d8029c47c3d2878f40287e2bbd9672c12a48dc6b4d598426e4655fa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T06:19:49Z\\\",\\\"message\\\":\\\"val\\\\nI0930 06:19:48.989528 6009 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0930 06:19:48.989574 6009 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0930 06:19:48.989589 6009 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0930 06:19:48.989608 6009 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 06:19:48.989611 6009 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0930 06:19:48.989615 6009 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0930 06:19:48.989633 6009 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 06:19:48.989645 6009 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0930 06:19:48.989661 6009 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 06:19:48.989659 6009 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0930 06:19:48.989674 6009 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0930 06:19:48.989685 6009 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 06:19:48.989686 6009 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0930 06:19:48.989707 6009 factory.go:656] Stopping watch factory\\\\nI0930 06:19:48.989725 6009 ovnkube.go:599] Stopped ovnkube\\\\nI0930 06:19:48.989726 6009 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 06:19:4\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f49abce591ae5c912fdbbdf0253d06ee20088424718dd6d535683a2a2ede5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sjmvw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:49Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:49 crc kubenswrapper[4691]: I0930 06:19:49.548906 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11fb453d515798fa4f85603df19a5dc7d305375d25c0f679598613aba7312328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:49Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:49 crc kubenswrapper[4691]: I0930 06:19:49.554421 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:49 crc kubenswrapper[4691]: I0930 06:19:49.554461 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:49 crc kubenswrapper[4691]: I0930 06:19:49.554472 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:49 crc kubenswrapper[4691]: I0930 06:19:49.554488 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:49 crc kubenswrapper[4691]: I0930 06:19:49.554502 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:49Z","lastTransitionTime":"2025-09-30T06:19:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:49 crc kubenswrapper[4691]: I0930 06:19:49.567491 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca08f5d521940a428454d90f85150e7b87e2353314c89a42b4c73740c3b23163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:49Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:49 crc kubenswrapper[4691]: I0930 06:19:49.582396 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xjjw8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bfd073c-4582-4a65-8170-7030f4852174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a065eb18eb909fb60011f5ae6adf327088480996585d74e861628ffdd759bd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xjjw8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:49Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:49 crc kubenswrapper[4691]: I0930 06:19:49.598766 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-p7wmt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99a6e728-8795-424d-a99e-7141c75baad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://746d60ed8b3230cabda106f1d2e4bc8496618515f2cfb2d6c4238375f71ad2af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6jfq9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-p7wmt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:49Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:49 crc kubenswrapper[4691]: I0930 06:19:49.618566 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:49Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:49 crc kubenswrapper[4691]: I0930 06:19:49.637879 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb1cfb02-1e6b-4bfb-8104-f1d137231de9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07fbce63750320e7345e1a99b4be96355f54c36c70e8929599b990eab3f3e637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38c9aa49f81f590d234daa993bb92af0c48103b45f5116c27b86daa651930a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acbe061d62e7bf11b0003c035dce19386981672d5e63236fada6e43b92274c91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fc15eff68924ff75a90b9cb7b5119d95ced9b9fe9e12968bfd077db73f37ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://017d7c032ea394e2dbef22cfc837c65ce54aab7db5fbd32210312b3e7042a698\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T06:19:31Z\\\",\\\"message\\\":\\\"W0930 06:19:20.583735 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 06:19:20.584237 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759213160 cert, and key in /tmp/serving-cert-2229078217/serving-signer.crt, /tmp/serving-cert-2229078217/serving-signer.key\\\\nI0930 06:19:20.815054 1 observer_polling.go:159] Starting file observer\\\\nW0930 06:19:20.818069 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 06:19:20.818377 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 06:19:20.821284 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2229078217/tls.crt::/tmp/serving-cert-2229078217/tls.key\\\\\\\"\\\\nF0930 06:19:31.389088 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39d7f1cd59b9a24d4303415d873bac1ed21326847749130167868b5298b7f8e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:49Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:49 crc kubenswrapper[4691]: I0930 06:19:49.657084 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:49 crc kubenswrapper[4691]: I0930 06:19:49.657105 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:49 crc kubenswrapper[4691]: I0930 06:19:49.657115 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:49 crc kubenswrapper[4691]: I0930 06:19:49.657129 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:49 crc kubenswrapper[4691]: I0930 06:19:49.657141 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:49Z","lastTransitionTime":"2025-09-30T06:19:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:49 crc kubenswrapper[4691]: I0930 06:19:49.659938 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:49Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:49 crc kubenswrapper[4691]: I0930 06:19:49.679912 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03f725fd435423024f16548f2fdd44ad1ddc26d1d485d56cc5b614ab2cac7a76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5fcea9ccb5c1331b7ae737abdcc80364814715f3c43c15793316f16acf0502a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:49Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:49 crc kubenswrapper[4691]: I0930 06:19:49.702434 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:49Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:49 crc kubenswrapper[4691]: I0930 06:19:49.718424 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8htrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c1c8663-d263-4b8c-93fa-05ee1b61d7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73cf34ef0b11305b959da324b55f8a211f0c278cf2dec835bb6c91ad385e4dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj86t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8htrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:49Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:49 crc kubenswrapper[4691]: I0930 06:19:49.732595 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b46ade-8260-448f-84b7-506632d23ff9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff4769002bae0f06423979897a58fdb4b2a78f9ea00abc1d5df5c8d4385c0919\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xszvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1440cd9bacd9b28d5e192b3d9143dda931c741606fdb8f246fba23b8a68c534c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xszvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4w4k6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:49Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:49 crc kubenswrapper[4691]: I0930 06:19:49.756622 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e55ae5a0-03bf-427d-92b2-39cdff18340d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f6cedf7ad601b81415785791be352c849e1d559ab8e6a0f33de2c89a93cf09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65513dc3f11ab14acbafe004e02c1b395b74935e84a00e75d9936bde03a97fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6240aa4c18020b30de9389e7a1869c736424444208d4011f228b4e5432520cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2841452dae413fa8cdf532c4933ed52cfb007be43a45ee2d25209b54890fd351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b052f8474075dfc1f07b8e0844072d5bbaa94bae1711f7e4cecc6928a65689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:49Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:49 crc kubenswrapper[4691]: I0930 06:19:49.764448 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:49 crc kubenswrapper[4691]: I0930 06:19:49.764493 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:49 crc kubenswrapper[4691]: I0930 06:19:49.764507 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:49 crc kubenswrapper[4691]: I0930 06:19:49.764527 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:49 crc kubenswrapper[4691]: I0930 06:19:49.764544 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:49Z","lastTransitionTime":"2025-09-30T06:19:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:49 crc kubenswrapper[4691]: I0930 06:19:49.776806 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3ecaa3-192f-40d4-abf8-938b9e03a661\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://469f62522e2f813405840a8654f761a52163bdabeba8cc7c57998ccd40548370\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198c76a83115408261c01c78cbd2845e487ad3dc896da0130b2b1338349506d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aadde3a7a603aae354d4f9dcf5ef288aeceabfb73ee3c92398fb3419326b27a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2f054ea0da71d1f65e501212a66771c5e65bfc123e3a3acd5e13900def039d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:49Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:49 crc kubenswrapper[4691]: I0930 06:19:49.867641 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:49 crc kubenswrapper[4691]: I0930 06:19:49.867688 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:49 crc kubenswrapper[4691]: I0930 06:19:49.867704 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:49 crc kubenswrapper[4691]: I0930 06:19:49.867732 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:49 crc kubenswrapper[4691]: I0930 06:19:49.867749 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:49Z","lastTransitionTime":"2025-09-30T06:19:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:49 crc kubenswrapper[4691]: I0930 06:19:49.971070 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:49 crc kubenswrapper[4691]: I0930 06:19:49.971135 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:49 crc kubenswrapper[4691]: I0930 06:19:49.971165 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:49 crc kubenswrapper[4691]: I0930 06:19:49.971190 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:49 crc kubenswrapper[4691]: I0930 06:19:49.971208 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:49Z","lastTransitionTime":"2025-09-30T06:19:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:50 crc kubenswrapper[4691]: I0930 06:19:50.074205 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:50 crc kubenswrapper[4691]: I0930 06:19:50.074261 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:50 crc kubenswrapper[4691]: I0930 06:19:50.074276 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:50 crc kubenswrapper[4691]: I0930 06:19:50.074295 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:50 crc kubenswrapper[4691]: I0930 06:19:50.074310 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:50Z","lastTransitionTime":"2025-09-30T06:19:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:50 crc kubenswrapper[4691]: I0930 06:19:50.177080 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:50 crc kubenswrapper[4691]: I0930 06:19:50.177127 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:50 crc kubenswrapper[4691]: I0930 06:19:50.177141 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:50 crc kubenswrapper[4691]: I0930 06:19:50.177158 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:50 crc kubenswrapper[4691]: I0930 06:19:50.177171 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:50Z","lastTransitionTime":"2025-09-30T06:19:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:50 crc kubenswrapper[4691]: I0930 06:19:50.224621 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 06:19:50 crc kubenswrapper[4691]: E0930 06:19:50.224766 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 06:19:50 crc kubenswrapper[4691]: I0930 06:19:50.280227 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:50 crc kubenswrapper[4691]: I0930 06:19:50.280270 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:50 crc kubenswrapper[4691]: I0930 06:19:50.280286 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:50 crc kubenswrapper[4691]: I0930 06:19:50.280308 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:50 crc kubenswrapper[4691]: I0930 06:19:50.280325 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:50Z","lastTransitionTime":"2025-09-30T06:19:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:50 crc kubenswrapper[4691]: I0930 06:19:50.383232 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:50 crc kubenswrapper[4691]: I0930 06:19:50.383301 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:50 crc kubenswrapper[4691]: I0930 06:19:50.383313 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:50 crc kubenswrapper[4691]: I0930 06:19:50.383330 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:50 crc kubenswrapper[4691]: I0930 06:19:50.383342 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:50Z","lastTransitionTime":"2025-09-30T06:19:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:50 crc kubenswrapper[4691]: I0930 06:19:50.469436 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sjmvw_6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d/ovnkube-controller/0.log" Sep 30 06:19:50 crc kubenswrapper[4691]: I0930 06:19:50.473928 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" event={"ID":"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d","Type":"ContainerStarted","Data":"62876e9f7bbf8224530d8666058b979352841ae2e2236cdf0265027e50fa60a6"} Sep 30 06:19:50 crc kubenswrapper[4691]: I0930 06:19:50.474191 4691 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 06:19:50 crc kubenswrapper[4691]: I0930 06:19:50.486498 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:50 crc kubenswrapper[4691]: I0930 06:19:50.486535 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:50 crc kubenswrapper[4691]: I0930 06:19:50.486547 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:50 crc kubenswrapper[4691]: I0930 06:19:50.486565 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:50 crc kubenswrapper[4691]: I0930 06:19:50.486579 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:50Z","lastTransitionTime":"2025-09-30T06:19:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:50 crc kubenswrapper[4691]: I0930 06:19:50.498432 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nzp64" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b7ceecf30a44692340aed4f0df2aa58e8c9552f8ef872b7a872b29516fa2068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d675d9797e8541b6fff5df78a8d0701a51d23b03bd2d489d0489a82f440b4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d675d9797e8541b6fff5df78a8d0701a51d23b03bd2d489d0489a82f440b4a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aacdb742861442ec0e7b14904397b4e1a4bd115cd56e6105b37c42cb762b0dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aacdb742861442ec0e7b14904397b4e1a4bd115cd56e6105b37c42cb762b0dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f6c31e7404be56f11fd658c5e92f44de4704b9c2d5ebbce1fd0af40d2ca8c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1f6c31e7404be56f11fd658c5e92f44de4704b9c2d5ebbce1fd0af40d2ca8c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74da2d6dc28ebfcbfff21e47ebad84a906c4686fddf32d04eb494be40d811b86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74da2d6dc28ebfcbfff21e47ebad84a906c4686fddf32d04eb494be40d811b86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d1c4f2763fc5ea8ecd81a1c6aff31d291156a4db150fc70c0111b811bd5467c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d1c4f2763fc5ea8ecd81a1c6aff31d291156a4db150fc70c0111b811bd5467c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f313d83849679400f0ef2223f644c042290a33d55cab935fdbe9af123c88d210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f313d83849679400f0ef2223f644c042290a33d55cab935fdbe9af123c88d210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nzp64\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:50Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:50 crc kubenswrapper[4691]: I0930 06:19:50.528025 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b6df86867d5427797ec0c0a28b653ef5b8ec62af450207d9af5f490df725b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04bc58a7b740ca100f333ef769a949f962ff6024f8dd87b798c99a28514e0394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d95203bc3297779316f06d001c72859f9b4561a9a9d7c3123c5a75e8d8bc6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f46927454f67993366632bb3f6e37e4cb0c5f013266b18c1ac61e9cddc42e023\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20789e47d7e584044b0763af1eb8b81bbde2f247ec14e56a5a9b08c28894c3be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d002e869e911c24f154df90324ace473c747a4cfa6dd60ddcbef6473f9815f72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62876e9f7bbf8224530d8666058b979352841ae2e2236cdf0265027e50fa60a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d7e580b7d8029c47c3d2878f40287e2bbd9672c12a48dc6b4d598426e4655fa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T06:19:49Z\\\",\\\"message\\\":\\\"val\\\\nI0930 06:19:48.989528 6009 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0930 06:19:48.989574 6009 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0930 06:19:48.989589 6009 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0930 06:19:48.989608 6009 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 06:19:48.989611 6009 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0930 06:19:48.989615 6009 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0930 06:19:48.989633 6009 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 06:19:48.989645 6009 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0930 06:19:48.989661 6009 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 06:19:48.989659 6009 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0930 06:19:48.989674 6009 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0930 06:19:48.989685 6009 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 06:19:48.989686 6009 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0930 06:19:48.989707 6009 factory.go:656] Stopping watch factory\\\\nI0930 06:19:48.989725 6009 ovnkube.go:599] Stopped ovnkube\\\\nI0930 06:19:48.989726 6009 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 06:19:4\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f49abce591ae5c912fdbbdf0253d06ee20088424718dd6d535683a2a2ede5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sjmvw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:50Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:50 crc kubenswrapper[4691]: I0930 06:19:50.547778 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11fb453d515798fa4f85603df19a5dc7d305375d25c0f679598613aba7312328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:50Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:50 crc kubenswrapper[4691]: I0930 06:19:50.562936 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca08f5d521940a428454d90f85150e7b87e2353314c89a42b4c73740c3b23163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:50Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:50 crc kubenswrapper[4691]: I0930 06:19:50.581943 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xjjw8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bfd073c-4582-4a65-8170-7030f4852174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a065eb18eb909fb60011f5ae6adf327088480996585d74e861628ffdd759bd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xjjw8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:50Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:50 crc kubenswrapper[4691]: I0930 06:19:50.588787 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:50 crc kubenswrapper[4691]: I0930 06:19:50.588826 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:50 crc kubenswrapper[4691]: I0930 06:19:50.588837 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:50 crc kubenswrapper[4691]: I0930 06:19:50.588855 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:50 crc kubenswrapper[4691]: I0930 06:19:50.588868 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:50Z","lastTransitionTime":"2025-09-30T06:19:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:50 crc kubenswrapper[4691]: I0930 06:19:50.598499 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-p7wmt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99a6e728-8795-424d-a99e-7141c75baad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://746d60ed8b3230cabda106f1d2e4bc8496618515f2cfb2d6c4238375f71ad2af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6jfq9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-p7wmt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:50Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:50 crc kubenswrapper[4691]: I0930 06:19:50.615558 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:50Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:50 crc kubenswrapper[4691]: I0930 06:19:50.634696 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb1cfb02-1e6b-4bfb-8104-f1d137231de9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07fbce63750320e7345e1a99b4be96355f54c36c70e8929599b990eab3f3e637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38c9aa49f81f590d234daa993bb92af0c48103b45f5116c27b86daa651930a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acbe061d62e7bf11b0003c035dce19386981672d5e63236fada6e43b92274c91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fc15eff68924ff75a90b9cb7b5119d95ced9b9fe9e12968bfd077db73f37ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://017d7c032ea394e2dbef22cfc837c65ce54aab7db5fbd32210312b3e7042a698\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T06:19:31Z\\\",\\\"message\\\":\\\"W0930 06:19:20.583735 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 06:19:20.584237 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759213160 cert, and key in /tmp/serving-cert-2229078217/serving-signer.crt, /tmp/serving-cert-2229078217/serving-signer.key\\\\nI0930 06:19:20.815054 1 observer_polling.go:159] Starting file observer\\\\nW0930 06:19:20.818069 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 06:19:20.818377 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 06:19:20.821284 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2229078217/tls.crt::/tmp/serving-cert-2229078217/tls.key\\\\\\\"\\\\nF0930 06:19:31.389088 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39d7f1cd59b9a24d4303415d873bac1ed21326847749130167868b5298b7f8e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:50Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:50 crc kubenswrapper[4691]: I0930 06:19:50.648359 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:50Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:50 crc kubenswrapper[4691]: I0930 06:19:50.662433 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03f725fd435423024f16548f2fdd44ad1ddc26d1d485d56cc5b614ab2cac7a76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5fcea9ccb5c1331b7ae737abdcc80364814715f3c43c15793316f16acf0502a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:50Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:50 crc kubenswrapper[4691]: I0930 06:19:50.677632 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:50Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:50 crc kubenswrapper[4691]: I0930 06:19:50.690721 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8htrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c1c8663-d263-4b8c-93fa-05ee1b61d7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73cf34ef0b11305b959da324b55f8a211f0c278cf2dec835bb6c91ad385e4dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj86t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8htrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:50Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:50 crc kubenswrapper[4691]: I0930 06:19:50.691193 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:50 crc kubenswrapper[4691]: I0930 06:19:50.691287 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:50 crc kubenswrapper[4691]: I0930 06:19:50.691302 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:50 crc kubenswrapper[4691]: I0930 06:19:50.691317 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:50 crc kubenswrapper[4691]: I0930 06:19:50.691330 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:50Z","lastTransitionTime":"2025-09-30T06:19:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:50 crc kubenswrapper[4691]: I0930 06:19:50.703129 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b46ade-8260-448f-84b7-506632d23ff9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff4769002bae0f06423979897a58fdb4b2a78f9ea00abc1d5df5c8d4385c0919\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xszvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1440cd9bacd9b28d5e192b3d9143dda931c741606fdb8f246fba23b8a68c534c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xszvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4w4k6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:50Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:50 crc kubenswrapper[4691]: I0930 06:19:50.733026 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e55ae5a0-03bf-427d-92b2-39cdff18340d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f6cedf7ad601b81415785791be352c849e1d559ab8e6a0f33de2c89a93cf09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65513dc3f11ab14acbafe004e02c1b395b74935e84a00e75d9936bde03a97fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6240aa4c18020b30de9389e7a1869c736424444208d4011f228b4e5432520cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2841452dae413fa8cdf532c4933ed52cfb007be43a45ee2d25209b54890fd351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b052f8474075dfc1f07b8e0844072d5bbaa94bae1711f7e4cecc6928a65689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:50Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:50 crc kubenswrapper[4691]: I0930 06:19:50.750508 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3ecaa3-192f-40d4-abf8-938b9e03a661\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://469f62522e2f813405840a8654f761a52163bdabeba8cc7c57998ccd40548370\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198c76a83115408261c01c78cbd2845e487ad3dc896da0130b2b1338349506d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aadde3a7a603aae354d4f9dcf5ef288aeceabfb73ee3c92398fb3419326b27a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2f054ea0da71d1f65e501212a66771c5e65bfc123e3a3acd5e13900def039d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:50Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:50 crc kubenswrapper[4691]: I0930 06:19:50.794198 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:50 crc kubenswrapper[4691]: I0930 06:19:50.794241 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:50 crc kubenswrapper[4691]: I0930 06:19:50.794256 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:50 crc kubenswrapper[4691]: I0930 06:19:50.794281 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:50 crc kubenswrapper[4691]: I0930 06:19:50.794300 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:50Z","lastTransitionTime":"2025-09-30T06:19:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:50 crc kubenswrapper[4691]: I0930 06:19:50.898105 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:50 crc kubenswrapper[4691]: I0930 06:19:50.898171 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:50 crc kubenswrapper[4691]: I0930 06:19:50.898189 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:50 crc kubenswrapper[4691]: I0930 06:19:50.898213 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:50 crc kubenswrapper[4691]: I0930 06:19:50.898230 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:50Z","lastTransitionTime":"2025-09-30T06:19:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.001132 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.001211 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.001234 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.001269 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.001292 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:51Z","lastTransitionTime":"2025-09-30T06:19:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.104080 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.104154 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.104172 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.104639 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.104693 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:51Z","lastTransitionTime":"2025-09-30T06:19:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.207627 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.207709 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.207734 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.207763 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.207786 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:51Z","lastTransitionTime":"2025-09-30T06:19:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.224164 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.224290 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 06:19:51 crc kubenswrapper[4691]: E0930 06:19:51.224389 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 06:19:51 crc kubenswrapper[4691]: E0930 06:19:51.224503 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.310471 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.310532 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.310554 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.310582 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.310605 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:51Z","lastTransitionTime":"2025-09-30T06:19:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.415327 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.415433 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.415451 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.415598 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.415621 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:51Z","lastTransitionTime":"2025-09-30T06:19:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.482487 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sjmvw_6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d/ovnkube-controller/1.log" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.483532 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sjmvw_6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d/ovnkube-controller/0.log" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.488121 4691 generic.go:334] "Generic (PLEG): container finished" podID="6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" containerID="62876e9f7bbf8224530d8666058b979352841ae2e2236cdf0265027e50fa60a6" exitCode=1 Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.488207 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" event={"ID":"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d","Type":"ContainerDied","Data":"62876e9f7bbf8224530d8666058b979352841ae2e2236cdf0265027e50fa60a6"} Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.488306 4691 scope.go:117] "RemoveContainer" containerID="1d7e580b7d8029c47c3d2878f40287e2bbd9672c12a48dc6b4d598426e4655fa" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.489457 4691 scope.go:117] "RemoveContainer" containerID="62876e9f7bbf8224530d8666058b979352841ae2e2236cdf0265027e50fa60a6" Sep 30 06:19:51 crc kubenswrapper[4691]: E0930 06:19:51.489723 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-sjmvw_openshift-ovn-kubernetes(6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d)\"" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" podUID="6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.512804 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:51Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.518487 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.518541 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.518559 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.518581 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.518600 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:51Z","lastTransitionTime":"2025-09-30T06:19:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.533735 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca08f5d521940a428454d90f85150e7b87e2353314c89a42b4c73740c3b23163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:51Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.549181 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xjjw8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bfd073c-4582-4a65-8170-7030f4852174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a065eb18eb909fb60011f5ae6adf327088480996585d74e861628ffdd759bd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xjjw8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:51Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.564508 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-p7wmt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99a6e728-8795-424d-a99e-7141c75baad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://746d60ed8b3230cabda106f1d2e4bc8496618515f2cfb2d6c4238375f71ad2af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6jfq9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-p7wmt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:51Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.588387 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e55ae5a0-03bf-427d-92b2-39cdff18340d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f6cedf7ad601b81415785791be352c849e1d559ab8e6a0f33de2c89a93cf09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65513dc3f11ab14acbafe004e02c1b395b74935e84a00e75d9936bde03a97fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6240aa4c18020b30de9389e7a1869c736424444208d4011f228b4e5432520cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2841452dae413fa8cdf532c4933ed52cfb007be43a45ee2d25209b54890fd351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b052f8474075dfc1f07b8e0844072d5bbaa94bae1711f7e4cecc6928a65689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:51Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.604719 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb1cfb02-1e6b-4bfb-8104-f1d137231de9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07fbce63750320e7345e1a99b4be96355f54c36c70e8929599b990eab3f3e637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38c9aa49f81f590d234daa993bb92af0c48103b45f5116c27b86daa651930a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acbe061d62e7bf11b0003c035dce19386981672d5e63236fada6e43b92274c91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fc15eff68924ff75a90b9cb7b5119d95ced9b9fe9e12968bfd077db73f37ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://017d7c032ea394e2dbef22cfc837c65ce54aab7db5fbd32210312b3e7042a698\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T06:19:31Z\\\",\\\"message\\\":\\\"W0930 06:19:20.583735 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 06:19:20.584237 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759213160 cert, and key in /tmp/serving-cert-2229078217/serving-signer.crt, /tmp/serving-cert-2229078217/serving-signer.key\\\\nI0930 06:19:20.815054 1 observer_polling.go:159] Starting file observer\\\\nW0930 06:19:20.818069 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 06:19:20.818377 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 06:19:20.821284 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2229078217/tls.crt::/tmp/serving-cert-2229078217/tls.key\\\\\\\"\\\\nF0930 06:19:31.389088 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39d7f1cd59b9a24d4303415d873bac1ed21326847749130167868b5298b7f8e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:51Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.621071 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:51Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.622722 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.622768 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.622785 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.622808 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.622825 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:51Z","lastTransitionTime":"2025-09-30T06:19:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.640650 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fxv8w"] Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.640869 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03f725fd435423024f16548f2fdd44ad1ddc26d1d485d56cc5b614ab2cac7a76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5fcea9ccb5c1331b7ae737abdcc80364814715f3c43c15793316f16acf0502a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:51Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.641339 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fxv8w" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.644616 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.645003 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.660725 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:51Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.675957 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8htrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c1c8663-d263-4b8c-93fa-05ee1b61d7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73cf34ef0b11305b959da324b55f8a211f0c278cf2dec835bb6c91ad385e4dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj86t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8htrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:51Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.691717 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b46ade-8260-448f-84b7-506632d23ff9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff4769002bae0f06423979897a58fdb4b2a78f9ea00abc1d5df5c8d4385c0919\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xszvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1440cd9bacd9b28d5e192b3d9143dda931c741606fdb8f246fba23b8a68c534c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xszvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4w4k6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:51Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.706533 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/37b1a1c7-92d7-41ea-b5c1-4a56b40f819e-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-fxv8w\" (UID: \"37b1a1c7-92d7-41ea-b5c1-4a56b40f819e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fxv8w" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.706584 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/37b1a1c7-92d7-41ea-b5c1-4a56b40f819e-env-overrides\") pod \"ovnkube-control-plane-749d76644c-fxv8w\" (UID: \"37b1a1c7-92d7-41ea-b5c1-4a56b40f819e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fxv8w" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.706721 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4h4k7\" (UniqueName: \"kubernetes.io/projected/37b1a1c7-92d7-41ea-b5c1-4a56b40f819e-kube-api-access-4h4k7\") pod \"ovnkube-control-plane-749d76644c-fxv8w\" (UID: \"37b1a1c7-92d7-41ea-b5c1-4a56b40f819e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fxv8w" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.706962 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/37b1a1c7-92d7-41ea-b5c1-4a56b40f819e-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-fxv8w\" (UID: \"37b1a1c7-92d7-41ea-b5c1-4a56b40f819e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fxv8w" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.709122 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3ecaa3-192f-40d4-abf8-938b9e03a661\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://469f62522e2f813405840a8654f761a52163bdabeba8cc7c57998ccd40548370\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198c76a83115408261c01c78cbd2845e487ad3dc896da0130b2b1338349506d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aadde3a7a603aae354d4f9dcf5ef288aeceabfb73ee3c92398fb3419326b27a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2f054ea0da71d1f65e501212a66771c5e65bfc123e3a3acd5e13900def039d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:51Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.725147 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11fb453d515798fa4f85603df19a5dc7d305375d25c0f679598613aba7312328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:51Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.726132 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.726182 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.726200 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.726221 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.726236 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:51Z","lastTransitionTime":"2025-09-30T06:19:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.745142 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nzp64" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b7ceecf30a44692340aed4f0df2aa58e8c9552f8ef872b7a872b29516fa2068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d675d9797e8541b6fff5df78a8d0701a51d23b03bd2d489d0489a82f440b4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d675d9797e8541b6fff5df78a8d0701a51d23b03bd2d489d0489a82f440b4a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aacdb742861442ec0e7b14904397b4e1a4bd115cd56e6105b37c42cb762b0dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aacdb742861442ec0e7b14904397b4e1a4bd115cd56e6105b37c42cb762b0dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f6c31e7404be56f11fd658c5e92f44de4704b9c2d5ebbce1fd0af40d2ca8c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1f6c31e7404be56f11fd658c5e92f44de4704b9c2d5ebbce1fd0af40d2ca8c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74da2d6dc28ebfcbfff21e47ebad84a906c4686fddf32d04eb494be40d811b86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74da2d6dc28ebfcbfff21e47ebad84a906c4686fddf32d04eb494be40d811b86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d1c4f2763fc5ea8ecd81a1c6aff31d291156a4db150fc70c0111b811bd5467c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d1c4f2763fc5ea8ecd81a1c6aff31d291156a4db150fc70c0111b811bd5467c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f313d83849679400f0ef2223f644c042290a33d55cab935fdbe9af123c88d210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f313d83849679400f0ef2223f644c042290a33d55cab935fdbe9af123c88d210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nzp64\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:51Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.774280 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b6df86867d5427797ec0c0a28b653ef5b8ec62af450207d9af5f490df725b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04bc58a7b740ca100f333ef769a949f962ff6024f8dd87b798c99a28514e0394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d95203bc3297779316f06d001c72859f9b4561a9a9d7c3123c5a75e8d8bc6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f46927454f67993366632bb3f6e37e4cb0c5f013266b18c1ac61e9cddc42e023\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20789e47d7e584044b0763af1eb8b81bbde2f247ec14e56a5a9b08c28894c3be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d002e869e911c24f154df90324ace473c747a4cfa6dd60ddcbef6473f9815f72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62876e9f7bbf8224530d8666058b979352841ae2e2236cdf0265027e50fa60a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d7e580b7d8029c47c3d2878f40287e2bbd9672c12a48dc6b4d598426e4655fa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T06:19:49Z\\\",\\\"message\\\":\\\"val\\\\nI0930 06:19:48.989528 6009 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0930 06:19:48.989574 6009 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0930 06:19:48.989589 6009 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0930 06:19:48.989608 6009 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 06:19:48.989611 6009 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0930 06:19:48.989615 6009 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0930 06:19:48.989633 6009 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 06:19:48.989645 6009 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0930 06:19:48.989661 6009 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 06:19:48.989659 6009 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0930 06:19:48.989674 6009 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0930 06:19:48.989685 6009 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 06:19:48.989686 6009 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0930 06:19:48.989707 6009 factory.go:656] Stopping watch factory\\\\nI0930 06:19:48.989725 6009 ovnkube.go:599] Stopped ovnkube\\\\nI0930 06:19:48.989726 6009 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 06:19:4\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62876e9f7bbf8224530d8666058b979352841ae2e2236cdf0265027e50fa60a6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T06:19:50Z\\\",\\\"message\\\":\\\"365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI0930 06:19:50.520239 6155 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0930 06:19:50.520245 6155 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-target-xd92c in node crc\\\\nI0930 06:19:50.520244 6155 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI0930 06:19:50.520170 6155 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-cluster-version/cluster-version-operator]} name:Service_openshift-cluster-version/cluster-version-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.182:9099:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61d39e4d-21a9-4387-9a2b-fa4ad14792e2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0930 06:19:50.520261 6155 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI0930 06:19:50.520262 6155 obj_retry.go:303] Retry object setup: *v1.Pod open\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f49abce591ae5c912fdbbdf0253d06ee20088424718dd6d535683a2a2ede5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sjmvw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:51Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.793956 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:51Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.807983 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/37b1a1c7-92d7-41ea-b5c1-4a56b40f819e-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-fxv8w\" (UID: \"37b1a1c7-92d7-41ea-b5c1-4a56b40f819e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fxv8w" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.808024 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/37b1a1c7-92d7-41ea-b5c1-4a56b40f819e-env-overrides\") pod \"ovnkube-control-plane-749d76644c-fxv8w\" (UID: \"37b1a1c7-92d7-41ea-b5c1-4a56b40f819e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fxv8w" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.808084 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4h4k7\" (UniqueName: \"kubernetes.io/projected/37b1a1c7-92d7-41ea-b5c1-4a56b40f819e-kube-api-access-4h4k7\") pod \"ovnkube-control-plane-749d76644c-fxv8w\" (UID: \"37b1a1c7-92d7-41ea-b5c1-4a56b40f819e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fxv8w" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.808108 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/37b1a1c7-92d7-41ea-b5c1-4a56b40f819e-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-fxv8w\" (UID: \"37b1a1c7-92d7-41ea-b5c1-4a56b40f819e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fxv8w" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.808865 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/37b1a1c7-92d7-41ea-b5c1-4a56b40f819e-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-fxv8w\" (UID: \"37b1a1c7-92d7-41ea-b5c1-4a56b40f819e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fxv8w" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.809335 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/37b1a1c7-92d7-41ea-b5c1-4a56b40f819e-env-overrides\") pod \"ovnkube-control-plane-749d76644c-fxv8w\" (UID: \"37b1a1c7-92d7-41ea-b5c1-4a56b40f819e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fxv8w" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.813634 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca08f5d521940a428454d90f85150e7b87e2353314c89a42b4c73740c3b23163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:51Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.813924 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/37b1a1c7-92d7-41ea-b5c1-4a56b40f819e-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-fxv8w\" (UID: \"37b1a1c7-92d7-41ea-b5c1-4a56b40f819e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fxv8w" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.829257 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.829290 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.829301 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.829316 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.829328 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:51Z","lastTransitionTime":"2025-09-30T06:19:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.833231 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xjjw8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bfd073c-4582-4a65-8170-7030f4852174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a065eb18eb909fb60011f5ae6adf327088480996585d74e861628ffdd759bd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xjjw8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:51Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.834574 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4h4k7\" (UniqueName: \"kubernetes.io/projected/37b1a1c7-92d7-41ea-b5c1-4a56b40f819e-kube-api-access-4h4k7\") pod \"ovnkube-control-plane-749d76644c-fxv8w\" (UID: \"37b1a1c7-92d7-41ea-b5c1-4a56b40f819e\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fxv8w" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.847282 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-p7wmt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99a6e728-8795-424d-a99e-7141c75baad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://746d60ed8b3230cabda106f1d2e4bc8496618515f2cfb2d6c4238375f71ad2af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6jfq9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-p7wmt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:51Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.863278 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fxv8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37b1a1c7-92d7-41ea-b5c1-4a56b40f819e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4h4k7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4h4k7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fxv8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:51Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.895043 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e55ae5a0-03bf-427d-92b2-39cdff18340d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f6cedf7ad601b81415785791be352c849e1d559ab8e6a0f33de2c89a93cf09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65513dc3f11ab14acbafe004e02c1b395b74935e84a00e75d9936bde03a97fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6240aa4c18020b30de9389e7a1869c736424444208d4011f228b4e5432520cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2841452dae413fa8cdf532c4933ed52cfb007be43a45ee2d25209b54890fd351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b052f8474075dfc1f07b8e0844072d5bbaa94bae1711f7e4cecc6928a65689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:51Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.914445 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb1cfb02-1e6b-4bfb-8104-f1d137231de9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07fbce63750320e7345e1a99b4be96355f54c36c70e8929599b990eab3f3e637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38c9aa49f81f590d234daa993bb92af0c48103b45f5116c27b86daa651930a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acbe061d62e7bf11b0003c035dce19386981672d5e63236fada6e43b92274c91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fc15eff68924ff75a90b9cb7b5119d95ced9b9fe9e12968bfd077db73f37ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://017d7c032ea394e2dbef22cfc837c65ce54aab7db5fbd32210312b3e7042a698\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T06:19:31Z\\\",\\\"message\\\":\\\"W0930 06:19:20.583735 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 06:19:20.584237 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759213160 cert, and key in /tmp/serving-cert-2229078217/serving-signer.crt, /tmp/serving-cert-2229078217/serving-signer.key\\\\nI0930 06:19:20.815054 1 observer_polling.go:159] Starting file observer\\\\nW0930 06:19:20.818069 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 06:19:20.818377 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 06:19:20.821284 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2229078217/tls.crt::/tmp/serving-cert-2229078217/tls.key\\\\\\\"\\\\nF0930 06:19:31.389088 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39d7f1cd59b9a24d4303415d873bac1ed21326847749130167868b5298b7f8e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:51Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.930339 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:51Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.932945 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.933007 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.933025 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.933055 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.933072 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:51Z","lastTransitionTime":"2025-09-30T06:19:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.950231 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03f725fd435423024f16548f2fdd44ad1ddc26d1d485d56cc5b614ab2cac7a76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5fcea9ccb5c1331b7ae737abdcc80364814715f3c43c15793316f16acf0502a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:51Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.963335 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fxv8w" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.965833 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:51Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.981422 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8htrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c1c8663-d263-4b8c-93fa-05ee1b61d7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73cf34ef0b11305b959da324b55f8a211f0c278cf2dec835bb6c91ad385e4dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj86t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8htrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:51Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:51 crc kubenswrapper[4691]: W0930 06:19:51.986012 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37b1a1c7_92d7_41ea_b5c1_4a56b40f819e.slice/crio-affde1c54776a336ed50e9be225b10468f1248554135ce09d203a3e8a08d0009 WatchSource:0}: Error finding container affde1c54776a336ed50e9be225b10468f1248554135ce09d203a3e8a08d0009: Status 404 returned error can't find the container with id affde1c54776a336ed50e9be225b10468f1248554135ce09d203a3e8a08d0009 Sep 30 06:19:51 crc kubenswrapper[4691]: I0930 06:19:51.996495 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b46ade-8260-448f-84b7-506632d23ff9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff4769002bae0f06423979897a58fdb4b2a78f9ea00abc1d5df5c8d4385c0919\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xszvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1440cd9bacd9b28d5e192b3d9143dda931c741606fdb8f246fba23b8a68c534c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xszvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4w4k6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:51Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:52 crc kubenswrapper[4691]: I0930 06:19:52.015601 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3ecaa3-192f-40d4-abf8-938b9e03a661\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://469f62522e2f813405840a8654f761a52163bdabeba8cc7c57998ccd40548370\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198c76a83115408261c01c78cbd2845e487ad3dc896da0130b2b1338349506d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aadde3a7a603aae354d4f9dcf5ef288aeceabfb73ee3c92398fb3419326b27a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2f054ea0da71d1f65e501212a66771c5e65bfc123e3a3acd5e13900def039d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:52Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:52 crc kubenswrapper[4691]: I0930 06:19:52.035514 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:52 crc kubenswrapper[4691]: I0930 06:19:52.035572 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:52 crc kubenswrapper[4691]: I0930 06:19:52.035519 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11fb453d515798fa4f85603df19a5dc7d305375d25c0f679598613aba7312328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:52Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:52 crc kubenswrapper[4691]: I0930 06:19:52.035587 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:52 crc kubenswrapper[4691]: I0930 06:19:52.035874 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:52 crc kubenswrapper[4691]: I0930 06:19:52.035939 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:52Z","lastTransitionTime":"2025-09-30T06:19:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:52 crc kubenswrapper[4691]: I0930 06:19:52.063467 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nzp64" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b7ceecf30a44692340aed4f0df2aa58e8c9552f8ef872b7a872b29516fa2068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d675d9797e8541b6fff5df78a8d0701a51d23b03bd2d489d0489a82f440b4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d675d9797e8541b6fff5df78a8d0701a51d23b03bd2d489d0489a82f440b4a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aacdb742861442ec0e7b14904397b4e1a4bd115cd56e6105b37c42cb762b0dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aacdb742861442ec0e7b14904397b4e1a4bd115cd56e6105b37c42cb762b0dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f6c31e7404be56f11fd658c5e92f44de4704b9c2d5ebbce1fd0af40d2ca8c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1f6c31e7404be56f11fd658c5e92f44de4704b9c2d5ebbce1fd0af40d2ca8c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74da2d6dc28ebfcbfff21e47ebad84a906c4686fddf32d04eb494be40d811b86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74da2d6dc28ebfcbfff21e47ebad84a906c4686fddf32d04eb494be40d811b86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d1c4f2763fc5ea8ecd81a1c6aff31d291156a4db150fc70c0111b811bd5467c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d1c4f2763fc5ea8ecd81a1c6aff31d291156a4db150fc70c0111b811bd5467c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f313d83849679400f0ef2223f644c042290a33d55cab935fdbe9af123c88d210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f313d83849679400f0ef2223f644c042290a33d55cab935fdbe9af123c88d210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nzp64\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:52Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:52 crc kubenswrapper[4691]: I0930 06:19:52.096181 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b6df86867d5427797ec0c0a28b653ef5b8ec62af450207d9af5f490df725b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04bc58a7b740ca100f333ef769a949f962ff6024f8dd87b798c99a28514e0394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d95203bc3297779316f06d001c72859f9b4561a9a9d7c3123c5a75e8d8bc6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f46927454f67993366632bb3f6e37e4cb0c5f013266b18c1ac61e9cddc42e023\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20789e47d7e584044b0763af1eb8b81bbde2f247ec14e56a5a9b08c28894c3be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d002e869e911c24f154df90324ace473c747a4cfa6dd60ddcbef6473f9815f72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62876e9f7bbf8224530d8666058b979352841ae2e2236cdf0265027e50fa60a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d7e580b7d8029c47c3d2878f40287e2bbd9672c12a48dc6b4d598426e4655fa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T06:19:49Z\\\",\\\"message\\\":\\\"val\\\\nI0930 06:19:48.989528 6009 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0930 06:19:48.989574 6009 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0930 06:19:48.989589 6009 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0930 06:19:48.989608 6009 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 06:19:48.989611 6009 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0930 06:19:48.989615 6009 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0930 06:19:48.989633 6009 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 06:19:48.989645 6009 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0930 06:19:48.989661 6009 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 06:19:48.989659 6009 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0930 06:19:48.989674 6009 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0930 06:19:48.989685 6009 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 06:19:48.989686 6009 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0930 06:19:48.989707 6009 factory.go:656] Stopping watch factory\\\\nI0930 06:19:48.989725 6009 ovnkube.go:599] Stopped ovnkube\\\\nI0930 06:19:48.989726 6009 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 06:19:4\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62876e9f7bbf8224530d8666058b979352841ae2e2236cdf0265027e50fa60a6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T06:19:50Z\\\",\\\"message\\\":\\\"365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI0930 06:19:50.520239 6155 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0930 06:19:50.520245 6155 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-target-xd92c in node crc\\\\nI0930 06:19:50.520244 6155 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI0930 06:19:50.520170 6155 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-cluster-version/cluster-version-operator]} name:Service_openshift-cluster-version/cluster-version-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.182:9099:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61d39e4d-21a9-4387-9a2b-fa4ad14792e2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0930 06:19:50.520261 6155 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI0930 06:19:50.520262 6155 obj_retry.go:303] Retry object setup: *v1.Pod open\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f49abce591ae5c912fdbbdf0253d06ee20088424718dd6d535683a2a2ede5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sjmvw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:52Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:52 crc kubenswrapper[4691]: I0930 06:19:52.137953 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:52 crc kubenswrapper[4691]: I0930 06:19:52.138013 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:52 crc kubenswrapper[4691]: I0930 06:19:52.138031 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:52 crc kubenswrapper[4691]: I0930 06:19:52.138054 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:52 crc kubenswrapper[4691]: I0930 06:19:52.138074 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:52Z","lastTransitionTime":"2025-09-30T06:19:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:52 crc kubenswrapper[4691]: I0930 06:19:52.223714 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 06:19:52 crc kubenswrapper[4691]: E0930 06:19:52.223840 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 06:19:52 crc kubenswrapper[4691]: I0930 06:19:52.240532 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:52 crc kubenswrapper[4691]: I0930 06:19:52.240581 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:52 crc kubenswrapper[4691]: I0930 06:19:52.240594 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:52 crc kubenswrapper[4691]: I0930 06:19:52.240614 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:52 crc kubenswrapper[4691]: I0930 06:19:52.240627 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:52Z","lastTransitionTime":"2025-09-30T06:19:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:52 crc kubenswrapper[4691]: I0930 06:19:52.342842 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:52 crc kubenswrapper[4691]: I0930 06:19:52.342921 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:52 crc kubenswrapper[4691]: I0930 06:19:52.342934 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:52 crc kubenswrapper[4691]: I0930 06:19:52.342952 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:52 crc kubenswrapper[4691]: I0930 06:19:52.342964 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:52Z","lastTransitionTime":"2025-09-30T06:19:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:52 crc kubenswrapper[4691]: I0930 06:19:52.446576 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:52 crc kubenswrapper[4691]: I0930 06:19:52.446660 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:52 crc kubenswrapper[4691]: I0930 06:19:52.446687 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:52 crc kubenswrapper[4691]: I0930 06:19:52.446719 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:52 crc kubenswrapper[4691]: I0930 06:19:52.446738 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:52Z","lastTransitionTime":"2025-09-30T06:19:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:52 crc kubenswrapper[4691]: I0930 06:19:52.495125 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fxv8w" event={"ID":"37b1a1c7-92d7-41ea-b5c1-4a56b40f819e","Type":"ContainerStarted","Data":"b07ee418ef8f92ea69f7d64ef09e0de246432d4663c4c5018be94894860c6444"} Sep 30 06:19:52 crc kubenswrapper[4691]: I0930 06:19:52.495208 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fxv8w" event={"ID":"37b1a1c7-92d7-41ea-b5c1-4a56b40f819e","Type":"ContainerStarted","Data":"be3a8860307e499758a11dd6561cdf1ffd968b3edf9c701b52994ea4cfe1c129"} Sep 30 06:19:52 crc kubenswrapper[4691]: I0930 06:19:52.495239 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fxv8w" event={"ID":"37b1a1c7-92d7-41ea-b5c1-4a56b40f819e","Type":"ContainerStarted","Data":"affde1c54776a336ed50e9be225b10468f1248554135ce09d203a3e8a08d0009"} Sep 30 06:19:52 crc kubenswrapper[4691]: I0930 06:19:52.498124 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sjmvw_6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d/ovnkube-controller/1.log" Sep 30 06:19:52 crc kubenswrapper[4691]: I0930 06:19:52.529014 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca08f5d521940a428454d90f85150e7b87e2353314c89a42b4c73740c3b23163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:52Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:52 crc kubenswrapper[4691]: I0930 06:19:52.549878 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:52 crc kubenswrapper[4691]: I0930 06:19:52.549936 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:52 crc kubenswrapper[4691]: I0930 06:19:52.549945 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:52 crc kubenswrapper[4691]: I0930 06:19:52.549960 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:52 crc kubenswrapper[4691]: I0930 06:19:52.549972 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:52Z","lastTransitionTime":"2025-09-30T06:19:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:52 crc kubenswrapper[4691]: I0930 06:19:52.550174 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xjjw8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bfd073c-4582-4a65-8170-7030f4852174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a065eb18eb909fb60011f5ae6adf327088480996585d74e861628ffdd759bd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xjjw8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:52Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:52 crc kubenswrapper[4691]: I0930 06:19:52.561208 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-p7wmt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99a6e728-8795-424d-a99e-7141c75baad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://746d60ed8b3230cabda106f1d2e4bc8496618515f2cfb2d6c4238375f71ad2af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6jfq9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-p7wmt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:52Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:52 crc kubenswrapper[4691]: I0930 06:19:52.573311 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fxv8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37b1a1c7-92d7-41ea-b5c1-4a56b40f819e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be3a8860307e499758a11dd6561cdf1ffd968b3edf9c701b52994ea4cfe1c129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4h4k7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b07ee418ef8f92ea69f7d64ef09e0de246432d4663c4c5018be94894860c6444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4h4k7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fxv8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:52Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:52 crc kubenswrapper[4691]: I0930 06:19:52.589084 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:52Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:52 crc kubenswrapper[4691]: I0930 06:19:52.604770 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb1cfb02-1e6b-4bfb-8104-f1d137231de9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07fbce63750320e7345e1a99b4be96355f54c36c70e8929599b990eab3f3e637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38c9aa49f81f590d234daa993bb92af0c48103b45f5116c27b86daa651930a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acbe061d62e7bf11b0003c035dce19386981672d5e63236fada6e43b92274c91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fc15eff68924ff75a90b9cb7b5119d95ced9b9fe9e12968bfd077db73f37ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://017d7c032ea394e2dbef22cfc837c65ce54aab7db5fbd32210312b3e7042a698\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T06:19:31Z\\\",\\\"message\\\":\\\"W0930 06:19:20.583735 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 06:19:20.584237 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759213160 cert, and key in /tmp/serving-cert-2229078217/serving-signer.crt, /tmp/serving-cert-2229078217/serving-signer.key\\\\nI0930 06:19:20.815054 1 observer_polling.go:159] Starting file observer\\\\nW0930 06:19:20.818069 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 06:19:20.818377 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 06:19:20.821284 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2229078217/tls.crt::/tmp/serving-cert-2229078217/tls.key\\\\\\\"\\\\nF0930 06:19:31.389088 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39d7f1cd59b9a24d4303415d873bac1ed21326847749130167868b5298b7f8e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:52Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:52 crc kubenswrapper[4691]: I0930 06:19:52.621720 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:52Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:52 crc kubenswrapper[4691]: I0930 06:19:52.635792 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03f725fd435423024f16548f2fdd44ad1ddc26d1d485d56cc5b614ab2cac7a76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5fcea9ccb5c1331b7ae737abdcc80364814715f3c43c15793316f16acf0502a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:52Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:52 crc kubenswrapper[4691]: I0930 06:19:52.647239 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:52Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:52 crc kubenswrapper[4691]: I0930 06:19:52.651671 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:52 crc kubenswrapper[4691]: I0930 06:19:52.651726 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:52 crc kubenswrapper[4691]: I0930 06:19:52.651745 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:52 crc kubenswrapper[4691]: I0930 06:19:52.651769 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:52 crc kubenswrapper[4691]: I0930 06:19:52.651795 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:52Z","lastTransitionTime":"2025-09-30T06:19:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:52 crc kubenswrapper[4691]: I0930 06:19:52.660694 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8htrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c1c8663-d263-4b8c-93fa-05ee1b61d7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73cf34ef0b11305b959da324b55f8a211f0c278cf2dec835bb6c91ad385e4dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj86t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8htrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:52Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:52 crc kubenswrapper[4691]: I0930 06:19:52.673390 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b46ade-8260-448f-84b7-506632d23ff9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff4769002bae0f06423979897a58fdb4b2a78f9ea00abc1d5df5c8d4385c0919\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xszvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1440cd9bacd9b28d5e192b3d9143dda931c741606fdb8f246fba23b8a68c534c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xszvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4w4k6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:52Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:52 crc kubenswrapper[4691]: I0930 06:19:52.705183 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e55ae5a0-03bf-427d-92b2-39cdff18340d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f6cedf7ad601b81415785791be352c849e1d559ab8e6a0f33de2c89a93cf09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65513dc3f11ab14acbafe004e02c1b395b74935e84a00e75d9936bde03a97fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6240aa4c18020b30de9389e7a1869c736424444208d4011f228b4e5432520cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2841452dae413fa8cdf532c4933ed52cfb007be43a45ee2d25209b54890fd351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b052f8474075dfc1f07b8e0844072d5bbaa94bae1711f7e4cecc6928a65689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:52Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:52 crc kubenswrapper[4691]: I0930 06:19:52.720803 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3ecaa3-192f-40d4-abf8-938b9e03a661\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://469f62522e2f813405840a8654f761a52163bdabeba8cc7c57998ccd40548370\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198c76a83115408261c01c78cbd2845e487ad3dc896da0130b2b1338349506d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aadde3a7a603aae354d4f9dcf5ef288aeceabfb73ee3c92398fb3419326b27a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2f054ea0da71d1f65e501212a66771c5e65bfc123e3a3acd5e13900def039d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:52Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:52 crc kubenswrapper[4691]: I0930 06:19:52.739010 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nzp64" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b7ceecf30a44692340aed4f0df2aa58e8c9552f8ef872b7a872b29516fa2068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d675d9797e8541b6fff5df78a8d0701a51d23b03bd2d489d0489a82f440b4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d675d9797e8541b6fff5df78a8d0701a51d23b03bd2d489d0489a82f440b4a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aacdb742861442ec0e7b14904397b4e1a4bd115cd56e6105b37c42cb762b0dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aacdb742861442ec0e7b14904397b4e1a4bd115cd56e6105b37c42cb762b0dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f6c31e7404be56f11fd658c5e92f44de4704b9c2d5ebbce1fd0af40d2ca8c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1f6c31e7404be56f11fd658c5e92f44de4704b9c2d5ebbce1fd0af40d2ca8c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74da2d6dc28ebfcbfff21e47ebad84a906c4686fddf32d04eb494be40d811b86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74da2d6dc28ebfcbfff21e47ebad84a906c4686fddf32d04eb494be40d811b86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d1c4f2763fc5ea8ecd81a1c6aff31d291156a4db150fc70c0111b811bd5467c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d1c4f2763fc5ea8ecd81a1c6aff31d291156a4db150fc70c0111b811bd5467c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f313d83849679400f0ef2223f644c042290a33d55cab935fdbe9af123c88d210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f313d83849679400f0ef2223f644c042290a33d55cab935fdbe9af123c88d210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nzp64\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:52Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:52 crc kubenswrapper[4691]: I0930 06:19:52.754857 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:52 crc kubenswrapper[4691]: I0930 06:19:52.754946 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:52 crc kubenswrapper[4691]: I0930 06:19:52.754964 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:52 crc kubenswrapper[4691]: I0930 06:19:52.754989 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:52 crc kubenswrapper[4691]: I0930 06:19:52.755006 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:52Z","lastTransitionTime":"2025-09-30T06:19:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:52 crc kubenswrapper[4691]: I0930 06:19:52.762142 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b6df86867d5427797ec0c0a28b653ef5b8ec62af450207d9af5f490df725b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04bc58a7b740ca100f333ef769a949f962ff6024f8dd87b798c99a28514e0394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d95203bc3297779316f06d001c72859f9b4561a9a9d7c3123c5a75e8d8bc6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f46927454f67993366632bb3f6e37e4cb0c5f013266b18c1ac61e9cddc42e023\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20789e47d7e584044b0763af1eb8b81bbde2f247ec14e56a5a9b08c28894c3be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d002e869e911c24f154df90324ace473c747a4cfa6dd60ddcbef6473f9815f72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62876e9f7bbf8224530d8666058b979352841ae2e2236cdf0265027e50fa60a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d7e580b7d8029c47c3d2878f40287e2bbd9672c12a48dc6b4d598426e4655fa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T06:19:49Z\\\",\\\"message\\\":\\\"val\\\\nI0930 06:19:48.989528 6009 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0930 06:19:48.989574 6009 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0930 06:19:48.989589 6009 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0930 06:19:48.989608 6009 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 06:19:48.989611 6009 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0930 06:19:48.989615 6009 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0930 06:19:48.989633 6009 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 06:19:48.989645 6009 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0930 06:19:48.989661 6009 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 06:19:48.989659 6009 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0930 06:19:48.989674 6009 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0930 06:19:48.989685 6009 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 06:19:48.989686 6009 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0930 06:19:48.989707 6009 factory.go:656] Stopping watch factory\\\\nI0930 06:19:48.989725 6009 ovnkube.go:599] Stopped ovnkube\\\\nI0930 06:19:48.989726 6009 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 06:19:4\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62876e9f7bbf8224530d8666058b979352841ae2e2236cdf0265027e50fa60a6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T06:19:50Z\\\",\\\"message\\\":\\\"365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI0930 06:19:50.520239 6155 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0930 06:19:50.520245 6155 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-target-xd92c in node crc\\\\nI0930 06:19:50.520244 6155 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI0930 06:19:50.520170 6155 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-cluster-version/cluster-version-operator]} name:Service_openshift-cluster-version/cluster-version-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.182:9099:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61d39e4d-21a9-4387-9a2b-fa4ad14792e2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0930 06:19:50.520261 6155 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI0930 06:19:50.520262 6155 obj_retry.go:303] Retry object setup: *v1.Pod open\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f49abce591ae5c912fdbbdf0253d06ee20088424718dd6d535683a2a2ede5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sjmvw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:52Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:52 crc kubenswrapper[4691]: I0930 06:19:52.782522 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11fb453d515798fa4f85603df19a5dc7d305375d25c0f679598613aba7312328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:52Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:52 crc kubenswrapper[4691]: I0930 06:19:52.860071 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:52 crc kubenswrapper[4691]: I0930 06:19:52.860128 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:52 crc kubenswrapper[4691]: I0930 06:19:52.860144 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:52 crc kubenswrapper[4691]: I0930 06:19:52.860167 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:52 crc kubenswrapper[4691]: I0930 06:19:52.860185 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:52Z","lastTransitionTime":"2025-09-30T06:19:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:52 crc kubenswrapper[4691]: I0930 06:19:52.919003 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 06:19:52 crc kubenswrapper[4691]: I0930 06:19:52.919165 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 06:19:52 crc kubenswrapper[4691]: E0930 06:19:52.919220 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 06:20:08.919180058 +0000 UTC m=+52.394201138 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:19:52 crc kubenswrapper[4691]: E0930 06:19:52.919304 4691 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 06:19:52 crc kubenswrapper[4691]: I0930 06:19:52.919357 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 06:19:52 crc kubenswrapper[4691]: E0930 06:19:52.919379 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 06:20:08.919354714 +0000 UTC m=+52.394375824 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 06:19:52 crc kubenswrapper[4691]: E0930 06:19:52.919514 4691 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 06:19:52 crc kubenswrapper[4691]: E0930 06:19:52.919599 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 06:20:08.919584611 +0000 UTC m=+52.394605691 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 06:19:52 crc kubenswrapper[4691]: I0930 06:19:52.962998 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:52 crc kubenswrapper[4691]: I0930 06:19:52.963065 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:52 crc kubenswrapper[4691]: I0930 06:19:52.963083 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:52 crc kubenswrapper[4691]: I0930 06:19:52.963110 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:52 crc kubenswrapper[4691]: I0930 06:19:52.963128 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:52Z","lastTransitionTime":"2025-09-30T06:19:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.020306 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.020435 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 06:19:53 crc kubenswrapper[4691]: E0930 06:19:53.020553 4691 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 06:19:53 crc kubenswrapper[4691]: E0930 06:19:53.020602 4691 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 06:19:53 crc kubenswrapper[4691]: E0930 06:19:53.020623 4691 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 06:19:53 crc kubenswrapper[4691]: E0930 06:19:53.020680 4691 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 06:19:53 crc kubenswrapper[4691]: E0930 06:19:53.020717 4691 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 06:19:53 crc kubenswrapper[4691]: E0930 06:19:53.020747 4691 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 06:19:53 crc kubenswrapper[4691]: E0930 06:19:53.020720 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 06:20:09.020688687 +0000 UTC m=+52.495709767 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 06:19:53 crc kubenswrapper[4691]: E0930 06:19:53.020869 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 06:20:09.020833851 +0000 UTC m=+52.495854941 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.065746 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.065808 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.065826 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.065850 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.065867 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:53Z","lastTransitionTime":"2025-09-30T06:19:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.157352 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-svjxq"] Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.158075 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-svjxq" Sep 30 06:19:53 crc kubenswrapper[4691]: E0930 06:19:53.158168 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-svjxq" podUID="a8ed6f92-0b98-4b1b-a46e-4d0604d686a1" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.169226 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.169292 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.169314 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.169341 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.169363 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:53Z","lastTransitionTime":"2025-09-30T06:19:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.182658 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11fb453d515798fa4f85603df19a5dc7d305375d25c0f679598613aba7312328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:53Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.204813 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nzp64" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b7ceecf30a44692340aed4f0df2aa58e8c9552f8ef872b7a872b29516fa2068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d675d9797e8541b6fff5df78a8d0701a51d23b03bd2d489d0489a82f440b4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d675d9797e8541b6fff5df78a8d0701a51d23b03bd2d489d0489a82f440b4a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aacdb742861442ec0e7b14904397b4e1a4bd115cd56e6105b37c42cb762b0dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aacdb742861442ec0e7b14904397b4e1a4bd115cd56e6105b37c42cb762b0dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f6c31e7404be56f11fd658c5e92f44de4704b9c2d5ebbce1fd0af40d2ca8c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1f6c31e7404be56f11fd658c5e92f44de4704b9c2d5ebbce1fd0af40d2ca8c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74da2d6dc28ebfcbfff21e47ebad84a906c4686fddf32d04eb494be40d811b86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74da2d6dc28ebfcbfff21e47ebad84a906c4686fddf32d04eb494be40d811b86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d1c4f2763fc5ea8ecd81a1c6aff31d291156a4db150fc70c0111b811bd5467c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d1c4f2763fc5ea8ecd81a1c6aff31d291156a4db150fc70c0111b811bd5467c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f313d83849679400f0ef2223f644c042290a33d55cab935fdbe9af123c88d210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f313d83849679400f0ef2223f644c042290a33d55cab935fdbe9af123c88d210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nzp64\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:53Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.222639 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q98dd\" (UniqueName: \"kubernetes.io/projected/a8ed6f92-0b98-4b1b-a46e-4d0604d686a1-kube-api-access-q98dd\") pod \"network-metrics-daemon-svjxq\" (UID: \"a8ed6f92-0b98-4b1b-a46e-4d0604d686a1\") " pod="openshift-multus/network-metrics-daemon-svjxq" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.222761 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a8ed6f92-0b98-4b1b-a46e-4d0604d686a1-metrics-certs\") pod \"network-metrics-daemon-svjxq\" (UID: \"a8ed6f92-0b98-4b1b-a46e-4d0604d686a1\") " pod="openshift-multus/network-metrics-daemon-svjxq" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.224744 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.224753 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 06:19:53 crc kubenswrapper[4691]: E0930 06:19:53.225062 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 06:19:53 crc kubenswrapper[4691]: E0930 06:19:53.224902 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.229767 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b6df86867d5427797ec0c0a28b653ef5b8ec62af450207d9af5f490df725b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04bc58a7b740ca100f333ef769a949f962ff6024f8dd87b798c99a28514e0394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d95203bc3297779316f06d001c72859f9b4561a9a9d7c3123c5a75e8d8bc6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f46927454f67993366632bb3f6e37e4cb0c5f013266b18c1ac61e9cddc42e023\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20789e47d7e584044b0763af1eb8b81bbde2f247ec14e56a5a9b08c28894c3be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d002e869e911c24f154df90324ace473c747a4cfa6dd60ddcbef6473f9815f72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62876e9f7bbf8224530d8666058b979352841ae2e2236cdf0265027e50fa60a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d7e580b7d8029c47c3d2878f40287e2bbd9672c12a48dc6b4d598426e4655fa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T06:19:49Z\\\",\\\"message\\\":\\\"val\\\\nI0930 06:19:48.989528 6009 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0930 06:19:48.989574 6009 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0930 06:19:48.989589 6009 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0930 06:19:48.989608 6009 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 06:19:48.989611 6009 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0930 06:19:48.989615 6009 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0930 06:19:48.989633 6009 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 06:19:48.989645 6009 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0930 06:19:48.989661 6009 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 06:19:48.989659 6009 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0930 06:19:48.989674 6009 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0930 06:19:48.989685 6009 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 06:19:48.989686 6009 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0930 06:19:48.989707 6009 factory.go:656] Stopping watch factory\\\\nI0930 06:19:48.989725 6009 ovnkube.go:599] Stopped ovnkube\\\\nI0930 06:19:48.989726 6009 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 06:19:4\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62876e9f7bbf8224530d8666058b979352841ae2e2236cdf0265027e50fa60a6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T06:19:50Z\\\",\\\"message\\\":\\\"365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI0930 06:19:50.520239 6155 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0930 06:19:50.520245 6155 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-target-xd92c in node crc\\\\nI0930 06:19:50.520244 6155 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI0930 06:19:50.520170 6155 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-cluster-version/cluster-version-operator]} name:Service_openshift-cluster-version/cluster-version-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.182:9099:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61d39e4d-21a9-4387-9a2b-fa4ad14792e2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0930 06:19:50.520261 6155 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI0930 06:19:50.520262 6155 obj_retry.go:303] Retry object setup: *v1.Pod open\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f49abce591ae5c912fdbbdf0253d06ee20088424718dd6d535683a2a2ede5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sjmvw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:53Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.256165 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:53Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.269052 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca08f5d521940a428454d90f85150e7b87e2353314c89a42b4c73740c3b23163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:53Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.272182 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.272232 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.272244 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.272263 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.272275 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:53Z","lastTransitionTime":"2025-09-30T06:19:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.287108 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xjjw8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bfd073c-4582-4a65-8170-7030f4852174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a065eb18eb909fb60011f5ae6adf327088480996585d74e861628ffdd759bd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xjjw8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:53Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.303252 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-p7wmt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99a6e728-8795-424d-a99e-7141c75baad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://746d60ed8b3230cabda106f1d2e4bc8496618515f2cfb2d6c4238375f71ad2af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6jfq9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-p7wmt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:53Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.320036 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fxv8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37b1a1c7-92d7-41ea-b5c1-4a56b40f819e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be3a8860307e499758a11dd6561cdf1ffd968b3edf9c701b52994ea4cfe1c129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4h4k7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b07ee418ef8f92ea69f7d64ef09e0de246432d4663c4c5018be94894860c6444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4h4k7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fxv8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:53Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.323234 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q98dd\" (UniqueName: \"kubernetes.io/projected/a8ed6f92-0b98-4b1b-a46e-4d0604d686a1-kube-api-access-q98dd\") pod \"network-metrics-daemon-svjxq\" (UID: \"a8ed6f92-0b98-4b1b-a46e-4d0604d686a1\") " pod="openshift-multus/network-metrics-daemon-svjxq" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.323289 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a8ed6f92-0b98-4b1b-a46e-4d0604d686a1-metrics-certs\") pod \"network-metrics-daemon-svjxq\" (UID: \"a8ed6f92-0b98-4b1b-a46e-4d0604d686a1\") " pod="openshift-multus/network-metrics-daemon-svjxq" Sep 30 06:19:53 crc kubenswrapper[4691]: E0930 06:19:53.323389 4691 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 06:19:53 crc kubenswrapper[4691]: E0930 06:19:53.323443 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8ed6f92-0b98-4b1b-a46e-4d0604d686a1-metrics-certs podName:a8ed6f92-0b98-4b1b-a46e-4d0604d686a1 nodeName:}" failed. No retries permitted until 2025-09-30 06:19:53.823428376 +0000 UTC m=+37.298449416 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a8ed6f92-0b98-4b1b-a46e-4d0604d686a1-metrics-certs") pod "network-metrics-daemon-svjxq" (UID: "a8ed6f92-0b98-4b1b-a46e-4d0604d686a1") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.334460 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-svjxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8ed6f92-0b98-4b1b-a46e-4d0604d686a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q98dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q98dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-svjxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:53Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.351293 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q98dd\" (UniqueName: \"kubernetes.io/projected/a8ed6f92-0b98-4b1b-a46e-4d0604d686a1-kube-api-access-q98dd\") pod \"network-metrics-daemon-svjxq\" (UID: \"a8ed6f92-0b98-4b1b-a46e-4d0604d686a1\") " pod="openshift-multus/network-metrics-daemon-svjxq" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.363655 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e55ae5a0-03bf-427d-92b2-39cdff18340d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f6cedf7ad601b81415785791be352c849e1d559ab8e6a0f33de2c89a93cf09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65513dc3f11ab14acbafe004e02c1b395b74935e84a00e75d9936bde03a97fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6240aa4c18020b30de9389e7a1869c736424444208d4011f228b4e5432520cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2841452dae413fa8cdf532c4933ed52cfb007be43a45ee2d25209b54890fd351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b052f8474075dfc1f07b8e0844072d5bbaa94bae1711f7e4cecc6928a65689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:53Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.374851 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.374938 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.374956 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.374983 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.375000 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:53Z","lastTransitionTime":"2025-09-30T06:19:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.385399 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb1cfb02-1e6b-4bfb-8104-f1d137231de9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07fbce63750320e7345e1a99b4be96355f54c36c70e8929599b990eab3f3e637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38c9aa49f81f590d234daa993bb92af0c48103b45f5116c27b86daa651930a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acbe061d62e7bf11b0003c035dce19386981672d5e63236fada6e43b92274c91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fc15eff68924ff75a90b9cb7b5119d95ced9b9fe9e12968bfd077db73f37ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://017d7c032ea394e2dbef22cfc837c65ce54aab7db5fbd32210312b3e7042a698\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T06:19:31Z\\\",\\\"message\\\":\\\"W0930 06:19:20.583735 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 06:19:20.584237 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759213160 cert, and key in /tmp/serving-cert-2229078217/serving-signer.crt, /tmp/serving-cert-2229078217/serving-signer.key\\\\nI0930 06:19:20.815054 1 observer_polling.go:159] Starting file observer\\\\nW0930 06:19:20.818069 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 06:19:20.818377 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 06:19:20.821284 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2229078217/tls.crt::/tmp/serving-cert-2229078217/tls.key\\\\\\\"\\\\nF0930 06:19:31.389088 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39d7f1cd59b9a24d4303415d873bac1ed21326847749130167868b5298b7f8e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:53Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.404162 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:53Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.422245 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03f725fd435423024f16548f2fdd44ad1ddc26d1d485d56cc5b614ab2cac7a76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5fcea9ccb5c1331b7ae737abdcc80364814715f3c43c15793316f16acf0502a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:53Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.441948 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:53Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.453460 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8htrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c1c8663-d263-4b8c-93fa-05ee1b61d7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73cf34ef0b11305b959da324b55f8a211f0c278cf2dec835bb6c91ad385e4dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj86t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8htrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:53Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.470750 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b46ade-8260-448f-84b7-506632d23ff9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff4769002bae0f06423979897a58fdb4b2a78f9ea00abc1d5df5c8d4385c0919\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xszvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1440cd9bacd9b28d5e192b3d9143dda931c741606fdb8f246fba23b8a68c534c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xszvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4w4k6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:53Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.478418 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.478664 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.478681 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.478709 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.478726 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:53Z","lastTransitionTime":"2025-09-30T06:19:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.492158 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3ecaa3-192f-40d4-abf8-938b9e03a661\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://469f62522e2f813405840a8654f761a52163bdabeba8cc7c57998ccd40548370\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198c76a83115408261c01c78cbd2845e487ad3dc896da0130b2b1338349506d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aadde3a7a603aae354d4f9dcf5ef288aeceabfb73ee3c92398fb3419326b27a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2f054ea0da71d1f65e501212a66771c5e65bfc123e3a3acd5e13900def039d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:53Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.581189 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.581255 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.581271 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.581295 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.581312 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:53Z","lastTransitionTime":"2025-09-30T06:19:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.684078 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.684122 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.684131 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.684145 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.684155 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:53Z","lastTransitionTime":"2025-09-30T06:19:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.695955 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.696006 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.696027 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.696051 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.696068 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:53Z","lastTransitionTime":"2025-09-30T06:19:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:53 crc kubenswrapper[4691]: E0930 06:19:53.715607 4691 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5d33cbaa-1b8a-4dde-af56-05c3aae2213e\\\",\\\"systemUUID\\\":\\\"d7c4eecd-4486-44ba-8bf7-42bfa69eb2b7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:53Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.720691 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.720750 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.720770 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.720795 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.720811 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:53Z","lastTransitionTime":"2025-09-30T06:19:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:53 crc kubenswrapper[4691]: E0930 06:19:53.740568 4691 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5d33cbaa-1b8a-4dde-af56-05c3aae2213e\\\",\\\"systemUUID\\\":\\\"d7c4eecd-4486-44ba-8bf7-42bfa69eb2b7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:53Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.744696 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.744742 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.744759 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.744781 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.744797 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:53Z","lastTransitionTime":"2025-09-30T06:19:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:53 crc kubenswrapper[4691]: E0930 06:19:53.759339 4691 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5d33cbaa-1b8a-4dde-af56-05c3aae2213e\\\",\\\"systemUUID\\\":\\\"d7c4eecd-4486-44ba-8bf7-42bfa69eb2b7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:53Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.764011 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.764077 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.764102 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.764130 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.764153 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:53Z","lastTransitionTime":"2025-09-30T06:19:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:53 crc kubenswrapper[4691]: E0930 06:19:53.779827 4691 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5d33cbaa-1b8a-4dde-af56-05c3aae2213e\\\",\\\"systemUUID\\\":\\\"d7c4eecd-4486-44ba-8bf7-42bfa69eb2b7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:53Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.785728 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.785782 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.785801 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.785824 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.785840 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:53Z","lastTransitionTime":"2025-09-30T06:19:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:53 crc kubenswrapper[4691]: E0930 06:19:53.805708 4691 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5d33cbaa-1b8a-4dde-af56-05c3aae2213e\\\",\\\"systemUUID\\\":\\\"d7c4eecd-4486-44ba-8bf7-42bfa69eb2b7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:53Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:53 crc kubenswrapper[4691]: E0930 06:19:53.806037 4691 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.809084 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.809135 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.809152 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.809175 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.809197 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:53Z","lastTransitionTime":"2025-09-30T06:19:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.831447 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a8ed6f92-0b98-4b1b-a46e-4d0604d686a1-metrics-certs\") pod \"network-metrics-daemon-svjxq\" (UID: \"a8ed6f92-0b98-4b1b-a46e-4d0604d686a1\") " pod="openshift-multus/network-metrics-daemon-svjxq" Sep 30 06:19:53 crc kubenswrapper[4691]: E0930 06:19:53.831644 4691 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 06:19:53 crc kubenswrapper[4691]: E0930 06:19:53.831733 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8ed6f92-0b98-4b1b-a46e-4d0604d686a1-metrics-certs podName:a8ed6f92-0b98-4b1b-a46e-4d0604d686a1 nodeName:}" failed. No retries permitted until 2025-09-30 06:19:54.831707042 +0000 UTC m=+38.306728122 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a8ed6f92-0b98-4b1b-a46e-4d0604d686a1-metrics-certs") pod "network-metrics-daemon-svjxq" (UID: "a8ed6f92-0b98-4b1b-a46e-4d0604d686a1") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.912686 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.912743 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.912760 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.912787 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:53 crc kubenswrapper[4691]: I0930 06:19:53.912806 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:53Z","lastTransitionTime":"2025-09-30T06:19:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:54 crc kubenswrapper[4691]: I0930 06:19:54.015667 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:54 crc kubenswrapper[4691]: I0930 06:19:54.015747 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:54 crc kubenswrapper[4691]: I0930 06:19:54.015765 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:54 crc kubenswrapper[4691]: I0930 06:19:54.015791 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:54 crc kubenswrapper[4691]: I0930 06:19:54.015809 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:54Z","lastTransitionTime":"2025-09-30T06:19:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:54 crc kubenswrapper[4691]: I0930 06:19:54.119173 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:54 crc kubenswrapper[4691]: I0930 06:19:54.119250 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:54 crc kubenswrapper[4691]: I0930 06:19:54.119268 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:54 crc kubenswrapper[4691]: I0930 06:19:54.119293 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:54 crc kubenswrapper[4691]: I0930 06:19:54.119309 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:54Z","lastTransitionTime":"2025-09-30T06:19:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:54 crc kubenswrapper[4691]: I0930 06:19:54.222459 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:54 crc kubenswrapper[4691]: I0930 06:19:54.222544 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:54 crc kubenswrapper[4691]: I0930 06:19:54.222568 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:54 crc kubenswrapper[4691]: I0930 06:19:54.222599 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:54 crc kubenswrapper[4691]: I0930 06:19:54.222623 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:54Z","lastTransitionTime":"2025-09-30T06:19:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:54 crc kubenswrapper[4691]: I0930 06:19:54.223800 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 06:19:54 crc kubenswrapper[4691]: I0930 06:19:54.223811 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-svjxq" Sep 30 06:19:54 crc kubenswrapper[4691]: E0930 06:19:54.223975 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 06:19:54 crc kubenswrapper[4691]: E0930 06:19:54.224135 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-svjxq" podUID="a8ed6f92-0b98-4b1b-a46e-4d0604d686a1" Sep 30 06:19:54 crc kubenswrapper[4691]: I0930 06:19:54.325051 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:54 crc kubenswrapper[4691]: I0930 06:19:54.325122 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:54 crc kubenswrapper[4691]: I0930 06:19:54.325144 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:54 crc kubenswrapper[4691]: I0930 06:19:54.325171 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:54 crc kubenswrapper[4691]: I0930 06:19:54.325189 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:54Z","lastTransitionTime":"2025-09-30T06:19:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:54 crc kubenswrapper[4691]: I0930 06:19:54.429347 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:54 crc kubenswrapper[4691]: I0930 06:19:54.429410 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:54 crc kubenswrapper[4691]: I0930 06:19:54.429433 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:54 crc kubenswrapper[4691]: I0930 06:19:54.429459 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:54 crc kubenswrapper[4691]: I0930 06:19:54.429479 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:54Z","lastTransitionTime":"2025-09-30T06:19:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:54 crc kubenswrapper[4691]: I0930 06:19:54.532313 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:54 crc kubenswrapper[4691]: I0930 06:19:54.532381 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:54 crc kubenswrapper[4691]: I0930 06:19:54.532397 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:54 crc kubenswrapper[4691]: I0930 06:19:54.532420 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:54 crc kubenswrapper[4691]: I0930 06:19:54.532439 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:54Z","lastTransitionTime":"2025-09-30T06:19:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:54 crc kubenswrapper[4691]: I0930 06:19:54.635415 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:54 crc kubenswrapper[4691]: I0930 06:19:54.635505 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:54 crc kubenswrapper[4691]: I0930 06:19:54.635532 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:54 crc kubenswrapper[4691]: I0930 06:19:54.635562 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:54 crc kubenswrapper[4691]: I0930 06:19:54.635603 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:54Z","lastTransitionTime":"2025-09-30T06:19:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:54 crc kubenswrapper[4691]: I0930 06:19:54.739027 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:54 crc kubenswrapper[4691]: I0930 06:19:54.739092 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:54 crc kubenswrapper[4691]: I0930 06:19:54.739111 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:54 crc kubenswrapper[4691]: I0930 06:19:54.739139 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:54 crc kubenswrapper[4691]: I0930 06:19:54.739186 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:54Z","lastTransitionTime":"2025-09-30T06:19:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:54 crc kubenswrapper[4691]: I0930 06:19:54.842402 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:54 crc kubenswrapper[4691]: I0930 06:19:54.842474 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:54 crc kubenswrapper[4691]: I0930 06:19:54.842491 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:54 crc kubenswrapper[4691]: I0930 06:19:54.842516 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:54 crc kubenswrapper[4691]: I0930 06:19:54.842533 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:54Z","lastTransitionTime":"2025-09-30T06:19:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:54 crc kubenswrapper[4691]: I0930 06:19:54.843241 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a8ed6f92-0b98-4b1b-a46e-4d0604d686a1-metrics-certs\") pod \"network-metrics-daemon-svjxq\" (UID: \"a8ed6f92-0b98-4b1b-a46e-4d0604d686a1\") " pod="openshift-multus/network-metrics-daemon-svjxq" Sep 30 06:19:54 crc kubenswrapper[4691]: E0930 06:19:54.843507 4691 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 06:19:54 crc kubenswrapper[4691]: E0930 06:19:54.843611 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8ed6f92-0b98-4b1b-a46e-4d0604d686a1-metrics-certs podName:a8ed6f92-0b98-4b1b-a46e-4d0604d686a1 nodeName:}" failed. No retries permitted until 2025-09-30 06:19:56.843582116 +0000 UTC m=+40.318603186 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a8ed6f92-0b98-4b1b-a46e-4d0604d686a1-metrics-certs") pod "network-metrics-daemon-svjxq" (UID: "a8ed6f92-0b98-4b1b-a46e-4d0604d686a1") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 06:19:54 crc kubenswrapper[4691]: I0930 06:19:54.945628 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:54 crc kubenswrapper[4691]: I0930 06:19:54.945687 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:54 crc kubenswrapper[4691]: I0930 06:19:54.945704 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:54 crc kubenswrapper[4691]: I0930 06:19:54.945729 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:54 crc kubenswrapper[4691]: I0930 06:19:54.945747 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:54Z","lastTransitionTime":"2025-09-30T06:19:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:55 crc kubenswrapper[4691]: I0930 06:19:55.049326 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:55 crc kubenswrapper[4691]: I0930 06:19:55.049386 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:55 crc kubenswrapper[4691]: I0930 06:19:55.049404 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:55 crc kubenswrapper[4691]: I0930 06:19:55.049430 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:55 crc kubenswrapper[4691]: I0930 06:19:55.049447 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:55Z","lastTransitionTime":"2025-09-30T06:19:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:55 crc kubenswrapper[4691]: I0930 06:19:55.152372 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:55 crc kubenswrapper[4691]: I0930 06:19:55.152422 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:55 crc kubenswrapper[4691]: I0930 06:19:55.152439 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:55 crc kubenswrapper[4691]: I0930 06:19:55.152461 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:55 crc kubenswrapper[4691]: I0930 06:19:55.152477 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:55Z","lastTransitionTime":"2025-09-30T06:19:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:55 crc kubenswrapper[4691]: I0930 06:19:55.225107 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 06:19:55 crc kubenswrapper[4691]: E0930 06:19:55.225275 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 06:19:55 crc kubenswrapper[4691]: I0930 06:19:55.225319 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 06:19:55 crc kubenswrapper[4691]: E0930 06:19:55.225492 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 06:19:55 crc kubenswrapper[4691]: I0930 06:19:55.255385 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:55 crc kubenswrapper[4691]: I0930 06:19:55.255493 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:55 crc kubenswrapper[4691]: I0930 06:19:55.255519 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:55 crc kubenswrapper[4691]: I0930 06:19:55.255594 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:55 crc kubenswrapper[4691]: I0930 06:19:55.255616 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:55Z","lastTransitionTime":"2025-09-30T06:19:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:55 crc kubenswrapper[4691]: I0930 06:19:55.358540 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:55 crc kubenswrapper[4691]: I0930 06:19:55.358640 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:55 crc kubenswrapper[4691]: I0930 06:19:55.358686 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:55 crc kubenswrapper[4691]: I0930 06:19:55.358709 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:55 crc kubenswrapper[4691]: I0930 06:19:55.358725 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:55Z","lastTransitionTime":"2025-09-30T06:19:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:55 crc kubenswrapper[4691]: I0930 06:19:55.461852 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:55 crc kubenswrapper[4691]: I0930 06:19:55.461939 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:55 crc kubenswrapper[4691]: I0930 06:19:55.461957 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:55 crc kubenswrapper[4691]: I0930 06:19:55.461982 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:55 crc kubenswrapper[4691]: I0930 06:19:55.462010 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:55Z","lastTransitionTime":"2025-09-30T06:19:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:55 crc kubenswrapper[4691]: I0930 06:19:55.565635 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:55 crc kubenswrapper[4691]: I0930 06:19:55.565691 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:55 crc kubenswrapper[4691]: I0930 06:19:55.565708 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:55 crc kubenswrapper[4691]: I0930 06:19:55.565729 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:55 crc kubenswrapper[4691]: I0930 06:19:55.565746 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:55Z","lastTransitionTime":"2025-09-30T06:19:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:55 crc kubenswrapper[4691]: I0930 06:19:55.668579 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:55 crc kubenswrapper[4691]: I0930 06:19:55.668649 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:55 crc kubenswrapper[4691]: I0930 06:19:55.668671 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:55 crc kubenswrapper[4691]: I0930 06:19:55.668701 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:55 crc kubenswrapper[4691]: I0930 06:19:55.668721 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:55Z","lastTransitionTime":"2025-09-30T06:19:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:55 crc kubenswrapper[4691]: I0930 06:19:55.772148 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:55 crc kubenswrapper[4691]: I0930 06:19:55.772216 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:55 crc kubenswrapper[4691]: I0930 06:19:55.772235 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:55 crc kubenswrapper[4691]: I0930 06:19:55.772257 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:55 crc kubenswrapper[4691]: I0930 06:19:55.772275 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:55Z","lastTransitionTime":"2025-09-30T06:19:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:55 crc kubenswrapper[4691]: I0930 06:19:55.875554 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:55 crc kubenswrapper[4691]: I0930 06:19:55.875630 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:55 crc kubenswrapper[4691]: I0930 06:19:55.875646 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:55 crc kubenswrapper[4691]: I0930 06:19:55.875669 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:55 crc kubenswrapper[4691]: I0930 06:19:55.875686 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:55Z","lastTransitionTime":"2025-09-30T06:19:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:55 crc kubenswrapper[4691]: I0930 06:19:55.979288 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:55 crc kubenswrapper[4691]: I0930 06:19:55.979336 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:55 crc kubenswrapper[4691]: I0930 06:19:55.979353 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:55 crc kubenswrapper[4691]: I0930 06:19:55.979377 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:55 crc kubenswrapper[4691]: I0930 06:19:55.979394 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:55Z","lastTransitionTime":"2025-09-30T06:19:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:56 crc kubenswrapper[4691]: I0930 06:19:56.087913 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:56 crc kubenswrapper[4691]: I0930 06:19:56.087975 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:56 crc kubenswrapper[4691]: I0930 06:19:56.087991 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:56 crc kubenswrapper[4691]: I0930 06:19:56.088016 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:56 crc kubenswrapper[4691]: I0930 06:19:56.088035 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:56Z","lastTransitionTime":"2025-09-30T06:19:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:56 crc kubenswrapper[4691]: I0930 06:19:56.191384 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:56 crc kubenswrapper[4691]: I0930 06:19:56.191465 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:56 crc kubenswrapper[4691]: I0930 06:19:56.191489 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:56 crc kubenswrapper[4691]: I0930 06:19:56.191520 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:56 crc kubenswrapper[4691]: I0930 06:19:56.191540 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:56Z","lastTransitionTime":"2025-09-30T06:19:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:56 crc kubenswrapper[4691]: I0930 06:19:56.224271 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-svjxq" Sep 30 06:19:56 crc kubenswrapper[4691]: I0930 06:19:56.224322 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 06:19:56 crc kubenswrapper[4691]: E0930 06:19:56.224463 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-svjxq" podUID="a8ed6f92-0b98-4b1b-a46e-4d0604d686a1" Sep 30 06:19:56 crc kubenswrapper[4691]: E0930 06:19:56.224577 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 06:19:56 crc kubenswrapper[4691]: I0930 06:19:56.295057 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:56 crc kubenswrapper[4691]: I0930 06:19:56.295120 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:56 crc kubenswrapper[4691]: I0930 06:19:56.295139 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:56 crc kubenswrapper[4691]: I0930 06:19:56.295165 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:56 crc kubenswrapper[4691]: I0930 06:19:56.295189 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:56Z","lastTransitionTime":"2025-09-30T06:19:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:56 crc kubenswrapper[4691]: I0930 06:19:56.398611 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:56 crc kubenswrapper[4691]: I0930 06:19:56.398670 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:56 crc kubenswrapper[4691]: I0930 06:19:56.398687 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:56 crc kubenswrapper[4691]: I0930 06:19:56.398710 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:56 crc kubenswrapper[4691]: I0930 06:19:56.398727 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:56Z","lastTransitionTime":"2025-09-30T06:19:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:56 crc kubenswrapper[4691]: I0930 06:19:56.502264 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:56 crc kubenswrapper[4691]: I0930 06:19:56.502325 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:56 crc kubenswrapper[4691]: I0930 06:19:56.502342 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:56 crc kubenswrapper[4691]: I0930 06:19:56.502398 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:56 crc kubenswrapper[4691]: I0930 06:19:56.502418 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:56Z","lastTransitionTime":"2025-09-30T06:19:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:56 crc kubenswrapper[4691]: I0930 06:19:56.605974 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:56 crc kubenswrapper[4691]: I0930 06:19:56.606032 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:56 crc kubenswrapper[4691]: I0930 06:19:56.606048 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:56 crc kubenswrapper[4691]: I0930 06:19:56.606072 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:56 crc kubenswrapper[4691]: I0930 06:19:56.606089 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:56Z","lastTransitionTime":"2025-09-30T06:19:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:56 crc kubenswrapper[4691]: I0930 06:19:56.708759 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:56 crc kubenswrapper[4691]: I0930 06:19:56.708818 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:56 crc kubenswrapper[4691]: I0930 06:19:56.708835 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:56 crc kubenswrapper[4691]: I0930 06:19:56.708861 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:56 crc kubenswrapper[4691]: I0930 06:19:56.708878 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:56Z","lastTransitionTime":"2025-09-30T06:19:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:56 crc kubenswrapper[4691]: I0930 06:19:56.812282 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:56 crc kubenswrapper[4691]: I0930 06:19:56.812365 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:56 crc kubenswrapper[4691]: I0930 06:19:56.812388 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:56 crc kubenswrapper[4691]: I0930 06:19:56.812417 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:56 crc kubenswrapper[4691]: I0930 06:19:56.812437 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:56Z","lastTransitionTime":"2025-09-30T06:19:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:56 crc kubenswrapper[4691]: I0930 06:19:56.864593 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a8ed6f92-0b98-4b1b-a46e-4d0604d686a1-metrics-certs\") pod \"network-metrics-daemon-svjxq\" (UID: \"a8ed6f92-0b98-4b1b-a46e-4d0604d686a1\") " pod="openshift-multus/network-metrics-daemon-svjxq" Sep 30 06:19:56 crc kubenswrapper[4691]: E0930 06:19:56.864776 4691 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 06:19:56 crc kubenswrapper[4691]: E0930 06:19:56.864870 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8ed6f92-0b98-4b1b-a46e-4d0604d686a1-metrics-certs podName:a8ed6f92-0b98-4b1b-a46e-4d0604d686a1 nodeName:}" failed. No retries permitted until 2025-09-30 06:20:00.864844994 +0000 UTC m=+44.339866064 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a8ed6f92-0b98-4b1b-a46e-4d0604d686a1-metrics-certs") pod "network-metrics-daemon-svjxq" (UID: "a8ed6f92-0b98-4b1b-a46e-4d0604d686a1") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 06:19:56 crc kubenswrapper[4691]: I0930 06:19:56.916545 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:56 crc kubenswrapper[4691]: I0930 06:19:56.916625 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:56 crc kubenswrapper[4691]: I0930 06:19:56.916648 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:56 crc kubenswrapper[4691]: I0930 06:19:56.916680 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:56 crc kubenswrapper[4691]: I0930 06:19:56.916704 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:56Z","lastTransitionTime":"2025-09-30T06:19:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:57 crc kubenswrapper[4691]: I0930 06:19:57.019797 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:57 crc kubenswrapper[4691]: I0930 06:19:57.020105 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:57 crc kubenswrapper[4691]: I0930 06:19:57.020218 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:57 crc kubenswrapper[4691]: I0930 06:19:57.020351 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:57 crc kubenswrapper[4691]: I0930 06:19:57.020420 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:57Z","lastTransitionTime":"2025-09-30T06:19:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:57 crc kubenswrapper[4691]: I0930 06:19:57.122327 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:57 crc kubenswrapper[4691]: I0930 06:19:57.122401 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:57 crc kubenswrapper[4691]: I0930 06:19:57.122412 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:57 crc kubenswrapper[4691]: I0930 06:19:57.122428 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:57 crc kubenswrapper[4691]: I0930 06:19:57.122439 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:57Z","lastTransitionTime":"2025-09-30T06:19:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:57 crc kubenswrapper[4691]: I0930 06:19:57.223966 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 06:19:57 crc kubenswrapper[4691]: E0930 06:19:57.224103 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 06:19:57 crc kubenswrapper[4691]: I0930 06:19:57.224506 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 06:19:57 crc kubenswrapper[4691]: E0930 06:19:57.224603 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 06:19:57 crc kubenswrapper[4691]: I0930 06:19:57.225039 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:57 crc kubenswrapper[4691]: I0930 06:19:57.225075 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:57 crc kubenswrapper[4691]: I0930 06:19:57.225089 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:57 crc kubenswrapper[4691]: I0930 06:19:57.225107 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:57 crc kubenswrapper[4691]: I0930 06:19:57.225124 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:57Z","lastTransitionTime":"2025-09-30T06:19:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:57 crc kubenswrapper[4691]: I0930 06:19:57.244578 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:57Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:57 crc kubenswrapper[4691]: I0930 06:19:57.259831 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03f725fd435423024f16548f2fdd44ad1ddc26d1d485d56cc5b614ab2cac7a76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5fcea9ccb5c1331b7ae737abdcc80364814715f3c43c15793316f16acf0502a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:57Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:57 crc kubenswrapper[4691]: I0930 06:19:57.276859 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:57Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:57 crc kubenswrapper[4691]: I0930 06:19:57.292424 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8htrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c1c8663-d263-4b8c-93fa-05ee1b61d7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73cf34ef0b11305b959da324b55f8a211f0c278cf2dec835bb6c91ad385e4dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj86t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8htrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:57Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:57 crc kubenswrapper[4691]: I0930 06:19:57.308945 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b46ade-8260-448f-84b7-506632d23ff9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff4769002bae0f06423979897a58fdb4b2a78f9ea00abc1d5df5c8d4385c0919\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xszvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1440cd9bacd9b28d5e192b3d9143dda931c741606fdb8f246fba23b8a68c534c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xszvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4w4k6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:57Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:57 crc kubenswrapper[4691]: I0930 06:19:57.334738 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:57 crc kubenswrapper[4691]: I0930 06:19:57.334794 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:57 crc kubenswrapper[4691]: I0930 06:19:57.334813 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:57 crc kubenswrapper[4691]: I0930 06:19:57.334837 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:57 crc kubenswrapper[4691]: I0930 06:19:57.334854 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:57Z","lastTransitionTime":"2025-09-30T06:19:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:57 crc kubenswrapper[4691]: I0930 06:19:57.342611 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e55ae5a0-03bf-427d-92b2-39cdff18340d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f6cedf7ad601b81415785791be352c849e1d559ab8e6a0f33de2c89a93cf09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65513dc3f11ab14acbafe004e02c1b395b74935e84a00e75d9936bde03a97fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6240aa4c18020b30de9389e7a1869c736424444208d4011f228b4e5432520cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2841452dae413fa8cdf532c4933ed52cfb007be43a45ee2d25209b54890fd351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b052f8474075dfc1f07b8e0844072d5bbaa94bae1711f7e4cecc6928a65689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:57Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:57 crc kubenswrapper[4691]: I0930 06:19:57.370772 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb1cfb02-1e6b-4bfb-8104-f1d137231de9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07fbce63750320e7345e1a99b4be96355f54c36c70e8929599b990eab3f3e637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38c9aa49f81f590d234daa993bb92af0c48103b45f5116c27b86daa651930a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acbe061d62e7bf11b0003c035dce19386981672d5e63236fada6e43b92274c91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fc15eff68924ff75a90b9cb7b5119d95ced9b9fe9e12968bfd077db73f37ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://017d7c032ea394e2dbef22cfc837c65ce54aab7db5fbd32210312b3e7042a698\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T06:19:31Z\\\",\\\"message\\\":\\\"W0930 06:19:20.583735 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 06:19:20.584237 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759213160 cert, and key in /tmp/serving-cert-2229078217/serving-signer.crt, /tmp/serving-cert-2229078217/serving-signer.key\\\\nI0930 06:19:20.815054 1 observer_polling.go:159] Starting file observer\\\\nW0930 06:19:20.818069 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 06:19:20.818377 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 06:19:20.821284 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2229078217/tls.crt::/tmp/serving-cert-2229078217/tls.key\\\\\\\"\\\\nF0930 06:19:31.389088 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39d7f1cd59b9a24d4303415d873bac1ed21326847749130167868b5298b7f8e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:57Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:57 crc kubenswrapper[4691]: I0930 06:19:57.391237 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3ecaa3-192f-40d4-abf8-938b9e03a661\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://469f62522e2f813405840a8654f761a52163bdabeba8cc7c57998ccd40548370\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198c76a83115408261c01c78cbd2845e487ad3dc896da0130b2b1338349506d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aadde3a7a603aae354d4f9dcf5ef288aeceabfb73ee3c92398fb3419326b27a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2f054ea0da71d1f65e501212a66771c5e65bfc123e3a3acd5e13900def039d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:57Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:57 crc kubenswrapper[4691]: I0930 06:19:57.424089 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b6df86867d5427797ec0c0a28b653ef5b8ec62af450207d9af5f490df725b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04bc58a7b740ca100f333ef769a949f962ff6024f8dd87b798c99a28514e0394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d95203bc3297779316f06d001c72859f9b4561a9a9d7c3123c5a75e8d8bc6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f46927454f67993366632bb3f6e37e4cb0c5f013266b18c1ac61e9cddc42e023\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20789e47d7e584044b0763af1eb8b81bbde2f247ec14e56a5a9b08c28894c3be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d002e869e911c24f154df90324ace473c747a4cfa6dd60ddcbef6473f9815f72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62876e9f7bbf8224530d8666058b979352841ae2e2236cdf0265027e50fa60a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1d7e580b7d8029c47c3d2878f40287e2bbd9672c12a48dc6b4d598426e4655fa\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T06:19:49Z\\\",\\\"message\\\":\\\"val\\\\nI0930 06:19:48.989528 6009 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0930 06:19:48.989574 6009 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0930 06:19:48.989589 6009 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0930 06:19:48.989608 6009 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0930 06:19:48.989611 6009 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0930 06:19:48.989615 6009 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0930 06:19:48.989633 6009 handler.go:208] Removed *v1.Node event handler 2\\\\nI0930 06:19:48.989645 6009 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0930 06:19:48.989661 6009 handler.go:208] Removed *v1.Node event handler 7\\\\nI0930 06:19:48.989659 6009 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0930 06:19:48.989674 6009 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0930 06:19:48.989685 6009 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 06:19:48.989686 6009 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0930 06:19:48.989707 6009 factory.go:656] Stopping watch factory\\\\nI0930 06:19:48.989725 6009 ovnkube.go:599] Stopped ovnkube\\\\nI0930 06:19:48.989726 6009 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0930 06:19:4\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62876e9f7bbf8224530d8666058b979352841ae2e2236cdf0265027e50fa60a6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T06:19:50Z\\\",\\\"message\\\":\\\"365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI0930 06:19:50.520239 6155 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0930 06:19:50.520245 6155 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-target-xd92c in node crc\\\\nI0930 06:19:50.520244 6155 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI0930 06:19:50.520170 6155 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-cluster-version/cluster-version-operator]} name:Service_openshift-cluster-version/cluster-version-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.182:9099:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61d39e4d-21a9-4387-9a2b-fa4ad14792e2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0930 06:19:50.520261 6155 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI0930 06:19:50.520262 6155 obj_retry.go:303] Retry object setup: *v1.Pod open\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f49abce591ae5c912fdbbdf0253d06ee20088424718dd6d535683a2a2ede5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sjmvw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:57Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:57 crc kubenswrapper[4691]: I0930 06:19:57.438529 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:57 crc kubenswrapper[4691]: I0930 06:19:57.438967 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:57 crc kubenswrapper[4691]: I0930 06:19:57.439082 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:57 crc kubenswrapper[4691]: I0930 06:19:57.439150 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:57 crc kubenswrapper[4691]: I0930 06:19:57.439172 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:57Z","lastTransitionTime":"2025-09-30T06:19:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:57 crc kubenswrapper[4691]: I0930 06:19:57.444194 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11fb453d515798fa4f85603df19a5dc7d305375d25c0f679598613aba7312328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:57Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:57 crc kubenswrapper[4691]: I0930 06:19:57.462113 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nzp64" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b7ceecf30a44692340aed4f0df2aa58e8c9552f8ef872b7a872b29516fa2068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d675d9797e8541b6fff5df78a8d0701a51d23b03bd2d489d0489a82f440b4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d675d9797e8541b6fff5df78a8d0701a51d23b03bd2d489d0489a82f440b4a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aacdb742861442ec0e7b14904397b4e1a4bd115cd56e6105b37c42cb762b0dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aacdb742861442ec0e7b14904397b4e1a4bd115cd56e6105b37c42cb762b0dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f6c31e7404be56f11fd658c5e92f44de4704b9c2d5ebbce1fd0af40d2ca8c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1f6c31e7404be56f11fd658c5e92f44de4704b9c2d5ebbce1fd0af40d2ca8c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74da2d6dc28ebfcbfff21e47ebad84a906c4686fddf32d04eb494be40d811b86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74da2d6dc28ebfcbfff21e47ebad84a906c4686fddf32d04eb494be40d811b86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d1c4f2763fc5ea8ecd81a1c6aff31d291156a4db150fc70c0111b811bd5467c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d1c4f2763fc5ea8ecd81a1c6aff31d291156a4db150fc70c0111b811bd5467c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f313d83849679400f0ef2223f644c042290a33d55cab935fdbe9af123c88d210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f313d83849679400f0ef2223f644c042290a33d55cab935fdbe9af123c88d210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nzp64\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:57Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:57 crc kubenswrapper[4691]: I0930 06:19:57.476495 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xjjw8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bfd073c-4582-4a65-8170-7030f4852174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a065eb18eb909fb60011f5ae6adf327088480996585d74e861628ffdd759bd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xjjw8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:57Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:57 crc kubenswrapper[4691]: I0930 06:19:57.489712 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-p7wmt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99a6e728-8795-424d-a99e-7141c75baad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://746d60ed8b3230cabda106f1d2e4bc8496618515f2cfb2d6c4238375f71ad2af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6jfq9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-p7wmt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:57Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:57 crc kubenswrapper[4691]: I0930 06:19:57.503262 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fxv8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37b1a1c7-92d7-41ea-b5c1-4a56b40f819e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be3a8860307e499758a11dd6561cdf1ffd968b3edf9c701b52994ea4cfe1c129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4h4k7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b07ee418ef8f92ea69f7d64ef09e0de246432d4663c4c5018be94894860c6444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4h4k7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fxv8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:57Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:57 crc kubenswrapper[4691]: I0930 06:19:57.521354 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-svjxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8ed6f92-0b98-4b1b-a46e-4d0604d686a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q98dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q98dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-svjxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:57Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:57 crc kubenswrapper[4691]: I0930 06:19:57.540533 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:57Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:57 crc kubenswrapper[4691]: I0930 06:19:57.543302 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:57 crc kubenswrapper[4691]: I0930 06:19:57.543368 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:57 crc kubenswrapper[4691]: I0930 06:19:57.543387 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:57 crc kubenswrapper[4691]: I0930 06:19:57.543412 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:57 crc kubenswrapper[4691]: I0930 06:19:57.543429 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:57Z","lastTransitionTime":"2025-09-30T06:19:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:57 crc kubenswrapper[4691]: I0930 06:19:57.559382 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca08f5d521940a428454d90f85150e7b87e2353314c89a42b4c73740c3b23163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:19:57Z is after 2025-08-24T17:21:41Z" Sep 30 06:19:57 crc kubenswrapper[4691]: I0930 06:19:57.646531 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:57 crc kubenswrapper[4691]: I0930 06:19:57.646581 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:57 crc kubenswrapper[4691]: I0930 06:19:57.646598 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:57 crc kubenswrapper[4691]: I0930 06:19:57.646623 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:57 crc kubenswrapper[4691]: I0930 06:19:57.646640 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:57Z","lastTransitionTime":"2025-09-30T06:19:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:57 crc kubenswrapper[4691]: I0930 06:19:57.750313 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:57 crc kubenswrapper[4691]: I0930 06:19:57.750375 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:57 crc kubenswrapper[4691]: I0930 06:19:57.750396 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:57 crc kubenswrapper[4691]: I0930 06:19:57.750421 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:57 crc kubenswrapper[4691]: I0930 06:19:57.750440 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:57Z","lastTransitionTime":"2025-09-30T06:19:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:57 crc kubenswrapper[4691]: I0930 06:19:57.854481 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:57 crc kubenswrapper[4691]: I0930 06:19:57.854552 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:57 crc kubenswrapper[4691]: I0930 06:19:57.854573 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:57 crc kubenswrapper[4691]: I0930 06:19:57.854598 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:57 crc kubenswrapper[4691]: I0930 06:19:57.854617 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:57Z","lastTransitionTime":"2025-09-30T06:19:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:57 crc kubenswrapper[4691]: I0930 06:19:57.958101 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:57 crc kubenswrapper[4691]: I0930 06:19:57.958165 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:57 crc kubenswrapper[4691]: I0930 06:19:57.958184 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:57 crc kubenswrapper[4691]: I0930 06:19:57.958209 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:57 crc kubenswrapper[4691]: I0930 06:19:57.958228 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:57Z","lastTransitionTime":"2025-09-30T06:19:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:58 crc kubenswrapper[4691]: I0930 06:19:58.061797 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:58 crc kubenswrapper[4691]: I0930 06:19:58.061858 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:58 crc kubenswrapper[4691]: I0930 06:19:58.061875 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:58 crc kubenswrapper[4691]: I0930 06:19:58.061921 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:58 crc kubenswrapper[4691]: I0930 06:19:58.061938 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:58Z","lastTransitionTime":"2025-09-30T06:19:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:58 crc kubenswrapper[4691]: I0930 06:19:58.165970 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:58 crc kubenswrapper[4691]: I0930 06:19:58.166023 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:58 crc kubenswrapper[4691]: I0930 06:19:58.166042 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:58 crc kubenswrapper[4691]: I0930 06:19:58.166066 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:58 crc kubenswrapper[4691]: I0930 06:19:58.166084 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:58Z","lastTransitionTime":"2025-09-30T06:19:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:58 crc kubenswrapper[4691]: I0930 06:19:58.224744 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-svjxq" Sep 30 06:19:58 crc kubenswrapper[4691]: I0930 06:19:58.224818 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 06:19:58 crc kubenswrapper[4691]: E0930 06:19:58.224981 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-svjxq" podUID="a8ed6f92-0b98-4b1b-a46e-4d0604d686a1" Sep 30 06:19:58 crc kubenswrapper[4691]: E0930 06:19:58.225097 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 06:19:58 crc kubenswrapper[4691]: I0930 06:19:58.268872 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:58 crc kubenswrapper[4691]: I0930 06:19:58.268982 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:58 crc kubenswrapper[4691]: I0930 06:19:58.269037 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:58 crc kubenswrapper[4691]: I0930 06:19:58.269065 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:58 crc kubenswrapper[4691]: I0930 06:19:58.269113 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:58Z","lastTransitionTime":"2025-09-30T06:19:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:58 crc kubenswrapper[4691]: I0930 06:19:58.372439 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:58 crc kubenswrapper[4691]: I0930 06:19:58.372508 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:58 crc kubenswrapper[4691]: I0930 06:19:58.372525 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:58 crc kubenswrapper[4691]: I0930 06:19:58.372553 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:58 crc kubenswrapper[4691]: I0930 06:19:58.372571 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:58Z","lastTransitionTime":"2025-09-30T06:19:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:58 crc kubenswrapper[4691]: I0930 06:19:58.476277 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:58 crc kubenswrapper[4691]: I0930 06:19:58.476337 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:58 crc kubenswrapper[4691]: I0930 06:19:58.476355 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:58 crc kubenswrapper[4691]: I0930 06:19:58.476380 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:58 crc kubenswrapper[4691]: I0930 06:19:58.476397 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:58Z","lastTransitionTime":"2025-09-30T06:19:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:58 crc kubenswrapper[4691]: I0930 06:19:58.579699 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:58 crc kubenswrapper[4691]: I0930 06:19:58.579756 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:58 crc kubenswrapper[4691]: I0930 06:19:58.579773 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:58 crc kubenswrapper[4691]: I0930 06:19:58.579796 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:58 crc kubenswrapper[4691]: I0930 06:19:58.579814 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:58Z","lastTransitionTime":"2025-09-30T06:19:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:58 crc kubenswrapper[4691]: I0930 06:19:58.683340 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:58 crc kubenswrapper[4691]: I0930 06:19:58.683441 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:58 crc kubenswrapper[4691]: I0930 06:19:58.683460 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:58 crc kubenswrapper[4691]: I0930 06:19:58.683483 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:58 crc kubenswrapper[4691]: I0930 06:19:58.683501 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:58Z","lastTransitionTime":"2025-09-30T06:19:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:58 crc kubenswrapper[4691]: I0930 06:19:58.786560 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:58 crc kubenswrapper[4691]: I0930 06:19:58.786705 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:58 crc kubenswrapper[4691]: I0930 06:19:58.786731 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:58 crc kubenswrapper[4691]: I0930 06:19:58.786759 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:58 crc kubenswrapper[4691]: I0930 06:19:58.786794 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:58Z","lastTransitionTime":"2025-09-30T06:19:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:58 crc kubenswrapper[4691]: I0930 06:19:58.890325 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:58 crc kubenswrapper[4691]: I0930 06:19:58.890417 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:58 crc kubenswrapper[4691]: I0930 06:19:58.890437 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:58 crc kubenswrapper[4691]: I0930 06:19:58.890466 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:58 crc kubenswrapper[4691]: I0930 06:19:58.890485 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:58Z","lastTransitionTime":"2025-09-30T06:19:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:58 crc kubenswrapper[4691]: I0930 06:19:58.992545 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:58 crc kubenswrapper[4691]: I0930 06:19:58.992615 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:58 crc kubenswrapper[4691]: I0930 06:19:58.992637 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:58 crc kubenswrapper[4691]: I0930 06:19:58.992665 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:58 crc kubenswrapper[4691]: I0930 06:19:58.992686 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:58Z","lastTransitionTime":"2025-09-30T06:19:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:59 crc kubenswrapper[4691]: I0930 06:19:59.095820 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:59 crc kubenswrapper[4691]: I0930 06:19:59.095869 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:59 crc kubenswrapper[4691]: I0930 06:19:59.095940 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:59 crc kubenswrapper[4691]: I0930 06:19:59.095967 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:59 crc kubenswrapper[4691]: I0930 06:19:59.095984 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:59Z","lastTransitionTime":"2025-09-30T06:19:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:59 crc kubenswrapper[4691]: I0930 06:19:59.198834 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:59 crc kubenswrapper[4691]: I0930 06:19:59.198944 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:59 crc kubenswrapper[4691]: I0930 06:19:59.198963 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:59 crc kubenswrapper[4691]: I0930 06:19:59.198987 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:59 crc kubenswrapper[4691]: I0930 06:19:59.199005 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:59Z","lastTransitionTime":"2025-09-30T06:19:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:59 crc kubenswrapper[4691]: I0930 06:19:59.224967 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 06:19:59 crc kubenswrapper[4691]: I0930 06:19:59.224995 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 06:19:59 crc kubenswrapper[4691]: E0930 06:19:59.225206 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 06:19:59 crc kubenswrapper[4691]: E0930 06:19:59.225285 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 06:19:59 crc kubenswrapper[4691]: I0930 06:19:59.302119 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:59 crc kubenswrapper[4691]: I0930 06:19:59.302210 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:59 crc kubenswrapper[4691]: I0930 06:19:59.302228 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:59 crc kubenswrapper[4691]: I0930 06:19:59.302250 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:59 crc kubenswrapper[4691]: I0930 06:19:59.302267 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:59Z","lastTransitionTime":"2025-09-30T06:19:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:59 crc kubenswrapper[4691]: I0930 06:19:59.405853 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:59 crc kubenswrapper[4691]: I0930 06:19:59.406062 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:59 crc kubenswrapper[4691]: I0930 06:19:59.406093 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:59 crc kubenswrapper[4691]: I0930 06:19:59.406118 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:59 crc kubenswrapper[4691]: I0930 06:19:59.406135 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:59Z","lastTransitionTime":"2025-09-30T06:19:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:59 crc kubenswrapper[4691]: I0930 06:19:59.509433 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:59 crc kubenswrapper[4691]: I0930 06:19:59.509498 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:59 crc kubenswrapper[4691]: I0930 06:19:59.509515 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:59 crc kubenswrapper[4691]: I0930 06:19:59.509540 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:59 crc kubenswrapper[4691]: I0930 06:19:59.509557 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:59Z","lastTransitionTime":"2025-09-30T06:19:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:59 crc kubenswrapper[4691]: I0930 06:19:59.613226 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:59 crc kubenswrapper[4691]: I0930 06:19:59.613293 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:59 crc kubenswrapper[4691]: I0930 06:19:59.613309 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:59 crc kubenswrapper[4691]: I0930 06:19:59.613334 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:59 crc kubenswrapper[4691]: I0930 06:19:59.613351 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:59Z","lastTransitionTime":"2025-09-30T06:19:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:59 crc kubenswrapper[4691]: I0930 06:19:59.716449 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:59 crc kubenswrapper[4691]: I0930 06:19:59.716520 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:59 crc kubenswrapper[4691]: I0930 06:19:59.716543 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:59 crc kubenswrapper[4691]: I0930 06:19:59.716571 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:59 crc kubenswrapper[4691]: I0930 06:19:59.716595 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:59Z","lastTransitionTime":"2025-09-30T06:19:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:59 crc kubenswrapper[4691]: I0930 06:19:59.819626 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:59 crc kubenswrapper[4691]: I0930 06:19:59.819674 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:59 crc kubenswrapper[4691]: I0930 06:19:59.819692 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:59 crc kubenswrapper[4691]: I0930 06:19:59.819713 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:59 crc kubenswrapper[4691]: I0930 06:19:59.819730 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:59Z","lastTransitionTime":"2025-09-30T06:19:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:19:59 crc kubenswrapper[4691]: I0930 06:19:59.922575 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:19:59 crc kubenswrapper[4691]: I0930 06:19:59.922635 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:19:59 crc kubenswrapper[4691]: I0930 06:19:59.922661 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:19:59 crc kubenswrapper[4691]: I0930 06:19:59.922688 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:19:59 crc kubenswrapper[4691]: I0930 06:19:59.922708 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:19:59Z","lastTransitionTime":"2025-09-30T06:19:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:00 crc kubenswrapper[4691]: I0930 06:20:00.025832 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:00 crc kubenswrapper[4691]: I0930 06:20:00.025915 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:00 crc kubenswrapper[4691]: I0930 06:20:00.025933 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:00 crc kubenswrapper[4691]: I0930 06:20:00.025959 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:00 crc kubenswrapper[4691]: I0930 06:20:00.025979 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:00Z","lastTransitionTime":"2025-09-30T06:20:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:00 crc kubenswrapper[4691]: I0930 06:20:00.129010 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:00 crc kubenswrapper[4691]: I0930 06:20:00.129065 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:00 crc kubenswrapper[4691]: I0930 06:20:00.129085 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:00 crc kubenswrapper[4691]: I0930 06:20:00.129111 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:00 crc kubenswrapper[4691]: I0930 06:20:00.129132 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:00Z","lastTransitionTime":"2025-09-30T06:20:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:00 crc kubenswrapper[4691]: I0930 06:20:00.224711 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 06:20:00 crc kubenswrapper[4691]: E0930 06:20:00.224980 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 06:20:00 crc kubenswrapper[4691]: I0930 06:20:00.224720 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-svjxq" Sep 30 06:20:00 crc kubenswrapper[4691]: E0930 06:20:00.225789 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-svjxq" podUID="a8ed6f92-0b98-4b1b-a46e-4d0604d686a1" Sep 30 06:20:00 crc kubenswrapper[4691]: I0930 06:20:00.232854 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:00 crc kubenswrapper[4691]: I0930 06:20:00.233096 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:00 crc kubenswrapper[4691]: I0930 06:20:00.233232 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:00 crc kubenswrapper[4691]: I0930 06:20:00.233397 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:00 crc kubenswrapper[4691]: I0930 06:20:00.233805 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:00Z","lastTransitionTime":"2025-09-30T06:20:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:00 crc kubenswrapper[4691]: I0930 06:20:00.336548 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:00 crc kubenswrapper[4691]: I0930 06:20:00.336606 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:00 crc kubenswrapper[4691]: I0930 06:20:00.336661 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:00 crc kubenswrapper[4691]: I0930 06:20:00.336686 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:00 crc kubenswrapper[4691]: I0930 06:20:00.336704 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:00Z","lastTransitionTime":"2025-09-30T06:20:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:00 crc kubenswrapper[4691]: I0930 06:20:00.439822 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:00 crc kubenswrapper[4691]: I0930 06:20:00.440194 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:00 crc kubenswrapper[4691]: I0930 06:20:00.440605 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:00 crc kubenswrapper[4691]: I0930 06:20:00.440971 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:00 crc kubenswrapper[4691]: I0930 06:20:00.441291 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:00Z","lastTransitionTime":"2025-09-30T06:20:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:00 crc kubenswrapper[4691]: I0930 06:20:00.543876 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:00 crc kubenswrapper[4691]: I0930 06:20:00.544141 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:00 crc kubenswrapper[4691]: I0930 06:20:00.544183 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:00 crc kubenswrapper[4691]: I0930 06:20:00.544210 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:00 crc kubenswrapper[4691]: I0930 06:20:00.544230 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:00Z","lastTransitionTime":"2025-09-30T06:20:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:00 crc kubenswrapper[4691]: I0930 06:20:00.647274 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:00 crc kubenswrapper[4691]: I0930 06:20:00.647667 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:00 crc kubenswrapper[4691]: I0930 06:20:00.647798 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:00 crc kubenswrapper[4691]: I0930 06:20:00.647963 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:00 crc kubenswrapper[4691]: I0930 06:20:00.648099 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:00Z","lastTransitionTime":"2025-09-30T06:20:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:00 crc kubenswrapper[4691]: I0930 06:20:00.751341 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:00 crc kubenswrapper[4691]: I0930 06:20:00.751387 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:00 crc kubenswrapper[4691]: I0930 06:20:00.751398 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:00 crc kubenswrapper[4691]: I0930 06:20:00.751416 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:00 crc kubenswrapper[4691]: I0930 06:20:00.751431 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:00Z","lastTransitionTime":"2025-09-30T06:20:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:00 crc kubenswrapper[4691]: I0930 06:20:00.854040 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:00 crc kubenswrapper[4691]: I0930 06:20:00.854096 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:00 crc kubenswrapper[4691]: I0930 06:20:00.854114 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:00 crc kubenswrapper[4691]: I0930 06:20:00.854137 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:00 crc kubenswrapper[4691]: I0930 06:20:00.854155 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:00Z","lastTransitionTime":"2025-09-30T06:20:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:00 crc kubenswrapper[4691]: I0930 06:20:00.912876 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a8ed6f92-0b98-4b1b-a46e-4d0604d686a1-metrics-certs\") pod \"network-metrics-daemon-svjxq\" (UID: \"a8ed6f92-0b98-4b1b-a46e-4d0604d686a1\") " pod="openshift-multus/network-metrics-daemon-svjxq" Sep 30 06:20:00 crc kubenswrapper[4691]: E0930 06:20:00.913189 4691 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 06:20:00 crc kubenswrapper[4691]: E0930 06:20:00.913317 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8ed6f92-0b98-4b1b-a46e-4d0604d686a1-metrics-certs podName:a8ed6f92-0b98-4b1b-a46e-4d0604d686a1 nodeName:}" failed. No retries permitted until 2025-09-30 06:20:08.913277437 +0000 UTC m=+52.388298577 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a8ed6f92-0b98-4b1b-a46e-4d0604d686a1-metrics-certs") pod "network-metrics-daemon-svjxq" (UID: "a8ed6f92-0b98-4b1b-a46e-4d0604d686a1") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 06:20:00 crc kubenswrapper[4691]: I0930 06:20:00.957636 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:00 crc kubenswrapper[4691]: I0930 06:20:00.957701 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:00 crc kubenswrapper[4691]: I0930 06:20:00.957722 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:00 crc kubenswrapper[4691]: I0930 06:20:00.957765 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:00 crc kubenswrapper[4691]: I0930 06:20:00.957790 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:00Z","lastTransitionTime":"2025-09-30T06:20:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:01 crc kubenswrapper[4691]: I0930 06:20:01.061090 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:01 crc kubenswrapper[4691]: I0930 06:20:01.061163 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:01 crc kubenswrapper[4691]: I0930 06:20:01.061190 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:01 crc kubenswrapper[4691]: I0930 06:20:01.061219 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:01 crc kubenswrapper[4691]: I0930 06:20:01.061254 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:01Z","lastTransitionTime":"2025-09-30T06:20:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:01 crc kubenswrapper[4691]: I0930 06:20:01.164373 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:01 crc kubenswrapper[4691]: I0930 06:20:01.164446 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:01 crc kubenswrapper[4691]: I0930 06:20:01.164463 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:01 crc kubenswrapper[4691]: I0930 06:20:01.164492 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:01 crc kubenswrapper[4691]: I0930 06:20:01.164512 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:01Z","lastTransitionTime":"2025-09-30T06:20:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:01 crc kubenswrapper[4691]: I0930 06:20:01.224533 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 06:20:01 crc kubenswrapper[4691]: I0930 06:20:01.224533 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 06:20:01 crc kubenswrapper[4691]: E0930 06:20:01.225069 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 06:20:01 crc kubenswrapper[4691]: E0930 06:20:01.225236 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 06:20:01 crc kubenswrapper[4691]: I0930 06:20:01.267824 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:01 crc kubenswrapper[4691]: I0930 06:20:01.268250 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:01 crc kubenswrapper[4691]: I0930 06:20:01.268445 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:01 crc kubenswrapper[4691]: I0930 06:20:01.268593 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:01 crc kubenswrapper[4691]: I0930 06:20:01.268730 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:01Z","lastTransitionTime":"2025-09-30T06:20:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:01 crc kubenswrapper[4691]: I0930 06:20:01.372040 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:01 crc kubenswrapper[4691]: I0930 06:20:01.372105 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:01 crc kubenswrapper[4691]: I0930 06:20:01.372148 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:01 crc kubenswrapper[4691]: I0930 06:20:01.372221 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:01 crc kubenswrapper[4691]: I0930 06:20:01.372245 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:01Z","lastTransitionTime":"2025-09-30T06:20:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:01 crc kubenswrapper[4691]: I0930 06:20:01.479638 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:01 crc kubenswrapper[4691]: I0930 06:20:01.479708 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:01 crc kubenswrapper[4691]: I0930 06:20:01.479734 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:01 crc kubenswrapper[4691]: I0930 06:20:01.479765 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:01 crc kubenswrapper[4691]: I0930 06:20:01.479788 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:01Z","lastTransitionTime":"2025-09-30T06:20:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:01 crc kubenswrapper[4691]: I0930 06:20:01.581754 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:01 crc kubenswrapper[4691]: I0930 06:20:01.581830 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:01 crc kubenswrapper[4691]: I0930 06:20:01.581861 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:01 crc kubenswrapper[4691]: I0930 06:20:01.581921 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:01 crc kubenswrapper[4691]: I0930 06:20:01.581947 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:01Z","lastTransitionTime":"2025-09-30T06:20:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:01 crc kubenswrapper[4691]: I0930 06:20:01.685295 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:01 crc kubenswrapper[4691]: I0930 06:20:01.685356 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:01 crc kubenswrapper[4691]: I0930 06:20:01.685376 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:01 crc kubenswrapper[4691]: I0930 06:20:01.685399 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:01 crc kubenswrapper[4691]: I0930 06:20:01.685420 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:01Z","lastTransitionTime":"2025-09-30T06:20:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:01 crc kubenswrapper[4691]: I0930 06:20:01.787615 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:01 crc kubenswrapper[4691]: I0930 06:20:01.787669 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:01 crc kubenswrapper[4691]: I0930 06:20:01.787686 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:01 crc kubenswrapper[4691]: I0930 06:20:01.787707 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:01 crc kubenswrapper[4691]: I0930 06:20:01.787725 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:01Z","lastTransitionTime":"2025-09-30T06:20:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:01 crc kubenswrapper[4691]: I0930 06:20:01.890987 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:01 crc kubenswrapper[4691]: I0930 06:20:01.891036 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:01 crc kubenswrapper[4691]: I0930 06:20:01.891053 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:01 crc kubenswrapper[4691]: I0930 06:20:01.891074 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:01 crc kubenswrapper[4691]: I0930 06:20:01.891090 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:01Z","lastTransitionTime":"2025-09-30T06:20:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:01 crc kubenswrapper[4691]: I0930 06:20:01.993923 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:01 crc kubenswrapper[4691]: I0930 06:20:01.993996 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:01 crc kubenswrapper[4691]: I0930 06:20:01.994019 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:01 crc kubenswrapper[4691]: I0930 06:20:01.994051 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:01 crc kubenswrapper[4691]: I0930 06:20:01.994073 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:01Z","lastTransitionTime":"2025-09-30T06:20:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:02 crc kubenswrapper[4691]: I0930 06:20:02.097173 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:02 crc kubenswrapper[4691]: I0930 06:20:02.097528 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:02 crc kubenswrapper[4691]: I0930 06:20:02.097686 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:02 crc kubenswrapper[4691]: I0930 06:20:02.097932 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:02 crc kubenswrapper[4691]: I0930 06:20:02.098191 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:02Z","lastTransitionTime":"2025-09-30T06:20:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:02 crc kubenswrapper[4691]: I0930 06:20:02.201576 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:02 crc kubenswrapper[4691]: I0930 06:20:02.201641 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:02 crc kubenswrapper[4691]: I0930 06:20:02.201658 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:02 crc kubenswrapper[4691]: I0930 06:20:02.201683 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:02 crc kubenswrapper[4691]: I0930 06:20:02.201708 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:02Z","lastTransitionTime":"2025-09-30T06:20:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:02 crc kubenswrapper[4691]: I0930 06:20:02.223840 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-svjxq" Sep 30 06:20:02 crc kubenswrapper[4691]: I0930 06:20:02.223841 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 06:20:02 crc kubenswrapper[4691]: E0930 06:20:02.224184 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 06:20:02 crc kubenswrapper[4691]: E0930 06:20:02.224060 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-svjxq" podUID="a8ed6f92-0b98-4b1b-a46e-4d0604d686a1" Sep 30 06:20:02 crc kubenswrapper[4691]: I0930 06:20:02.304313 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:02 crc kubenswrapper[4691]: I0930 06:20:02.304364 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:02 crc kubenswrapper[4691]: I0930 06:20:02.304386 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:02 crc kubenswrapper[4691]: I0930 06:20:02.304409 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:02 crc kubenswrapper[4691]: I0930 06:20:02.304425 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:02Z","lastTransitionTime":"2025-09-30T06:20:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:02 crc kubenswrapper[4691]: I0930 06:20:02.408765 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:02 crc kubenswrapper[4691]: I0930 06:20:02.408831 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:02 crc kubenswrapper[4691]: I0930 06:20:02.408848 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:02 crc kubenswrapper[4691]: I0930 06:20:02.408872 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:02 crc kubenswrapper[4691]: I0930 06:20:02.408926 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:02Z","lastTransitionTime":"2025-09-30T06:20:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:02 crc kubenswrapper[4691]: I0930 06:20:02.512024 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:02 crc kubenswrapper[4691]: I0930 06:20:02.512064 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:02 crc kubenswrapper[4691]: I0930 06:20:02.512078 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:02 crc kubenswrapper[4691]: I0930 06:20:02.512097 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:02 crc kubenswrapper[4691]: I0930 06:20:02.512107 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:02Z","lastTransitionTime":"2025-09-30T06:20:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:02 crc kubenswrapper[4691]: I0930 06:20:02.630385 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:02 crc kubenswrapper[4691]: I0930 06:20:02.630433 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:02 crc kubenswrapper[4691]: I0930 06:20:02.630446 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:02 crc kubenswrapper[4691]: I0930 06:20:02.630473 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:02 crc kubenswrapper[4691]: I0930 06:20:02.630485 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:02Z","lastTransitionTime":"2025-09-30T06:20:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:02 crc kubenswrapper[4691]: I0930 06:20:02.732841 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:02 crc kubenswrapper[4691]: I0930 06:20:02.732934 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:02 crc kubenswrapper[4691]: I0930 06:20:02.732951 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:02 crc kubenswrapper[4691]: I0930 06:20:02.732978 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:02 crc kubenswrapper[4691]: I0930 06:20:02.732996 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:02Z","lastTransitionTime":"2025-09-30T06:20:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:02 crc kubenswrapper[4691]: I0930 06:20:02.836272 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:02 crc kubenswrapper[4691]: I0930 06:20:02.836317 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:02 crc kubenswrapper[4691]: I0930 06:20:02.836330 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:02 crc kubenswrapper[4691]: I0930 06:20:02.836350 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:02 crc kubenswrapper[4691]: I0930 06:20:02.836368 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:02Z","lastTransitionTime":"2025-09-30T06:20:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:02 crc kubenswrapper[4691]: I0930 06:20:02.940113 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:02 crc kubenswrapper[4691]: I0930 06:20:02.940169 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:02 crc kubenswrapper[4691]: I0930 06:20:02.940186 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:02 crc kubenswrapper[4691]: I0930 06:20:02.940212 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:02 crc kubenswrapper[4691]: I0930 06:20:02.940230 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:02Z","lastTransitionTime":"2025-09-30T06:20:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:03 crc kubenswrapper[4691]: I0930 06:20:03.043223 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:03 crc kubenswrapper[4691]: I0930 06:20:03.043291 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:03 crc kubenswrapper[4691]: I0930 06:20:03.043309 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:03 crc kubenswrapper[4691]: I0930 06:20:03.043336 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:03 crc kubenswrapper[4691]: I0930 06:20:03.043355 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:03Z","lastTransitionTime":"2025-09-30T06:20:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:03 crc kubenswrapper[4691]: I0930 06:20:03.146279 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:03 crc kubenswrapper[4691]: I0930 06:20:03.146547 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:03 crc kubenswrapper[4691]: I0930 06:20:03.146721 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:03 crc kubenswrapper[4691]: I0930 06:20:03.146878 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:03 crc kubenswrapper[4691]: I0930 06:20:03.147074 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:03Z","lastTransitionTime":"2025-09-30T06:20:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:03 crc kubenswrapper[4691]: I0930 06:20:03.224353 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 06:20:03 crc kubenswrapper[4691]: E0930 06:20:03.224578 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 06:20:03 crc kubenswrapper[4691]: I0930 06:20:03.224689 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 06:20:03 crc kubenswrapper[4691]: E0930 06:20:03.225028 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 06:20:03 crc kubenswrapper[4691]: I0930 06:20:03.257350 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:03 crc kubenswrapper[4691]: I0930 06:20:03.257415 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:03 crc kubenswrapper[4691]: I0930 06:20:03.257433 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:03 crc kubenswrapper[4691]: I0930 06:20:03.257457 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:03 crc kubenswrapper[4691]: I0930 06:20:03.257475 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:03Z","lastTransitionTime":"2025-09-30T06:20:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:03 crc kubenswrapper[4691]: I0930 06:20:03.360651 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:03 crc kubenswrapper[4691]: I0930 06:20:03.360712 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:03 crc kubenswrapper[4691]: I0930 06:20:03.360736 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:03 crc kubenswrapper[4691]: I0930 06:20:03.360765 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:03 crc kubenswrapper[4691]: I0930 06:20:03.360787 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:03Z","lastTransitionTime":"2025-09-30T06:20:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:03 crc kubenswrapper[4691]: I0930 06:20:03.464515 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:03 crc kubenswrapper[4691]: I0930 06:20:03.464590 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:03 crc kubenswrapper[4691]: I0930 06:20:03.464613 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:03 crc kubenswrapper[4691]: I0930 06:20:03.464642 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:03 crc kubenswrapper[4691]: I0930 06:20:03.464663 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:03Z","lastTransitionTime":"2025-09-30T06:20:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:03 crc kubenswrapper[4691]: I0930 06:20:03.568435 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:03 crc kubenswrapper[4691]: I0930 06:20:03.568486 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:03 crc kubenswrapper[4691]: I0930 06:20:03.568504 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:03 crc kubenswrapper[4691]: I0930 06:20:03.568527 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:03 crc kubenswrapper[4691]: I0930 06:20:03.568543 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:03Z","lastTransitionTime":"2025-09-30T06:20:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:03 crc kubenswrapper[4691]: I0930 06:20:03.672458 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:03 crc kubenswrapper[4691]: I0930 06:20:03.672518 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:03 crc kubenswrapper[4691]: I0930 06:20:03.672536 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:03 crc kubenswrapper[4691]: I0930 06:20:03.672558 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:03 crc kubenswrapper[4691]: I0930 06:20:03.672574 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:03Z","lastTransitionTime":"2025-09-30T06:20:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:03 crc kubenswrapper[4691]: I0930 06:20:03.775323 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:03 crc kubenswrapper[4691]: I0930 06:20:03.775373 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:03 crc kubenswrapper[4691]: I0930 06:20:03.775389 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:03 crc kubenswrapper[4691]: I0930 06:20:03.775410 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:03 crc kubenswrapper[4691]: I0930 06:20:03.775426 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:03Z","lastTransitionTime":"2025-09-30T06:20:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:03 crc kubenswrapper[4691]: I0930 06:20:03.879129 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:03 crc kubenswrapper[4691]: I0930 06:20:03.879268 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:03 crc kubenswrapper[4691]: I0930 06:20:03.879296 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:03 crc kubenswrapper[4691]: I0930 06:20:03.879325 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:03 crc kubenswrapper[4691]: I0930 06:20:03.879346 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:03Z","lastTransitionTime":"2025-09-30T06:20:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:03 crc kubenswrapper[4691]: I0930 06:20:03.904766 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:03 crc kubenswrapper[4691]: I0930 06:20:03.904821 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:03 crc kubenswrapper[4691]: I0930 06:20:03.904838 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:03 crc kubenswrapper[4691]: I0930 06:20:03.904932 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:03 crc kubenswrapper[4691]: I0930 06:20:03.904979 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:03Z","lastTransitionTime":"2025-09-30T06:20:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:03 crc kubenswrapper[4691]: E0930 06:20:03.925371 4691 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5d33cbaa-1b8a-4dde-af56-05c3aae2213e\\\",\\\"systemUUID\\\":\\\"d7c4eecd-4486-44ba-8bf7-42bfa69eb2b7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:03Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:03 crc kubenswrapper[4691]: I0930 06:20:03.930529 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:03 crc kubenswrapper[4691]: I0930 06:20:03.930587 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:03 crc kubenswrapper[4691]: I0930 06:20:03.930603 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:03 crc kubenswrapper[4691]: I0930 06:20:03.930628 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:03 crc kubenswrapper[4691]: I0930 06:20:03.930645 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:03Z","lastTransitionTime":"2025-09-30T06:20:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:03 crc kubenswrapper[4691]: E0930 06:20:03.950212 4691 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5d33cbaa-1b8a-4dde-af56-05c3aae2213e\\\",\\\"systemUUID\\\":\\\"d7c4eecd-4486-44ba-8bf7-42bfa69eb2b7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:03Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:03 crc kubenswrapper[4691]: I0930 06:20:03.954774 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:03 crc kubenswrapper[4691]: I0930 06:20:03.954822 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:03 crc kubenswrapper[4691]: I0930 06:20:03.954844 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:03 crc kubenswrapper[4691]: I0930 06:20:03.954864 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:03 crc kubenswrapper[4691]: I0930 06:20:03.954880 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:03Z","lastTransitionTime":"2025-09-30T06:20:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:03 crc kubenswrapper[4691]: E0930 06:20:03.973962 4691 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5d33cbaa-1b8a-4dde-af56-05c3aae2213e\\\",\\\"systemUUID\\\":\\\"d7c4eecd-4486-44ba-8bf7-42bfa69eb2b7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:03Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:03 crc kubenswrapper[4691]: I0930 06:20:03.979194 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:03 crc kubenswrapper[4691]: I0930 06:20:03.979261 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:03 crc kubenswrapper[4691]: I0930 06:20:03.979287 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:03 crc kubenswrapper[4691]: I0930 06:20:03.979316 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:03 crc kubenswrapper[4691]: I0930 06:20:03.979338 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:03Z","lastTransitionTime":"2025-09-30T06:20:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:03 crc kubenswrapper[4691]: E0930 06:20:03.999553 4691 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:03Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5d33cbaa-1b8a-4dde-af56-05c3aae2213e\\\",\\\"systemUUID\\\":\\\"d7c4eecd-4486-44ba-8bf7-42bfa69eb2b7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:03Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:04 crc kubenswrapper[4691]: I0930 06:20:04.005339 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:04 crc kubenswrapper[4691]: I0930 06:20:04.005388 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:04 crc kubenswrapper[4691]: I0930 06:20:04.005405 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:04 crc kubenswrapper[4691]: I0930 06:20:04.005426 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:04 crc kubenswrapper[4691]: I0930 06:20:04.005443 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:04Z","lastTransitionTime":"2025-09-30T06:20:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:04 crc kubenswrapper[4691]: E0930 06:20:04.026739 4691 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:04Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:04Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:04Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:04Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5d33cbaa-1b8a-4dde-af56-05c3aae2213e\\\",\\\"systemUUID\\\":\\\"d7c4eecd-4486-44ba-8bf7-42bfa69eb2b7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:04Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:04 crc kubenswrapper[4691]: E0930 06:20:04.027024 4691 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 06:20:04 crc kubenswrapper[4691]: I0930 06:20:04.029255 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:04 crc kubenswrapper[4691]: I0930 06:20:04.029316 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:04 crc kubenswrapper[4691]: I0930 06:20:04.029341 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:04 crc kubenswrapper[4691]: I0930 06:20:04.029368 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:04 crc kubenswrapper[4691]: I0930 06:20:04.029392 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:04Z","lastTransitionTime":"2025-09-30T06:20:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:04 crc kubenswrapper[4691]: I0930 06:20:04.133065 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:04 crc kubenswrapper[4691]: I0930 06:20:04.133182 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:04 crc kubenswrapper[4691]: I0930 06:20:04.133203 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:04 crc kubenswrapper[4691]: I0930 06:20:04.133228 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:04 crc kubenswrapper[4691]: I0930 06:20:04.133244 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:04Z","lastTransitionTime":"2025-09-30T06:20:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:04 crc kubenswrapper[4691]: I0930 06:20:04.224216 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 06:20:04 crc kubenswrapper[4691]: I0930 06:20:04.224305 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-svjxq" Sep 30 06:20:04 crc kubenswrapper[4691]: E0930 06:20:04.224408 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 06:20:04 crc kubenswrapper[4691]: E0930 06:20:04.224530 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-svjxq" podUID="a8ed6f92-0b98-4b1b-a46e-4d0604d686a1" Sep 30 06:20:04 crc kubenswrapper[4691]: I0930 06:20:04.237062 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:04 crc kubenswrapper[4691]: I0930 06:20:04.237119 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:04 crc kubenswrapper[4691]: I0930 06:20:04.237141 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:04 crc kubenswrapper[4691]: I0930 06:20:04.237172 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:04 crc kubenswrapper[4691]: I0930 06:20:04.237194 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:04Z","lastTransitionTime":"2025-09-30T06:20:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:04 crc kubenswrapper[4691]: I0930 06:20:04.340239 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:04 crc kubenswrapper[4691]: I0930 06:20:04.340312 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:04 crc kubenswrapper[4691]: I0930 06:20:04.340333 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:04 crc kubenswrapper[4691]: I0930 06:20:04.340361 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:04 crc kubenswrapper[4691]: I0930 06:20:04.340383 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:04Z","lastTransitionTime":"2025-09-30T06:20:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:04 crc kubenswrapper[4691]: I0930 06:20:04.443268 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:04 crc kubenswrapper[4691]: I0930 06:20:04.443319 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:04 crc kubenswrapper[4691]: I0930 06:20:04.443335 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:04 crc kubenswrapper[4691]: I0930 06:20:04.443359 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:04 crc kubenswrapper[4691]: I0930 06:20:04.443374 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:04Z","lastTransitionTime":"2025-09-30T06:20:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:04 crc kubenswrapper[4691]: I0930 06:20:04.546270 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:04 crc kubenswrapper[4691]: I0930 06:20:04.546426 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:04 crc kubenswrapper[4691]: I0930 06:20:04.546454 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:04 crc kubenswrapper[4691]: I0930 06:20:04.546484 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:04 crc kubenswrapper[4691]: I0930 06:20:04.546505 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:04Z","lastTransitionTime":"2025-09-30T06:20:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:04 crc kubenswrapper[4691]: I0930 06:20:04.650079 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:04 crc kubenswrapper[4691]: I0930 06:20:04.650143 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:04 crc kubenswrapper[4691]: I0930 06:20:04.650162 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:04 crc kubenswrapper[4691]: I0930 06:20:04.650187 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:04 crc kubenswrapper[4691]: I0930 06:20:04.650205 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:04Z","lastTransitionTime":"2025-09-30T06:20:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:04 crc kubenswrapper[4691]: I0930 06:20:04.753503 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:04 crc kubenswrapper[4691]: I0930 06:20:04.753578 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:04 crc kubenswrapper[4691]: I0930 06:20:04.753596 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:04 crc kubenswrapper[4691]: I0930 06:20:04.753621 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:04 crc kubenswrapper[4691]: I0930 06:20:04.753639 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:04Z","lastTransitionTime":"2025-09-30T06:20:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:04 crc kubenswrapper[4691]: I0930 06:20:04.856801 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:04 crc kubenswrapper[4691]: I0930 06:20:04.857136 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:04 crc kubenswrapper[4691]: I0930 06:20:04.857163 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:04 crc kubenswrapper[4691]: I0930 06:20:04.857198 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:04 crc kubenswrapper[4691]: I0930 06:20:04.857245 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:04Z","lastTransitionTime":"2025-09-30T06:20:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:04 crc kubenswrapper[4691]: I0930 06:20:04.960621 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:04 crc kubenswrapper[4691]: I0930 06:20:04.960700 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:04 crc kubenswrapper[4691]: I0930 06:20:04.960724 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:04 crc kubenswrapper[4691]: I0930 06:20:04.960757 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:04 crc kubenswrapper[4691]: I0930 06:20:04.960782 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:04Z","lastTransitionTime":"2025-09-30T06:20:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.063877 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.063953 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.063968 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.063987 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.064003 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:05Z","lastTransitionTime":"2025-09-30T06:20:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.168413 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.168466 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.168482 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.168506 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.168522 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:05Z","lastTransitionTime":"2025-09-30T06:20:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.224583 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.224618 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 06:20:05 crc kubenswrapper[4691]: E0930 06:20:05.225316 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 06:20:05 crc kubenswrapper[4691]: E0930 06:20:05.226285 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.234344 4691 scope.go:117] "RemoveContainer" containerID="62876e9f7bbf8224530d8666058b979352841ae2e2236cdf0265027e50fa60a6" Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.254937 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:05Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.272232 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.272295 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.272316 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.272348 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.272371 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:05Z","lastTransitionTime":"2025-09-30T06:20:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.273974 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8htrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c1c8663-d263-4b8c-93fa-05ee1b61d7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73cf34ef0b11305b959da324b55f8a211f0c278cf2dec835bb6c91ad385e4dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj86t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8htrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:05Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.295996 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b46ade-8260-448f-84b7-506632d23ff9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff4769002bae0f06423979897a58fdb4b2a78f9ea00abc1d5df5c8d4385c0919\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xszvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1440cd9bacd9b28d5e192b3d9143dda931c741606fdb8f246fba23b8a68c534c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xszvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4w4k6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:05Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.324315 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e55ae5a0-03bf-427d-92b2-39cdff18340d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f6cedf7ad601b81415785791be352c849e1d559ab8e6a0f33de2c89a93cf09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65513dc3f11ab14acbafe004e02c1b395b74935e84a00e75d9936bde03a97fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6240aa4c18020b30de9389e7a1869c736424444208d4011f228b4e5432520cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2841452dae413fa8cdf532c4933ed52cfb007be43a45ee2d25209b54890fd351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b052f8474075dfc1f07b8e0844072d5bbaa94bae1711f7e4cecc6928a65689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:05Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.342923 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb1cfb02-1e6b-4bfb-8104-f1d137231de9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07fbce63750320e7345e1a99b4be96355f54c36c70e8929599b990eab3f3e637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38c9aa49f81f590d234daa993bb92af0c48103b45f5116c27b86daa651930a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acbe061d62e7bf11b0003c035dce19386981672d5e63236fada6e43b92274c91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fc15eff68924ff75a90b9cb7b5119d95ced9b9fe9e12968bfd077db73f37ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://017d7c032ea394e2dbef22cfc837c65ce54aab7db5fbd32210312b3e7042a698\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T06:19:31Z\\\",\\\"message\\\":\\\"W0930 06:19:20.583735 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 06:19:20.584237 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759213160 cert, and key in /tmp/serving-cert-2229078217/serving-signer.crt, /tmp/serving-cert-2229078217/serving-signer.key\\\\nI0930 06:19:20.815054 1 observer_polling.go:159] Starting file observer\\\\nW0930 06:19:20.818069 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 06:19:20.818377 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 06:19:20.821284 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2229078217/tls.crt::/tmp/serving-cert-2229078217/tls.key\\\\\\\"\\\\nF0930 06:19:31.389088 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39d7f1cd59b9a24d4303415d873bac1ed21326847749130167868b5298b7f8e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:05Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.357291 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:05Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.373839 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03f725fd435423024f16548f2fdd44ad1ddc26d1d485d56cc5b614ab2cac7a76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5fcea9ccb5c1331b7ae737abdcc80364814715f3c43c15793316f16acf0502a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:05Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.378064 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.378235 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.378408 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.378548 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.378761 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:05Z","lastTransitionTime":"2025-09-30T06:20:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.389961 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3ecaa3-192f-40d4-abf8-938b9e03a661\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://469f62522e2f813405840a8654f761a52163bdabeba8cc7c57998ccd40548370\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198c76a83115408261c01c78cbd2845e487ad3dc896da0130b2b1338349506d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aadde3a7a603aae354d4f9dcf5ef288aeceabfb73ee3c92398fb3419326b27a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2f054ea0da71d1f65e501212a66771c5e65bfc123e3a3acd5e13900def039d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:05Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.404434 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11fb453d515798fa4f85603df19a5dc7d305375d25c0f679598613aba7312328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:05Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.421155 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nzp64" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b7ceecf30a44692340aed4f0df2aa58e8c9552f8ef872b7a872b29516fa2068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d675d9797e8541b6fff5df78a8d0701a51d23b03bd2d489d0489a82f440b4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d675d9797e8541b6fff5df78a8d0701a51d23b03bd2d489d0489a82f440b4a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aacdb742861442ec0e7b14904397b4e1a4bd115cd56e6105b37c42cb762b0dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aacdb742861442ec0e7b14904397b4e1a4bd115cd56e6105b37c42cb762b0dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f6c31e7404be56f11fd658c5e92f44de4704b9c2d5ebbce1fd0af40d2ca8c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1f6c31e7404be56f11fd658c5e92f44de4704b9c2d5ebbce1fd0af40d2ca8c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74da2d6dc28ebfcbfff21e47ebad84a906c4686fddf32d04eb494be40d811b86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74da2d6dc28ebfcbfff21e47ebad84a906c4686fddf32d04eb494be40d811b86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d1c4f2763fc5ea8ecd81a1c6aff31d291156a4db150fc70c0111b811bd5467c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d1c4f2763fc5ea8ecd81a1c6aff31d291156a4db150fc70c0111b811bd5467c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f313d83849679400f0ef2223f644c042290a33d55cab935fdbe9af123c88d210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f313d83849679400f0ef2223f644c042290a33d55cab935fdbe9af123c88d210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nzp64\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:05Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.453310 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b6df86867d5427797ec0c0a28b653ef5b8ec62af450207d9af5f490df725b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04bc58a7b740ca100f333ef769a949f962ff6024f8dd87b798c99a28514e0394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d95203bc3297779316f06d001c72859f9b4561a9a9d7c3123c5a75e8d8bc6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f46927454f67993366632bb3f6e37e4cb0c5f013266b18c1ac61e9cddc42e023\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20789e47d7e584044b0763af1eb8b81bbde2f247ec14e56a5a9b08c28894c3be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d002e869e911c24f154df90324ace473c747a4cfa6dd60ddcbef6473f9815f72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://62876e9f7bbf8224530d8666058b979352841ae2e2236cdf0265027e50fa60a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62876e9f7bbf8224530d8666058b979352841ae2e2236cdf0265027e50fa60a6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T06:19:50Z\\\",\\\"message\\\":\\\"365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI0930 06:19:50.520239 6155 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0930 06:19:50.520245 6155 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-target-xd92c in node crc\\\\nI0930 06:19:50.520244 6155 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI0930 06:19:50.520170 6155 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-cluster-version/cluster-version-operator]} name:Service_openshift-cluster-version/cluster-version-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.182:9099:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61d39e4d-21a9-4387-9a2b-fa4ad14792e2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0930 06:19:50.520261 6155 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI0930 06:19:50.520262 6155 obj_retry.go:303] Retry object setup: *v1.Pod open\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-sjmvw_openshift-ovn-kubernetes(6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f49abce591ae5c912fdbbdf0253d06ee20088424718dd6d535683a2a2ede5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sjmvw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:05Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.470908 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fxv8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37b1a1c7-92d7-41ea-b5c1-4a56b40f819e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be3a8860307e499758a11dd6561cdf1ffd968b3edf9c701b52994ea4cfe1c129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4h4k7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b07ee418ef8f92ea69f7d64ef09e0de246432d4663c4c5018be94894860c6444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4h4k7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fxv8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:05Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.482458 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.482516 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.482535 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.482560 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.482579 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:05Z","lastTransitionTime":"2025-09-30T06:20:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.483002 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-svjxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8ed6f92-0b98-4b1b-a46e-4d0604d686a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q98dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q98dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-svjxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:05Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.497424 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:05Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.515618 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca08f5d521940a428454d90f85150e7b87e2353314c89a42b4c73740c3b23163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:05Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.532652 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xjjw8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bfd073c-4582-4a65-8170-7030f4852174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a065eb18eb909fb60011f5ae6adf327088480996585d74e861628ffdd759bd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xjjw8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:05Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.544418 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-p7wmt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99a6e728-8795-424d-a99e-7141c75baad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://746d60ed8b3230cabda106f1d2e4bc8496618515f2cfb2d6c4238375f71ad2af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6jfq9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-p7wmt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:05Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.550705 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sjmvw_6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d/ovnkube-controller/1.log" Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.555542 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" event={"ID":"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d","Type":"ContainerStarted","Data":"e4eafbd6f3545c96381319c2b5692b5eccef9c2f837e82cdabf3ea0b92504260"} Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.555773 4691 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.582215 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11fb453d515798fa4f85603df19a5dc7d305375d25c0f679598613aba7312328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:05Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.585956 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.586018 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.586037 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.586062 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.586079 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:05Z","lastTransitionTime":"2025-09-30T06:20:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.605070 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nzp64" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b7ceecf30a44692340aed4f0df2aa58e8c9552f8ef872b7a872b29516fa2068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d675d9797e8541b6fff5df78a8d0701a51d23b03bd2d489d0489a82f440b4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d675d9797e8541b6fff5df78a8d0701a51d23b03bd2d489d0489a82f440b4a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aacdb742861442ec0e7b14904397b4e1a4bd115cd56e6105b37c42cb762b0dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aacdb742861442ec0e7b14904397b4e1a4bd115cd56e6105b37c42cb762b0dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f6c31e7404be56f11fd658c5e92f44de4704b9c2d5ebbce1fd0af40d2ca8c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1f6c31e7404be56f11fd658c5e92f44de4704b9c2d5ebbce1fd0af40d2ca8c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74da2d6dc28ebfcbfff21e47ebad84a906c4686fddf32d04eb494be40d811b86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74da2d6dc28ebfcbfff21e47ebad84a906c4686fddf32d04eb494be40d811b86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d1c4f2763fc5ea8ecd81a1c6aff31d291156a4db150fc70c0111b811bd5467c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d1c4f2763fc5ea8ecd81a1c6aff31d291156a4db150fc70c0111b811bd5467c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f313d83849679400f0ef2223f644c042290a33d55cab935fdbe9af123c88d210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f313d83849679400f0ef2223f644c042290a33d55cab935fdbe9af123c88d210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nzp64\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:05Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.634580 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b6df86867d5427797ec0c0a28b653ef5b8ec62af450207d9af5f490df725b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04bc58a7b740ca100f333ef769a949f962ff6024f8dd87b798c99a28514e0394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d95203bc3297779316f06d001c72859f9b4561a9a9d7c3123c5a75e8d8bc6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f46927454f67993366632bb3f6e37e4cb0c5f013266b18c1ac61e9cddc42e023\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20789e47d7e584044b0763af1eb8b81bbde2f247ec14e56a5a9b08c28894c3be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d002e869e911c24f154df90324ace473c747a4cfa6dd60ddcbef6473f9815f72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4eafbd6f3545c96381319c2b5692b5eccef9c2f837e82cdabf3ea0b92504260\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62876e9f7bbf8224530d8666058b979352841ae2e2236cdf0265027e50fa60a6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T06:19:50Z\\\",\\\"message\\\":\\\"365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI0930 06:19:50.520239 6155 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0930 06:19:50.520245 6155 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-target-xd92c in node crc\\\\nI0930 06:19:50.520244 6155 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI0930 06:19:50.520170 6155 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-cluster-version/cluster-version-operator]} name:Service_openshift-cluster-version/cluster-version-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.182:9099:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61d39e4d-21a9-4387-9a2b-fa4ad14792e2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0930 06:19:50.520261 6155 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI0930 06:19:50.520262 6155 obj_retry.go:303] Retry object setup: *v1.Pod open\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f49abce591ae5c912fdbbdf0253d06ee20088424718dd6d535683a2a2ede5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sjmvw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:05Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.649656 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-svjxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8ed6f92-0b98-4b1b-a46e-4d0604d686a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q98dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q98dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-svjxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:05Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.673046 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:05Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.689447 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.689496 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.689510 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.689533 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.689547 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:05Z","lastTransitionTime":"2025-09-30T06:20:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.693568 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca08f5d521940a428454d90f85150e7b87e2353314c89a42b4c73740c3b23163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:05Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.714229 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xjjw8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bfd073c-4582-4a65-8170-7030f4852174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a065eb18eb909fb60011f5ae6adf327088480996585d74e861628ffdd759bd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xjjw8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:05Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.728977 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-p7wmt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99a6e728-8795-424d-a99e-7141c75baad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://746d60ed8b3230cabda106f1d2e4bc8496618515f2cfb2d6c4238375f71ad2af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6jfq9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-p7wmt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:05Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.750905 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fxv8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37b1a1c7-92d7-41ea-b5c1-4a56b40f819e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be3a8860307e499758a11dd6561cdf1ffd968b3edf9c701b52994ea4cfe1c129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4h4k7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b07ee418ef8f92ea69f7d64ef09e0de246432d4663c4c5018be94894860c6444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4h4k7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fxv8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:05Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.766140 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8htrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c1c8663-d263-4b8c-93fa-05ee1b61d7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73cf34ef0b11305b959da324b55f8a211f0c278cf2dec835bb6c91ad385e4dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj86t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8htrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:05Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.789740 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b46ade-8260-448f-84b7-506632d23ff9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff4769002bae0f06423979897a58fdb4b2a78f9ea00abc1d5df5c8d4385c0919\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xszvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1440cd9bacd9b28d5e192b3d9143dda931c741606fdb8f246fba23b8a68c534c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xszvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4w4k6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:05Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.792975 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.793020 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.793032 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.793048 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.793059 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:05Z","lastTransitionTime":"2025-09-30T06:20:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.826564 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e55ae5a0-03bf-427d-92b2-39cdff18340d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f6cedf7ad601b81415785791be352c849e1d559ab8e6a0f33de2c89a93cf09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65513dc3f11ab14acbafe004e02c1b395b74935e84a00e75d9936bde03a97fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6240aa4c18020b30de9389e7a1869c736424444208d4011f228b4e5432520cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2841452dae413fa8cdf532c4933ed52cfb007be43a45ee2d25209b54890fd351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b052f8474075dfc1f07b8e0844072d5bbaa94bae1711f7e4cecc6928a65689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:05Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.840435 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb1cfb02-1e6b-4bfb-8104-f1d137231de9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07fbce63750320e7345e1a99b4be96355f54c36c70e8929599b990eab3f3e637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38c9aa49f81f590d234daa993bb92af0c48103b45f5116c27b86daa651930a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acbe061d62e7bf11b0003c035dce19386981672d5e63236fada6e43b92274c91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fc15eff68924ff75a90b9cb7b5119d95ced9b9fe9e12968bfd077db73f37ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://017d7c032ea394e2dbef22cfc837c65ce54aab7db5fbd32210312b3e7042a698\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T06:19:31Z\\\",\\\"message\\\":\\\"W0930 06:19:20.583735 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 06:19:20.584237 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759213160 cert, and key in /tmp/serving-cert-2229078217/serving-signer.crt, /tmp/serving-cert-2229078217/serving-signer.key\\\\nI0930 06:19:20.815054 1 observer_polling.go:159] Starting file observer\\\\nW0930 06:19:20.818069 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 06:19:20.818377 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 06:19:20.821284 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2229078217/tls.crt::/tmp/serving-cert-2229078217/tls.key\\\\\\\"\\\\nF0930 06:19:31.389088 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39d7f1cd59b9a24d4303415d873bac1ed21326847749130167868b5298b7f8e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:05Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.852498 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:05Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.869689 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03f725fd435423024f16548f2fdd44ad1ddc26d1d485d56cc5b614ab2cac7a76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5fcea9ccb5c1331b7ae737abdcc80364814715f3c43c15793316f16acf0502a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:05Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.885132 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:05Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.895813 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.895863 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.895876 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.895922 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.895942 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:05Z","lastTransitionTime":"2025-09-30T06:20:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.899848 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3ecaa3-192f-40d4-abf8-938b9e03a661\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://469f62522e2f813405840a8654f761a52163bdabeba8cc7c57998ccd40548370\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198c76a83115408261c01c78cbd2845e487ad3dc896da0130b2b1338349506d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aadde3a7a603aae354d4f9dcf5ef288aeceabfb73ee3c92398fb3419326b27a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2f054ea0da71d1f65e501212a66771c5e65bfc123e3a3acd5e13900def039d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:05Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.999221 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.999286 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.999305 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:05 crc kubenswrapper[4691]: I0930 06:20:05.999331 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:06 crc kubenswrapper[4691]: I0930 06:20:05.999350 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:05Z","lastTransitionTime":"2025-09-30T06:20:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:06 crc kubenswrapper[4691]: I0930 06:20:06.104264 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:06 crc kubenswrapper[4691]: I0930 06:20:06.104330 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:06 crc kubenswrapper[4691]: I0930 06:20:06.104347 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:06 crc kubenswrapper[4691]: I0930 06:20:06.104374 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:06 crc kubenswrapper[4691]: I0930 06:20:06.104397 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:06Z","lastTransitionTime":"2025-09-30T06:20:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:06 crc kubenswrapper[4691]: I0930 06:20:06.207453 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:06 crc kubenswrapper[4691]: I0930 06:20:06.207524 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:06 crc kubenswrapper[4691]: I0930 06:20:06.207542 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:06 crc kubenswrapper[4691]: I0930 06:20:06.207567 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:06 crc kubenswrapper[4691]: I0930 06:20:06.207586 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:06Z","lastTransitionTime":"2025-09-30T06:20:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:06 crc kubenswrapper[4691]: I0930 06:20:06.224880 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 06:20:06 crc kubenswrapper[4691]: I0930 06:20:06.224866 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-svjxq" Sep 30 06:20:06 crc kubenswrapper[4691]: E0930 06:20:06.225094 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 06:20:06 crc kubenswrapper[4691]: E0930 06:20:06.225300 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-svjxq" podUID="a8ed6f92-0b98-4b1b-a46e-4d0604d686a1" Sep 30 06:20:06 crc kubenswrapper[4691]: I0930 06:20:06.310627 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:06 crc kubenswrapper[4691]: I0930 06:20:06.310742 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:06 crc kubenswrapper[4691]: I0930 06:20:06.310755 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:06 crc kubenswrapper[4691]: I0930 06:20:06.310777 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:06 crc kubenswrapper[4691]: I0930 06:20:06.310788 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:06Z","lastTransitionTime":"2025-09-30T06:20:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:06 crc kubenswrapper[4691]: I0930 06:20:06.413925 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:06 crc kubenswrapper[4691]: I0930 06:20:06.414004 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:06 crc kubenswrapper[4691]: I0930 06:20:06.414030 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:06 crc kubenswrapper[4691]: I0930 06:20:06.414062 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:06 crc kubenswrapper[4691]: I0930 06:20:06.414086 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:06Z","lastTransitionTime":"2025-09-30T06:20:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:06 crc kubenswrapper[4691]: I0930 06:20:06.517689 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:06 crc kubenswrapper[4691]: I0930 06:20:06.517791 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:06 crc kubenswrapper[4691]: I0930 06:20:06.517809 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:06 crc kubenswrapper[4691]: I0930 06:20:06.517934 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:06 crc kubenswrapper[4691]: I0930 06:20:06.517957 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:06Z","lastTransitionTime":"2025-09-30T06:20:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:06 crc kubenswrapper[4691]: I0930 06:20:06.563732 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sjmvw_6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d/ovnkube-controller/2.log" Sep 30 06:20:06 crc kubenswrapper[4691]: I0930 06:20:06.565139 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sjmvw_6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d/ovnkube-controller/1.log" Sep 30 06:20:06 crc kubenswrapper[4691]: I0930 06:20:06.570837 4691 generic.go:334] "Generic (PLEG): container finished" podID="6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" containerID="e4eafbd6f3545c96381319c2b5692b5eccef9c2f837e82cdabf3ea0b92504260" exitCode=1 Sep 30 06:20:06 crc kubenswrapper[4691]: I0930 06:20:06.570942 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" event={"ID":"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d","Type":"ContainerDied","Data":"e4eafbd6f3545c96381319c2b5692b5eccef9c2f837e82cdabf3ea0b92504260"} Sep 30 06:20:06 crc kubenswrapper[4691]: I0930 06:20:06.571045 4691 scope.go:117] "RemoveContainer" containerID="62876e9f7bbf8224530d8666058b979352841ae2e2236cdf0265027e50fa60a6" Sep 30 06:20:06 crc kubenswrapper[4691]: I0930 06:20:06.572405 4691 scope.go:117] "RemoveContainer" containerID="e4eafbd6f3545c96381319c2b5692b5eccef9c2f837e82cdabf3ea0b92504260" Sep 30 06:20:06 crc kubenswrapper[4691]: E0930 06:20:06.572743 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-sjmvw_openshift-ovn-kubernetes(6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d)\"" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" podUID="6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" Sep 30 06:20:06 crc kubenswrapper[4691]: I0930 06:20:06.603017 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nzp64" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b7ceecf30a44692340aed4f0df2aa58e8c9552f8ef872b7a872b29516fa2068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d675d9797e8541b6fff5df78a8d0701a51d23b03bd2d489d0489a82f440b4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d675d9797e8541b6fff5df78a8d0701a51d23b03bd2d489d0489a82f440b4a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aacdb742861442ec0e7b14904397b4e1a4bd115cd56e6105b37c42cb762b0dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aacdb742861442ec0e7b14904397b4e1a4bd115cd56e6105b37c42cb762b0dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f6c31e7404be56f11fd658c5e92f44de4704b9c2d5ebbce1fd0af40d2ca8c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1f6c31e7404be56f11fd658c5e92f44de4704b9c2d5ebbce1fd0af40d2ca8c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74da2d6dc28ebfcbfff21e47ebad84a906c4686fddf32d04eb494be40d811b86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74da2d6dc28ebfcbfff21e47ebad84a906c4686fddf32d04eb494be40d811b86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d1c4f2763fc5ea8ecd81a1c6aff31d291156a4db150fc70c0111b811bd5467c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d1c4f2763fc5ea8ecd81a1c6aff31d291156a4db150fc70c0111b811bd5467c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f313d83849679400f0ef2223f644c042290a33d55cab935fdbe9af123c88d210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f313d83849679400f0ef2223f644c042290a33d55cab935fdbe9af123c88d210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nzp64\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:06Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:06 crc kubenswrapper[4691]: I0930 06:20:06.654739 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b6df86867d5427797ec0c0a28b653ef5b8ec62af450207d9af5f490df725b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04bc58a7b740ca100f333ef769a949f962ff6024f8dd87b798c99a28514e0394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d95203bc3297779316f06d001c72859f9b4561a9a9d7c3123c5a75e8d8bc6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f46927454f67993366632bb3f6e37e4cb0c5f013266b18c1ac61e9cddc42e023\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20789e47d7e584044b0763af1eb8b81bbde2f247ec14e56a5a9b08c28894c3be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d002e869e911c24f154df90324ace473c747a4cfa6dd60ddcbef6473f9815f72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4eafbd6f3545c96381319c2b5692b5eccef9c2f837e82cdabf3ea0b92504260\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62876e9f7bbf8224530d8666058b979352841ae2e2236cdf0265027e50fa60a6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T06:19:50Z\\\",\\\"message\\\":\\\"365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI0930 06:19:50.520239 6155 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0930 06:19:50.520245 6155 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-target-xd92c in node crc\\\\nI0930 06:19:50.520244 6155 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI0930 06:19:50.520170 6155 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-cluster-version/cluster-version-operator]} name:Service_openshift-cluster-version/cluster-version-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.182:9099:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61d39e4d-21a9-4387-9a2b-fa4ad14792e2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0930 06:19:50.520261 6155 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI0930 06:19:50.520262 6155 obj_retry.go:303] Retry object setup: *v1.Pod open\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4eafbd6f3545c96381319c2b5692b5eccef9c2f837e82cdabf3ea0b92504260\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T06:20:06Z\\\",\\\"message\\\":\\\"/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 06:20:06.206257 6355 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0930 06:20:06.206290 6355 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0930 06:20:06.206550 6355 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 06:20:06.206243 6355 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 06:20:06.206758 6355 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 06:20:06.207239 6355 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 06:20:06.207377 6355 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 06:20:06.207466 6355 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 06:20:06.208066 6355 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f49abce591ae5c912fdbbdf0253d06ee20088424718dd6d535683a2a2ede5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sjmvw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:06Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:06 crc kubenswrapper[4691]: I0930 06:20:06.656672 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:06 crc kubenswrapper[4691]: I0930 06:20:06.656731 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:06 crc kubenswrapper[4691]: I0930 06:20:06.656751 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:06 crc kubenswrapper[4691]: I0930 06:20:06.656777 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:06 crc kubenswrapper[4691]: I0930 06:20:06.656794 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:06Z","lastTransitionTime":"2025-09-30T06:20:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:06 crc kubenswrapper[4691]: I0930 06:20:06.681939 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11fb453d515798fa4f85603df19a5dc7d305375d25c0f679598613aba7312328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:06Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:06 crc kubenswrapper[4691]: I0930 06:20:06.701201 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca08f5d521940a428454d90f85150e7b87e2353314c89a42b4c73740c3b23163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:06Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:06 crc kubenswrapper[4691]: I0930 06:20:06.722826 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xjjw8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bfd073c-4582-4a65-8170-7030f4852174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a065eb18eb909fb60011f5ae6adf327088480996585d74e861628ffdd759bd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xjjw8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:06Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:06 crc kubenswrapper[4691]: I0930 06:20:06.739343 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-p7wmt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99a6e728-8795-424d-a99e-7141c75baad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://746d60ed8b3230cabda106f1d2e4bc8496618515f2cfb2d6c4238375f71ad2af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6jfq9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-p7wmt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:06Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:06 crc kubenswrapper[4691]: I0930 06:20:06.757459 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fxv8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37b1a1c7-92d7-41ea-b5c1-4a56b40f819e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be3a8860307e499758a11dd6561cdf1ffd968b3edf9c701b52994ea4cfe1c129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4h4k7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b07ee418ef8f92ea69f7d64ef09e0de246432d4663c4c5018be94894860c6444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4h4k7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fxv8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:06Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:06 crc kubenswrapper[4691]: I0930 06:20:06.759945 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:06 crc kubenswrapper[4691]: I0930 06:20:06.759992 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:06 crc kubenswrapper[4691]: I0930 06:20:06.760012 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:06 crc kubenswrapper[4691]: I0930 06:20:06.760044 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:06 crc kubenswrapper[4691]: I0930 06:20:06.760066 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:06Z","lastTransitionTime":"2025-09-30T06:20:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:06 crc kubenswrapper[4691]: I0930 06:20:06.776047 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-svjxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8ed6f92-0b98-4b1b-a46e-4d0604d686a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q98dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q98dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-svjxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:06Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:06 crc kubenswrapper[4691]: I0930 06:20:06.795772 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:06Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:06 crc kubenswrapper[4691]: I0930 06:20:06.818072 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb1cfb02-1e6b-4bfb-8104-f1d137231de9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07fbce63750320e7345e1a99b4be96355f54c36c70e8929599b990eab3f3e637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38c9aa49f81f590d234daa993bb92af0c48103b45f5116c27b86daa651930a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acbe061d62e7bf11b0003c035dce19386981672d5e63236fada6e43b92274c91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fc15eff68924ff75a90b9cb7b5119d95ced9b9fe9e12968bfd077db73f37ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://017d7c032ea394e2dbef22cfc837c65ce54aab7db5fbd32210312b3e7042a698\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T06:19:31Z\\\",\\\"message\\\":\\\"W0930 06:19:20.583735 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 06:19:20.584237 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759213160 cert, and key in /tmp/serving-cert-2229078217/serving-signer.crt, /tmp/serving-cert-2229078217/serving-signer.key\\\\nI0930 06:19:20.815054 1 observer_polling.go:159] Starting file observer\\\\nW0930 06:19:20.818069 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 06:19:20.818377 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 06:19:20.821284 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2229078217/tls.crt::/tmp/serving-cert-2229078217/tls.key\\\\\\\"\\\\nF0930 06:19:31.389088 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39d7f1cd59b9a24d4303415d873bac1ed21326847749130167868b5298b7f8e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:06Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:06 crc kubenswrapper[4691]: I0930 06:20:06.837997 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:06Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:06 crc kubenswrapper[4691]: I0930 06:20:06.858969 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03f725fd435423024f16548f2fdd44ad1ddc26d1d485d56cc5b614ab2cac7a76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5fcea9ccb5c1331b7ae737abdcc80364814715f3c43c15793316f16acf0502a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:06Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:06 crc kubenswrapper[4691]: I0930 06:20:06.863522 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:06 crc kubenswrapper[4691]: I0930 06:20:06.863615 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:06 crc kubenswrapper[4691]: I0930 06:20:06.863637 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:06 crc kubenswrapper[4691]: I0930 06:20:06.863671 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:06 crc kubenswrapper[4691]: I0930 06:20:06.863702 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:06Z","lastTransitionTime":"2025-09-30T06:20:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:06 crc kubenswrapper[4691]: I0930 06:20:06.878056 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:06Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:06 crc kubenswrapper[4691]: I0930 06:20:06.895236 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8htrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c1c8663-d263-4b8c-93fa-05ee1b61d7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73cf34ef0b11305b959da324b55f8a211f0c278cf2dec835bb6c91ad385e4dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj86t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8htrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:06Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:06 crc kubenswrapper[4691]: I0930 06:20:06.914415 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b46ade-8260-448f-84b7-506632d23ff9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff4769002bae0f06423979897a58fdb4b2a78f9ea00abc1d5df5c8d4385c0919\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xszvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1440cd9bacd9b28d5e192b3d9143dda931c741606fdb8f246fba23b8a68c534c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xszvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4w4k6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:06Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:06 crc kubenswrapper[4691]: I0930 06:20:06.947853 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e55ae5a0-03bf-427d-92b2-39cdff18340d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f6cedf7ad601b81415785791be352c849e1d559ab8e6a0f33de2c89a93cf09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65513dc3f11ab14acbafe004e02c1b395b74935e84a00e75d9936bde03a97fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6240aa4c18020b30de9389e7a1869c736424444208d4011f228b4e5432520cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2841452dae413fa8cdf532c4933ed52cfb007be43a45ee2d25209b54890fd351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b052f8474075dfc1f07b8e0844072d5bbaa94bae1711f7e4cecc6928a65689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:06Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:06 crc kubenswrapper[4691]: I0930 06:20:06.967246 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:06 crc kubenswrapper[4691]: I0930 06:20:06.967313 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:06 crc kubenswrapper[4691]: I0930 06:20:06.967330 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:06 crc kubenswrapper[4691]: I0930 06:20:06.967359 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:06 crc kubenswrapper[4691]: I0930 06:20:06.967376 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:06Z","lastTransitionTime":"2025-09-30T06:20:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:06 crc kubenswrapper[4691]: I0930 06:20:06.971257 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3ecaa3-192f-40d4-abf8-938b9e03a661\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://469f62522e2f813405840a8654f761a52163bdabeba8cc7c57998ccd40548370\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198c76a83115408261c01c78cbd2845e487ad3dc896da0130b2b1338349506d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aadde3a7a603aae354d4f9dcf5ef288aeceabfb73ee3c92398fb3419326b27a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2f054ea0da71d1f65e501212a66771c5e65bfc123e3a3acd5e13900def039d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:06Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:07 crc kubenswrapper[4691]: I0930 06:20:07.070684 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:07 crc kubenswrapper[4691]: I0930 06:20:07.070782 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:07 crc kubenswrapper[4691]: I0930 06:20:07.070801 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:07 crc kubenswrapper[4691]: I0930 06:20:07.070828 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:07 crc kubenswrapper[4691]: I0930 06:20:07.070846 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:07Z","lastTransitionTime":"2025-09-30T06:20:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:07 crc kubenswrapper[4691]: I0930 06:20:07.173585 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:07 crc kubenswrapper[4691]: I0930 06:20:07.173660 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:07 crc kubenswrapper[4691]: I0930 06:20:07.173678 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:07 crc kubenswrapper[4691]: I0930 06:20:07.173706 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:07 crc kubenswrapper[4691]: I0930 06:20:07.173729 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:07Z","lastTransitionTime":"2025-09-30T06:20:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:07 crc kubenswrapper[4691]: I0930 06:20:07.224117 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 06:20:07 crc kubenswrapper[4691]: I0930 06:20:07.224186 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 06:20:07 crc kubenswrapper[4691]: E0930 06:20:07.224330 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 06:20:07 crc kubenswrapper[4691]: E0930 06:20:07.224766 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 06:20:07 crc kubenswrapper[4691]: I0930 06:20:07.259920 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e55ae5a0-03bf-427d-92b2-39cdff18340d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f6cedf7ad601b81415785791be352c849e1d559ab8e6a0f33de2c89a93cf09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65513dc3f11ab14acbafe004e02c1b395b74935e84a00e75d9936bde03a97fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6240aa4c18020b30de9389e7a1869c736424444208d4011f228b4e5432520cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2841452dae413fa8cdf532c4933ed52cfb007be43a45ee2d25209b54890fd351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b052f8474075dfc1f07b8e0844072d5bbaa94bae1711f7e4cecc6928a65689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:07Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:07 crc kubenswrapper[4691]: I0930 06:20:07.276351 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:07 crc kubenswrapper[4691]: I0930 06:20:07.276575 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:07 crc kubenswrapper[4691]: I0930 06:20:07.276704 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:07 crc kubenswrapper[4691]: I0930 06:20:07.276855 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:07 crc kubenswrapper[4691]: I0930 06:20:07.277037 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:07Z","lastTransitionTime":"2025-09-30T06:20:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:07 crc kubenswrapper[4691]: I0930 06:20:07.285442 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb1cfb02-1e6b-4bfb-8104-f1d137231de9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07fbce63750320e7345e1a99b4be96355f54c36c70e8929599b990eab3f3e637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38c9aa49f81f590d234daa993bb92af0c48103b45f5116c27b86daa651930a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acbe061d62e7bf11b0003c035dce19386981672d5e63236fada6e43b92274c91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fc15eff68924ff75a90b9cb7b5119d95ced9b9fe9e12968bfd077db73f37ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://017d7c032ea394e2dbef22cfc837c65ce54aab7db5fbd32210312b3e7042a698\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T06:19:31Z\\\",\\\"message\\\":\\\"W0930 06:19:20.583735 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 06:19:20.584237 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759213160 cert, and key in /tmp/serving-cert-2229078217/serving-signer.crt, /tmp/serving-cert-2229078217/serving-signer.key\\\\nI0930 06:19:20.815054 1 observer_polling.go:159] Starting file observer\\\\nW0930 06:19:20.818069 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 06:19:20.818377 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 06:19:20.821284 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2229078217/tls.crt::/tmp/serving-cert-2229078217/tls.key\\\\\\\"\\\\nF0930 06:19:31.389088 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39d7f1cd59b9a24d4303415d873bac1ed21326847749130167868b5298b7f8e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:07Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:07 crc kubenswrapper[4691]: I0930 06:20:07.308225 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:07Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:07 crc kubenswrapper[4691]: I0930 06:20:07.330145 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03f725fd435423024f16548f2fdd44ad1ddc26d1d485d56cc5b614ab2cac7a76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5fcea9ccb5c1331b7ae737abdcc80364814715f3c43c15793316f16acf0502a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:07Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:07 crc kubenswrapper[4691]: I0930 06:20:07.349435 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:07Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:07 crc kubenswrapper[4691]: I0930 06:20:07.365602 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8htrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c1c8663-d263-4b8c-93fa-05ee1b61d7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73cf34ef0b11305b959da324b55f8a211f0c278cf2dec835bb6c91ad385e4dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj86t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8htrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:07Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:07 crc kubenswrapper[4691]: I0930 06:20:07.379994 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:07 crc kubenswrapper[4691]: I0930 06:20:07.380070 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:07 crc kubenswrapper[4691]: I0930 06:20:07.380093 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:07 crc kubenswrapper[4691]: I0930 06:20:07.380121 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:07 crc kubenswrapper[4691]: I0930 06:20:07.380142 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:07Z","lastTransitionTime":"2025-09-30T06:20:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:07 crc kubenswrapper[4691]: I0930 06:20:07.386302 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b46ade-8260-448f-84b7-506632d23ff9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff4769002bae0f06423979897a58fdb4b2a78f9ea00abc1d5df5c8d4385c0919\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xszvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1440cd9bacd9b28d5e192b3d9143dda931c741606fdb8f246fba23b8a68c534c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xszvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4w4k6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:07Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:07 crc kubenswrapper[4691]: I0930 06:20:07.406838 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3ecaa3-192f-40d4-abf8-938b9e03a661\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://469f62522e2f813405840a8654f761a52163bdabeba8cc7c57998ccd40548370\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198c76a83115408261c01c78cbd2845e487ad3dc896da0130b2b1338349506d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aadde3a7a603aae354d4f9dcf5ef288aeceabfb73ee3c92398fb3419326b27a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2f054ea0da71d1f65e501212a66771c5e65bfc123e3a3acd5e13900def039d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:07Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:07 crc kubenswrapper[4691]: I0930 06:20:07.430149 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11fb453d515798fa4f85603df19a5dc7d305375d25c0f679598613aba7312328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:07Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:07 crc kubenswrapper[4691]: I0930 06:20:07.457626 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nzp64" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b7ceecf30a44692340aed4f0df2aa58e8c9552f8ef872b7a872b29516fa2068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d675d9797e8541b6fff5df78a8d0701a51d23b03bd2d489d0489a82f440b4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d675d9797e8541b6fff5df78a8d0701a51d23b03bd2d489d0489a82f440b4a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aacdb742861442ec0e7b14904397b4e1a4bd115cd56e6105b37c42cb762b0dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aacdb742861442ec0e7b14904397b4e1a4bd115cd56e6105b37c42cb762b0dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f6c31e7404be56f11fd658c5e92f44de4704b9c2d5ebbce1fd0af40d2ca8c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1f6c31e7404be56f11fd658c5e92f44de4704b9c2d5ebbce1fd0af40d2ca8c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74da2d6dc28ebfcbfff21e47ebad84a906c4686fddf32d04eb494be40d811b86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74da2d6dc28ebfcbfff21e47ebad84a906c4686fddf32d04eb494be40d811b86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d1c4f2763fc5ea8ecd81a1c6aff31d291156a4db150fc70c0111b811bd5467c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d1c4f2763fc5ea8ecd81a1c6aff31d291156a4db150fc70c0111b811bd5467c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f313d83849679400f0ef2223f644c042290a33d55cab935fdbe9af123c88d210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f313d83849679400f0ef2223f644c042290a33d55cab935fdbe9af123c88d210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nzp64\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:07Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:07 crc kubenswrapper[4691]: I0930 06:20:07.482858 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:07 crc kubenswrapper[4691]: I0930 06:20:07.482943 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:07 crc kubenswrapper[4691]: I0930 06:20:07.482963 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:07 crc kubenswrapper[4691]: I0930 06:20:07.482989 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:07 crc kubenswrapper[4691]: I0930 06:20:07.483006 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:07Z","lastTransitionTime":"2025-09-30T06:20:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:07 crc kubenswrapper[4691]: I0930 06:20:07.492740 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b6df86867d5427797ec0c0a28b653ef5b8ec62af450207d9af5f490df725b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04bc58a7b740ca100f333ef769a949f962ff6024f8dd87b798c99a28514e0394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d95203bc3297779316f06d001c72859f9b4561a9a9d7c3123c5a75e8d8bc6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f46927454f67993366632bb3f6e37e4cb0c5f013266b18c1ac61e9cddc42e023\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20789e47d7e584044b0763af1eb8b81bbde2f247ec14e56a5a9b08c28894c3be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d002e869e911c24f154df90324ace473c747a4cfa6dd60ddcbef6473f9815f72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4eafbd6f3545c96381319c2b5692b5eccef9c2f837e82cdabf3ea0b92504260\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://62876e9f7bbf8224530d8666058b979352841ae2e2236cdf0265027e50fa60a6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T06:19:50Z\\\",\\\"message\\\":\\\"365] Adding new object: *v1.Pod openshift-network-diagnostics/network-check-target-xd92c\\\\nI0930 06:19:50.520239 6155 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g\\\\nI0930 06:19:50.520245 6155 ovn.go:134] Ensuring zone local for Pod openshift-network-diagnostics/network-check-target-xd92c in node crc\\\\nI0930 06:19:50.520244 6155 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI0930 06:19:50.520170 6155 model_client.go:382] Update operations generated as: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-cluster-version/cluster-version-operator]} name:Service_openshift-cluster-version/cluster-version-operator_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.4.182:9099:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {61d39e4d-21a9-4387-9a2b-fa4ad14792e2}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0930 06:19:50.520261 6155 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI0930 06:19:50.520262 6155 obj_retry.go:303] Retry object setup: *v1.Pod open\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4eafbd6f3545c96381319c2b5692b5eccef9c2f837e82cdabf3ea0b92504260\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T06:20:06Z\\\",\\\"message\\\":\\\"/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 06:20:06.206257 6355 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0930 06:20:06.206290 6355 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0930 06:20:06.206550 6355 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 06:20:06.206243 6355 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 06:20:06.206758 6355 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 06:20:06.207239 6355 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 06:20:06.207377 6355 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 06:20:06.207466 6355 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 06:20:06.208066 6355 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:20:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f49abce591ae5c912fdbbdf0253d06ee20088424718dd6d535683a2a2ede5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sjmvw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:07Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:07 crc kubenswrapper[4691]: I0930 06:20:07.514963 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:07Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:07 crc kubenswrapper[4691]: I0930 06:20:07.535931 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca08f5d521940a428454d90f85150e7b87e2353314c89a42b4c73740c3b23163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:07Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:07 crc kubenswrapper[4691]: I0930 06:20:07.557334 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xjjw8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bfd073c-4582-4a65-8170-7030f4852174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a065eb18eb909fb60011f5ae6adf327088480996585d74e861628ffdd759bd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xjjw8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:07Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:07 crc kubenswrapper[4691]: I0930 06:20:07.574091 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-p7wmt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99a6e728-8795-424d-a99e-7141c75baad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://746d60ed8b3230cabda106f1d2e4bc8496618515f2cfb2d6c4238375f71ad2af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6jfq9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-p7wmt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:07Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:07 crc kubenswrapper[4691]: I0930 06:20:07.577335 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sjmvw_6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d/ovnkube-controller/2.log" Sep 30 06:20:07 crc kubenswrapper[4691]: I0930 06:20:07.585307 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:07 crc kubenswrapper[4691]: I0930 06:20:07.585405 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:07 crc kubenswrapper[4691]: I0930 06:20:07.585429 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:07 crc kubenswrapper[4691]: I0930 06:20:07.585489 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:07 crc kubenswrapper[4691]: I0930 06:20:07.585516 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:07Z","lastTransitionTime":"2025-09-30T06:20:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:07 crc kubenswrapper[4691]: I0930 06:20:07.590635 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fxv8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37b1a1c7-92d7-41ea-b5c1-4a56b40f819e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be3a8860307e499758a11dd6561cdf1ffd968b3edf9c701b52994ea4cfe1c129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4h4k7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b07ee418ef8f92ea69f7d64ef09e0de246432d4663c4c5018be94894860c6444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4h4k7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fxv8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:07Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:07 crc kubenswrapper[4691]: I0930 06:20:07.600720 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-svjxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8ed6f92-0b98-4b1b-a46e-4d0604d686a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q98dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q98dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-svjxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:07Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:07 crc kubenswrapper[4691]: I0930 06:20:07.688608 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:07 crc kubenswrapper[4691]: I0930 06:20:07.688704 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:07 crc kubenswrapper[4691]: I0930 06:20:07.688759 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:07 crc kubenswrapper[4691]: I0930 06:20:07.688785 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:07 crc kubenswrapper[4691]: I0930 06:20:07.688803 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:07Z","lastTransitionTime":"2025-09-30T06:20:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:07 crc kubenswrapper[4691]: I0930 06:20:07.792739 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:07 crc kubenswrapper[4691]: I0930 06:20:07.793096 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:07 crc kubenswrapper[4691]: I0930 06:20:07.793283 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:07 crc kubenswrapper[4691]: I0930 06:20:07.793417 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:07 crc kubenswrapper[4691]: I0930 06:20:07.793563 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:07Z","lastTransitionTime":"2025-09-30T06:20:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:07 crc kubenswrapper[4691]: I0930 06:20:07.897866 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:07 crc kubenswrapper[4691]: I0930 06:20:07.898286 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:07 crc kubenswrapper[4691]: I0930 06:20:07.898429 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:07 crc kubenswrapper[4691]: I0930 06:20:07.898562 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:07 crc kubenswrapper[4691]: I0930 06:20:07.898704 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:07Z","lastTransitionTime":"2025-09-30T06:20:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:08 crc kubenswrapper[4691]: I0930 06:20:08.001672 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:08 crc kubenswrapper[4691]: I0930 06:20:08.001738 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:08 crc kubenswrapper[4691]: I0930 06:20:08.001754 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:08 crc kubenswrapper[4691]: I0930 06:20:08.001780 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:08 crc kubenswrapper[4691]: I0930 06:20:08.001829 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:08Z","lastTransitionTime":"2025-09-30T06:20:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:08 crc kubenswrapper[4691]: I0930 06:20:08.105494 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:08 crc kubenswrapper[4691]: I0930 06:20:08.105564 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:08 crc kubenswrapper[4691]: I0930 06:20:08.105584 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:08 crc kubenswrapper[4691]: I0930 06:20:08.105606 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:08 crc kubenswrapper[4691]: I0930 06:20:08.105623 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:08Z","lastTransitionTime":"2025-09-30T06:20:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:08 crc kubenswrapper[4691]: I0930 06:20:08.209016 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:08 crc kubenswrapper[4691]: I0930 06:20:08.209100 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:08 crc kubenswrapper[4691]: I0930 06:20:08.209113 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:08 crc kubenswrapper[4691]: I0930 06:20:08.209130 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:08 crc kubenswrapper[4691]: I0930 06:20:08.209143 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:08Z","lastTransitionTime":"2025-09-30T06:20:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:08 crc kubenswrapper[4691]: I0930 06:20:08.224638 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-svjxq" Sep 30 06:20:08 crc kubenswrapper[4691]: I0930 06:20:08.224659 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 06:20:08 crc kubenswrapper[4691]: E0930 06:20:08.224812 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-svjxq" podUID="a8ed6f92-0b98-4b1b-a46e-4d0604d686a1" Sep 30 06:20:08 crc kubenswrapper[4691]: E0930 06:20:08.225000 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 06:20:08 crc kubenswrapper[4691]: I0930 06:20:08.312276 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:08 crc kubenswrapper[4691]: I0930 06:20:08.312331 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:08 crc kubenswrapper[4691]: I0930 06:20:08.312349 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:08 crc kubenswrapper[4691]: I0930 06:20:08.312373 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:08 crc kubenswrapper[4691]: I0930 06:20:08.312390 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:08Z","lastTransitionTime":"2025-09-30T06:20:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:08 crc kubenswrapper[4691]: I0930 06:20:08.415609 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:08 crc kubenswrapper[4691]: I0930 06:20:08.415666 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:08 crc kubenswrapper[4691]: I0930 06:20:08.415682 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:08 crc kubenswrapper[4691]: I0930 06:20:08.415706 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:08 crc kubenswrapper[4691]: I0930 06:20:08.415723 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:08Z","lastTransitionTime":"2025-09-30T06:20:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:08 crc kubenswrapper[4691]: I0930 06:20:08.518494 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:08 crc kubenswrapper[4691]: I0930 06:20:08.518558 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:08 crc kubenswrapper[4691]: I0930 06:20:08.518574 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:08 crc kubenswrapper[4691]: I0930 06:20:08.518597 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:08 crc kubenswrapper[4691]: I0930 06:20:08.518614 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:08Z","lastTransitionTime":"2025-09-30T06:20:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:08 crc kubenswrapper[4691]: I0930 06:20:08.587228 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" Sep 30 06:20:08 crc kubenswrapper[4691]: I0930 06:20:08.588963 4691 scope.go:117] "RemoveContainer" containerID="e4eafbd6f3545c96381319c2b5692b5eccef9c2f837e82cdabf3ea0b92504260" Sep 30 06:20:08 crc kubenswrapper[4691]: E0930 06:20:08.589314 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-sjmvw_openshift-ovn-kubernetes(6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d)\"" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" podUID="6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" Sep 30 06:20:08 crc kubenswrapper[4691]: I0930 06:20:08.612017 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11fb453d515798fa4f85603df19a5dc7d305375d25c0f679598613aba7312328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:08Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:08 crc kubenswrapper[4691]: I0930 06:20:08.622005 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:08 crc kubenswrapper[4691]: I0930 06:20:08.622057 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:08 crc kubenswrapper[4691]: I0930 06:20:08.622078 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:08 crc kubenswrapper[4691]: I0930 06:20:08.622105 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:08 crc kubenswrapper[4691]: I0930 06:20:08.622122 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:08Z","lastTransitionTime":"2025-09-30T06:20:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:08 crc kubenswrapper[4691]: I0930 06:20:08.639256 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nzp64" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b7ceecf30a44692340aed4f0df2aa58e8c9552f8ef872b7a872b29516fa2068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d675d9797e8541b6fff5df78a8d0701a51d23b03bd2d489d0489a82f440b4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d675d9797e8541b6fff5df78a8d0701a51d23b03bd2d489d0489a82f440b4a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aacdb742861442ec0e7b14904397b4e1a4bd115cd56e6105b37c42cb762b0dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aacdb742861442ec0e7b14904397b4e1a4bd115cd56e6105b37c42cb762b0dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f6c31e7404be56f11fd658c5e92f44de4704b9c2d5ebbce1fd0af40d2ca8c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1f6c31e7404be56f11fd658c5e92f44de4704b9c2d5ebbce1fd0af40d2ca8c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74da2d6dc28ebfcbfff21e47ebad84a906c4686fddf32d04eb494be40d811b86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74da2d6dc28ebfcbfff21e47ebad84a906c4686fddf32d04eb494be40d811b86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d1c4f2763fc5ea8ecd81a1c6aff31d291156a4db150fc70c0111b811bd5467c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d1c4f2763fc5ea8ecd81a1c6aff31d291156a4db150fc70c0111b811bd5467c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f313d83849679400f0ef2223f644c042290a33d55cab935fdbe9af123c88d210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f313d83849679400f0ef2223f644c042290a33d55cab935fdbe9af123c88d210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nzp64\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:08Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:08 crc kubenswrapper[4691]: I0930 06:20:08.674573 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b6df86867d5427797ec0c0a28b653ef5b8ec62af450207d9af5f490df725b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04bc58a7b740ca100f333ef769a949f962ff6024f8dd87b798c99a28514e0394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d95203bc3297779316f06d001c72859f9b4561a9a9d7c3123c5a75e8d8bc6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f46927454f67993366632bb3f6e37e4cb0c5f013266b18c1ac61e9cddc42e023\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20789e47d7e584044b0763af1eb8b81bbde2f247ec14e56a5a9b08c28894c3be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d002e869e911c24f154df90324ace473c747a4cfa6dd60ddcbef6473f9815f72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4eafbd6f3545c96381319c2b5692b5eccef9c2f837e82cdabf3ea0b92504260\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4eafbd6f3545c96381319c2b5692b5eccef9c2f837e82cdabf3ea0b92504260\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T06:20:06Z\\\",\\\"message\\\":\\\"/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 06:20:06.206257 6355 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0930 06:20:06.206290 6355 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0930 06:20:06.206550 6355 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 06:20:06.206243 6355 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 06:20:06.206758 6355 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 06:20:06.207239 6355 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 06:20:06.207377 6355 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 06:20:06.207466 6355 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 06:20:06.208066 6355 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:20:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-sjmvw_openshift-ovn-kubernetes(6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f49abce591ae5c912fdbbdf0253d06ee20088424718dd6d535683a2a2ede5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sjmvw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:08Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:08 crc kubenswrapper[4691]: I0930 06:20:08.691607 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-p7wmt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99a6e728-8795-424d-a99e-7141c75baad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://746d60ed8b3230cabda106f1d2e4bc8496618515f2cfb2d6c4238375f71ad2af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6jfq9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-p7wmt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:08Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:08 crc kubenswrapper[4691]: I0930 06:20:08.709872 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fxv8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37b1a1c7-92d7-41ea-b5c1-4a56b40f819e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be3a8860307e499758a11dd6561cdf1ffd968b3edf9c701b52994ea4cfe1c129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4h4k7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b07ee418ef8f92ea69f7d64ef09e0de246432d4663c4c5018be94894860c6444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4h4k7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fxv8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:08Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:08 crc kubenswrapper[4691]: I0930 06:20:08.725483 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:08 crc kubenswrapper[4691]: I0930 06:20:08.725540 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:08 crc kubenswrapper[4691]: I0930 06:20:08.725557 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:08 crc kubenswrapper[4691]: I0930 06:20:08.725580 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:08 crc kubenswrapper[4691]: I0930 06:20:08.725596 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:08Z","lastTransitionTime":"2025-09-30T06:20:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:08 crc kubenswrapper[4691]: I0930 06:20:08.728638 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-svjxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8ed6f92-0b98-4b1b-a46e-4d0604d686a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q98dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q98dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-svjxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:08Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:08 crc kubenswrapper[4691]: I0930 06:20:08.748300 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:08Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:08 crc kubenswrapper[4691]: I0930 06:20:08.766635 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca08f5d521940a428454d90f85150e7b87e2353314c89a42b4c73740c3b23163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:08Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:08 crc kubenswrapper[4691]: I0930 06:20:08.788137 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xjjw8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bfd073c-4582-4a65-8170-7030f4852174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a065eb18eb909fb60011f5ae6adf327088480996585d74e861628ffdd759bd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xjjw8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:08Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:08 crc kubenswrapper[4691]: I0930 06:20:08.808852 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03f725fd435423024f16548f2fdd44ad1ddc26d1d485d56cc5b614ab2cac7a76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5fcea9ccb5c1331b7ae737abdcc80364814715f3c43c15793316f16acf0502a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:08Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:08 crc kubenswrapper[4691]: I0930 06:20:08.829534 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:08Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:08 crc kubenswrapper[4691]: I0930 06:20:08.829723 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:08 crc kubenswrapper[4691]: I0930 06:20:08.829982 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:08 crc kubenswrapper[4691]: I0930 06:20:08.830011 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:08 crc kubenswrapper[4691]: I0930 06:20:08.830040 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:08 crc kubenswrapper[4691]: I0930 06:20:08.830063 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:08Z","lastTransitionTime":"2025-09-30T06:20:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:08 crc kubenswrapper[4691]: I0930 06:20:08.846295 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8htrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c1c8663-d263-4b8c-93fa-05ee1b61d7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73cf34ef0b11305b959da324b55f8a211f0c278cf2dec835bb6c91ad385e4dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj86t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8htrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:08Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:08 crc kubenswrapper[4691]: I0930 06:20:08.864732 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b46ade-8260-448f-84b7-506632d23ff9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff4769002bae0f06423979897a58fdb4b2a78f9ea00abc1d5df5c8d4385c0919\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xszvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1440cd9bacd9b28d5e192b3d9143dda931c741606fdb8f246fba23b8a68c534c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xszvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4w4k6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:08Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:08 crc kubenswrapper[4691]: I0930 06:20:08.898330 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e55ae5a0-03bf-427d-92b2-39cdff18340d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f6cedf7ad601b81415785791be352c849e1d559ab8e6a0f33de2c89a93cf09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65513dc3f11ab14acbafe004e02c1b395b74935e84a00e75d9936bde03a97fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6240aa4c18020b30de9389e7a1869c736424444208d4011f228b4e5432520cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2841452dae413fa8cdf532c4933ed52cfb007be43a45ee2d25209b54890fd351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b052f8474075dfc1f07b8e0844072d5bbaa94bae1711f7e4cecc6928a65689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:08Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:08 crc kubenswrapper[4691]: I0930 06:20:08.923697 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb1cfb02-1e6b-4bfb-8104-f1d137231de9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07fbce63750320e7345e1a99b4be96355f54c36c70e8929599b990eab3f3e637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38c9aa49f81f590d234daa993bb92af0c48103b45f5116c27b86daa651930a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acbe061d62e7bf11b0003c035dce19386981672d5e63236fada6e43b92274c91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fc15eff68924ff75a90b9cb7b5119d95ced9b9fe9e12968bfd077db73f37ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://017d7c032ea394e2dbef22cfc837c65ce54aab7db5fbd32210312b3e7042a698\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T06:19:31Z\\\",\\\"message\\\":\\\"W0930 06:19:20.583735 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 06:19:20.584237 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759213160 cert, and key in /tmp/serving-cert-2229078217/serving-signer.crt, /tmp/serving-cert-2229078217/serving-signer.key\\\\nI0930 06:19:20.815054 1 observer_polling.go:159] Starting file observer\\\\nW0930 06:19:20.818069 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 06:19:20.818377 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 06:19:20.821284 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2229078217/tls.crt::/tmp/serving-cert-2229078217/tls.key\\\\\\\"\\\\nF0930 06:19:31.389088 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39d7f1cd59b9a24d4303415d873bac1ed21326847749130167868b5298b7f8e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:08Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:08 crc kubenswrapper[4691]: I0930 06:20:08.933957 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:08 crc kubenswrapper[4691]: I0930 06:20:08.934217 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:08 crc kubenswrapper[4691]: I0930 06:20:08.934367 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:08 crc kubenswrapper[4691]: I0930 06:20:08.934507 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:08 crc kubenswrapper[4691]: I0930 06:20:08.934647 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:08Z","lastTransitionTime":"2025-09-30T06:20:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:08 crc kubenswrapper[4691]: I0930 06:20:08.945106 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:08Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:08 crc kubenswrapper[4691]: I0930 06:20:08.966859 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3ecaa3-192f-40d4-abf8-938b9e03a661\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://469f62522e2f813405840a8654f761a52163bdabeba8cc7c57998ccd40548370\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198c76a83115408261c01c78cbd2845e487ad3dc896da0130b2b1338349506d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aadde3a7a603aae354d4f9dcf5ef288aeceabfb73ee3c92398fb3419326b27a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2f054ea0da71d1f65e501212a66771c5e65bfc123e3a3acd5e13900def039d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:08Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:09 crc kubenswrapper[4691]: I0930 06:20:09.011423 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 06:20:09 crc kubenswrapper[4691]: I0930 06:20:09.011587 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 06:20:09 crc kubenswrapper[4691]: E0930 06:20:09.011617 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 06:20:41.011578319 +0000 UTC m=+84.486599369 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:20:09 crc kubenswrapper[4691]: E0930 06:20:09.011743 4691 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 06:20:09 crc kubenswrapper[4691]: I0930 06:20:09.011771 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 06:20:09 crc kubenswrapper[4691]: E0930 06:20:09.011818 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 06:20:41.011795576 +0000 UTC m=+84.486816646 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 06:20:09 crc kubenswrapper[4691]: I0930 06:20:09.011848 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a8ed6f92-0b98-4b1b-a46e-4d0604d686a1-metrics-certs\") pod \"network-metrics-daemon-svjxq\" (UID: \"a8ed6f92-0b98-4b1b-a46e-4d0604d686a1\") " pod="openshift-multus/network-metrics-daemon-svjxq" Sep 30 06:20:09 crc kubenswrapper[4691]: E0930 06:20:09.011874 4691 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 06:20:09 crc kubenswrapper[4691]: E0930 06:20:09.011949 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 06:20:41.011940081 +0000 UTC m=+84.486961241 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 06:20:09 crc kubenswrapper[4691]: E0930 06:20:09.011976 4691 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 06:20:09 crc kubenswrapper[4691]: E0930 06:20:09.012023 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8ed6f92-0b98-4b1b-a46e-4d0604d686a1-metrics-certs podName:a8ed6f92-0b98-4b1b-a46e-4d0604d686a1 nodeName:}" failed. No retries permitted until 2025-09-30 06:20:25.012007903 +0000 UTC m=+68.487028973 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a8ed6f92-0b98-4b1b-a46e-4d0604d686a1-metrics-certs") pod "network-metrics-daemon-svjxq" (UID: "a8ed6f92-0b98-4b1b-a46e-4d0604d686a1") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 06:20:09 crc kubenswrapper[4691]: I0930 06:20:09.038378 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:09 crc kubenswrapper[4691]: I0930 06:20:09.038427 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:09 crc kubenswrapper[4691]: I0930 06:20:09.038437 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:09 crc kubenswrapper[4691]: I0930 06:20:09.038455 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:09 crc kubenswrapper[4691]: I0930 06:20:09.038469 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:09Z","lastTransitionTime":"2025-09-30T06:20:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:09 crc kubenswrapper[4691]: I0930 06:20:09.113140 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 06:20:09 crc kubenswrapper[4691]: I0930 06:20:09.113233 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 06:20:09 crc kubenswrapper[4691]: E0930 06:20:09.113361 4691 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 06:20:09 crc kubenswrapper[4691]: E0930 06:20:09.113380 4691 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 06:20:09 crc kubenswrapper[4691]: E0930 06:20:09.113392 4691 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 06:20:09 crc kubenswrapper[4691]: E0930 06:20:09.113443 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 06:20:41.113425769 +0000 UTC m=+84.588446819 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 06:20:09 crc kubenswrapper[4691]: E0930 06:20:09.113478 4691 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 06:20:09 crc kubenswrapper[4691]: E0930 06:20:09.113522 4691 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 06:20:09 crc kubenswrapper[4691]: E0930 06:20:09.113541 4691 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 06:20:09 crc kubenswrapper[4691]: E0930 06:20:09.113618 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 06:20:41.113592654 +0000 UTC m=+84.588613734 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 06:20:09 crc kubenswrapper[4691]: I0930 06:20:09.140597 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:09 crc kubenswrapper[4691]: I0930 06:20:09.140627 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:09 crc kubenswrapper[4691]: I0930 06:20:09.140664 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:09 crc kubenswrapper[4691]: I0930 06:20:09.140677 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:09 crc kubenswrapper[4691]: I0930 06:20:09.140687 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:09Z","lastTransitionTime":"2025-09-30T06:20:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:09 crc kubenswrapper[4691]: I0930 06:20:09.224208 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 06:20:09 crc kubenswrapper[4691]: I0930 06:20:09.224212 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 06:20:09 crc kubenswrapper[4691]: E0930 06:20:09.224316 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 06:20:09 crc kubenswrapper[4691]: E0930 06:20:09.224418 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 06:20:09 crc kubenswrapper[4691]: I0930 06:20:09.244006 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:09 crc kubenswrapper[4691]: I0930 06:20:09.244050 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:09 crc kubenswrapper[4691]: I0930 06:20:09.244067 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:09 crc kubenswrapper[4691]: I0930 06:20:09.244089 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:09 crc kubenswrapper[4691]: I0930 06:20:09.244106 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:09Z","lastTransitionTime":"2025-09-30T06:20:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:09 crc kubenswrapper[4691]: I0930 06:20:09.346479 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:09 crc kubenswrapper[4691]: I0930 06:20:09.346565 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:09 crc kubenswrapper[4691]: I0930 06:20:09.346589 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:09 crc kubenswrapper[4691]: I0930 06:20:09.346625 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:09 crc kubenswrapper[4691]: I0930 06:20:09.346652 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:09Z","lastTransitionTime":"2025-09-30T06:20:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:09 crc kubenswrapper[4691]: I0930 06:20:09.450309 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:09 crc kubenswrapper[4691]: I0930 06:20:09.450371 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:09 crc kubenswrapper[4691]: I0930 06:20:09.450388 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:09 crc kubenswrapper[4691]: I0930 06:20:09.450413 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:09 crc kubenswrapper[4691]: I0930 06:20:09.450430 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:09Z","lastTransitionTime":"2025-09-30T06:20:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:09 crc kubenswrapper[4691]: I0930 06:20:09.553940 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:09 crc kubenswrapper[4691]: I0930 06:20:09.554000 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:09 crc kubenswrapper[4691]: I0930 06:20:09.554016 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:09 crc kubenswrapper[4691]: I0930 06:20:09.554040 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:09 crc kubenswrapper[4691]: I0930 06:20:09.554059 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:09Z","lastTransitionTime":"2025-09-30T06:20:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:09 crc kubenswrapper[4691]: I0930 06:20:09.657029 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:09 crc kubenswrapper[4691]: I0930 06:20:09.657114 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:09 crc kubenswrapper[4691]: I0930 06:20:09.657138 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:09 crc kubenswrapper[4691]: I0930 06:20:09.657168 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:09 crc kubenswrapper[4691]: I0930 06:20:09.657189 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:09Z","lastTransitionTime":"2025-09-30T06:20:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:09 crc kubenswrapper[4691]: I0930 06:20:09.759942 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:09 crc kubenswrapper[4691]: I0930 06:20:09.760004 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:09 crc kubenswrapper[4691]: I0930 06:20:09.760023 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:09 crc kubenswrapper[4691]: I0930 06:20:09.760046 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:09 crc kubenswrapper[4691]: I0930 06:20:09.760063 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:09Z","lastTransitionTime":"2025-09-30T06:20:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:09 crc kubenswrapper[4691]: I0930 06:20:09.862973 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:09 crc kubenswrapper[4691]: I0930 06:20:09.863040 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:09 crc kubenswrapper[4691]: I0930 06:20:09.863058 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:09 crc kubenswrapper[4691]: I0930 06:20:09.863087 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:09 crc kubenswrapper[4691]: I0930 06:20:09.863106 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:09Z","lastTransitionTime":"2025-09-30T06:20:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:09 crc kubenswrapper[4691]: I0930 06:20:09.967061 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:09 crc kubenswrapper[4691]: I0930 06:20:09.967114 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:09 crc kubenswrapper[4691]: I0930 06:20:09.967126 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:09 crc kubenswrapper[4691]: I0930 06:20:09.967145 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:09 crc kubenswrapper[4691]: I0930 06:20:09.967157 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:09Z","lastTransitionTime":"2025-09-30T06:20:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:10 crc kubenswrapper[4691]: I0930 06:20:10.070015 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:10 crc kubenswrapper[4691]: I0930 06:20:10.070094 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:10 crc kubenswrapper[4691]: I0930 06:20:10.070120 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:10 crc kubenswrapper[4691]: I0930 06:20:10.070152 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:10 crc kubenswrapper[4691]: I0930 06:20:10.070177 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:10Z","lastTransitionTime":"2025-09-30T06:20:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:10 crc kubenswrapper[4691]: I0930 06:20:10.173726 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:10 crc kubenswrapper[4691]: I0930 06:20:10.173818 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:10 crc kubenswrapper[4691]: I0930 06:20:10.173844 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:10 crc kubenswrapper[4691]: I0930 06:20:10.173877 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:10 crc kubenswrapper[4691]: I0930 06:20:10.174038 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:10Z","lastTransitionTime":"2025-09-30T06:20:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:10 crc kubenswrapper[4691]: I0930 06:20:10.224458 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-svjxq" Sep 30 06:20:10 crc kubenswrapper[4691]: I0930 06:20:10.224482 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 06:20:10 crc kubenswrapper[4691]: E0930 06:20:10.224674 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-svjxq" podUID="a8ed6f92-0b98-4b1b-a46e-4d0604d686a1" Sep 30 06:20:10 crc kubenswrapper[4691]: E0930 06:20:10.225241 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 06:20:10 crc kubenswrapper[4691]: I0930 06:20:10.277467 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:10 crc kubenswrapper[4691]: I0930 06:20:10.277524 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:10 crc kubenswrapper[4691]: I0930 06:20:10.277541 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:10 crc kubenswrapper[4691]: I0930 06:20:10.277568 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:10 crc kubenswrapper[4691]: I0930 06:20:10.277585 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:10Z","lastTransitionTime":"2025-09-30T06:20:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:10 crc kubenswrapper[4691]: I0930 06:20:10.380810 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:10 crc kubenswrapper[4691]: I0930 06:20:10.380873 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:10 crc kubenswrapper[4691]: I0930 06:20:10.381005 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:10 crc kubenswrapper[4691]: I0930 06:20:10.381038 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:10 crc kubenswrapper[4691]: I0930 06:20:10.381061 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:10Z","lastTransitionTime":"2025-09-30T06:20:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:10 crc kubenswrapper[4691]: I0930 06:20:10.484540 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:10 crc kubenswrapper[4691]: I0930 06:20:10.484615 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:10 crc kubenswrapper[4691]: I0930 06:20:10.484639 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:10 crc kubenswrapper[4691]: I0930 06:20:10.484669 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:10 crc kubenswrapper[4691]: I0930 06:20:10.484690 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:10Z","lastTransitionTime":"2025-09-30T06:20:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:10 crc kubenswrapper[4691]: I0930 06:20:10.588114 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:10 crc kubenswrapper[4691]: I0930 06:20:10.588194 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:10 crc kubenswrapper[4691]: I0930 06:20:10.588217 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:10 crc kubenswrapper[4691]: I0930 06:20:10.588247 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:10 crc kubenswrapper[4691]: I0930 06:20:10.588266 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:10Z","lastTransitionTime":"2025-09-30T06:20:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:10 crc kubenswrapper[4691]: I0930 06:20:10.691018 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:10 crc kubenswrapper[4691]: I0930 06:20:10.691085 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:10 crc kubenswrapper[4691]: I0930 06:20:10.691110 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:10 crc kubenswrapper[4691]: I0930 06:20:10.691139 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:10 crc kubenswrapper[4691]: I0930 06:20:10.691161 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:10Z","lastTransitionTime":"2025-09-30T06:20:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:10 crc kubenswrapper[4691]: I0930 06:20:10.794136 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:10 crc kubenswrapper[4691]: I0930 06:20:10.794218 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:10 crc kubenswrapper[4691]: I0930 06:20:10.794238 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:10 crc kubenswrapper[4691]: I0930 06:20:10.794263 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:10 crc kubenswrapper[4691]: I0930 06:20:10.794281 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:10Z","lastTransitionTime":"2025-09-30T06:20:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:10 crc kubenswrapper[4691]: I0930 06:20:10.897257 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:10 crc kubenswrapper[4691]: I0930 06:20:10.897330 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:10 crc kubenswrapper[4691]: I0930 06:20:10.897356 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:10 crc kubenswrapper[4691]: I0930 06:20:10.897389 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:10 crc kubenswrapper[4691]: I0930 06:20:10.897412 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:10Z","lastTransitionTime":"2025-09-30T06:20:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:11 crc kubenswrapper[4691]: I0930 06:20:11.000366 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:11 crc kubenswrapper[4691]: I0930 06:20:11.000420 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:11 crc kubenswrapper[4691]: I0930 06:20:11.000436 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:11 crc kubenswrapper[4691]: I0930 06:20:11.000459 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:11 crc kubenswrapper[4691]: I0930 06:20:11.000476 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:11Z","lastTransitionTime":"2025-09-30T06:20:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:11 crc kubenswrapper[4691]: I0930 06:20:11.103617 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:11 crc kubenswrapper[4691]: I0930 06:20:11.103696 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:11 crc kubenswrapper[4691]: I0930 06:20:11.103717 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:11 crc kubenswrapper[4691]: I0930 06:20:11.103788 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:11 crc kubenswrapper[4691]: I0930 06:20:11.103814 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:11Z","lastTransitionTime":"2025-09-30T06:20:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:11 crc kubenswrapper[4691]: I0930 06:20:11.207435 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:11 crc kubenswrapper[4691]: I0930 06:20:11.207514 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:11 crc kubenswrapper[4691]: I0930 06:20:11.207568 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:11 crc kubenswrapper[4691]: I0930 06:20:11.207596 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:11 crc kubenswrapper[4691]: I0930 06:20:11.207617 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:11Z","lastTransitionTime":"2025-09-30T06:20:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:11 crc kubenswrapper[4691]: I0930 06:20:11.224023 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 06:20:11 crc kubenswrapper[4691]: I0930 06:20:11.224176 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 06:20:11 crc kubenswrapper[4691]: E0930 06:20:11.224428 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 06:20:11 crc kubenswrapper[4691]: E0930 06:20:11.224655 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 06:20:11 crc kubenswrapper[4691]: I0930 06:20:11.311329 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:11 crc kubenswrapper[4691]: I0930 06:20:11.311396 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:11 crc kubenswrapper[4691]: I0930 06:20:11.311416 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:11 crc kubenswrapper[4691]: I0930 06:20:11.311442 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:11 crc kubenswrapper[4691]: I0930 06:20:11.311462 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:11Z","lastTransitionTime":"2025-09-30T06:20:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:11 crc kubenswrapper[4691]: I0930 06:20:11.414913 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:11 crc kubenswrapper[4691]: I0930 06:20:11.414970 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:11 crc kubenswrapper[4691]: I0930 06:20:11.414987 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:11 crc kubenswrapper[4691]: I0930 06:20:11.415011 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:11 crc kubenswrapper[4691]: I0930 06:20:11.415030 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:11Z","lastTransitionTime":"2025-09-30T06:20:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:11 crc kubenswrapper[4691]: I0930 06:20:11.517246 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:11 crc kubenswrapper[4691]: I0930 06:20:11.517311 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:11 crc kubenswrapper[4691]: I0930 06:20:11.517338 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:11 crc kubenswrapper[4691]: I0930 06:20:11.517367 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:11 crc kubenswrapper[4691]: I0930 06:20:11.517390 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:11Z","lastTransitionTime":"2025-09-30T06:20:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:11 crc kubenswrapper[4691]: I0930 06:20:11.620434 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:11 crc kubenswrapper[4691]: I0930 06:20:11.620602 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:11 crc kubenswrapper[4691]: I0930 06:20:11.620623 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:11 crc kubenswrapper[4691]: I0930 06:20:11.620655 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:11 crc kubenswrapper[4691]: I0930 06:20:11.620672 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:11Z","lastTransitionTime":"2025-09-30T06:20:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:11 crc kubenswrapper[4691]: I0930 06:20:11.723828 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:11 crc kubenswrapper[4691]: I0930 06:20:11.723880 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:11 crc kubenswrapper[4691]: I0930 06:20:11.723933 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:11 crc kubenswrapper[4691]: I0930 06:20:11.723956 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:11 crc kubenswrapper[4691]: I0930 06:20:11.723973 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:11Z","lastTransitionTime":"2025-09-30T06:20:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:11 crc kubenswrapper[4691]: I0930 06:20:11.826568 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:11 crc kubenswrapper[4691]: I0930 06:20:11.826620 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:11 crc kubenswrapper[4691]: I0930 06:20:11.826637 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:11 crc kubenswrapper[4691]: I0930 06:20:11.826660 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:11 crc kubenswrapper[4691]: I0930 06:20:11.826676 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:11Z","lastTransitionTime":"2025-09-30T06:20:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:11 crc kubenswrapper[4691]: I0930 06:20:11.929550 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:11 crc kubenswrapper[4691]: I0930 06:20:11.929593 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:11 crc kubenswrapper[4691]: I0930 06:20:11.929603 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:11 crc kubenswrapper[4691]: I0930 06:20:11.929620 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:11 crc kubenswrapper[4691]: I0930 06:20:11.929634 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:11Z","lastTransitionTime":"2025-09-30T06:20:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:12 crc kubenswrapper[4691]: I0930 06:20:12.032595 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:12 crc kubenswrapper[4691]: I0930 06:20:12.032636 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:12 crc kubenswrapper[4691]: I0930 06:20:12.032645 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:12 crc kubenswrapper[4691]: I0930 06:20:12.032660 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:12 crc kubenswrapper[4691]: I0930 06:20:12.032669 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:12Z","lastTransitionTime":"2025-09-30T06:20:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:12 crc kubenswrapper[4691]: I0930 06:20:12.136190 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:12 crc kubenswrapper[4691]: I0930 06:20:12.136248 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:12 crc kubenswrapper[4691]: I0930 06:20:12.136266 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:12 crc kubenswrapper[4691]: I0930 06:20:12.136291 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:12 crc kubenswrapper[4691]: I0930 06:20:12.136309 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:12Z","lastTransitionTime":"2025-09-30T06:20:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:12 crc kubenswrapper[4691]: I0930 06:20:12.223932 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-svjxq" Sep 30 06:20:12 crc kubenswrapper[4691]: I0930 06:20:12.223932 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 06:20:12 crc kubenswrapper[4691]: E0930 06:20:12.224123 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-svjxq" podUID="a8ed6f92-0b98-4b1b-a46e-4d0604d686a1" Sep 30 06:20:12 crc kubenswrapper[4691]: E0930 06:20:12.224226 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 06:20:12 crc kubenswrapper[4691]: I0930 06:20:12.238761 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:12 crc kubenswrapper[4691]: I0930 06:20:12.238791 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:12 crc kubenswrapper[4691]: I0930 06:20:12.238798 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:12 crc kubenswrapper[4691]: I0930 06:20:12.238809 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:12 crc kubenswrapper[4691]: I0930 06:20:12.238819 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:12Z","lastTransitionTime":"2025-09-30T06:20:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:12 crc kubenswrapper[4691]: I0930 06:20:12.342264 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:12 crc kubenswrapper[4691]: I0930 06:20:12.342319 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:12 crc kubenswrapper[4691]: I0930 06:20:12.342338 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:12 crc kubenswrapper[4691]: I0930 06:20:12.342364 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:12 crc kubenswrapper[4691]: I0930 06:20:12.342384 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:12Z","lastTransitionTime":"2025-09-30T06:20:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:12 crc kubenswrapper[4691]: I0930 06:20:12.444960 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:12 crc kubenswrapper[4691]: I0930 06:20:12.445014 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:12 crc kubenswrapper[4691]: I0930 06:20:12.445030 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:12 crc kubenswrapper[4691]: I0930 06:20:12.445051 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:12 crc kubenswrapper[4691]: I0930 06:20:12.445068 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:12Z","lastTransitionTime":"2025-09-30T06:20:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:12 crc kubenswrapper[4691]: I0930 06:20:12.548182 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:12 crc kubenswrapper[4691]: I0930 06:20:12.548227 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:12 crc kubenswrapper[4691]: I0930 06:20:12.548240 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:12 crc kubenswrapper[4691]: I0930 06:20:12.548257 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:12 crc kubenswrapper[4691]: I0930 06:20:12.548269 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:12Z","lastTransitionTime":"2025-09-30T06:20:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:12 crc kubenswrapper[4691]: I0930 06:20:12.650879 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:12 crc kubenswrapper[4691]: I0930 06:20:12.650974 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:12 crc kubenswrapper[4691]: I0930 06:20:12.650996 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:12 crc kubenswrapper[4691]: I0930 06:20:12.651021 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:12 crc kubenswrapper[4691]: I0930 06:20:12.651065 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:12Z","lastTransitionTime":"2025-09-30T06:20:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:12 crc kubenswrapper[4691]: I0930 06:20:12.753684 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:12 crc kubenswrapper[4691]: I0930 06:20:12.753741 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:12 crc kubenswrapper[4691]: I0930 06:20:12.753762 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:12 crc kubenswrapper[4691]: I0930 06:20:12.753787 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:12 crc kubenswrapper[4691]: I0930 06:20:12.753805 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:12Z","lastTransitionTime":"2025-09-30T06:20:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:12 crc kubenswrapper[4691]: I0930 06:20:12.856426 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:12 crc kubenswrapper[4691]: I0930 06:20:12.856486 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:12 crc kubenswrapper[4691]: I0930 06:20:12.856506 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:12 crc kubenswrapper[4691]: I0930 06:20:12.856528 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:12 crc kubenswrapper[4691]: I0930 06:20:12.856545 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:12Z","lastTransitionTime":"2025-09-30T06:20:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:12 crc kubenswrapper[4691]: I0930 06:20:12.960054 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:12 crc kubenswrapper[4691]: I0930 06:20:12.960122 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:12 crc kubenswrapper[4691]: I0930 06:20:12.960141 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:12 crc kubenswrapper[4691]: I0930 06:20:12.960165 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:12 crc kubenswrapper[4691]: I0930 06:20:12.960183 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:12Z","lastTransitionTime":"2025-09-30T06:20:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:13 crc kubenswrapper[4691]: I0930 06:20:13.063181 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:13 crc kubenswrapper[4691]: I0930 06:20:13.063248 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:13 crc kubenswrapper[4691]: I0930 06:20:13.063269 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:13 crc kubenswrapper[4691]: I0930 06:20:13.063292 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:13 crc kubenswrapper[4691]: I0930 06:20:13.063312 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:13Z","lastTransitionTime":"2025-09-30T06:20:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:13 crc kubenswrapper[4691]: I0930 06:20:13.166471 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:13 crc kubenswrapper[4691]: I0930 06:20:13.166542 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:13 crc kubenswrapper[4691]: I0930 06:20:13.166560 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:13 crc kubenswrapper[4691]: I0930 06:20:13.166586 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:13 crc kubenswrapper[4691]: I0930 06:20:13.166604 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:13Z","lastTransitionTime":"2025-09-30T06:20:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:13 crc kubenswrapper[4691]: I0930 06:20:13.224110 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 06:20:13 crc kubenswrapper[4691]: I0930 06:20:13.224160 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 06:20:13 crc kubenswrapper[4691]: E0930 06:20:13.224372 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 06:20:13 crc kubenswrapper[4691]: E0930 06:20:13.224514 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 06:20:13 crc kubenswrapper[4691]: I0930 06:20:13.270500 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:13 crc kubenswrapper[4691]: I0930 06:20:13.270556 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:13 crc kubenswrapper[4691]: I0930 06:20:13.270572 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:13 crc kubenswrapper[4691]: I0930 06:20:13.270596 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:13 crc kubenswrapper[4691]: I0930 06:20:13.270615 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:13Z","lastTransitionTime":"2025-09-30T06:20:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:13 crc kubenswrapper[4691]: I0930 06:20:13.365295 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Sep 30 06:20:13 crc kubenswrapper[4691]: I0930 06:20:13.373626 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:13 crc kubenswrapper[4691]: I0930 06:20:13.373682 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:13 crc kubenswrapper[4691]: I0930 06:20:13.373705 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:13 crc kubenswrapper[4691]: I0930 06:20:13.373735 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:13 crc kubenswrapper[4691]: I0930 06:20:13.373756 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:13Z","lastTransitionTime":"2025-09-30T06:20:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:13 crc kubenswrapper[4691]: I0930 06:20:13.379554 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Sep 30 06:20:13 crc kubenswrapper[4691]: I0930 06:20:13.389636 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b46ade-8260-448f-84b7-506632d23ff9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff4769002bae0f06423979897a58fdb4b2a78f9ea00abc1d5df5c8d4385c0919\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xszvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1440cd9bacd9b28d5e192b3d9143dda931c741606fdb8f246fba23b8a68c534c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xszvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4w4k6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:13Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:13 crc kubenswrapper[4691]: I0930 06:20:13.425057 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e55ae5a0-03bf-427d-92b2-39cdff18340d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f6cedf7ad601b81415785791be352c849e1d559ab8e6a0f33de2c89a93cf09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65513dc3f11ab14acbafe004e02c1b395b74935e84a00e75d9936bde03a97fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6240aa4c18020b30de9389e7a1869c736424444208d4011f228b4e5432520cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2841452dae413fa8cdf532c4933ed52cfb007be43a45ee2d25209b54890fd351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b052f8474075dfc1f07b8e0844072d5bbaa94bae1711f7e4cecc6928a65689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:13Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:13 crc kubenswrapper[4691]: I0930 06:20:13.447467 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb1cfb02-1e6b-4bfb-8104-f1d137231de9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07fbce63750320e7345e1a99b4be96355f54c36c70e8929599b990eab3f3e637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38c9aa49f81f590d234daa993bb92af0c48103b45f5116c27b86daa651930a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acbe061d62e7bf11b0003c035dce19386981672d5e63236fada6e43b92274c91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fc15eff68924ff75a90b9cb7b5119d95ced9b9fe9e12968bfd077db73f37ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://017d7c032ea394e2dbef22cfc837c65ce54aab7db5fbd32210312b3e7042a698\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T06:19:31Z\\\",\\\"message\\\":\\\"W0930 06:19:20.583735 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 06:19:20.584237 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759213160 cert, and key in /tmp/serving-cert-2229078217/serving-signer.crt, /tmp/serving-cert-2229078217/serving-signer.key\\\\nI0930 06:19:20.815054 1 observer_polling.go:159] Starting file observer\\\\nW0930 06:19:20.818069 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 06:19:20.818377 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 06:19:20.821284 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2229078217/tls.crt::/tmp/serving-cert-2229078217/tls.key\\\\\\\"\\\\nF0930 06:19:31.389088 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39d7f1cd59b9a24d4303415d873bac1ed21326847749130167868b5298b7f8e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:13Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:13 crc kubenswrapper[4691]: I0930 06:20:13.467194 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:13Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:13 crc kubenswrapper[4691]: I0930 06:20:13.476837 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:13 crc kubenswrapper[4691]: I0930 06:20:13.476915 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:13 crc kubenswrapper[4691]: I0930 06:20:13.476934 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:13 crc kubenswrapper[4691]: I0930 06:20:13.476985 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:13 crc kubenswrapper[4691]: I0930 06:20:13.477001 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:13Z","lastTransitionTime":"2025-09-30T06:20:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:13 crc kubenswrapper[4691]: I0930 06:20:13.490303 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03f725fd435423024f16548f2fdd44ad1ddc26d1d485d56cc5b614ab2cac7a76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5fcea9ccb5c1331b7ae737abdcc80364814715f3c43c15793316f16acf0502a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:13Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:13 crc kubenswrapper[4691]: I0930 06:20:13.513194 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:13Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:13 crc kubenswrapper[4691]: I0930 06:20:13.528652 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8htrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c1c8663-d263-4b8c-93fa-05ee1b61d7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73cf34ef0b11305b959da324b55f8a211f0c278cf2dec835bb6c91ad385e4dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj86t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8htrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:13Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:13 crc kubenswrapper[4691]: I0930 06:20:13.547989 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3ecaa3-192f-40d4-abf8-938b9e03a661\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://469f62522e2f813405840a8654f761a52163bdabeba8cc7c57998ccd40548370\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198c76a83115408261c01c78cbd2845e487ad3dc896da0130b2b1338349506d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aadde3a7a603aae354d4f9dcf5ef288aeceabfb73ee3c92398fb3419326b27a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2f054ea0da71d1f65e501212a66771c5e65bfc123e3a3acd5e13900def039d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:13Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:13 crc kubenswrapper[4691]: I0930 06:20:13.570954 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11fb453d515798fa4f85603df19a5dc7d305375d25c0f679598613aba7312328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:13Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:13 crc kubenswrapper[4691]: I0930 06:20:13.579592 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:13 crc kubenswrapper[4691]: I0930 06:20:13.579648 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:13 crc kubenswrapper[4691]: I0930 06:20:13.579665 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:13 crc kubenswrapper[4691]: I0930 06:20:13.579694 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:13 crc kubenswrapper[4691]: I0930 06:20:13.579712 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:13Z","lastTransitionTime":"2025-09-30T06:20:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:13 crc kubenswrapper[4691]: I0930 06:20:13.594855 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nzp64" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b7ceecf30a44692340aed4f0df2aa58e8c9552f8ef872b7a872b29516fa2068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d675d9797e8541b6fff5df78a8d0701a51d23b03bd2d489d0489a82f440b4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d675d9797e8541b6fff5df78a8d0701a51d23b03bd2d489d0489a82f440b4a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aacdb742861442ec0e7b14904397b4e1a4bd115cd56e6105b37c42cb762b0dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aacdb742861442ec0e7b14904397b4e1a4bd115cd56e6105b37c42cb762b0dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f6c31e7404be56f11fd658c5e92f44de4704b9c2d5ebbce1fd0af40d2ca8c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1f6c31e7404be56f11fd658c5e92f44de4704b9c2d5ebbce1fd0af40d2ca8c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74da2d6dc28ebfcbfff21e47ebad84a906c4686fddf32d04eb494be40d811b86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74da2d6dc28ebfcbfff21e47ebad84a906c4686fddf32d04eb494be40d811b86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d1c4f2763fc5ea8ecd81a1c6aff31d291156a4db150fc70c0111b811bd5467c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d1c4f2763fc5ea8ecd81a1c6aff31d291156a4db150fc70c0111b811bd5467c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f313d83849679400f0ef2223f644c042290a33d55cab935fdbe9af123c88d210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f313d83849679400f0ef2223f644c042290a33d55cab935fdbe9af123c88d210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nzp64\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:13Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:13 crc kubenswrapper[4691]: I0930 06:20:13.630360 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b6df86867d5427797ec0c0a28b653ef5b8ec62af450207d9af5f490df725b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04bc58a7b740ca100f333ef769a949f962ff6024f8dd87b798c99a28514e0394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d95203bc3297779316f06d001c72859f9b4561a9a9d7c3123c5a75e8d8bc6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f46927454f67993366632bb3f6e37e4cb0c5f013266b18c1ac61e9cddc42e023\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20789e47d7e584044b0763af1eb8b81bbde2f247ec14e56a5a9b08c28894c3be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d002e869e911c24f154df90324ace473c747a4cfa6dd60ddcbef6473f9815f72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4eafbd6f3545c96381319c2b5692b5eccef9c2f837e82cdabf3ea0b92504260\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4eafbd6f3545c96381319c2b5692b5eccef9c2f837e82cdabf3ea0b92504260\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T06:20:06Z\\\",\\\"message\\\":\\\"/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 06:20:06.206257 6355 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0930 06:20:06.206290 6355 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0930 06:20:06.206550 6355 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 06:20:06.206243 6355 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 06:20:06.206758 6355 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 06:20:06.207239 6355 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 06:20:06.207377 6355 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 06:20:06.207466 6355 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 06:20:06.208066 6355 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:20:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-sjmvw_openshift-ovn-kubernetes(6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f49abce591ae5c912fdbbdf0253d06ee20088424718dd6d535683a2a2ede5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sjmvw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:13Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:13 crc kubenswrapper[4691]: I0930 06:20:13.650437 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:13Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:13 crc kubenswrapper[4691]: I0930 06:20:13.671133 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca08f5d521940a428454d90f85150e7b87e2353314c89a42b4c73740c3b23163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:13Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:13 crc kubenswrapper[4691]: I0930 06:20:13.683071 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:13 crc kubenswrapper[4691]: I0930 06:20:13.683137 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:13 crc kubenswrapper[4691]: I0930 06:20:13.683158 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:13 crc kubenswrapper[4691]: I0930 06:20:13.683188 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:13 crc kubenswrapper[4691]: I0930 06:20:13.683206 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:13Z","lastTransitionTime":"2025-09-30T06:20:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:13 crc kubenswrapper[4691]: I0930 06:20:13.692509 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xjjw8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bfd073c-4582-4a65-8170-7030f4852174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a065eb18eb909fb60011f5ae6adf327088480996585d74e861628ffdd759bd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xjjw8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:13Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:13 crc kubenswrapper[4691]: I0930 06:20:13.708602 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-p7wmt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99a6e728-8795-424d-a99e-7141c75baad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://746d60ed8b3230cabda106f1d2e4bc8496618515f2cfb2d6c4238375f71ad2af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6jfq9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-p7wmt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:13Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:13 crc kubenswrapper[4691]: I0930 06:20:13.725179 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fxv8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37b1a1c7-92d7-41ea-b5c1-4a56b40f819e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be3a8860307e499758a11dd6561cdf1ffd968b3edf9c701b52994ea4cfe1c129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4h4k7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b07ee418ef8f92ea69f7d64ef09e0de246432d4663c4c5018be94894860c6444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4h4k7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fxv8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:13Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:13 crc kubenswrapper[4691]: I0930 06:20:13.742057 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-svjxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8ed6f92-0b98-4b1b-a46e-4d0604d686a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q98dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q98dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-svjxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:13Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:13 crc kubenswrapper[4691]: I0930 06:20:13.785878 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:13 crc kubenswrapper[4691]: I0930 06:20:13.785960 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:13 crc kubenswrapper[4691]: I0930 06:20:13.785980 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:13 crc kubenswrapper[4691]: I0930 06:20:13.786001 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:13 crc kubenswrapper[4691]: I0930 06:20:13.786019 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:13Z","lastTransitionTime":"2025-09-30T06:20:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:13 crc kubenswrapper[4691]: I0930 06:20:13.888578 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:13 crc kubenswrapper[4691]: I0930 06:20:13.888628 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:13 crc kubenswrapper[4691]: I0930 06:20:13.888645 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:13 crc kubenswrapper[4691]: I0930 06:20:13.888667 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:13 crc kubenswrapper[4691]: I0930 06:20:13.888685 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:13Z","lastTransitionTime":"2025-09-30T06:20:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:13 crc kubenswrapper[4691]: I0930 06:20:13.991723 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:13 crc kubenswrapper[4691]: I0930 06:20:13.991788 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:13 crc kubenswrapper[4691]: I0930 06:20:13.991804 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:13 crc kubenswrapper[4691]: I0930 06:20:13.991828 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:13 crc kubenswrapper[4691]: I0930 06:20:13.991844 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:13Z","lastTransitionTime":"2025-09-30T06:20:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:14 crc kubenswrapper[4691]: I0930 06:20:14.094723 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:14 crc kubenswrapper[4691]: I0930 06:20:14.094794 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:14 crc kubenswrapper[4691]: I0930 06:20:14.094814 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:14 crc kubenswrapper[4691]: I0930 06:20:14.094839 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:14 crc kubenswrapper[4691]: I0930 06:20:14.094857 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:14Z","lastTransitionTime":"2025-09-30T06:20:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:14 crc kubenswrapper[4691]: I0930 06:20:14.199179 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:14 crc kubenswrapper[4691]: I0930 06:20:14.199263 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:14 crc kubenswrapper[4691]: I0930 06:20:14.199288 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:14 crc kubenswrapper[4691]: I0930 06:20:14.199322 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:14 crc kubenswrapper[4691]: I0930 06:20:14.199347 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:14Z","lastTransitionTime":"2025-09-30T06:20:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:14 crc kubenswrapper[4691]: I0930 06:20:14.224593 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-svjxq" Sep 30 06:20:14 crc kubenswrapper[4691]: I0930 06:20:14.224600 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 06:20:14 crc kubenswrapper[4691]: E0930 06:20:14.224835 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-svjxq" podUID="a8ed6f92-0b98-4b1b-a46e-4d0604d686a1" Sep 30 06:20:14 crc kubenswrapper[4691]: E0930 06:20:14.225029 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 06:20:14 crc kubenswrapper[4691]: I0930 06:20:14.302397 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:14 crc kubenswrapper[4691]: I0930 06:20:14.302456 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:14 crc kubenswrapper[4691]: I0930 06:20:14.302472 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:14 crc kubenswrapper[4691]: I0930 06:20:14.302496 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:14 crc kubenswrapper[4691]: I0930 06:20:14.302568 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:14Z","lastTransitionTime":"2025-09-30T06:20:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:14 crc kubenswrapper[4691]: I0930 06:20:14.374515 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:14 crc kubenswrapper[4691]: I0930 06:20:14.374585 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:14 crc kubenswrapper[4691]: I0930 06:20:14.374601 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:14 crc kubenswrapper[4691]: I0930 06:20:14.374628 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:14 crc kubenswrapper[4691]: I0930 06:20:14.374646 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:14Z","lastTransitionTime":"2025-09-30T06:20:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:14 crc kubenswrapper[4691]: E0930 06:20:14.396847 4691 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5d33cbaa-1b8a-4dde-af56-05c3aae2213e\\\",\\\"systemUUID\\\":\\\"d7c4eecd-4486-44ba-8bf7-42bfa69eb2b7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:14Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:14 crc kubenswrapper[4691]: I0930 06:20:14.401979 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:14 crc kubenswrapper[4691]: I0930 06:20:14.402040 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:14 crc kubenswrapper[4691]: I0930 06:20:14.402059 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:14 crc kubenswrapper[4691]: I0930 06:20:14.402084 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:14 crc kubenswrapper[4691]: I0930 06:20:14.402101 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:14Z","lastTransitionTime":"2025-09-30T06:20:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:14 crc kubenswrapper[4691]: E0930 06:20:14.421144 4691 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5d33cbaa-1b8a-4dde-af56-05c3aae2213e\\\",\\\"systemUUID\\\":\\\"d7c4eecd-4486-44ba-8bf7-42bfa69eb2b7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:14Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:14 crc kubenswrapper[4691]: I0930 06:20:14.426541 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:14 crc kubenswrapper[4691]: I0930 06:20:14.426615 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:14 crc kubenswrapper[4691]: I0930 06:20:14.426632 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:14 crc kubenswrapper[4691]: I0930 06:20:14.426657 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:14 crc kubenswrapper[4691]: I0930 06:20:14.426677 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:14Z","lastTransitionTime":"2025-09-30T06:20:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:14 crc kubenswrapper[4691]: E0930 06:20:14.445969 4691 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5d33cbaa-1b8a-4dde-af56-05c3aae2213e\\\",\\\"systemUUID\\\":\\\"d7c4eecd-4486-44ba-8bf7-42bfa69eb2b7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:14Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:14 crc kubenswrapper[4691]: I0930 06:20:14.451016 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:14 crc kubenswrapper[4691]: I0930 06:20:14.451071 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:14 crc kubenswrapper[4691]: I0930 06:20:14.451089 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:14 crc kubenswrapper[4691]: I0930 06:20:14.451114 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:14 crc kubenswrapper[4691]: I0930 06:20:14.451133 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:14Z","lastTransitionTime":"2025-09-30T06:20:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:14 crc kubenswrapper[4691]: E0930 06:20:14.470034 4691 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5d33cbaa-1b8a-4dde-af56-05c3aae2213e\\\",\\\"systemUUID\\\":\\\"d7c4eecd-4486-44ba-8bf7-42bfa69eb2b7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:14Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:14 crc kubenswrapper[4691]: I0930 06:20:14.475270 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:14 crc kubenswrapper[4691]: I0930 06:20:14.475358 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:14 crc kubenswrapper[4691]: I0930 06:20:14.475402 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:14 crc kubenswrapper[4691]: I0930 06:20:14.475429 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:14 crc kubenswrapper[4691]: I0930 06:20:14.475448 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:14Z","lastTransitionTime":"2025-09-30T06:20:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:14 crc kubenswrapper[4691]: E0930 06:20:14.495599 4691 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:14Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5d33cbaa-1b8a-4dde-af56-05c3aae2213e\\\",\\\"systemUUID\\\":\\\"d7c4eecd-4486-44ba-8bf7-42bfa69eb2b7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:14Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:14 crc kubenswrapper[4691]: E0930 06:20:14.495813 4691 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 06:20:14 crc kubenswrapper[4691]: I0930 06:20:14.498096 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:14 crc kubenswrapper[4691]: I0930 06:20:14.498148 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:14 crc kubenswrapper[4691]: I0930 06:20:14.498163 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:14 crc kubenswrapper[4691]: I0930 06:20:14.498185 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:14 crc kubenswrapper[4691]: I0930 06:20:14.498230 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:14Z","lastTransitionTime":"2025-09-30T06:20:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:14 crc kubenswrapper[4691]: I0930 06:20:14.601010 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:14 crc kubenswrapper[4691]: I0930 06:20:14.601067 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:14 crc kubenswrapper[4691]: I0930 06:20:14.601084 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:14 crc kubenswrapper[4691]: I0930 06:20:14.601106 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:14 crc kubenswrapper[4691]: I0930 06:20:14.601122 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:14Z","lastTransitionTime":"2025-09-30T06:20:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:14 crc kubenswrapper[4691]: I0930 06:20:14.704378 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:14 crc kubenswrapper[4691]: I0930 06:20:14.704432 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:14 crc kubenswrapper[4691]: I0930 06:20:14.704448 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:14 crc kubenswrapper[4691]: I0930 06:20:14.704469 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:14 crc kubenswrapper[4691]: I0930 06:20:14.704485 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:14Z","lastTransitionTime":"2025-09-30T06:20:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:14 crc kubenswrapper[4691]: I0930 06:20:14.807607 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:14 crc kubenswrapper[4691]: I0930 06:20:14.807677 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:14 crc kubenswrapper[4691]: I0930 06:20:14.807722 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:14 crc kubenswrapper[4691]: I0930 06:20:14.807748 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:14 crc kubenswrapper[4691]: I0930 06:20:14.807770 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:14Z","lastTransitionTime":"2025-09-30T06:20:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:14 crc kubenswrapper[4691]: I0930 06:20:14.910974 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:14 crc kubenswrapper[4691]: I0930 06:20:14.911037 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:14 crc kubenswrapper[4691]: I0930 06:20:14.911065 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:14 crc kubenswrapper[4691]: I0930 06:20:14.911095 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:14 crc kubenswrapper[4691]: I0930 06:20:14.911115 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:14Z","lastTransitionTime":"2025-09-30T06:20:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:15 crc kubenswrapper[4691]: I0930 06:20:15.014169 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:15 crc kubenswrapper[4691]: I0930 06:20:15.014236 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:15 crc kubenswrapper[4691]: I0930 06:20:15.014260 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:15 crc kubenswrapper[4691]: I0930 06:20:15.014287 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:15 crc kubenswrapper[4691]: I0930 06:20:15.014308 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:15Z","lastTransitionTime":"2025-09-30T06:20:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:15 crc kubenswrapper[4691]: I0930 06:20:15.117622 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:15 crc kubenswrapper[4691]: I0930 06:20:15.117690 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:15 crc kubenswrapper[4691]: I0930 06:20:15.117709 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:15 crc kubenswrapper[4691]: I0930 06:20:15.117737 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:15 crc kubenswrapper[4691]: I0930 06:20:15.117754 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:15Z","lastTransitionTime":"2025-09-30T06:20:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:15 crc kubenswrapper[4691]: I0930 06:20:15.221671 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:15 crc kubenswrapper[4691]: I0930 06:20:15.221747 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:15 crc kubenswrapper[4691]: I0930 06:20:15.221769 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:15 crc kubenswrapper[4691]: I0930 06:20:15.221799 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:15 crc kubenswrapper[4691]: I0930 06:20:15.221820 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:15Z","lastTransitionTime":"2025-09-30T06:20:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:15 crc kubenswrapper[4691]: I0930 06:20:15.224049 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 06:20:15 crc kubenswrapper[4691]: E0930 06:20:15.224220 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 06:20:15 crc kubenswrapper[4691]: I0930 06:20:15.224059 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 06:20:15 crc kubenswrapper[4691]: E0930 06:20:15.224702 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 06:20:15 crc kubenswrapper[4691]: I0930 06:20:15.324985 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:15 crc kubenswrapper[4691]: I0930 06:20:15.325064 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:15 crc kubenswrapper[4691]: I0930 06:20:15.325087 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:15 crc kubenswrapper[4691]: I0930 06:20:15.325117 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:15 crc kubenswrapper[4691]: I0930 06:20:15.325141 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:15Z","lastTransitionTime":"2025-09-30T06:20:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:15 crc kubenswrapper[4691]: I0930 06:20:15.428783 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:15 crc kubenswrapper[4691]: I0930 06:20:15.428847 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:15 crc kubenswrapper[4691]: I0930 06:20:15.428863 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:15 crc kubenswrapper[4691]: I0930 06:20:15.428913 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:15 crc kubenswrapper[4691]: I0930 06:20:15.428934 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:15Z","lastTransitionTime":"2025-09-30T06:20:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:15 crc kubenswrapper[4691]: I0930 06:20:15.531047 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:15 crc kubenswrapper[4691]: I0930 06:20:15.531089 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:15 crc kubenswrapper[4691]: I0930 06:20:15.531101 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:15 crc kubenswrapper[4691]: I0930 06:20:15.531116 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:15 crc kubenswrapper[4691]: I0930 06:20:15.531129 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:15Z","lastTransitionTime":"2025-09-30T06:20:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:15 crc kubenswrapper[4691]: I0930 06:20:15.633320 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:15 crc kubenswrapper[4691]: I0930 06:20:15.633376 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:15 crc kubenswrapper[4691]: I0930 06:20:15.633393 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:15 crc kubenswrapper[4691]: I0930 06:20:15.633417 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:15 crc kubenswrapper[4691]: I0930 06:20:15.633435 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:15Z","lastTransitionTime":"2025-09-30T06:20:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:15 crc kubenswrapper[4691]: I0930 06:20:15.736487 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:15 crc kubenswrapper[4691]: I0930 06:20:15.736542 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:15 crc kubenswrapper[4691]: I0930 06:20:15.736559 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:15 crc kubenswrapper[4691]: I0930 06:20:15.736584 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:15 crc kubenswrapper[4691]: I0930 06:20:15.736603 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:15Z","lastTransitionTime":"2025-09-30T06:20:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:15 crc kubenswrapper[4691]: I0930 06:20:15.840025 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:15 crc kubenswrapper[4691]: I0930 06:20:15.840379 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:15 crc kubenswrapper[4691]: I0930 06:20:15.840396 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:15 crc kubenswrapper[4691]: I0930 06:20:15.840418 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:15 crc kubenswrapper[4691]: I0930 06:20:15.840434 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:15Z","lastTransitionTime":"2025-09-30T06:20:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:15 crc kubenswrapper[4691]: I0930 06:20:15.943258 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:15 crc kubenswrapper[4691]: I0930 06:20:15.943322 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:15 crc kubenswrapper[4691]: I0930 06:20:15.943340 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:15 crc kubenswrapper[4691]: I0930 06:20:15.943366 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:15 crc kubenswrapper[4691]: I0930 06:20:15.943384 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:15Z","lastTransitionTime":"2025-09-30T06:20:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:16 crc kubenswrapper[4691]: I0930 06:20:16.046432 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:16 crc kubenswrapper[4691]: I0930 06:20:16.046523 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:16 crc kubenswrapper[4691]: I0930 06:20:16.046543 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:16 crc kubenswrapper[4691]: I0930 06:20:16.046564 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:16 crc kubenswrapper[4691]: I0930 06:20:16.046581 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:16Z","lastTransitionTime":"2025-09-30T06:20:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:16 crc kubenswrapper[4691]: I0930 06:20:16.149721 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:16 crc kubenswrapper[4691]: I0930 06:20:16.149792 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:16 crc kubenswrapper[4691]: I0930 06:20:16.149816 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:16 crc kubenswrapper[4691]: I0930 06:20:16.149842 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:16 crc kubenswrapper[4691]: I0930 06:20:16.149863 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:16Z","lastTransitionTime":"2025-09-30T06:20:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:16 crc kubenswrapper[4691]: I0930 06:20:16.224419 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 06:20:16 crc kubenswrapper[4691]: I0930 06:20:16.224419 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-svjxq" Sep 30 06:20:16 crc kubenswrapper[4691]: E0930 06:20:16.224620 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 06:20:16 crc kubenswrapper[4691]: E0930 06:20:16.224771 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-svjxq" podUID="a8ed6f92-0b98-4b1b-a46e-4d0604d686a1" Sep 30 06:20:16 crc kubenswrapper[4691]: I0930 06:20:16.252422 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:16 crc kubenswrapper[4691]: I0930 06:20:16.252491 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:16 crc kubenswrapper[4691]: I0930 06:20:16.252513 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:16 crc kubenswrapper[4691]: I0930 06:20:16.252542 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:16 crc kubenswrapper[4691]: I0930 06:20:16.252565 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:16Z","lastTransitionTime":"2025-09-30T06:20:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:16 crc kubenswrapper[4691]: I0930 06:20:16.355753 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:16 crc kubenswrapper[4691]: I0930 06:20:16.355812 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:16 crc kubenswrapper[4691]: I0930 06:20:16.355831 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:16 crc kubenswrapper[4691]: I0930 06:20:16.355855 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:16 crc kubenswrapper[4691]: I0930 06:20:16.355872 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:16Z","lastTransitionTime":"2025-09-30T06:20:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:16 crc kubenswrapper[4691]: I0930 06:20:16.458405 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:16 crc kubenswrapper[4691]: I0930 06:20:16.458469 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:16 crc kubenswrapper[4691]: I0930 06:20:16.458495 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:16 crc kubenswrapper[4691]: I0930 06:20:16.458522 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:16 crc kubenswrapper[4691]: I0930 06:20:16.458546 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:16Z","lastTransitionTime":"2025-09-30T06:20:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:16 crc kubenswrapper[4691]: I0930 06:20:16.561940 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:16 crc kubenswrapper[4691]: I0930 06:20:16.562014 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:16 crc kubenswrapper[4691]: I0930 06:20:16.562032 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:16 crc kubenswrapper[4691]: I0930 06:20:16.562061 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:16 crc kubenswrapper[4691]: I0930 06:20:16.562084 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:16Z","lastTransitionTime":"2025-09-30T06:20:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:16 crc kubenswrapper[4691]: I0930 06:20:16.665107 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:16 crc kubenswrapper[4691]: I0930 06:20:16.665203 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:16 crc kubenswrapper[4691]: I0930 06:20:16.665224 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:16 crc kubenswrapper[4691]: I0930 06:20:16.665279 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:16 crc kubenswrapper[4691]: I0930 06:20:16.665299 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:16Z","lastTransitionTime":"2025-09-30T06:20:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:16 crc kubenswrapper[4691]: I0930 06:20:16.771150 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:16 crc kubenswrapper[4691]: I0930 06:20:16.771222 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:16 crc kubenswrapper[4691]: I0930 06:20:16.771239 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:16 crc kubenswrapper[4691]: I0930 06:20:16.771265 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:16 crc kubenswrapper[4691]: I0930 06:20:16.771282 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:16Z","lastTransitionTime":"2025-09-30T06:20:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:16 crc kubenswrapper[4691]: I0930 06:20:16.874477 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:16 crc kubenswrapper[4691]: I0930 06:20:16.874545 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:16 crc kubenswrapper[4691]: I0930 06:20:16.874564 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:16 crc kubenswrapper[4691]: I0930 06:20:16.874591 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:16 crc kubenswrapper[4691]: I0930 06:20:16.874608 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:16Z","lastTransitionTime":"2025-09-30T06:20:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:16 crc kubenswrapper[4691]: I0930 06:20:16.978194 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:16 crc kubenswrapper[4691]: I0930 06:20:16.978260 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:16 crc kubenswrapper[4691]: I0930 06:20:16.978275 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:16 crc kubenswrapper[4691]: I0930 06:20:16.978294 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:16 crc kubenswrapper[4691]: I0930 06:20:16.978313 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:16Z","lastTransitionTime":"2025-09-30T06:20:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:17 crc kubenswrapper[4691]: I0930 06:20:17.081726 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:17 crc kubenswrapper[4691]: I0930 06:20:17.081823 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:17 crc kubenswrapper[4691]: I0930 06:20:17.081842 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:17 crc kubenswrapper[4691]: I0930 06:20:17.081867 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:17 crc kubenswrapper[4691]: I0930 06:20:17.081910 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:17Z","lastTransitionTime":"2025-09-30T06:20:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:17 crc kubenswrapper[4691]: I0930 06:20:17.185923 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:17 crc kubenswrapper[4691]: I0930 06:20:17.185977 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:17 crc kubenswrapper[4691]: I0930 06:20:17.185995 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:17 crc kubenswrapper[4691]: I0930 06:20:17.186018 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:17 crc kubenswrapper[4691]: I0930 06:20:17.186035 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:17Z","lastTransitionTime":"2025-09-30T06:20:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:17 crc kubenswrapper[4691]: I0930 06:20:17.224782 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 06:20:17 crc kubenswrapper[4691]: E0930 06:20:17.224989 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 06:20:17 crc kubenswrapper[4691]: I0930 06:20:17.225294 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 06:20:17 crc kubenswrapper[4691]: E0930 06:20:17.225407 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 06:20:17 crc kubenswrapper[4691]: I0930 06:20:17.248394 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11fb453d515798fa4f85603df19a5dc7d305375d25c0f679598613aba7312328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:17Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:17 crc kubenswrapper[4691]: I0930 06:20:17.272200 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nzp64" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b7ceecf30a44692340aed4f0df2aa58e8c9552f8ef872b7a872b29516fa2068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d675d9797e8541b6fff5df78a8d0701a51d23b03bd2d489d0489a82f440b4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d675d9797e8541b6fff5df78a8d0701a51d23b03bd2d489d0489a82f440b4a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aacdb742861442ec0e7b14904397b4e1a4bd115cd56e6105b37c42cb762b0dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aacdb742861442ec0e7b14904397b4e1a4bd115cd56e6105b37c42cb762b0dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f6c31e7404be56f11fd658c5e92f44de4704b9c2d5ebbce1fd0af40d2ca8c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1f6c31e7404be56f11fd658c5e92f44de4704b9c2d5ebbce1fd0af40d2ca8c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74da2d6dc28ebfcbfff21e47ebad84a906c4686fddf32d04eb494be40d811b86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74da2d6dc28ebfcbfff21e47ebad84a906c4686fddf32d04eb494be40d811b86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d1c4f2763fc5ea8ecd81a1c6aff31d291156a4db150fc70c0111b811bd5467c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d1c4f2763fc5ea8ecd81a1c6aff31d291156a4db150fc70c0111b811bd5467c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f313d83849679400f0ef2223f644c042290a33d55cab935fdbe9af123c88d210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f313d83849679400f0ef2223f644c042290a33d55cab935fdbe9af123c88d210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nzp64\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:17Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:17 crc kubenswrapper[4691]: I0930 06:20:17.289352 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:17 crc kubenswrapper[4691]: I0930 06:20:17.289418 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:17 crc kubenswrapper[4691]: I0930 06:20:17.289439 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:17 crc kubenswrapper[4691]: I0930 06:20:17.289467 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:17 crc kubenswrapper[4691]: I0930 06:20:17.289485 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:17Z","lastTransitionTime":"2025-09-30T06:20:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:17 crc kubenswrapper[4691]: I0930 06:20:17.306789 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b6df86867d5427797ec0c0a28b653ef5b8ec62af450207d9af5f490df725b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04bc58a7b740ca100f333ef769a949f962ff6024f8dd87b798c99a28514e0394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d95203bc3297779316f06d001c72859f9b4561a9a9d7c3123c5a75e8d8bc6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f46927454f67993366632bb3f6e37e4cb0c5f013266b18c1ac61e9cddc42e023\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20789e47d7e584044b0763af1eb8b81bbde2f247ec14e56a5a9b08c28894c3be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d002e869e911c24f154df90324ace473c747a4cfa6dd60ddcbef6473f9815f72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4eafbd6f3545c96381319c2b5692b5eccef9c2f837e82cdabf3ea0b92504260\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4eafbd6f3545c96381319c2b5692b5eccef9c2f837e82cdabf3ea0b92504260\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T06:20:06Z\\\",\\\"message\\\":\\\"/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 06:20:06.206257 6355 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0930 06:20:06.206290 6355 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0930 06:20:06.206550 6355 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 06:20:06.206243 6355 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 06:20:06.206758 6355 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 06:20:06.207239 6355 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 06:20:06.207377 6355 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 06:20:06.207466 6355 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 06:20:06.208066 6355 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:20:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-sjmvw_openshift-ovn-kubernetes(6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f49abce591ae5c912fdbbdf0253d06ee20088424718dd6d535683a2a2ede5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sjmvw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:17Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:17 crc kubenswrapper[4691]: I0930 06:20:17.331222 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:17Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:17 crc kubenswrapper[4691]: I0930 06:20:17.354184 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca08f5d521940a428454d90f85150e7b87e2353314c89a42b4c73740c3b23163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:17Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:17 crc kubenswrapper[4691]: I0930 06:20:17.375833 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xjjw8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bfd073c-4582-4a65-8170-7030f4852174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a065eb18eb909fb60011f5ae6adf327088480996585d74e861628ffdd759bd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xjjw8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:17Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:17 crc kubenswrapper[4691]: I0930 06:20:17.392138 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-p7wmt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99a6e728-8795-424d-a99e-7141c75baad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://746d60ed8b3230cabda106f1d2e4bc8496618515f2cfb2d6c4238375f71ad2af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6jfq9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-p7wmt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:17Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:17 crc kubenswrapper[4691]: I0930 06:20:17.392641 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:17 crc kubenswrapper[4691]: I0930 06:20:17.392695 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:17 crc kubenswrapper[4691]: I0930 06:20:17.392709 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:17 crc kubenswrapper[4691]: I0930 06:20:17.392735 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:17 crc kubenswrapper[4691]: I0930 06:20:17.392751 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:17Z","lastTransitionTime":"2025-09-30T06:20:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:17 crc kubenswrapper[4691]: I0930 06:20:17.411134 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fxv8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37b1a1c7-92d7-41ea-b5c1-4a56b40f819e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be3a8860307e499758a11dd6561cdf1ffd968b3edf9c701b52994ea4cfe1c129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4h4k7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b07ee418ef8f92ea69f7d64ef09e0de246432d4663c4c5018be94894860c6444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4h4k7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fxv8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:17Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:17 crc kubenswrapper[4691]: I0930 06:20:17.429605 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-svjxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8ed6f92-0b98-4b1b-a46e-4d0604d686a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q98dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q98dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-svjxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:17Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:17 crc kubenswrapper[4691]: I0930 06:20:17.444211 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b46ade-8260-448f-84b7-506632d23ff9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff4769002bae0f06423979897a58fdb4b2a78f9ea00abc1d5df5c8d4385c0919\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xszvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1440cd9bacd9b28d5e192b3d9143dda931c741606fdb8f246fba23b8a68c534c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xszvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4w4k6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:17Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:17 crc kubenswrapper[4691]: I0930 06:20:17.476544 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e55ae5a0-03bf-427d-92b2-39cdff18340d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f6cedf7ad601b81415785791be352c849e1d559ab8e6a0f33de2c89a93cf09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65513dc3f11ab14acbafe004e02c1b395b74935e84a00e75d9936bde03a97fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6240aa4c18020b30de9389e7a1869c736424444208d4011f228b4e5432520cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2841452dae413fa8cdf532c4933ed52cfb007be43a45ee2d25209b54890fd351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b052f8474075dfc1f07b8e0844072d5bbaa94bae1711f7e4cecc6928a65689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:17Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:17 crc kubenswrapper[4691]: I0930 06:20:17.489606 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb1cfb02-1e6b-4bfb-8104-f1d137231de9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07fbce63750320e7345e1a99b4be96355f54c36c70e8929599b990eab3f3e637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38c9aa49f81f590d234daa993bb92af0c48103b45f5116c27b86daa651930a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acbe061d62e7bf11b0003c035dce19386981672d5e63236fada6e43b92274c91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fc15eff68924ff75a90b9cb7b5119d95ced9b9fe9e12968bfd077db73f37ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://017d7c032ea394e2dbef22cfc837c65ce54aab7db5fbd32210312b3e7042a698\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T06:19:31Z\\\",\\\"message\\\":\\\"W0930 06:19:20.583735 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 06:19:20.584237 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759213160 cert, and key in /tmp/serving-cert-2229078217/serving-signer.crt, /tmp/serving-cert-2229078217/serving-signer.key\\\\nI0930 06:19:20.815054 1 observer_polling.go:159] Starting file observer\\\\nW0930 06:19:20.818069 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 06:19:20.818377 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 06:19:20.821284 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2229078217/tls.crt::/tmp/serving-cert-2229078217/tls.key\\\\\\\"\\\\nF0930 06:19:31.389088 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39d7f1cd59b9a24d4303415d873bac1ed21326847749130167868b5298b7f8e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:17Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:17 crc kubenswrapper[4691]: I0930 06:20:17.495261 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:17 crc kubenswrapper[4691]: I0930 06:20:17.495297 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:17 crc kubenswrapper[4691]: I0930 06:20:17.495333 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:17 crc kubenswrapper[4691]: I0930 06:20:17.495349 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:17 crc kubenswrapper[4691]: I0930 06:20:17.495361 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:17Z","lastTransitionTime":"2025-09-30T06:20:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:17 crc kubenswrapper[4691]: I0930 06:20:17.502261 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:17Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:17 crc kubenswrapper[4691]: I0930 06:20:17.514009 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03f725fd435423024f16548f2fdd44ad1ddc26d1d485d56cc5b614ab2cac7a76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5fcea9ccb5c1331b7ae737abdcc80364814715f3c43c15793316f16acf0502a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:17Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:17 crc kubenswrapper[4691]: I0930 06:20:17.527683 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:17Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:17 crc kubenswrapper[4691]: I0930 06:20:17.542012 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8htrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c1c8663-d263-4b8c-93fa-05ee1b61d7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73cf34ef0b11305b959da324b55f8a211f0c278cf2dec835bb6c91ad385e4dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj86t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8htrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:17Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:17 crc kubenswrapper[4691]: I0930 06:20:17.556422 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3ecaa3-192f-40d4-abf8-938b9e03a661\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://469f62522e2f813405840a8654f761a52163bdabeba8cc7c57998ccd40548370\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198c76a83115408261c01c78cbd2845e487ad3dc896da0130b2b1338349506d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aadde3a7a603aae354d4f9dcf5ef288aeceabfb73ee3c92398fb3419326b27a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2f054ea0da71d1f65e501212a66771c5e65bfc123e3a3acd5e13900def039d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:17Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:17 crc kubenswrapper[4691]: I0930 06:20:17.575903 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c655fbd5-0708-4151-b0fb-a97d8e1826bc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c055c319af36509c20e700f8f5025b0d356ca5e6038be80dd69282a1f1ad716b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://810c839e6c66bbacb466fb7023bed728b17be9d13025e2db26ee5b40fea124f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1f3d8c18456850212ce283d46273b39939040ddd575f193acc0910cf479f3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://deebf28973e8320961c1318918377f952fadba50dec1e580e461de659c63f23e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://deebf28973e8320961c1318918377f952fadba50dec1e580e461de659c63f23e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:17Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:17 crc kubenswrapper[4691]: I0930 06:20:17.597616 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:17 crc kubenswrapper[4691]: I0930 06:20:17.597648 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:17 crc kubenswrapper[4691]: I0930 06:20:17.597662 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:17 crc kubenswrapper[4691]: I0930 06:20:17.597679 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:17 crc kubenswrapper[4691]: I0930 06:20:17.597691 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:17Z","lastTransitionTime":"2025-09-30T06:20:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:17 crc kubenswrapper[4691]: I0930 06:20:17.701166 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:17 crc kubenswrapper[4691]: I0930 06:20:17.701232 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:17 crc kubenswrapper[4691]: I0930 06:20:17.701253 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:17 crc kubenswrapper[4691]: I0930 06:20:17.701283 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:17 crc kubenswrapper[4691]: I0930 06:20:17.701302 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:17Z","lastTransitionTime":"2025-09-30T06:20:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:17 crc kubenswrapper[4691]: I0930 06:20:17.804732 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:17 crc kubenswrapper[4691]: I0930 06:20:17.805409 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:17 crc kubenswrapper[4691]: I0930 06:20:17.805678 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:17 crc kubenswrapper[4691]: I0930 06:20:17.805951 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:17 crc kubenswrapper[4691]: I0930 06:20:17.806130 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:17Z","lastTransitionTime":"2025-09-30T06:20:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:17 crc kubenswrapper[4691]: I0930 06:20:17.909474 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:17 crc kubenswrapper[4691]: I0930 06:20:17.909535 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:17 crc kubenswrapper[4691]: I0930 06:20:17.909552 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:17 crc kubenswrapper[4691]: I0930 06:20:17.909577 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:17 crc kubenswrapper[4691]: I0930 06:20:17.909595 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:17Z","lastTransitionTime":"2025-09-30T06:20:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:18 crc kubenswrapper[4691]: I0930 06:20:18.012309 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:18 crc kubenswrapper[4691]: I0930 06:20:18.012364 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:18 crc kubenswrapper[4691]: I0930 06:20:18.012383 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:18 crc kubenswrapper[4691]: I0930 06:20:18.012410 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:18 crc kubenswrapper[4691]: I0930 06:20:18.012431 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:18Z","lastTransitionTime":"2025-09-30T06:20:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:18 crc kubenswrapper[4691]: I0930 06:20:18.115873 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:18 crc kubenswrapper[4691]: I0930 06:20:18.116263 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:18 crc kubenswrapper[4691]: I0930 06:20:18.116399 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:18 crc kubenswrapper[4691]: I0930 06:20:18.116534 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:18 crc kubenswrapper[4691]: I0930 06:20:18.116682 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:18Z","lastTransitionTime":"2025-09-30T06:20:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:18 crc kubenswrapper[4691]: I0930 06:20:18.219507 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:18 crc kubenswrapper[4691]: I0930 06:20:18.219559 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:18 crc kubenswrapper[4691]: I0930 06:20:18.219578 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:18 crc kubenswrapper[4691]: I0930 06:20:18.219601 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:18 crc kubenswrapper[4691]: I0930 06:20:18.219617 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:18Z","lastTransitionTime":"2025-09-30T06:20:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:18 crc kubenswrapper[4691]: I0930 06:20:18.224076 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 06:20:18 crc kubenswrapper[4691]: I0930 06:20:18.224156 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-svjxq" Sep 30 06:20:18 crc kubenswrapper[4691]: E0930 06:20:18.224232 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 06:20:18 crc kubenswrapper[4691]: E0930 06:20:18.224294 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-svjxq" podUID="a8ed6f92-0b98-4b1b-a46e-4d0604d686a1" Sep 30 06:20:18 crc kubenswrapper[4691]: I0930 06:20:18.321825 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:18 crc kubenswrapper[4691]: I0930 06:20:18.322668 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:18 crc kubenswrapper[4691]: I0930 06:20:18.322823 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:18 crc kubenswrapper[4691]: I0930 06:20:18.323016 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:18 crc kubenswrapper[4691]: I0930 06:20:18.323168 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:18Z","lastTransitionTime":"2025-09-30T06:20:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:18 crc kubenswrapper[4691]: I0930 06:20:18.425778 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:18 crc kubenswrapper[4691]: I0930 06:20:18.425834 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:18 crc kubenswrapper[4691]: I0930 06:20:18.425851 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:18 crc kubenswrapper[4691]: I0930 06:20:18.425877 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:18 crc kubenswrapper[4691]: I0930 06:20:18.425929 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:18Z","lastTransitionTime":"2025-09-30T06:20:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:18 crc kubenswrapper[4691]: I0930 06:20:18.528778 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:18 crc kubenswrapper[4691]: I0930 06:20:18.529277 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:18 crc kubenswrapper[4691]: I0930 06:20:18.529585 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:18 crc kubenswrapper[4691]: I0930 06:20:18.529807 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:18 crc kubenswrapper[4691]: I0930 06:20:18.530058 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:18Z","lastTransitionTime":"2025-09-30T06:20:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:18 crc kubenswrapper[4691]: I0930 06:20:18.634039 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:18 crc kubenswrapper[4691]: I0930 06:20:18.634110 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:18 crc kubenswrapper[4691]: I0930 06:20:18.634132 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:18 crc kubenswrapper[4691]: I0930 06:20:18.634162 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:18 crc kubenswrapper[4691]: I0930 06:20:18.634184 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:18Z","lastTransitionTime":"2025-09-30T06:20:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:18 crc kubenswrapper[4691]: I0930 06:20:18.736798 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:18 crc kubenswrapper[4691]: I0930 06:20:18.736928 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:18 crc kubenswrapper[4691]: I0930 06:20:18.736950 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:18 crc kubenswrapper[4691]: I0930 06:20:18.736975 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:18 crc kubenswrapper[4691]: I0930 06:20:18.736993 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:18Z","lastTransitionTime":"2025-09-30T06:20:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:18 crc kubenswrapper[4691]: I0930 06:20:18.839951 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:18 crc kubenswrapper[4691]: I0930 06:20:18.840011 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:18 crc kubenswrapper[4691]: I0930 06:20:18.840029 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:18 crc kubenswrapper[4691]: I0930 06:20:18.840054 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:18 crc kubenswrapper[4691]: I0930 06:20:18.840071 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:18Z","lastTransitionTime":"2025-09-30T06:20:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:18 crc kubenswrapper[4691]: I0930 06:20:18.943631 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:18 crc kubenswrapper[4691]: I0930 06:20:18.943722 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:18 crc kubenswrapper[4691]: I0930 06:20:18.943747 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:18 crc kubenswrapper[4691]: I0930 06:20:18.943772 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:18 crc kubenswrapper[4691]: I0930 06:20:18.943789 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:18Z","lastTransitionTime":"2025-09-30T06:20:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:19 crc kubenswrapper[4691]: I0930 06:20:19.047182 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:19 crc kubenswrapper[4691]: I0930 06:20:19.047239 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:19 crc kubenswrapper[4691]: I0930 06:20:19.047276 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:19 crc kubenswrapper[4691]: I0930 06:20:19.047311 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:19 crc kubenswrapper[4691]: I0930 06:20:19.047336 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:19Z","lastTransitionTime":"2025-09-30T06:20:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:19 crc kubenswrapper[4691]: I0930 06:20:19.150214 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:19 crc kubenswrapper[4691]: I0930 06:20:19.150278 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:19 crc kubenswrapper[4691]: I0930 06:20:19.150296 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:19 crc kubenswrapper[4691]: I0930 06:20:19.150325 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:19 crc kubenswrapper[4691]: I0930 06:20:19.150344 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:19Z","lastTransitionTime":"2025-09-30T06:20:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:19 crc kubenswrapper[4691]: I0930 06:20:19.224099 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 06:20:19 crc kubenswrapper[4691]: I0930 06:20:19.224224 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 06:20:19 crc kubenswrapper[4691]: E0930 06:20:19.224289 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 06:20:19 crc kubenswrapper[4691]: E0930 06:20:19.224388 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 06:20:19 crc kubenswrapper[4691]: I0930 06:20:19.253128 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:19 crc kubenswrapper[4691]: I0930 06:20:19.253195 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:19 crc kubenswrapper[4691]: I0930 06:20:19.253217 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:19 crc kubenswrapper[4691]: I0930 06:20:19.253244 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:19 crc kubenswrapper[4691]: I0930 06:20:19.253266 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:19Z","lastTransitionTime":"2025-09-30T06:20:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:19 crc kubenswrapper[4691]: I0930 06:20:19.355664 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:19 crc kubenswrapper[4691]: I0930 06:20:19.355723 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:19 crc kubenswrapper[4691]: I0930 06:20:19.355740 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:19 crc kubenswrapper[4691]: I0930 06:20:19.355761 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:19 crc kubenswrapper[4691]: I0930 06:20:19.355779 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:19Z","lastTransitionTime":"2025-09-30T06:20:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:19 crc kubenswrapper[4691]: I0930 06:20:19.459034 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:19 crc kubenswrapper[4691]: I0930 06:20:19.459087 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:19 crc kubenswrapper[4691]: I0930 06:20:19.459101 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:19 crc kubenswrapper[4691]: I0930 06:20:19.459122 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:19 crc kubenswrapper[4691]: I0930 06:20:19.459137 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:19Z","lastTransitionTime":"2025-09-30T06:20:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:19 crc kubenswrapper[4691]: I0930 06:20:19.562360 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:19 crc kubenswrapper[4691]: I0930 06:20:19.562444 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:19 crc kubenswrapper[4691]: I0930 06:20:19.562459 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:19 crc kubenswrapper[4691]: I0930 06:20:19.562481 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:19 crc kubenswrapper[4691]: I0930 06:20:19.562496 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:19Z","lastTransitionTime":"2025-09-30T06:20:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:19 crc kubenswrapper[4691]: I0930 06:20:19.665196 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:19 crc kubenswrapper[4691]: I0930 06:20:19.665278 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:19 crc kubenswrapper[4691]: I0930 06:20:19.665305 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:19 crc kubenswrapper[4691]: I0930 06:20:19.665332 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:19 crc kubenswrapper[4691]: I0930 06:20:19.665350 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:19Z","lastTransitionTime":"2025-09-30T06:20:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:19 crc kubenswrapper[4691]: I0930 06:20:19.772429 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:19 crc kubenswrapper[4691]: I0930 06:20:19.772547 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:19 crc kubenswrapper[4691]: I0930 06:20:19.772568 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:19 crc kubenswrapper[4691]: I0930 06:20:19.772647 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:19 crc kubenswrapper[4691]: I0930 06:20:19.772672 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:19Z","lastTransitionTime":"2025-09-30T06:20:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:19 crc kubenswrapper[4691]: I0930 06:20:19.876634 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:19 crc kubenswrapper[4691]: I0930 06:20:19.876679 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:19 crc kubenswrapper[4691]: I0930 06:20:19.876694 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:19 crc kubenswrapper[4691]: I0930 06:20:19.876717 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:19 crc kubenswrapper[4691]: I0930 06:20:19.876733 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:19Z","lastTransitionTime":"2025-09-30T06:20:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:19 crc kubenswrapper[4691]: I0930 06:20:19.980019 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:19 crc kubenswrapper[4691]: I0930 06:20:19.980153 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:19 crc kubenswrapper[4691]: I0930 06:20:19.980180 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:19 crc kubenswrapper[4691]: I0930 06:20:19.980213 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:19 crc kubenswrapper[4691]: I0930 06:20:19.980236 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:19Z","lastTransitionTime":"2025-09-30T06:20:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:20 crc kubenswrapper[4691]: I0930 06:20:20.083088 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:20 crc kubenswrapper[4691]: I0930 06:20:20.083148 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:20 crc kubenswrapper[4691]: I0930 06:20:20.083169 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:20 crc kubenswrapper[4691]: I0930 06:20:20.083193 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:20 crc kubenswrapper[4691]: I0930 06:20:20.083209 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:20Z","lastTransitionTime":"2025-09-30T06:20:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:20 crc kubenswrapper[4691]: I0930 06:20:20.186461 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:20 crc kubenswrapper[4691]: I0930 06:20:20.186514 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:20 crc kubenswrapper[4691]: I0930 06:20:20.186532 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:20 crc kubenswrapper[4691]: I0930 06:20:20.186554 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:20 crc kubenswrapper[4691]: I0930 06:20:20.186570 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:20Z","lastTransitionTime":"2025-09-30T06:20:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:20 crc kubenswrapper[4691]: I0930 06:20:20.224645 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 06:20:20 crc kubenswrapper[4691]: E0930 06:20:20.224824 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 06:20:20 crc kubenswrapper[4691]: I0930 06:20:20.225213 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-svjxq" Sep 30 06:20:20 crc kubenswrapper[4691]: E0930 06:20:20.225376 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-svjxq" podUID="a8ed6f92-0b98-4b1b-a46e-4d0604d686a1" Sep 30 06:20:20 crc kubenswrapper[4691]: I0930 06:20:20.227738 4691 scope.go:117] "RemoveContainer" containerID="e4eafbd6f3545c96381319c2b5692b5eccef9c2f837e82cdabf3ea0b92504260" Sep 30 06:20:20 crc kubenswrapper[4691]: E0930 06:20:20.228440 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-sjmvw_openshift-ovn-kubernetes(6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d)\"" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" podUID="6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" Sep 30 06:20:20 crc kubenswrapper[4691]: I0930 06:20:20.289620 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:20 crc kubenswrapper[4691]: I0930 06:20:20.289678 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:20 crc kubenswrapper[4691]: I0930 06:20:20.289699 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:20 crc kubenswrapper[4691]: I0930 06:20:20.289727 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:20 crc kubenswrapper[4691]: I0930 06:20:20.289747 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:20Z","lastTransitionTime":"2025-09-30T06:20:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:20 crc kubenswrapper[4691]: I0930 06:20:20.392483 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:20 crc kubenswrapper[4691]: I0930 06:20:20.392522 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:20 crc kubenswrapper[4691]: I0930 06:20:20.392539 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:20 crc kubenswrapper[4691]: I0930 06:20:20.392562 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:20 crc kubenswrapper[4691]: I0930 06:20:20.392578 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:20Z","lastTransitionTime":"2025-09-30T06:20:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:20 crc kubenswrapper[4691]: I0930 06:20:20.495169 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:20 crc kubenswrapper[4691]: I0930 06:20:20.495258 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:20 crc kubenswrapper[4691]: I0930 06:20:20.495286 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:20 crc kubenswrapper[4691]: I0930 06:20:20.495321 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:20 crc kubenswrapper[4691]: I0930 06:20:20.495351 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:20Z","lastTransitionTime":"2025-09-30T06:20:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:20 crc kubenswrapper[4691]: I0930 06:20:20.597767 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:20 crc kubenswrapper[4691]: I0930 06:20:20.597803 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:20 crc kubenswrapper[4691]: I0930 06:20:20.597820 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:20 crc kubenswrapper[4691]: I0930 06:20:20.597842 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:20 crc kubenswrapper[4691]: I0930 06:20:20.597858 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:20Z","lastTransitionTime":"2025-09-30T06:20:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:20 crc kubenswrapper[4691]: I0930 06:20:20.700434 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:20 crc kubenswrapper[4691]: I0930 06:20:20.700496 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:20 crc kubenswrapper[4691]: I0930 06:20:20.700516 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:20 crc kubenswrapper[4691]: I0930 06:20:20.700541 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:20 crc kubenswrapper[4691]: I0930 06:20:20.700558 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:20Z","lastTransitionTime":"2025-09-30T06:20:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:20 crc kubenswrapper[4691]: I0930 06:20:20.803661 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:20 crc kubenswrapper[4691]: I0930 06:20:20.803766 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:20 crc kubenswrapper[4691]: I0930 06:20:20.803789 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:20 crc kubenswrapper[4691]: I0930 06:20:20.803812 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:20 crc kubenswrapper[4691]: I0930 06:20:20.803862 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:20Z","lastTransitionTime":"2025-09-30T06:20:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:20 crc kubenswrapper[4691]: I0930 06:20:20.907532 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:20 crc kubenswrapper[4691]: I0930 06:20:20.907585 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:20 crc kubenswrapper[4691]: I0930 06:20:20.907604 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:20 crc kubenswrapper[4691]: I0930 06:20:20.907626 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:20 crc kubenswrapper[4691]: I0930 06:20:20.907643 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:20Z","lastTransitionTime":"2025-09-30T06:20:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:21 crc kubenswrapper[4691]: I0930 06:20:21.011249 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:21 crc kubenswrapper[4691]: I0930 06:20:21.011316 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:21 crc kubenswrapper[4691]: I0930 06:20:21.011340 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:21 crc kubenswrapper[4691]: I0930 06:20:21.011367 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:21 crc kubenswrapper[4691]: I0930 06:20:21.011390 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:21Z","lastTransitionTime":"2025-09-30T06:20:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:21 crc kubenswrapper[4691]: I0930 06:20:21.113711 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:21 crc kubenswrapper[4691]: I0930 06:20:21.113738 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:21 crc kubenswrapper[4691]: I0930 06:20:21.113746 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:21 crc kubenswrapper[4691]: I0930 06:20:21.113758 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:21 crc kubenswrapper[4691]: I0930 06:20:21.113768 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:21Z","lastTransitionTime":"2025-09-30T06:20:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:21 crc kubenswrapper[4691]: I0930 06:20:21.215872 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:21 crc kubenswrapper[4691]: I0930 06:20:21.215954 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:21 crc kubenswrapper[4691]: I0930 06:20:21.215971 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:21 crc kubenswrapper[4691]: I0930 06:20:21.215994 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:21 crc kubenswrapper[4691]: I0930 06:20:21.216010 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:21Z","lastTransitionTime":"2025-09-30T06:20:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:21 crc kubenswrapper[4691]: I0930 06:20:21.224105 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 06:20:21 crc kubenswrapper[4691]: I0930 06:20:21.224194 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 06:20:21 crc kubenswrapper[4691]: E0930 06:20:21.224256 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 06:20:21 crc kubenswrapper[4691]: E0930 06:20:21.224376 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 06:20:21 crc kubenswrapper[4691]: I0930 06:20:21.319310 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:21 crc kubenswrapper[4691]: I0930 06:20:21.319365 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:21 crc kubenswrapper[4691]: I0930 06:20:21.319380 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:21 crc kubenswrapper[4691]: I0930 06:20:21.319404 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:21 crc kubenswrapper[4691]: I0930 06:20:21.319426 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:21Z","lastTransitionTime":"2025-09-30T06:20:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:21 crc kubenswrapper[4691]: I0930 06:20:21.422804 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:21 crc kubenswrapper[4691]: I0930 06:20:21.422877 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:21 crc kubenswrapper[4691]: I0930 06:20:21.422929 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:21 crc kubenswrapper[4691]: I0930 06:20:21.422957 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:21 crc kubenswrapper[4691]: I0930 06:20:21.422981 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:21Z","lastTransitionTime":"2025-09-30T06:20:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:21 crc kubenswrapper[4691]: I0930 06:20:21.525540 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:21 crc kubenswrapper[4691]: I0930 06:20:21.525624 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:21 crc kubenswrapper[4691]: I0930 06:20:21.525635 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:21 crc kubenswrapper[4691]: I0930 06:20:21.525672 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:21 crc kubenswrapper[4691]: I0930 06:20:21.525684 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:21Z","lastTransitionTime":"2025-09-30T06:20:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:21 crc kubenswrapper[4691]: I0930 06:20:21.628360 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:21 crc kubenswrapper[4691]: I0930 06:20:21.628690 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:21 crc kubenswrapper[4691]: I0930 06:20:21.628830 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:21 crc kubenswrapper[4691]: I0930 06:20:21.629007 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:21 crc kubenswrapper[4691]: I0930 06:20:21.629147 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:21Z","lastTransitionTime":"2025-09-30T06:20:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:21 crc kubenswrapper[4691]: I0930 06:20:21.732215 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:21 crc kubenswrapper[4691]: I0930 06:20:21.732556 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:21 crc kubenswrapper[4691]: I0930 06:20:21.732644 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:21 crc kubenswrapper[4691]: I0930 06:20:21.732736 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:21 crc kubenswrapper[4691]: I0930 06:20:21.732827 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:21Z","lastTransitionTime":"2025-09-30T06:20:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:21 crc kubenswrapper[4691]: I0930 06:20:21.838364 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:21 crc kubenswrapper[4691]: I0930 06:20:21.838921 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:21 crc kubenswrapper[4691]: I0930 06:20:21.839011 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:21 crc kubenswrapper[4691]: I0930 06:20:21.839098 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:21 crc kubenswrapper[4691]: I0930 06:20:21.839157 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:21Z","lastTransitionTime":"2025-09-30T06:20:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:21 crc kubenswrapper[4691]: I0930 06:20:21.941322 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:21 crc kubenswrapper[4691]: I0930 06:20:21.941472 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:21 crc kubenswrapper[4691]: I0930 06:20:21.941561 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:21 crc kubenswrapper[4691]: I0930 06:20:21.941651 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:21 crc kubenswrapper[4691]: I0930 06:20:21.941735 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:21Z","lastTransitionTime":"2025-09-30T06:20:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:22 crc kubenswrapper[4691]: I0930 06:20:22.044169 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:22 crc kubenswrapper[4691]: I0930 06:20:22.044212 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:22 crc kubenswrapper[4691]: I0930 06:20:22.044224 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:22 crc kubenswrapper[4691]: I0930 06:20:22.044241 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:22 crc kubenswrapper[4691]: I0930 06:20:22.044254 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:22Z","lastTransitionTime":"2025-09-30T06:20:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:22 crc kubenswrapper[4691]: I0930 06:20:22.147826 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:22 crc kubenswrapper[4691]: I0930 06:20:22.147880 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:22 crc kubenswrapper[4691]: I0930 06:20:22.147935 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:22 crc kubenswrapper[4691]: I0930 06:20:22.147961 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:22 crc kubenswrapper[4691]: I0930 06:20:22.147977 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:22Z","lastTransitionTime":"2025-09-30T06:20:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:22 crc kubenswrapper[4691]: I0930 06:20:22.224590 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-svjxq" Sep 30 06:20:22 crc kubenswrapper[4691]: E0930 06:20:22.224873 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-svjxq" podUID="a8ed6f92-0b98-4b1b-a46e-4d0604d686a1" Sep 30 06:20:22 crc kubenswrapper[4691]: I0930 06:20:22.225423 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 06:20:22 crc kubenswrapper[4691]: E0930 06:20:22.227314 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 06:20:22 crc kubenswrapper[4691]: I0930 06:20:22.250706 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:22 crc kubenswrapper[4691]: I0930 06:20:22.250760 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:22 crc kubenswrapper[4691]: I0930 06:20:22.250789 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:22 crc kubenswrapper[4691]: I0930 06:20:22.250817 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:22 crc kubenswrapper[4691]: I0930 06:20:22.250842 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:22Z","lastTransitionTime":"2025-09-30T06:20:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:22 crc kubenswrapper[4691]: I0930 06:20:22.356969 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:22 crc kubenswrapper[4691]: I0930 06:20:22.357062 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:22 crc kubenswrapper[4691]: I0930 06:20:22.357126 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:22 crc kubenswrapper[4691]: I0930 06:20:22.357156 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:22 crc kubenswrapper[4691]: I0930 06:20:22.357219 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:22Z","lastTransitionTime":"2025-09-30T06:20:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:22 crc kubenswrapper[4691]: I0930 06:20:22.460294 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:22 crc kubenswrapper[4691]: I0930 06:20:22.460382 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:22 crc kubenswrapper[4691]: I0930 06:20:22.460455 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:22 crc kubenswrapper[4691]: I0930 06:20:22.460527 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:22 crc kubenswrapper[4691]: I0930 06:20:22.460554 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:22Z","lastTransitionTime":"2025-09-30T06:20:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:22 crc kubenswrapper[4691]: I0930 06:20:22.564072 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:22 crc kubenswrapper[4691]: I0930 06:20:22.564249 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:22 crc kubenswrapper[4691]: I0930 06:20:22.564270 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:22 crc kubenswrapper[4691]: I0930 06:20:22.564322 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:22 crc kubenswrapper[4691]: I0930 06:20:22.564338 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:22Z","lastTransitionTime":"2025-09-30T06:20:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:22 crc kubenswrapper[4691]: I0930 06:20:22.666455 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:22 crc kubenswrapper[4691]: I0930 06:20:22.666550 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:22 crc kubenswrapper[4691]: I0930 06:20:22.666568 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:22 crc kubenswrapper[4691]: I0930 06:20:22.666589 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:22 crc kubenswrapper[4691]: I0930 06:20:22.666605 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:22Z","lastTransitionTime":"2025-09-30T06:20:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:22 crc kubenswrapper[4691]: I0930 06:20:22.770008 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:22 crc kubenswrapper[4691]: I0930 06:20:22.770450 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:22 crc kubenswrapper[4691]: I0930 06:20:22.770557 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:22 crc kubenswrapper[4691]: I0930 06:20:22.770653 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:22 crc kubenswrapper[4691]: I0930 06:20:22.770723 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:22Z","lastTransitionTime":"2025-09-30T06:20:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:22 crc kubenswrapper[4691]: I0930 06:20:22.873639 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:22 crc kubenswrapper[4691]: I0930 06:20:22.874145 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:22 crc kubenswrapper[4691]: I0930 06:20:22.874216 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:22 crc kubenswrapper[4691]: I0930 06:20:22.874340 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:22 crc kubenswrapper[4691]: I0930 06:20:22.874424 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:22Z","lastTransitionTime":"2025-09-30T06:20:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:22 crc kubenswrapper[4691]: I0930 06:20:22.977490 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:22 crc kubenswrapper[4691]: I0930 06:20:22.977566 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:22 crc kubenswrapper[4691]: I0930 06:20:22.977579 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:22 crc kubenswrapper[4691]: I0930 06:20:22.977596 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:22 crc kubenswrapper[4691]: I0930 06:20:22.977604 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:22Z","lastTransitionTime":"2025-09-30T06:20:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:23 crc kubenswrapper[4691]: I0930 06:20:23.079072 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:23 crc kubenswrapper[4691]: I0930 06:20:23.079104 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:23 crc kubenswrapper[4691]: I0930 06:20:23.079113 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:23 crc kubenswrapper[4691]: I0930 06:20:23.079127 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:23 crc kubenswrapper[4691]: I0930 06:20:23.079137 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:23Z","lastTransitionTime":"2025-09-30T06:20:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:23 crc kubenswrapper[4691]: I0930 06:20:23.181684 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:23 crc kubenswrapper[4691]: I0930 06:20:23.181734 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:23 crc kubenswrapper[4691]: I0930 06:20:23.181746 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:23 crc kubenswrapper[4691]: I0930 06:20:23.181762 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:23 crc kubenswrapper[4691]: I0930 06:20:23.181774 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:23Z","lastTransitionTime":"2025-09-30T06:20:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:23 crc kubenswrapper[4691]: I0930 06:20:23.224270 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 06:20:23 crc kubenswrapper[4691]: I0930 06:20:23.224284 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 06:20:23 crc kubenswrapper[4691]: E0930 06:20:23.224439 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 06:20:23 crc kubenswrapper[4691]: E0930 06:20:23.224509 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 06:20:23 crc kubenswrapper[4691]: I0930 06:20:23.284302 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:23 crc kubenswrapper[4691]: I0930 06:20:23.284334 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:23 crc kubenswrapper[4691]: I0930 06:20:23.284341 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:23 crc kubenswrapper[4691]: I0930 06:20:23.284352 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:23 crc kubenswrapper[4691]: I0930 06:20:23.284360 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:23Z","lastTransitionTime":"2025-09-30T06:20:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:23 crc kubenswrapper[4691]: I0930 06:20:23.386510 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:23 crc kubenswrapper[4691]: I0930 06:20:23.386558 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:23 crc kubenswrapper[4691]: I0930 06:20:23.386574 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:23 crc kubenswrapper[4691]: I0930 06:20:23.386595 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:23 crc kubenswrapper[4691]: I0930 06:20:23.386610 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:23Z","lastTransitionTime":"2025-09-30T06:20:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:23 crc kubenswrapper[4691]: I0930 06:20:23.489591 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:23 crc kubenswrapper[4691]: I0930 06:20:23.489634 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:23 crc kubenswrapper[4691]: I0930 06:20:23.489650 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:23 crc kubenswrapper[4691]: I0930 06:20:23.489666 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:23 crc kubenswrapper[4691]: I0930 06:20:23.489676 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:23Z","lastTransitionTime":"2025-09-30T06:20:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:23 crc kubenswrapper[4691]: I0930 06:20:23.592752 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:23 crc kubenswrapper[4691]: I0930 06:20:23.592791 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:23 crc kubenswrapper[4691]: I0930 06:20:23.592801 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:23 crc kubenswrapper[4691]: I0930 06:20:23.592814 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:23 crc kubenswrapper[4691]: I0930 06:20:23.592825 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:23Z","lastTransitionTime":"2025-09-30T06:20:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:23 crc kubenswrapper[4691]: I0930 06:20:23.694862 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:23 crc kubenswrapper[4691]: I0930 06:20:23.694983 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:23 crc kubenswrapper[4691]: I0930 06:20:23.695028 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:23 crc kubenswrapper[4691]: I0930 06:20:23.695042 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:23 crc kubenswrapper[4691]: I0930 06:20:23.695051 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:23Z","lastTransitionTime":"2025-09-30T06:20:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:23 crc kubenswrapper[4691]: I0930 06:20:23.797663 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:23 crc kubenswrapper[4691]: I0930 06:20:23.797730 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:23 crc kubenswrapper[4691]: I0930 06:20:23.797755 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:23 crc kubenswrapper[4691]: I0930 06:20:23.797783 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:23 crc kubenswrapper[4691]: I0930 06:20:23.797805 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:23Z","lastTransitionTime":"2025-09-30T06:20:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:23 crc kubenswrapper[4691]: I0930 06:20:23.900708 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:23 crc kubenswrapper[4691]: I0930 06:20:23.900781 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:23 crc kubenswrapper[4691]: I0930 06:20:23.900805 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:23 crc kubenswrapper[4691]: I0930 06:20:23.900836 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:23 crc kubenswrapper[4691]: I0930 06:20:23.900857 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:23Z","lastTransitionTime":"2025-09-30T06:20:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:24 crc kubenswrapper[4691]: I0930 06:20:24.003406 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:24 crc kubenswrapper[4691]: I0930 06:20:24.003514 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:24 crc kubenswrapper[4691]: I0930 06:20:24.003537 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:24 crc kubenswrapper[4691]: I0930 06:20:24.003564 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:24 crc kubenswrapper[4691]: I0930 06:20:24.003588 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:24Z","lastTransitionTime":"2025-09-30T06:20:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:24 crc kubenswrapper[4691]: I0930 06:20:24.106581 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:24 crc kubenswrapper[4691]: I0930 06:20:24.106625 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:24 crc kubenswrapper[4691]: I0930 06:20:24.106636 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:24 crc kubenswrapper[4691]: I0930 06:20:24.106654 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:24 crc kubenswrapper[4691]: I0930 06:20:24.106666 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:24Z","lastTransitionTime":"2025-09-30T06:20:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:24 crc kubenswrapper[4691]: I0930 06:20:24.209649 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:24 crc kubenswrapper[4691]: I0930 06:20:24.209718 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:24 crc kubenswrapper[4691]: I0930 06:20:24.209741 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:24 crc kubenswrapper[4691]: I0930 06:20:24.209763 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:24 crc kubenswrapper[4691]: I0930 06:20:24.209780 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:24Z","lastTransitionTime":"2025-09-30T06:20:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:24 crc kubenswrapper[4691]: I0930 06:20:24.224061 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 06:20:24 crc kubenswrapper[4691]: I0930 06:20:24.224082 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-svjxq" Sep 30 06:20:24 crc kubenswrapper[4691]: E0930 06:20:24.224269 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 06:20:24 crc kubenswrapper[4691]: E0930 06:20:24.224319 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-svjxq" podUID="a8ed6f92-0b98-4b1b-a46e-4d0604d686a1" Sep 30 06:20:24 crc kubenswrapper[4691]: I0930 06:20:24.312023 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:24 crc kubenswrapper[4691]: I0930 06:20:24.312080 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:24 crc kubenswrapper[4691]: I0930 06:20:24.312097 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:24 crc kubenswrapper[4691]: I0930 06:20:24.312119 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:24 crc kubenswrapper[4691]: I0930 06:20:24.312135 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:24Z","lastTransitionTime":"2025-09-30T06:20:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:24 crc kubenswrapper[4691]: I0930 06:20:24.414926 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:24 crc kubenswrapper[4691]: I0930 06:20:24.414989 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:24 crc kubenswrapper[4691]: I0930 06:20:24.415011 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:24 crc kubenswrapper[4691]: I0930 06:20:24.415041 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:24 crc kubenswrapper[4691]: I0930 06:20:24.415127 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:24Z","lastTransitionTime":"2025-09-30T06:20:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:24 crc kubenswrapper[4691]: I0930 06:20:24.517749 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:24 crc kubenswrapper[4691]: I0930 06:20:24.517815 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:24 crc kubenswrapper[4691]: I0930 06:20:24.517833 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:24 crc kubenswrapper[4691]: I0930 06:20:24.517858 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:24 crc kubenswrapper[4691]: I0930 06:20:24.517877 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:24Z","lastTransitionTime":"2025-09-30T06:20:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:24 crc kubenswrapper[4691]: I0930 06:20:24.620779 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:24 crc kubenswrapper[4691]: I0930 06:20:24.620855 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:24 crc kubenswrapper[4691]: I0930 06:20:24.620881 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:24 crc kubenswrapper[4691]: I0930 06:20:24.620996 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:24 crc kubenswrapper[4691]: I0930 06:20:24.621021 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:24Z","lastTransitionTime":"2025-09-30T06:20:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:24 crc kubenswrapper[4691]: I0930 06:20:24.651433 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:24 crc kubenswrapper[4691]: I0930 06:20:24.651493 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:24 crc kubenswrapper[4691]: I0930 06:20:24.651511 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:24 crc kubenswrapper[4691]: I0930 06:20:24.651536 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:24 crc kubenswrapper[4691]: I0930 06:20:24.651555 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:24Z","lastTransitionTime":"2025-09-30T06:20:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:24 crc kubenswrapper[4691]: E0930 06:20:24.669608 4691 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5d33cbaa-1b8a-4dde-af56-05c3aae2213e\\\",\\\"systemUUID\\\":\\\"d7c4eecd-4486-44ba-8bf7-42bfa69eb2b7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:24Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:24 crc kubenswrapper[4691]: I0930 06:20:24.675186 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:24 crc kubenswrapper[4691]: I0930 06:20:24.675288 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:24 crc kubenswrapper[4691]: I0930 06:20:24.675317 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:24 crc kubenswrapper[4691]: I0930 06:20:24.675355 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:24 crc kubenswrapper[4691]: I0930 06:20:24.675397 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:24Z","lastTransitionTime":"2025-09-30T06:20:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:24 crc kubenswrapper[4691]: E0930 06:20:24.698346 4691 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5d33cbaa-1b8a-4dde-af56-05c3aae2213e\\\",\\\"systemUUID\\\":\\\"d7c4eecd-4486-44ba-8bf7-42bfa69eb2b7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:24Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:24 crc kubenswrapper[4691]: I0930 06:20:24.703646 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:24 crc kubenswrapper[4691]: I0930 06:20:24.703700 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:24 crc kubenswrapper[4691]: I0930 06:20:24.703719 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:24 crc kubenswrapper[4691]: I0930 06:20:24.703743 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:24 crc kubenswrapper[4691]: I0930 06:20:24.703760 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:24Z","lastTransitionTime":"2025-09-30T06:20:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:24 crc kubenswrapper[4691]: E0930 06:20:24.722791 4691 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5d33cbaa-1b8a-4dde-af56-05c3aae2213e\\\",\\\"systemUUID\\\":\\\"d7c4eecd-4486-44ba-8bf7-42bfa69eb2b7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:24Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:24 crc kubenswrapper[4691]: I0930 06:20:24.727507 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:24 crc kubenswrapper[4691]: I0930 06:20:24.727579 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:24 crc kubenswrapper[4691]: I0930 06:20:24.727599 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:24 crc kubenswrapper[4691]: I0930 06:20:24.727625 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:24 crc kubenswrapper[4691]: I0930 06:20:24.727644 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:24Z","lastTransitionTime":"2025-09-30T06:20:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:24 crc kubenswrapper[4691]: E0930 06:20:24.747759 4691 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5d33cbaa-1b8a-4dde-af56-05c3aae2213e\\\",\\\"systemUUID\\\":\\\"d7c4eecd-4486-44ba-8bf7-42bfa69eb2b7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:24Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:24 crc kubenswrapper[4691]: I0930 06:20:24.751311 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:24 crc kubenswrapper[4691]: I0930 06:20:24.751367 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:24 crc kubenswrapper[4691]: I0930 06:20:24.751387 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:24 crc kubenswrapper[4691]: I0930 06:20:24.751408 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:24 crc kubenswrapper[4691]: I0930 06:20:24.751425 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:24Z","lastTransitionTime":"2025-09-30T06:20:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:24 crc kubenswrapper[4691]: E0930 06:20:24.771926 4691 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:24Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5d33cbaa-1b8a-4dde-af56-05c3aae2213e\\\",\\\"systemUUID\\\":\\\"d7c4eecd-4486-44ba-8bf7-42bfa69eb2b7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:24Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:24 crc kubenswrapper[4691]: E0930 06:20:24.772026 4691 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 06:20:24 crc kubenswrapper[4691]: I0930 06:20:24.773549 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:24 crc kubenswrapper[4691]: I0930 06:20:24.773609 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:24 crc kubenswrapper[4691]: I0930 06:20:24.773620 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:24 crc kubenswrapper[4691]: I0930 06:20:24.773639 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:24 crc kubenswrapper[4691]: I0930 06:20:24.773654 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:24Z","lastTransitionTime":"2025-09-30T06:20:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:24 crc kubenswrapper[4691]: I0930 06:20:24.875779 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:24 crc kubenswrapper[4691]: I0930 06:20:24.875819 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:24 crc kubenswrapper[4691]: I0930 06:20:24.875832 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:24 crc kubenswrapper[4691]: I0930 06:20:24.875849 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:24 crc kubenswrapper[4691]: I0930 06:20:24.875861 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:24Z","lastTransitionTime":"2025-09-30T06:20:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:24 crc kubenswrapper[4691]: I0930 06:20:24.978962 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:24 crc kubenswrapper[4691]: I0930 06:20:24.979010 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:24 crc kubenswrapper[4691]: I0930 06:20:24.979029 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:24 crc kubenswrapper[4691]: I0930 06:20:24.979052 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:24 crc kubenswrapper[4691]: I0930 06:20:24.979069 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:24Z","lastTransitionTime":"2025-09-30T06:20:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:25 crc kubenswrapper[4691]: I0930 06:20:25.082517 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:25 crc kubenswrapper[4691]: I0930 06:20:25.082546 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:25 crc kubenswrapper[4691]: I0930 06:20:25.082555 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:25 crc kubenswrapper[4691]: I0930 06:20:25.082567 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:25 crc kubenswrapper[4691]: I0930 06:20:25.082577 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:25Z","lastTransitionTime":"2025-09-30T06:20:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:25 crc kubenswrapper[4691]: I0930 06:20:25.088037 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a8ed6f92-0b98-4b1b-a46e-4d0604d686a1-metrics-certs\") pod \"network-metrics-daemon-svjxq\" (UID: \"a8ed6f92-0b98-4b1b-a46e-4d0604d686a1\") " pod="openshift-multus/network-metrics-daemon-svjxq" Sep 30 06:20:25 crc kubenswrapper[4691]: E0930 06:20:25.088159 4691 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 06:20:25 crc kubenswrapper[4691]: E0930 06:20:25.088204 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8ed6f92-0b98-4b1b-a46e-4d0604d686a1-metrics-certs podName:a8ed6f92-0b98-4b1b-a46e-4d0604d686a1 nodeName:}" failed. No retries permitted until 2025-09-30 06:20:57.088191223 +0000 UTC m=+100.563212263 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a8ed6f92-0b98-4b1b-a46e-4d0604d686a1-metrics-certs") pod "network-metrics-daemon-svjxq" (UID: "a8ed6f92-0b98-4b1b-a46e-4d0604d686a1") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 06:20:25 crc kubenswrapper[4691]: I0930 06:20:25.184975 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:25 crc kubenswrapper[4691]: I0930 06:20:25.185000 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:25 crc kubenswrapper[4691]: I0930 06:20:25.185008 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:25 crc kubenswrapper[4691]: I0930 06:20:25.185018 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:25 crc kubenswrapper[4691]: I0930 06:20:25.185026 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:25Z","lastTransitionTime":"2025-09-30T06:20:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:25 crc kubenswrapper[4691]: I0930 06:20:25.223788 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 06:20:25 crc kubenswrapper[4691]: I0930 06:20:25.223830 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 06:20:25 crc kubenswrapper[4691]: E0930 06:20:25.224028 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 06:20:25 crc kubenswrapper[4691]: E0930 06:20:25.224079 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 06:20:25 crc kubenswrapper[4691]: I0930 06:20:25.286955 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:25 crc kubenswrapper[4691]: I0930 06:20:25.287020 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:25 crc kubenswrapper[4691]: I0930 06:20:25.287044 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:25 crc kubenswrapper[4691]: I0930 06:20:25.287068 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:25 crc kubenswrapper[4691]: I0930 06:20:25.287086 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:25Z","lastTransitionTime":"2025-09-30T06:20:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:25 crc kubenswrapper[4691]: I0930 06:20:25.389526 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:25 crc kubenswrapper[4691]: I0930 06:20:25.389578 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:25 crc kubenswrapper[4691]: I0930 06:20:25.389590 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:25 crc kubenswrapper[4691]: I0930 06:20:25.389613 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:25 crc kubenswrapper[4691]: I0930 06:20:25.389627 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:25Z","lastTransitionTime":"2025-09-30T06:20:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:25 crc kubenswrapper[4691]: I0930 06:20:25.491831 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:25 crc kubenswrapper[4691]: I0930 06:20:25.491949 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:25 crc kubenswrapper[4691]: I0930 06:20:25.491968 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:25 crc kubenswrapper[4691]: I0930 06:20:25.491994 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:25 crc kubenswrapper[4691]: I0930 06:20:25.492011 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:25Z","lastTransitionTime":"2025-09-30T06:20:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:25 crc kubenswrapper[4691]: I0930 06:20:25.594769 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:25 crc kubenswrapper[4691]: I0930 06:20:25.594828 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:25 crc kubenswrapper[4691]: I0930 06:20:25.594845 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:25 crc kubenswrapper[4691]: I0930 06:20:25.594870 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:25 crc kubenswrapper[4691]: I0930 06:20:25.594926 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:25Z","lastTransitionTime":"2025-09-30T06:20:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:25 crc kubenswrapper[4691]: I0930 06:20:25.697034 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:25 crc kubenswrapper[4691]: I0930 06:20:25.697092 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:25 crc kubenswrapper[4691]: I0930 06:20:25.697109 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:25 crc kubenswrapper[4691]: I0930 06:20:25.697130 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:25 crc kubenswrapper[4691]: I0930 06:20:25.697146 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:25Z","lastTransitionTime":"2025-09-30T06:20:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:25 crc kubenswrapper[4691]: I0930 06:20:25.800033 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:25 crc kubenswrapper[4691]: I0930 06:20:25.800077 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:25 crc kubenswrapper[4691]: I0930 06:20:25.800092 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:25 crc kubenswrapper[4691]: I0930 06:20:25.800113 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:25 crc kubenswrapper[4691]: I0930 06:20:25.800130 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:25Z","lastTransitionTime":"2025-09-30T06:20:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:25 crc kubenswrapper[4691]: I0930 06:20:25.902874 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:25 crc kubenswrapper[4691]: I0930 06:20:25.902943 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:25 crc kubenswrapper[4691]: I0930 06:20:25.902955 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:25 crc kubenswrapper[4691]: I0930 06:20:25.902974 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:25 crc kubenswrapper[4691]: I0930 06:20:25.902989 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:25Z","lastTransitionTime":"2025-09-30T06:20:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:26 crc kubenswrapper[4691]: I0930 06:20:26.005831 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:26 crc kubenswrapper[4691]: I0930 06:20:26.005945 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:26 crc kubenswrapper[4691]: I0930 06:20:26.005970 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:26 crc kubenswrapper[4691]: I0930 06:20:26.006000 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:26 crc kubenswrapper[4691]: I0930 06:20:26.006023 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:26Z","lastTransitionTime":"2025-09-30T06:20:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:26 crc kubenswrapper[4691]: I0930 06:20:26.108993 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:26 crc kubenswrapper[4691]: I0930 06:20:26.109061 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:26 crc kubenswrapper[4691]: I0930 06:20:26.109086 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:26 crc kubenswrapper[4691]: I0930 06:20:26.109114 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:26 crc kubenswrapper[4691]: I0930 06:20:26.109137 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:26Z","lastTransitionTime":"2025-09-30T06:20:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:26 crc kubenswrapper[4691]: I0930 06:20:26.212006 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:26 crc kubenswrapper[4691]: I0930 06:20:26.212064 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:26 crc kubenswrapper[4691]: I0930 06:20:26.212075 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:26 crc kubenswrapper[4691]: I0930 06:20:26.212093 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:26 crc kubenswrapper[4691]: I0930 06:20:26.212106 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:26Z","lastTransitionTime":"2025-09-30T06:20:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:26 crc kubenswrapper[4691]: I0930 06:20:26.224625 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-svjxq" Sep 30 06:20:26 crc kubenswrapper[4691]: E0930 06:20:26.224753 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-svjxq" podUID="a8ed6f92-0b98-4b1b-a46e-4d0604d686a1" Sep 30 06:20:26 crc kubenswrapper[4691]: I0930 06:20:26.224629 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 06:20:26 crc kubenswrapper[4691]: E0930 06:20:26.224957 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 06:20:26 crc kubenswrapper[4691]: I0930 06:20:26.315281 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:26 crc kubenswrapper[4691]: I0930 06:20:26.315342 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:26 crc kubenswrapper[4691]: I0930 06:20:26.315353 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:26 crc kubenswrapper[4691]: I0930 06:20:26.315369 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:26 crc kubenswrapper[4691]: I0930 06:20:26.315382 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:26Z","lastTransitionTime":"2025-09-30T06:20:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:26 crc kubenswrapper[4691]: I0930 06:20:26.418479 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:26 crc kubenswrapper[4691]: I0930 06:20:26.418532 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:26 crc kubenswrapper[4691]: I0930 06:20:26.418543 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:26 crc kubenswrapper[4691]: I0930 06:20:26.418562 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:26 crc kubenswrapper[4691]: I0930 06:20:26.418574 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:26Z","lastTransitionTime":"2025-09-30T06:20:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:26 crc kubenswrapper[4691]: I0930 06:20:26.521209 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:26 crc kubenswrapper[4691]: I0930 06:20:26.521264 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:26 crc kubenswrapper[4691]: I0930 06:20:26.521278 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:26 crc kubenswrapper[4691]: I0930 06:20:26.521298 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:26 crc kubenswrapper[4691]: I0930 06:20:26.521310 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:26Z","lastTransitionTime":"2025-09-30T06:20:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:26 crc kubenswrapper[4691]: I0930 06:20:26.624483 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:26 crc kubenswrapper[4691]: I0930 06:20:26.624549 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:26 crc kubenswrapper[4691]: I0930 06:20:26.624567 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:26 crc kubenswrapper[4691]: I0930 06:20:26.624591 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:26 crc kubenswrapper[4691]: I0930 06:20:26.624609 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:26Z","lastTransitionTime":"2025-09-30T06:20:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:26 crc kubenswrapper[4691]: I0930 06:20:26.653131 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xjjw8_5bfd073c-4582-4a65-8170-7030f4852174/kube-multus/0.log" Sep 30 06:20:26 crc kubenswrapper[4691]: I0930 06:20:26.653211 4691 generic.go:334] "Generic (PLEG): container finished" podID="5bfd073c-4582-4a65-8170-7030f4852174" containerID="3a065eb18eb909fb60011f5ae6adf327088480996585d74e861628ffdd759bd9" exitCode=1 Sep 30 06:20:26 crc kubenswrapper[4691]: I0930 06:20:26.653257 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xjjw8" event={"ID":"5bfd073c-4582-4a65-8170-7030f4852174","Type":"ContainerDied","Data":"3a065eb18eb909fb60011f5ae6adf327088480996585d74e861628ffdd759bd9"} Sep 30 06:20:26 crc kubenswrapper[4691]: I0930 06:20:26.653928 4691 scope.go:117] "RemoveContainer" containerID="3a065eb18eb909fb60011f5ae6adf327088480996585d74e861628ffdd759bd9" Sep 30 06:20:26 crc kubenswrapper[4691]: I0930 06:20:26.668119 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b46ade-8260-448f-84b7-506632d23ff9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff4769002bae0f06423979897a58fdb4b2a78f9ea00abc1d5df5c8d4385c0919\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xszvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1440cd9bacd9b28d5e192b3d9143dda931c741606fdb8f246fba23b8a68c534c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xszvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4w4k6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:26Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:26 crc kubenswrapper[4691]: I0930 06:20:26.697554 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e55ae5a0-03bf-427d-92b2-39cdff18340d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f6cedf7ad601b81415785791be352c849e1d559ab8e6a0f33de2c89a93cf09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65513dc3f11ab14acbafe004e02c1b395b74935e84a00e75d9936bde03a97fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6240aa4c18020b30de9389e7a1869c736424444208d4011f228b4e5432520cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2841452dae413fa8cdf532c4933ed52cfb007be43a45ee2d25209b54890fd351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b052f8474075dfc1f07b8e0844072d5bbaa94bae1711f7e4cecc6928a65689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:26Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:26 crc kubenswrapper[4691]: I0930 06:20:26.718816 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb1cfb02-1e6b-4bfb-8104-f1d137231de9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07fbce63750320e7345e1a99b4be96355f54c36c70e8929599b990eab3f3e637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38c9aa49f81f590d234daa993bb92af0c48103b45f5116c27b86daa651930a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acbe061d62e7bf11b0003c035dce19386981672d5e63236fada6e43b92274c91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fc15eff68924ff75a90b9cb7b5119d95ced9b9fe9e12968bfd077db73f37ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://017d7c032ea394e2dbef22cfc837c65ce54aab7db5fbd32210312b3e7042a698\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T06:19:31Z\\\",\\\"message\\\":\\\"W0930 06:19:20.583735 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 06:19:20.584237 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759213160 cert, and key in /tmp/serving-cert-2229078217/serving-signer.crt, /tmp/serving-cert-2229078217/serving-signer.key\\\\nI0930 06:19:20.815054 1 observer_polling.go:159] Starting file observer\\\\nW0930 06:19:20.818069 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 06:19:20.818377 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 06:19:20.821284 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2229078217/tls.crt::/tmp/serving-cert-2229078217/tls.key\\\\\\\"\\\\nF0930 06:19:31.389088 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39d7f1cd59b9a24d4303415d873bac1ed21326847749130167868b5298b7f8e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:26Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:26 crc kubenswrapper[4691]: I0930 06:20:26.729540 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:26 crc kubenswrapper[4691]: I0930 06:20:26.729598 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:26 crc kubenswrapper[4691]: I0930 06:20:26.729617 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:26 crc kubenswrapper[4691]: I0930 06:20:26.729644 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:26 crc kubenswrapper[4691]: I0930 06:20:26.729662 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:26Z","lastTransitionTime":"2025-09-30T06:20:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:26 crc kubenswrapper[4691]: I0930 06:20:26.740338 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:26Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:26 crc kubenswrapper[4691]: I0930 06:20:26.755840 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03f725fd435423024f16548f2fdd44ad1ddc26d1d485d56cc5b614ab2cac7a76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5fcea9ccb5c1331b7ae737abdcc80364814715f3c43c15793316f16acf0502a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:26Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:26 crc kubenswrapper[4691]: I0930 06:20:26.772283 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:26Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:26 crc kubenswrapper[4691]: I0930 06:20:26.783657 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8htrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c1c8663-d263-4b8c-93fa-05ee1b61d7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73cf34ef0b11305b959da324b55f8a211f0c278cf2dec835bb6c91ad385e4dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj86t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8htrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:26Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:26 crc kubenswrapper[4691]: I0930 06:20:26.796950 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3ecaa3-192f-40d4-abf8-938b9e03a661\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://469f62522e2f813405840a8654f761a52163bdabeba8cc7c57998ccd40548370\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198c76a83115408261c01c78cbd2845e487ad3dc896da0130b2b1338349506d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aadde3a7a603aae354d4f9dcf5ef288aeceabfb73ee3c92398fb3419326b27a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2f054ea0da71d1f65e501212a66771c5e65bfc123e3a3acd5e13900def039d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:26Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:26 crc kubenswrapper[4691]: I0930 06:20:26.810071 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c655fbd5-0708-4151-b0fb-a97d8e1826bc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c055c319af36509c20e700f8f5025b0d356ca5e6038be80dd69282a1f1ad716b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://810c839e6c66bbacb466fb7023bed728b17be9d13025e2db26ee5b40fea124f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1f3d8c18456850212ce283d46273b39939040ddd575f193acc0910cf479f3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://deebf28973e8320961c1318918377f952fadba50dec1e580e461de659c63f23e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://deebf28973e8320961c1318918377f952fadba50dec1e580e461de659c63f23e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:26Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:26 crc kubenswrapper[4691]: I0930 06:20:26.826337 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11fb453d515798fa4f85603df19a5dc7d305375d25c0f679598613aba7312328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:26Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:26 crc kubenswrapper[4691]: I0930 06:20:26.831379 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:26 crc kubenswrapper[4691]: I0930 06:20:26.831418 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:26 crc kubenswrapper[4691]: I0930 06:20:26.831430 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:26 crc kubenswrapper[4691]: I0930 06:20:26.831449 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:26 crc kubenswrapper[4691]: I0930 06:20:26.831460 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:26Z","lastTransitionTime":"2025-09-30T06:20:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:26 crc kubenswrapper[4691]: I0930 06:20:26.841103 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nzp64" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b7ceecf30a44692340aed4f0df2aa58e8c9552f8ef872b7a872b29516fa2068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d675d9797e8541b6fff5df78a8d0701a51d23b03bd2d489d0489a82f440b4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d675d9797e8541b6fff5df78a8d0701a51d23b03bd2d489d0489a82f440b4a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aacdb742861442ec0e7b14904397b4e1a4bd115cd56e6105b37c42cb762b0dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aacdb742861442ec0e7b14904397b4e1a4bd115cd56e6105b37c42cb762b0dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f6c31e7404be56f11fd658c5e92f44de4704b9c2d5ebbce1fd0af40d2ca8c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1f6c31e7404be56f11fd658c5e92f44de4704b9c2d5ebbce1fd0af40d2ca8c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74da2d6dc28ebfcbfff21e47ebad84a906c4686fddf32d04eb494be40d811b86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74da2d6dc28ebfcbfff21e47ebad84a906c4686fddf32d04eb494be40d811b86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d1c4f2763fc5ea8ecd81a1c6aff31d291156a4db150fc70c0111b811bd5467c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d1c4f2763fc5ea8ecd81a1c6aff31d291156a4db150fc70c0111b811bd5467c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f313d83849679400f0ef2223f644c042290a33d55cab935fdbe9af123c88d210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f313d83849679400f0ef2223f644c042290a33d55cab935fdbe9af123c88d210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nzp64\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:26Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:26 crc kubenswrapper[4691]: I0930 06:20:26.863982 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b6df86867d5427797ec0c0a28b653ef5b8ec62af450207d9af5f490df725b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04bc58a7b740ca100f333ef769a949f962ff6024f8dd87b798c99a28514e0394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d95203bc3297779316f06d001c72859f9b4561a9a9d7c3123c5a75e8d8bc6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f46927454f67993366632bb3f6e37e4cb0c5f013266b18c1ac61e9cddc42e023\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20789e47d7e584044b0763af1eb8b81bbde2f247ec14e56a5a9b08c28894c3be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d002e869e911c24f154df90324ace473c747a4cfa6dd60ddcbef6473f9815f72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4eafbd6f3545c96381319c2b5692b5eccef9c2f837e82cdabf3ea0b92504260\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4eafbd6f3545c96381319c2b5692b5eccef9c2f837e82cdabf3ea0b92504260\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T06:20:06Z\\\",\\\"message\\\":\\\"/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 06:20:06.206257 6355 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0930 06:20:06.206290 6355 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0930 06:20:06.206550 6355 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 06:20:06.206243 6355 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 06:20:06.206758 6355 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 06:20:06.207239 6355 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 06:20:06.207377 6355 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 06:20:06.207466 6355 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 06:20:06.208066 6355 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:20:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-sjmvw_openshift-ovn-kubernetes(6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f49abce591ae5c912fdbbdf0253d06ee20088424718dd6d535683a2a2ede5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sjmvw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:26Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:26 crc kubenswrapper[4691]: I0930 06:20:26.877394 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:26Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:26 crc kubenswrapper[4691]: I0930 06:20:26.890247 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca08f5d521940a428454d90f85150e7b87e2353314c89a42b4c73740c3b23163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:26Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:26 crc kubenswrapper[4691]: I0930 06:20:26.905671 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xjjw8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bfd073c-4582-4a65-8170-7030f4852174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a065eb18eb909fb60011f5ae6adf327088480996585d74e861628ffdd759bd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a065eb18eb909fb60011f5ae6adf327088480996585d74e861628ffdd759bd9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T06:20:26Z\\\",\\\"message\\\":\\\"2025-09-30T06:19:40+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f75473c8-d557-49aa-97dd-d3a6a6daff28\\\\n2025-09-30T06:19:40+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f75473c8-d557-49aa-97dd-d3a6a6daff28 to /host/opt/cni/bin/\\\\n2025-09-30T06:19:41Z [verbose] multus-daemon started\\\\n2025-09-30T06:19:41Z [verbose] Readiness Indicator file check\\\\n2025-09-30T06:20:26Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xjjw8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:26Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:26 crc kubenswrapper[4691]: I0930 06:20:26.916598 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-p7wmt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99a6e728-8795-424d-a99e-7141c75baad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://746d60ed8b3230cabda106f1d2e4bc8496618515f2cfb2d6c4238375f71ad2af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6jfq9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-p7wmt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:26Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:26 crc kubenswrapper[4691]: I0930 06:20:26.928582 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fxv8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37b1a1c7-92d7-41ea-b5c1-4a56b40f819e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be3a8860307e499758a11dd6561cdf1ffd968b3edf9c701b52994ea4cfe1c129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4h4k7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b07ee418ef8f92ea69f7d64ef09e0de246432d4663c4c5018be94894860c6444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4h4k7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fxv8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:26Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:26 crc kubenswrapper[4691]: I0930 06:20:26.933769 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:26 crc kubenswrapper[4691]: I0930 06:20:26.933815 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:26 crc kubenswrapper[4691]: I0930 06:20:26.933829 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:26 crc kubenswrapper[4691]: I0930 06:20:26.933849 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:26 crc kubenswrapper[4691]: I0930 06:20:26.933862 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:26Z","lastTransitionTime":"2025-09-30T06:20:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:26 crc kubenswrapper[4691]: I0930 06:20:26.946109 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-svjxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8ed6f92-0b98-4b1b-a46e-4d0604d686a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q98dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q98dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-svjxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:26Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.036631 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.036685 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.036698 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.036716 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.036729 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:27Z","lastTransitionTime":"2025-09-30T06:20:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.140261 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.140312 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.140323 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.140344 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.140355 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:27Z","lastTransitionTime":"2025-09-30T06:20:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.224616 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.224775 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 06:20:27 crc kubenswrapper[4691]: E0930 06:20:27.224865 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 06:20:27 crc kubenswrapper[4691]: E0930 06:20:27.225043 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.242944 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.243180 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.243191 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.243210 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.243223 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:27Z","lastTransitionTime":"2025-09-30T06:20:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.246147 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:27Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.264623 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03f725fd435423024f16548f2fdd44ad1ddc26d1d485d56cc5b614ab2cac7a76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5fcea9ccb5c1331b7ae737abdcc80364814715f3c43c15793316f16acf0502a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:27Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.279502 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:27Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.291500 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8htrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c1c8663-d263-4b8c-93fa-05ee1b61d7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73cf34ef0b11305b959da324b55f8a211f0c278cf2dec835bb6c91ad385e4dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj86t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8htrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:27Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.305842 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b46ade-8260-448f-84b7-506632d23ff9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff4769002bae0f06423979897a58fdb4b2a78f9ea00abc1d5df5c8d4385c0919\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xszvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1440cd9bacd9b28d5e192b3d9143dda931c741606fdb8f246fba23b8a68c534c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xszvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4w4k6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:27Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.337710 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e55ae5a0-03bf-427d-92b2-39cdff18340d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f6cedf7ad601b81415785791be352c849e1d559ab8e6a0f33de2c89a93cf09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65513dc3f11ab14acbafe004e02c1b395b74935e84a00e75d9936bde03a97fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6240aa4c18020b30de9389e7a1869c736424444208d4011f228b4e5432520cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2841452dae413fa8cdf532c4933ed52cfb007be43a45ee2d25209b54890fd351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b052f8474075dfc1f07b8e0844072d5bbaa94bae1711f7e4cecc6928a65689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:27Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.345124 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.345158 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.345166 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.345182 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.345194 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:27Z","lastTransitionTime":"2025-09-30T06:20:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.359976 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb1cfb02-1e6b-4bfb-8104-f1d137231de9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07fbce63750320e7345e1a99b4be96355f54c36c70e8929599b990eab3f3e637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38c9aa49f81f590d234daa993bb92af0c48103b45f5116c27b86daa651930a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acbe061d62e7bf11b0003c035dce19386981672d5e63236fada6e43b92274c91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fc15eff68924ff75a90b9cb7b5119d95ced9b9fe9e12968bfd077db73f37ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://017d7c032ea394e2dbef22cfc837c65ce54aab7db5fbd32210312b3e7042a698\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T06:19:31Z\\\",\\\"message\\\":\\\"W0930 06:19:20.583735 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 06:19:20.584237 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759213160 cert, and key in /tmp/serving-cert-2229078217/serving-signer.crt, /tmp/serving-cert-2229078217/serving-signer.key\\\\nI0930 06:19:20.815054 1 observer_polling.go:159] Starting file observer\\\\nW0930 06:19:20.818069 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 06:19:20.818377 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 06:19:20.821284 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2229078217/tls.crt::/tmp/serving-cert-2229078217/tls.key\\\\\\\"\\\\nF0930 06:19:31.389088 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39d7f1cd59b9a24d4303415d873bac1ed21326847749130167868b5298b7f8e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:27Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.382732 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c655fbd5-0708-4151-b0fb-a97d8e1826bc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c055c319af36509c20e700f8f5025b0d356ca5e6038be80dd69282a1f1ad716b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://810c839e6c66bbacb466fb7023bed728b17be9d13025e2db26ee5b40fea124f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1f3d8c18456850212ce283d46273b39939040ddd575f193acc0910cf479f3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://deebf28973e8320961c1318918377f952fadba50dec1e580e461de659c63f23e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://deebf28973e8320961c1318918377f952fadba50dec1e580e461de659c63f23e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:27Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.406922 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3ecaa3-192f-40d4-abf8-938b9e03a661\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://469f62522e2f813405840a8654f761a52163bdabeba8cc7c57998ccd40548370\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198c76a83115408261c01c78cbd2845e487ad3dc896da0130b2b1338349506d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aadde3a7a603aae354d4f9dcf5ef288aeceabfb73ee3c92398fb3419326b27a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2f054ea0da71d1f65e501212a66771c5e65bfc123e3a3acd5e13900def039d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:27Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.436286 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b6df86867d5427797ec0c0a28b653ef5b8ec62af450207d9af5f490df725b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04bc58a7b740ca100f333ef769a949f962ff6024f8dd87b798c99a28514e0394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d95203bc3297779316f06d001c72859f9b4561a9a9d7c3123c5a75e8d8bc6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f46927454f67993366632bb3f6e37e4cb0c5f013266b18c1ac61e9cddc42e023\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20789e47d7e584044b0763af1eb8b81bbde2f247ec14e56a5a9b08c28894c3be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d002e869e911c24f154df90324ace473c747a4cfa6dd60ddcbef6473f9815f72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4eafbd6f3545c96381319c2b5692b5eccef9c2f837e82cdabf3ea0b92504260\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4eafbd6f3545c96381319c2b5692b5eccef9c2f837e82cdabf3ea0b92504260\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T06:20:06Z\\\",\\\"message\\\":\\\"/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 06:20:06.206257 6355 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0930 06:20:06.206290 6355 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0930 06:20:06.206550 6355 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 06:20:06.206243 6355 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 06:20:06.206758 6355 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 06:20:06.207239 6355 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 06:20:06.207377 6355 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 06:20:06.207466 6355 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 06:20:06.208066 6355 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:20:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-sjmvw_openshift-ovn-kubernetes(6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f49abce591ae5c912fdbbdf0253d06ee20088424718dd6d535683a2a2ede5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sjmvw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:27Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.448370 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.448432 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.448450 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.448474 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.448494 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:27Z","lastTransitionTime":"2025-09-30T06:20:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.462235 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11fb453d515798fa4f85603df19a5dc7d305375d25c0f679598613aba7312328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:27Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.486569 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nzp64" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b7ceecf30a44692340aed4f0df2aa58e8c9552f8ef872b7a872b29516fa2068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d675d9797e8541b6fff5df78a8d0701a51d23b03bd2d489d0489a82f440b4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d675d9797e8541b6fff5df78a8d0701a51d23b03bd2d489d0489a82f440b4a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aacdb742861442ec0e7b14904397b4e1a4bd115cd56e6105b37c42cb762b0dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aacdb742861442ec0e7b14904397b4e1a4bd115cd56e6105b37c42cb762b0dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f6c31e7404be56f11fd658c5e92f44de4704b9c2d5ebbce1fd0af40d2ca8c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1f6c31e7404be56f11fd658c5e92f44de4704b9c2d5ebbce1fd0af40d2ca8c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74da2d6dc28ebfcbfff21e47ebad84a906c4686fddf32d04eb494be40d811b86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74da2d6dc28ebfcbfff21e47ebad84a906c4686fddf32d04eb494be40d811b86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d1c4f2763fc5ea8ecd81a1c6aff31d291156a4db150fc70c0111b811bd5467c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d1c4f2763fc5ea8ecd81a1c6aff31d291156a4db150fc70c0111b811bd5467c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f313d83849679400f0ef2223f644c042290a33d55cab935fdbe9af123c88d210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f313d83849679400f0ef2223f644c042290a33d55cab935fdbe9af123c88d210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nzp64\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:27Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.502452 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xjjw8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bfd073c-4582-4a65-8170-7030f4852174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3a065eb18eb909fb60011f5ae6adf327088480996585d74e861628ffdd759bd9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a065eb18eb909fb60011f5ae6adf327088480996585d74e861628ffdd759bd9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T06:20:26Z\\\",\\\"message\\\":\\\"2025-09-30T06:19:40+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f75473c8-d557-49aa-97dd-d3a6a6daff28\\\\n2025-09-30T06:19:40+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f75473c8-d557-49aa-97dd-d3a6a6daff28 to /host/opt/cni/bin/\\\\n2025-09-30T06:19:41Z [verbose] multus-daemon started\\\\n2025-09-30T06:19:41Z [verbose] Readiness Indicator file check\\\\n2025-09-30T06:20:26Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xjjw8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:27Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.515824 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-p7wmt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99a6e728-8795-424d-a99e-7141c75baad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://746d60ed8b3230cabda106f1d2e4bc8496618515f2cfb2d6c4238375f71ad2af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6jfq9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-p7wmt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:27Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.526565 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fxv8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37b1a1c7-92d7-41ea-b5c1-4a56b40f819e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be3a8860307e499758a11dd6561cdf1ffd968b3edf9c701b52994ea4cfe1c129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4h4k7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b07ee418ef8f92ea69f7d64ef09e0de246432d4663c4c5018be94894860c6444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4h4k7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fxv8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:27Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.538809 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-svjxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8ed6f92-0b98-4b1b-a46e-4d0604d686a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q98dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q98dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-svjxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:27Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.551124 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.551168 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.551182 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.551200 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.551212 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:27Z","lastTransitionTime":"2025-09-30T06:20:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.552816 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:27Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.569851 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca08f5d521940a428454d90f85150e7b87e2353314c89a42b4c73740c3b23163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:27Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.653395 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.653437 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.653450 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.653467 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.653479 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:27Z","lastTransitionTime":"2025-09-30T06:20:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.658345 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xjjw8_5bfd073c-4582-4a65-8170-7030f4852174/kube-multus/0.log" Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.658391 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xjjw8" event={"ID":"5bfd073c-4582-4a65-8170-7030f4852174","Type":"ContainerStarted","Data":"139bc9d998c0acce39c92915ac1c15ec414b09e1f5c0ae15c37f7af6e5cf570d"} Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.689446 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e55ae5a0-03bf-427d-92b2-39cdff18340d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f6cedf7ad601b81415785791be352c849e1d559ab8e6a0f33de2c89a93cf09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65513dc3f11ab14acbafe004e02c1b395b74935e84a00e75d9936bde03a97fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6240aa4c18020b30de9389e7a1869c736424444208d4011f228b4e5432520cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2841452dae413fa8cdf532c4933ed52cfb007be43a45ee2d25209b54890fd351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b052f8474075dfc1f07b8e0844072d5bbaa94bae1711f7e4cecc6928a65689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:27Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.709508 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb1cfb02-1e6b-4bfb-8104-f1d137231de9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07fbce63750320e7345e1a99b4be96355f54c36c70e8929599b990eab3f3e637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38c9aa49f81f590d234daa993bb92af0c48103b45f5116c27b86daa651930a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acbe061d62e7bf11b0003c035dce19386981672d5e63236fada6e43b92274c91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fc15eff68924ff75a90b9cb7b5119d95ced9b9fe9e12968bfd077db73f37ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://017d7c032ea394e2dbef22cfc837c65ce54aab7db5fbd32210312b3e7042a698\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T06:19:31Z\\\",\\\"message\\\":\\\"W0930 06:19:20.583735 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 06:19:20.584237 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759213160 cert, and key in /tmp/serving-cert-2229078217/serving-signer.crt, /tmp/serving-cert-2229078217/serving-signer.key\\\\nI0930 06:19:20.815054 1 observer_polling.go:159] Starting file observer\\\\nW0930 06:19:20.818069 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 06:19:20.818377 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 06:19:20.821284 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2229078217/tls.crt::/tmp/serving-cert-2229078217/tls.key\\\\\\\"\\\\nF0930 06:19:31.389088 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39d7f1cd59b9a24d4303415d873bac1ed21326847749130167868b5298b7f8e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:27Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.726499 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:27Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.742075 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03f725fd435423024f16548f2fdd44ad1ddc26d1d485d56cc5b614ab2cac7a76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5fcea9ccb5c1331b7ae737abdcc80364814715f3c43c15793316f16acf0502a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:27Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.755525 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.755555 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.755564 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.755578 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.755587 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:27Z","lastTransitionTime":"2025-09-30T06:20:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.763521 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:27Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.779530 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8htrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c1c8663-d263-4b8c-93fa-05ee1b61d7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73cf34ef0b11305b959da324b55f8a211f0c278cf2dec835bb6c91ad385e4dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj86t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8htrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:27Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.797514 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b46ade-8260-448f-84b7-506632d23ff9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff4769002bae0f06423979897a58fdb4b2a78f9ea00abc1d5df5c8d4385c0919\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xszvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1440cd9bacd9b28d5e192b3d9143dda931c741606fdb8f246fba23b8a68c534c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xszvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4w4k6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:27Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.817079 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3ecaa3-192f-40d4-abf8-938b9e03a661\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://469f62522e2f813405840a8654f761a52163bdabeba8cc7c57998ccd40548370\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198c76a83115408261c01c78cbd2845e487ad3dc896da0130b2b1338349506d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aadde3a7a603aae354d4f9dcf5ef288aeceabfb73ee3c92398fb3419326b27a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2f054ea0da71d1f65e501212a66771c5e65bfc123e3a3acd5e13900def039d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:27Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.831561 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c655fbd5-0708-4151-b0fb-a97d8e1826bc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c055c319af36509c20e700f8f5025b0d356ca5e6038be80dd69282a1f1ad716b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://810c839e6c66bbacb466fb7023bed728b17be9d13025e2db26ee5b40fea124f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1f3d8c18456850212ce283d46273b39939040ddd575f193acc0910cf479f3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://deebf28973e8320961c1318918377f952fadba50dec1e580e461de659c63f23e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://deebf28973e8320961c1318918377f952fadba50dec1e580e461de659c63f23e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:27Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.848904 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11fb453d515798fa4f85603df19a5dc7d305375d25c0f679598613aba7312328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:27Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.858055 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.858179 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.858217 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.858251 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.858277 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:27Z","lastTransitionTime":"2025-09-30T06:20:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.868700 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nzp64" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b7ceecf30a44692340aed4f0df2aa58e8c9552f8ef872b7a872b29516fa2068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d675d9797e8541b6fff5df78a8d0701a51d23b03bd2d489d0489a82f440b4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d675d9797e8541b6fff5df78a8d0701a51d23b03bd2d489d0489a82f440b4a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aacdb742861442ec0e7b14904397b4e1a4bd115cd56e6105b37c42cb762b0dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aacdb742861442ec0e7b14904397b4e1a4bd115cd56e6105b37c42cb762b0dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f6c31e7404be56f11fd658c5e92f44de4704b9c2d5ebbce1fd0af40d2ca8c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1f6c31e7404be56f11fd658c5e92f44de4704b9c2d5ebbce1fd0af40d2ca8c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74da2d6dc28ebfcbfff21e47ebad84a906c4686fddf32d04eb494be40d811b86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74da2d6dc28ebfcbfff21e47ebad84a906c4686fddf32d04eb494be40d811b86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d1c4f2763fc5ea8ecd81a1c6aff31d291156a4db150fc70c0111b811bd5467c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d1c4f2763fc5ea8ecd81a1c6aff31d291156a4db150fc70c0111b811bd5467c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f313d83849679400f0ef2223f644c042290a33d55cab935fdbe9af123c88d210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f313d83849679400f0ef2223f644c042290a33d55cab935fdbe9af123c88d210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nzp64\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:27Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.894179 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b6df86867d5427797ec0c0a28b653ef5b8ec62af450207d9af5f490df725b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04bc58a7b740ca100f333ef769a949f962ff6024f8dd87b798c99a28514e0394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d95203bc3297779316f06d001c72859f9b4561a9a9d7c3123c5a75e8d8bc6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f46927454f67993366632bb3f6e37e4cb0c5f013266b18c1ac61e9cddc42e023\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20789e47d7e584044b0763af1eb8b81bbde2f247ec14e56a5a9b08c28894c3be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d002e869e911c24f154df90324ace473c747a4cfa6dd60ddcbef6473f9815f72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4eafbd6f3545c96381319c2b5692b5eccef9c2f837e82cdabf3ea0b92504260\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4eafbd6f3545c96381319c2b5692b5eccef9c2f837e82cdabf3ea0b92504260\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T06:20:06Z\\\",\\\"message\\\":\\\"/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 06:20:06.206257 6355 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0930 06:20:06.206290 6355 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0930 06:20:06.206550 6355 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 06:20:06.206243 6355 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 06:20:06.206758 6355 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 06:20:06.207239 6355 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 06:20:06.207377 6355 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 06:20:06.207466 6355 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 06:20:06.208066 6355 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:20:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-sjmvw_openshift-ovn-kubernetes(6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f49abce591ae5c912fdbbdf0253d06ee20088424718dd6d535683a2a2ede5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sjmvw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:27Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.907120 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:27Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.922412 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca08f5d521940a428454d90f85150e7b87e2353314c89a42b4c73740c3b23163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:27Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.942164 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xjjw8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bfd073c-4582-4a65-8170-7030f4852174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://139bc9d998c0acce39c92915ac1c15ec414b09e1f5c0ae15c37f7af6e5cf570d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a065eb18eb909fb60011f5ae6adf327088480996585d74e861628ffdd759bd9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T06:20:26Z\\\",\\\"message\\\":\\\"2025-09-30T06:19:40+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f75473c8-d557-49aa-97dd-d3a6a6daff28\\\\n2025-09-30T06:19:40+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f75473c8-d557-49aa-97dd-d3a6a6daff28 to /host/opt/cni/bin/\\\\n2025-09-30T06:19:41Z [verbose] multus-daemon started\\\\n2025-09-30T06:19:41Z [verbose] Readiness Indicator file check\\\\n2025-09-30T06:20:26Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xjjw8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:27Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.955933 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-p7wmt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99a6e728-8795-424d-a99e-7141c75baad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://746d60ed8b3230cabda106f1d2e4bc8496618515f2cfb2d6c4238375f71ad2af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6jfq9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-p7wmt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:27Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.960546 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.960577 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.960585 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.960600 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.960610 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:27Z","lastTransitionTime":"2025-09-30T06:20:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.971003 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fxv8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37b1a1c7-92d7-41ea-b5c1-4a56b40f819e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be3a8860307e499758a11dd6561cdf1ffd968b3edf9c701b52994ea4cfe1c129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4h4k7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b07ee418ef8f92ea69f7d64ef09e0de246432d4663c4c5018be94894860c6444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4h4k7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fxv8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:27Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:27 crc kubenswrapper[4691]: I0930 06:20:27.984691 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-svjxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8ed6f92-0b98-4b1b-a46e-4d0604d686a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q98dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q98dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-svjxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:27Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:28 crc kubenswrapper[4691]: I0930 06:20:28.063622 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:28 crc kubenswrapper[4691]: I0930 06:20:28.063662 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:28 crc kubenswrapper[4691]: I0930 06:20:28.063672 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:28 crc kubenswrapper[4691]: I0930 06:20:28.063688 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:28 crc kubenswrapper[4691]: I0930 06:20:28.063702 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:28Z","lastTransitionTime":"2025-09-30T06:20:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:28 crc kubenswrapper[4691]: I0930 06:20:28.166730 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:28 crc kubenswrapper[4691]: I0930 06:20:28.166776 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:28 crc kubenswrapper[4691]: I0930 06:20:28.166786 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:28 crc kubenswrapper[4691]: I0930 06:20:28.166802 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:28 crc kubenswrapper[4691]: I0930 06:20:28.166814 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:28Z","lastTransitionTime":"2025-09-30T06:20:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:28 crc kubenswrapper[4691]: I0930 06:20:28.223767 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-svjxq" Sep 30 06:20:28 crc kubenswrapper[4691]: I0930 06:20:28.223857 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 06:20:28 crc kubenswrapper[4691]: E0930 06:20:28.223929 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-svjxq" podUID="a8ed6f92-0b98-4b1b-a46e-4d0604d686a1" Sep 30 06:20:28 crc kubenswrapper[4691]: E0930 06:20:28.223967 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 06:20:28 crc kubenswrapper[4691]: I0930 06:20:28.268756 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:28 crc kubenswrapper[4691]: I0930 06:20:28.268797 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:28 crc kubenswrapper[4691]: I0930 06:20:28.268810 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:28 crc kubenswrapper[4691]: I0930 06:20:28.268825 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:28 crc kubenswrapper[4691]: I0930 06:20:28.268837 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:28Z","lastTransitionTime":"2025-09-30T06:20:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:28 crc kubenswrapper[4691]: I0930 06:20:28.371081 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:28 crc kubenswrapper[4691]: I0930 06:20:28.371122 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:28 crc kubenswrapper[4691]: I0930 06:20:28.371135 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:28 crc kubenswrapper[4691]: I0930 06:20:28.371151 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:28 crc kubenswrapper[4691]: I0930 06:20:28.371164 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:28Z","lastTransitionTime":"2025-09-30T06:20:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:28 crc kubenswrapper[4691]: I0930 06:20:28.473062 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:28 crc kubenswrapper[4691]: I0930 06:20:28.473108 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:28 crc kubenswrapper[4691]: I0930 06:20:28.473120 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:28 crc kubenswrapper[4691]: I0930 06:20:28.473138 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:28 crc kubenswrapper[4691]: I0930 06:20:28.473151 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:28Z","lastTransitionTime":"2025-09-30T06:20:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:28 crc kubenswrapper[4691]: I0930 06:20:28.575310 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:28 crc kubenswrapper[4691]: I0930 06:20:28.575374 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:28 crc kubenswrapper[4691]: I0930 06:20:28.575396 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:28 crc kubenswrapper[4691]: I0930 06:20:28.575423 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:28 crc kubenswrapper[4691]: I0930 06:20:28.575445 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:28Z","lastTransitionTime":"2025-09-30T06:20:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:28 crc kubenswrapper[4691]: I0930 06:20:28.678417 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:28 crc kubenswrapper[4691]: I0930 06:20:28.678479 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:28 crc kubenswrapper[4691]: I0930 06:20:28.678495 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:28 crc kubenswrapper[4691]: I0930 06:20:28.678518 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:28 crc kubenswrapper[4691]: I0930 06:20:28.678535 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:28Z","lastTransitionTime":"2025-09-30T06:20:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:28 crc kubenswrapper[4691]: I0930 06:20:28.781317 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:28 crc kubenswrapper[4691]: I0930 06:20:28.781380 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:28 crc kubenswrapper[4691]: I0930 06:20:28.781402 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:28 crc kubenswrapper[4691]: I0930 06:20:28.781432 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:28 crc kubenswrapper[4691]: I0930 06:20:28.781453 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:28Z","lastTransitionTime":"2025-09-30T06:20:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:28 crc kubenswrapper[4691]: I0930 06:20:28.884365 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:28 crc kubenswrapper[4691]: I0930 06:20:28.884419 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:28 crc kubenswrapper[4691]: I0930 06:20:28.884430 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:28 crc kubenswrapper[4691]: I0930 06:20:28.884448 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:28 crc kubenswrapper[4691]: I0930 06:20:28.884464 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:28Z","lastTransitionTime":"2025-09-30T06:20:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:28 crc kubenswrapper[4691]: I0930 06:20:28.987580 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:28 crc kubenswrapper[4691]: I0930 06:20:28.987642 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:28 crc kubenswrapper[4691]: I0930 06:20:28.987653 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:28 crc kubenswrapper[4691]: I0930 06:20:28.987673 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:28 crc kubenswrapper[4691]: I0930 06:20:28.987685 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:28Z","lastTransitionTime":"2025-09-30T06:20:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:29 crc kubenswrapper[4691]: I0930 06:20:29.091455 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:29 crc kubenswrapper[4691]: I0930 06:20:29.091497 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:29 crc kubenswrapper[4691]: I0930 06:20:29.091515 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:29 crc kubenswrapper[4691]: I0930 06:20:29.091539 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:29 crc kubenswrapper[4691]: I0930 06:20:29.091556 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:29Z","lastTransitionTime":"2025-09-30T06:20:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:29 crc kubenswrapper[4691]: I0930 06:20:29.193827 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:29 crc kubenswrapper[4691]: I0930 06:20:29.193871 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:29 crc kubenswrapper[4691]: I0930 06:20:29.193920 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:29 crc kubenswrapper[4691]: I0930 06:20:29.193952 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:29 crc kubenswrapper[4691]: I0930 06:20:29.193974 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:29Z","lastTransitionTime":"2025-09-30T06:20:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:29 crc kubenswrapper[4691]: I0930 06:20:29.224747 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 06:20:29 crc kubenswrapper[4691]: E0930 06:20:29.224987 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 06:20:29 crc kubenswrapper[4691]: I0930 06:20:29.225050 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 06:20:29 crc kubenswrapper[4691]: E0930 06:20:29.225200 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 06:20:29 crc kubenswrapper[4691]: I0930 06:20:29.296476 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:29 crc kubenswrapper[4691]: I0930 06:20:29.296528 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:29 crc kubenswrapper[4691]: I0930 06:20:29.296542 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:29 crc kubenswrapper[4691]: I0930 06:20:29.296560 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:29 crc kubenswrapper[4691]: I0930 06:20:29.296571 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:29Z","lastTransitionTime":"2025-09-30T06:20:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:29 crc kubenswrapper[4691]: I0930 06:20:29.400194 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:29 crc kubenswrapper[4691]: I0930 06:20:29.400270 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:29 crc kubenswrapper[4691]: I0930 06:20:29.400367 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:29 crc kubenswrapper[4691]: I0930 06:20:29.400397 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:29 crc kubenswrapper[4691]: I0930 06:20:29.400414 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:29Z","lastTransitionTime":"2025-09-30T06:20:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:29 crc kubenswrapper[4691]: I0930 06:20:29.503483 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:29 crc kubenswrapper[4691]: I0930 06:20:29.503549 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:29 crc kubenswrapper[4691]: I0930 06:20:29.503567 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:29 crc kubenswrapper[4691]: I0930 06:20:29.503590 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:29 crc kubenswrapper[4691]: I0930 06:20:29.503607 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:29Z","lastTransitionTime":"2025-09-30T06:20:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:29 crc kubenswrapper[4691]: I0930 06:20:29.606805 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:29 crc kubenswrapper[4691]: I0930 06:20:29.606846 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:29 crc kubenswrapper[4691]: I0930 06:20:29.606855 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:29 crc kubenswrapper[4691]: I0930 06:20:29.606871 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:29 crc kubenswrapper[4691]: I0930 06:20:29.606900 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:29Z","lastTransitionTime":"2025-09-30T06:20:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:29 crc kubenswrapper[4691]: I0930 06:20:29.710086 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:29 crc kubenswrapper[4691]: I0930 06:20:29.710155 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:29 crc kubenswrapper[4691]: I0930 06:20:29.710173 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:29 crc kubenswrapper[4691]: I0930 06:20:29.710198 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:29 crc kubenswrapper[4691]: I0930 06:20:29.710218 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:29Z","lastTransitionTime":"2025-09-30T06:20:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:29 crc kubenswrapper[4691]: I0930 06:20:29.814229 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:29 crc kubenswrapper[4691]: I0930 06:20:29.814292 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:29 crc kubenswrapper[4691]: I0930 06:20:29.814310 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:29 crc kubenswrapper[4691]: I0930 06:20:29.814333 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:29 crc kubenswrapper[4691]: I0930 06:20:29.814350 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:29Z","lastTransitionTime":"2025-09-30T06:20:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:29 crc kubenswrapper[4691]: I0930 06:20:29.917570 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:29 crc kubenswrapper[4691]: I0930 06:20:29.917626 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:29 crc kubenswrapper[4691]: I0930 06:20:29.917645 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:29 crc kubenswrapper[4691]: I0930 06:20:29.917673 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:29 crc kubenswrapper[4691]: I0930 06:20:29.917695 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:29Z","lastTransitionTime":"2025-09-30T06:20:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:30 crc kubenswrapper[4691]: I0930 06:20:30.020501 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:30 crc kubenswrapper[4691]: I0930 06:20:30.020568 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:30 crc kubenswrapper[4691]: I0930 06:20:30.020591 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:30 crc kubenswrapper[4691]: I0930 06:20:30.020621 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:30 crc kubenswrapper[4691]: I0930 06:20:30.020643 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:30Z","lastTransitionTime":"2025-09-30T06:20:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:30 crc kubenswrapper[4691]: I0930 06:20:30.122802 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:30 crc kubenswrapper[4691]: I0930 06:20:30.122857 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:30 crc kubenswrapper[4691]: I0930 06:20:30.122875 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:30 crc kubenswrapper[4691]: I0930 06:20:30.122926 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:30 crc kubenswrapper[4691]: I0930 06:20:30.122947 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:30Z","lastTransitionTime":"2025-09-30T06:20:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:30 crc kubenswrapper[4691]: I0930 06:20:30.223868 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-svjxq" Sep 30 06:20:30 crc kubenswrapper[4691]: I0930 06:20:30.223982 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 06:20:30 crc kubenswrapper[4691]: E0930 06:20:30.224023 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-svjxq" podUID="a8ed6f92-0b98-4b1b-a46e-4d0604d686a1" Sep 30 06:20:30 crc kubenswrapper[4691]: E0930 06:20:30.224156 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 06:20:30 crc kubenswrapper[4691]: I0930 06:20:30.229497 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:30 crc kubenswrapper[4691]: I0930 06:20:30.229531 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:30 crc kubenswrapper[4691]: I0930 06:20:30.229540 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:30 crc kubenswrapper[4691]: I0930 06:20:30.229555 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:30 crc kubenswrapper[4691]: I0930 06:20:30.229567 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:30Z","lastTransitionTime":"2025-09-30T06:20:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:30 crc kubenswrapper[4691]: I0930 06:20:30.332632 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:30 crc kubenswrapper[4691]: I0930 06:20:30.332698 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:30 crc kubenswrapper[4691]: I0930 06:20:30.332714 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:30 crc kubenswrapper[4691]: I0930 06:20:30.332739 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:30 crc kubenswrapper[4691]: I0930 06:20:30.332757 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:30Z","lastTransitionTime":"2025-09-30T06:20:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:30 crc kubenswrapper[4691]: I0930 06:20:30.436075 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:30 crc kubenswrapper[4691]: I0930 06:20:30.436129 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:30 crc kubenswrapper[4691]: I0930 06:20:30.436146 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:30 crc kubenswrapper[4691]: I0930 06:20:30.436170 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:30 crc kubenswrapper[4691]: I0930 06:20:30.436186 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:30Z","lastTransitionTime":"2025-09-30T06:20:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:30 crc kubenswrapper[4691]: I0930 06:20:30.539490 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:30 crc kubenswrapper[4691]: I0930 06:20:30.539556 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:30 crc kubenswrapper[4691]: I0930 06:20:30.539579 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:30 crc kubenswrapper[4691]: I0930 06:20:30.539609 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:30 crc kubenswrapper[4691]: I0930 06:20:30.539632 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:30Z","lastTransitionTime":"2025-09-30T06:20:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:30 crc kubenswrapper[4691]: I0930 06:20:30.642646 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:30 crc kubenswrapper[4691]: I0930 06:20:30.642710 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:30 crc kubenswrapper[4691]: I0930 06:20:30.642731 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:30 crc kubenswrapper[4691]: I0930 06:20:30.642759 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:30 crc kubenswrapper[4691]: I0930 06:20:30.642779 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:30Z","lastTransitionTime":"2025-09-30T06:20:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:30 crc kubenswrapper[4691]: I0930 06:20:30.744968 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:30 crc kubenswrapper[4691]: I0930 06:20:30.745034 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:30 crc kubenswrapper[4691]: I0930 06:20:30.745052 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:30 crc kubenswrapper[4691]: I0930 06:20:30.745076 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:30 crc kubenswrapper[4691]: I0930 06:20:30.745092 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:30Z","lastTransitionTime":"2025-09-30T06:20:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:30 crc kubenswrapper[4691]: I0930 06:20:30.847568 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:30 crc kubenswrapper[4691]: I0930 06:20:30.847636 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:30 crc kubenswrapper[4691]: I0930 06:20:30.847657 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:30 crc kubenswrapper[4691]: I0930 06:20:30.847682 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:30 crc kubenswrapper[4691]: I0930 06:20:30.847726 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:30Z","lastTransitionTime":"2025-09-30T06:20:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:30 crc kubenswrapper[4691]: I0930 06:20:30.950806 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:30 crc kubenswrapper[4691]: I0930 06:20:30.950860 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:30 crc kubenswrapper[4691]: I0930 06:20:30.950877 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:30 crc kubenswrapper[4691]: I0930 06:20:30.950922 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:30 crc kubenswrapper[4691]: I0930 06:20:30.950941 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:30Z","lastTransitionTime":"2025-09-30T06:20:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:31 crc kubenswrapper[4691]: I0930 06:20:31.054163 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:31 crc kubenswrapper[4691]: I0930 06:20:31.054216 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:31 crc kubenswrapper[4691]: I0930 06:20:31.054237 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:31 crc kubenswrapper[4691]: I0930 06:20:31.054265 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:31 crc kubenswrapper[4691]: I0930 06:20:31.054286 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:31Z","lastTransitionTime":"2025-09-30T06:20:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:31 crc kubenswrapper[4691]: I0930 06:20:31.157339 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:31 crc kubenswrapper[4691]: I0930 06:20:31.157404 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:31 crc kubenswrapper[4691]: I0930 06:20:31.157423 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:31 crc kubenswrapper[4691]: I0930 06:20:31.157450 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:31 crc kubenswrapper[4691]: I0930 06:20:31.157467 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:31Z","lastTransitionTime":"2025-09-30T06:20:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:31 crc kubenswrapper[4691]: I0930 06:20:31.223997 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 06:20:31 crc kubenswrapper[4691]: I0930 06:20:31.224072 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 06:20:31 crc kubenswrapper[4691]: E0930 06:20:31.224145 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 06:20:31 crc kubenswrapper[4691]: E0930 06:20:31.224216 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 06:20:31 crc kubenswrapper[4691]: I0930 06:20:31.260589 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:31 crc kubenswrapper[4691]: I0930 06:20:31.260625 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:31 crc kubenswrapper[4691]: I0930 06:20:31.260633 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:31 crc kubenswrapper[4691]: I0930 06:20:31.260645 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:31 crc kubenswrapper[4691]: I0930 06:20:31.260656 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:31Z","lastTransitionTime":"2025-09-30T06:20:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:31 crc kubenswrapper[4691]: I0930 06:20:31.364189 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:31 crc kubenswrapper[4691]: I0930 06:20:31.364246 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:31 crc kubenswrapper[4691]: I0930 06:20:31.364264 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:31 crc kubenswrapper[4691]: I0930 06:20:31.364286 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:31 crc kubenswrapper[4691]: I0930 06:20:31.364302 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:31Z","lastTransitionTime":"2025-09-30T06:20:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:31 crc kubenswrapper[4691]: I0930 06:20:31.468295 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:31 crc kubenswrapper[4691]: I0930 06:20:31.468359 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:31 crc kubenswrapper[4691]: I0930 06:20:31.468376 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:31 crc kubenswrapper[4691]: I0930 06:20:31.468402 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:31 crc kubenswrapper[4691]: I0930 06:20:31.468419 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:31Z","lastTransitionTime":"2025-09-30T06:20:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:31 crc kubenswrapper[4691]: I0930 06:20:31.571312 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:31 crc kubenswrapper[4691]: I0930 06:20:31.571372 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:31 crc kubenswrapper[4691]: I0930 06:20:31.571389 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:31 crc kubenswrapper[4691]: I0930 06:20:31.571416 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:31 crc kubenswrapper[4691]: I0930 06:20:31.571434 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:31Z","lastTransitionTime":"2025-09-30T06:20:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:31 crc kubenswrapper[4691]: I0930 06:20:31.673425 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:31 crc kubenswrapper[4691]: I0930 06:20:31.673498 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:31 crc kubenswrapper[4691]: I0930 06:20:31.673521 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:31 crc kubenswrapper[4691]: I0930 06:20:31.673549 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:31 crc kubenswrapper[4691]: I0930 06:20:31.673572 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:31Z","lastTransitionTime":"2025-09-30T06:20:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:31 crc kubenswrapper[4691]: I0930 06:20:31.777308 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:31 crc kubenswrapper[4691]: I0930 06:20:31.777384 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:31 crc kubenswrapper[4691]: I0930 06:20:31.777410 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:31 crc kubenswrapper[4691]: I0930 06:20:31.777441 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:31 crc kubenswrapper[4691]: I0930 06:20:31.777462 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:31Z","lastTransitionTime":"2025-09-30T06:20:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:31 crc kubenswrapper[4691]: I0930 06:20:31.879755 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:31 crc kubenswrapper[4691]: I0930 06:20:31.879809 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:31 crc kubenswrapper[4691]: I0930 06:20:31.879829 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:31 crc kubenswrapper[4691]: I0930 06:20:31.879856 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:31 crc kubenswrapper[4691]: I0930 06:20:31.879877 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:31Z","lastTransitionTime":"2025-09-30T06:20:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:31 crc kubenswrapper[4691]: I0930 06:20:31.982928 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:31 crc kubenswrapper[4691]: I0930 06:20:31.983000 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:31 crc kubenswrapper[4691]: I0930 06:20:31.983017 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:31 crc kubenswrapper[4691]: I0930 06:20:31.983040 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:31 crc kubenswrapper[4691]: I0930 06:20:31.983056 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:31Z","lastTransitionTime":"2025-09-30T06:20:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:32 crc kubenswrapper[4691]: I0930 06:20:32.085537 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:32 crc kubenswrapper[4691]: I0930 06:20:32.085604 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:32 crc kubenswrapper[4691]: I0930 06:20:32.085622 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:32 crc kubenswrapper[4691]: I0930 06:20:32.085648 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:32 crc kubenswrapper[4691]: I0930 06:20:32.085667 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:32Z","lastTransitionTime":"2025-09-30T06:20:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:32 crc kubenswrapper[4691]: I0930 06:20:32.189045 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:32 crc kubenswrapper[4691]: I0930 06:20:32.189102 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:32 crc kubenswrapper[4691]: I0930 06:20:32.189118 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:32 crc kubenswrapper[4691]: I0930 06:20:32.189145 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:32 crc kubenswrapper[4691]: I0930 06:20:32.189162 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:32Z","lastTransitionTime":"2025-09-30T06:20:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:32 crc kubenswrapper[4691]: I0930 06:20:32.223787 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 06:20:32 crc kubenswrapper[4691]: I0930 06:20:32.223802 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-svjxq" Sep 30 06:20:32 crc kubenswrapper[4691]: E0930 06:20:32.224007 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 06:20:32 crc kubenswrapper[4691]: E0930 06:20:32.224122 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-svjxq" podUID="a8ed6f92-0b98-4b1b-a46e-4d0604d686a1" Sep 30 06:20:32 crc kubenswrapper[4691]: I0930 06:20:32.292493 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:32 crc kubenswrapper[4691]: I0930 06:20:32.292535 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:32 crc kubenswrapper[4691]: I0930 06:20:32.292553 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:32 crc kubenswrapper[4691]: I0930 06:20:32.292574 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:32 crc kubenswrapper[4691]: I0930 06:20:32.292609 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:32Z","lastTransitionTime":"2025-09-30T06:20:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:32 crc kubenswrapper[4691]: I0930 06:20:32.394983 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:32 crc kubenswrapper[4691]: I0930 06:20:32.395046 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:32 crc kubenswrapper[4691]: I0930 06:20:32.395064 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:32 crc kubenswrapper[4691]: I0930 06:20:32.395088 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:32 crc kubenswrapper[4691]: I0930 06:20:32.395109 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:32Z","lastTransitionTime":"2025-09-30T06:20:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:32 crc kubenswrapper[4691]: I0930 06:20:32.499230 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:32 crc kubenswrapper[4691]: I0930 06:20:32.499291 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:32 crc kubenswrapper[4691]: I0930 06:20:32.499308 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:32 crc kubenswrapper[4691]: I0930 06:20:32.499332 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:32 crc kubenswrapper[4691]: I0930 06:20:32.499353 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:32Z","lastTransitionTime":"2025-09-30T06:20:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:32 crc kubenswrapper[4691]: I0930 06:20:32.602991 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:32 crc kubenswrapper[4691]: I0930 06:20:32.603064 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:32 crc kubenswrapper[4691]: I0930 06:20:32.603089 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:32 crc kubenswrapper[4691]: I0930 06:20:32.603117 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:32 crc kubenswrapper[4691]: I0930 06:20:32.603139 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:32Z","lastTransitionTime":"2025-09-30T06:20:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:32 crc kubenswrapper[4691]: I0930 06:20:32.706022 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:32 crc kubenswrapper[4691]: I0930 06:20:32.706083 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:32 crc kubenswrapper[4691]: I0930 06:20:32.706103 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:32 crc kubenswrapper[4691]: I0930 06:20:32.706132 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:32 crc kubenswrapper[4691]: I0930 06:20:32.706151 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:32Z","lastTransitionTime":"2025-09-30T06:20:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:32 crc kubenswrapper[4691]: I0930 06:20:32.815968 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:32 crc kubenswrapper[4691]: I0930 06:20:32.816035 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:32 crc kubenswrapper[4691]: I0930 06:20:32.816053 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:32 crc kubenswrapper[4691]: I0930 06:20:32.816077 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:32 crc kubenswrapper[4691]: I0930 06:20:32.816094 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:32Z","lastTransitionTime":"2025-09-30T06:20:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:32 crc kubenswrapper[4691]: I0930 06:20:32.918513 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:32 crc kubenswrapper[4691]: I0930 06:20:32.918591 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:32 crc kubenswrapper[4691]: I0930 06:20:32.918621 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:32 crc kubenswrapper[4691]: I0930 06:20:32.918649 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:32 crc kubenswrapper[4691]: I0930 06:20:32.918674 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:32Z","lastTransitionTime":"2025-09-30T06:20:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:33 crc kubenswrapper[4691]: I0930 06:20:33.022361 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:33 crc kubenswrapper[4691]: I0930 06:20:33.022426 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:33 crc kubenswrapper[4691]: I0930 06:20:33.022443 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:33 crc kubenswrapper[4691]: I0930 06:20:33.022469 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:33 crc kubenswrapper[4691]: I0930 06:20:33.022486 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:33Z","lastTransitionTime":"2025-09-30T06:20:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:33 crc kubenswrapper[4691]: I0930 06:20:33.124949 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:33 crc kubenswrapper[4691]: I0930 06:20:33.125006 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:33 crc kubenswrapper[4691]: I0930 06:20:33.125021 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:33 crc kubenswrapper[4691]: I0930 06:20:33.125041 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:33 crc kubenswrapper[4691]: I0930 06:20:33.125057 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:33Z","lastTransitionTime":"2025-09-30T06:20:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:33 crc kubenswrapper[4691]: I0930 06:20:33.224065 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 06:20:33 crc kubenswrapper[4691]: I0930 06:20:33.224212 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 06:20:33 crc kubenswrapper[4691]: E0930 06:20:33.224292 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 06:20:33 crc kubenswrapper[4691]: E0930 06:20:33.224357 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 06:20:33 crc kubenswrapper[4691]: I0930 06:20:33.227661 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:33 crc kubenswrapper[4691]: I0930 06:20:33.227728 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:33 crc kubenswrapper[4691]: I0930 06:20:33.227746 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:33 crc kubenswrapper[4691]: I0930 06:20:33.227767 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:33 crc kubenswrapper[4691]: I0930 06:20:33.227784 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:33Z","lastTransitionTime":"2025-09-30T06:20:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:33 crc kubenswrapper[4691]: I0930 06:20:33.330467 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:33 crc kubenswrapper[4691]: I0930 06:20:33.330545 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:33 crc kubenswrapper[4691]: I0930 06:20:33.330568 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:33 crc kubenswrapper[4691]: I0930 06:20:33.330605 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:33 crc kubenswrapper[4691]: I0930 06:20:33.330627 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:33Z","lastTransitionTime":"2025-09-30T06:20:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:33 crc kubenswrapper[4691]: I0930 06:20:33.433621 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:33 crc kubenswrapper[4691]: I0930 06:20:33.433684 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:33 crc kubenswrapper[4691]: I0930 06:20:33.433701 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:33 crc kubenswrapper[4691]: I0930 06:20:33.433729 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:33 crc kubenswrapper[4691]: I0930 06:20:33.433746 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:33Z","lastTransitionTime":"2025-09-30T06:20:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:33 crc kubenswrapper[4691]: I0930 06:20:33.536957 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:33 crc kubenswrapper[4691]: I0930 06:20:33.537014 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:33 crc kubenswrapper[4691]: I0930 06:20:33.537033 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:33 crc kubenswrapper[4691]: I0930 06:20:33.537056 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:33 crc kubenswrapper[4691]: I0930 06:20:33.537073 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:33Z","lastTransitionTime":"2025-09-30T06:20:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:33 crc kubenswrapper[4691]: I0930 06:20:33.640485 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:33 crc kubenswrapper[4691]: I0930 06:20:33.640547 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:33 crc kubenswrapper[4691]: I0930 06:20:33.640565 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:33 crc kubenswrapper[4691]: I0930 06:20:33.640589 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:33 crc kubenswrapper[4691]: I0930 06:20:33.640610 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:33Z","lastTransitionTime":"2025-09-30T06:20:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:33 crc kubenswrapper[4691]: I0930 06:20:33.742722 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:33 crc kubenswrapper[4691]: I0930 06:20:33.743017 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:33 crc kubenswrapper[4691]: I0930 06:20:33.743174 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:33 crc kubenswrapper[4691]: I0930 06:20:33.743277 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:33 crc kubenswrapper[4691]: I0930 06:20:33.743382 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:33Z","lastTransitionTime":"2025-09-30T06:20:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:33 crc kubenswrapper[4691]: I0930 06:20:33.846146 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:33 crc kubenswrapper[4691]: I0930 06:20:33.846195 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:33 crc kubenswrapper[4691]: I0930 06:20:33.846214 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:33 crc kubenswrapper[4691]: I0930 06:20:33.846237 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:33 crc kubenswrapper[4691]: I0930 06:20:33.846254 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:33Z","lastTransitionTime":"2025-09-30T06:20:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:33 crc kubenswrapper[4691]: I0930 06:20:33.949352 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:33 crc kubenswrapper[4691]: I0930 06:20:33.949431 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:33 crc kubenswrapper[4691]: I0930 06:20:33.949456 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:33 crc kubenswrapper[4691]: I0930 06:20:33.949484 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:33 crc kubenswrapper[4691]: I0930 06:20:33.949507 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:33Z","lastTransitionTime":"2025-09-30T06:20:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.052942 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.053018 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.053043 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.053075 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.053100 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:34Z","lastTransitionTime":"2025-09-30T06:20:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.156205 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.156260 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.156276 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.156298 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.156316 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:34Z","lastTransitionTime":"2025-09-30T06:20:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.224547 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.224590 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-svjxq" Sep 30 06:20:34 crc kubenswrapper[4691]: E0930 06:20:34.224770 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 06:20:34 crc kubenswrapper[4691]: E0930 06:20:34.225011 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-svjxq" podUID="a8ed6f92-0b98-4b1b-a46e-4d0604d686a1" Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.226104 4691 scope.go:117] "RemoveContainer" containerID="e4eafbd6f3545c96381319c2b5692b5eccef9c2f837e82cdabf3ea0b92504260" Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.259590 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.259739 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.259765 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.259798 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.259821 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:34Z","lastTransitionTime":"2025-09-30T06:20:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.363051 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.363353 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.363552 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.363768 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.363992 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:34Z","lastTransitionTime":"2025-09-30T06:20:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.468049 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.468101 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.468118 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.468142 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.468159 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:34Z","lastTransitionTime":"2025-09-30T06:20:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.571924 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.571974 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.571993 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.572019 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.572039 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:34Z","lastTransitionTime":"2025-09-30T06:20:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.676152 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.676255 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.676277 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.676705 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.677112 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:34Z","lastTransitionTime":"2025-09-30T06:20:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.686153 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sjmvw_6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d/ovnkube-controller/2.log" Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.690078 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" event={"ID":"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d","Type":"ContainerStarted","Data":"76736edc05ab57e4ee3d6dea3b7f4d2033e718cdac579abe7c4f41fe6988f62e"} Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.691739 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.707812 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-svjxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8ed6f92-0b98-4b1b-a46e-4d0604d686a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q98dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q98dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-svjxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:34Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.739540 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:34Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.763148 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca08f5d521940a428454d90f85150e7b87e2353314c89a42b4c73740c3b23163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:34Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.776807 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.776864 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.776876 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.776909 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.776926 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:34Z","lastTransitionTime":"2025-09-30T06:20:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.785698 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xjjw8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bfd073c-4582-4a65-8170-7030f4852174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://139bc9d998c0acce39c92915ac1c15ec414b09e1f5c0ae15c37f7af6e5cf570d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a065eb18eb909fb60011f5ae6adf327088480996585d74e861628ffdd759bd9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T06:20:26Z\\\",\\\"message\\\":\\\"2025-09-30T06:19:40+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f75473c8-d557-49aa-97dd-d3a6a6daff28\\\\n2025-09-30T06:19:40+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f75473c8-d557-49aa-97dd-d3a6a6daff28 to /host/opt/cni/bin/\\\\n2025-09-30T06:19:41Z [verbose] multus-daemon started\\\\n2025-09-30T06:19:41Z [verbose] Readiness Indicator file check\\\\n2025-09-30T06:20:26Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xjjw8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:34Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:34 crc kubenswrapper[4691]: E0930 06:20:34.797177 4691 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5d33cbaa-1b8a-4dde-af56-05c3aae2213e\\\",\\\"systemUUID\\\":\\\"d7c4eecd-4486-44ba-8bf7-42bfa69eb2b7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:34Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.801728 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.801771 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.801788 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.801812 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.801829 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:34Z","lastTransitionTime":"2025-09-30T06:20:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.807788 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-p7wmt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99a6e728-8795-424d-a99e-7141c75baad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://746d60ed8b3230cabda106f1d2e4bc8496618515f2cfb2d6c4238375f71ad2af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6jfq9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-p7wmt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:34Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:34 crc kubenswrapper[4691]: E0930 06:20:34.817118 4691 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5d33cbaa-1b8a-4dde-af56-05c3aae2213e\\\",\\\"systemUUID\\\":\\\"d7c4eecd-4486-44ba-8bf7-42bfa69eb2b7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:34Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.821052 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.821110 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.821129 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.821153 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.821170 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:34Z","lastTransitionTime":"2025-09-30T06:20:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.823257 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fxv8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37b1a1c7-92d7-41ea-b5c1-4a56b40f819e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be3a8860307e499758a11dd6561cdf1ffd968b3edf9c701b52994ea4cfe1c129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4h4k7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b07ee418ef8f92ea69f7d64ef09e0de246432d4663c4c5018be94894860c6444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4h4k7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fxv8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:34Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.835000 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8htrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c1c8663-d263-4b8c-93fa-05ee1b61d7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73cf34ef0b11305b959da324b55f8a211f0c278cf2dec835bb6c91ad385e4dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj86t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8htrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:34Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:34 crc kubenswrapper[4691]: E0930 06:20:34.837270 4691 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5d33cbaa-1b8a-4dde-af56-05c3aae2213e\\\",\\\"systemUUID\\\":\\\"d7c4eecd-4486-44ba-8bf7-42bfa69eb2b7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:34Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.842223 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.842256 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.842271 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.842288 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.842299 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:34Z","lastTransitionTime":"2025-09-30T06:20:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.846627 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b46ade-8260-448f-84b7-506632d23ff9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff4769002bae0f06423979897a58fdb4b2a78f9ea00abc1d5df5c8d4385c0919\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xszvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1440cd9bacd9b28d5e192b3d9143dda931c741606fdb8f246fba23b8a68c534c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xszvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4w4k6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:34Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:34 crc kubenswrapper[4691]: E0930 06:20:34.858546 4691 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5d33cbaa-1b8a-4dde-af56-05c3aae2213e\\\",\\\"systemUUID\\\":\\\"d7c4eecd-4486-44ba-8bf7-42bfa69eb2b7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:34Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.862043 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.862089 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.862102 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.862120 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.862136 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:34Z","lastTransitionTime":"2025-09-30T06:20:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:34 crc kubenswrapper[4691]: E0930 06:20:34.875985 4691 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:34Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5d33cbaa-1b8a-4dde-af56-05c3aae2213e\\\",\\\"systemUUID\\\":\\\"d7c4eecd-4486-44ba-8bf7-42bfa69eb2b7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:34Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:34 crc kubenswrapper[4691]: E0930 06:20:34.876221 4691 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.878521 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e55ae5a0-03bf-427d-92b2-39cdff18340d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f6cedf7ad601b81415785791be352c849e1d559ab8e6a0f33de2c89a93cf09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65513dc3f11ab14acbafe004e02c1b395b74935e84a00e75d9936bde03a97fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6240aa4c18020b30de9389e7a1869c736424444208d4011f228b4e5432520cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2841452dae413fa8cdf532c4933ed52cfb007be43a45ee2d25209b54890fd351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b052f8474075dfc1f07b8e0844072d5bbaa94bae1711f7e4cecc6928a65689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:34Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.879789 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.879833 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.879858 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.879920 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.879945 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:34Z","lastTransitionTime":"2025-09-30T06:20:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.892931 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb1cfb02-1e6b-4bfb-8104-f1d137231de9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07fbce63750320e7345e1a99b4be96355f54c36c70e8929599b990eab3f3e637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38c9aa49f81f590d234daa993bb92af0c48103b45f5116c27b86daa651930a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acbe061d62e7bf11b0003c035dce19386981672d5e63236fada6e43b92274c91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fc15eff68924ff75a90b9cb7b5119d95ced9b9fe9e12968bfd077db73f37ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://017d7c032ea394e2dbef22cfc837c65ce54aab7db5fbd32210312b3e7042a698\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T06:19:31Z\\\",\\\"message\\\":\\\"W0930 06:19:20.583735 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 06:19:20.584237 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759213160 cert, and key in /tmp/serving-cert-2229078217/serving-signer.crt, /tmp/serving-cert-2229078217/serving-signer.key\\\\nI0930 06:19:20.815054 1 observer_polling.go:159] Starting file observer\\\\nW0930 06:19:20.818069 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 06:19:20.818377 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 06:19:20.821284 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2229078217/tls.crt::/tmp/serving-cert-2229078217/tls.key\\\\\\\"\\\\nF0930 06:19:31.389088 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39d7f1cd59b9a24d4303415d873bac1ed21326847749130167868b5298b7f8e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:34Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.910796 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:34Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.924208 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03f725fd435423024f16548f2fdd44ad1ddc26d1d485d56cc5b614ab2cac7a76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5fcea9ccb5c1331b7ae737abdcc80364814715f3c43c15793316f16acf0502a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:34Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.936614 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:34Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.951018 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3ecaa3-192f-40d4-abf8-938b9e03a661\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://469f62522e2f813405840a8654f761a52163bdabeba8cc7c57998ccd40548370\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198c76a83115408261c01c78cbd2845e487ad3dc896da0130b2b1338349506d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aadde3a7a603aae354d4f9dcf5ef288aeceabfb73ee3c92398fb3419326b27a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2f054ea0da71d1f65e501212a66771c5e65bfc123e3a3acd5e13900def039d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:34Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.962913 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c655fbd5-0708-4151-b0fb-a97d8e1826bc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c055c319af36509c20e700f8f5025b0d356ca5e6038be80dd69282a1f1ad716b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://810c839e6c66bbacb466fb7023bed728b17be9d13025e2db26ee5b40fea124f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1f3d8c18456850212ce283d46273b39939040ddd575f193acc0910cf479f3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://deebf28973e8320961c1318918377f952fadba50dec1e580e461de659c63f23e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://deebf28973e8320961c1318918377f952fadba50dec1e580e461de659c63f23e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:34Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.981829 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.981909 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.981932 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.981957 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.981975 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:34Z","lastTransitionTime":"2025-09-30T06:20:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:34 crc kubenswrapper[4691]: I0930 06:20:34.983480 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11fb453d515798fa4f85603df19a5dc7d305375d25c0f679598613aba7312328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:34Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:35 crc kubenswrapper[4691]: I0930 06:20:35.003747 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nzp64" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b7ceecf30a44692340aed4f0df2aa58e8c9552f8ef872b7a872b29516fa2068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d675d9797e8541b6fff5df78a8d0701a51d23b03bd2d489d0489a82f440b4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d675d9797e8541b6fff5df78a8d0701a51d23b03bd2d489d0489a82f440b4a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aacdb742861442ec0e7b14904397b4e1a4bd115cd56e6105b37c42cb762b0dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aacdb742861442ec0e7b14904397b4e1a4bd115cd56e6105b37c42cb762b0dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f6c31e7404be56f11fd658c5e92f44de4704b9c2d5ebbce1fd0af40d2ca8c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1f6c31e7404be56f11fd658c5e92f44de4704b9c2d5ebbce1fd0af40d2ca8c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74da2d6dc28ebfcbfff21e47ebad84a906c4686fddf32d04eb494be40d811b86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74da2d6dc28ebfcbfff21e47ebad84a906c4686fddf32d04eb494be40d811b86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d1c4f2763fc5ea8ecd81a1c6aff31d291156a4db150fc70c0111b811bd5467c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d1c4f2763fc5ea8ecd81a1c6aff31d291156a4db150fc70c0111b811bd5467c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f313d83849679400f0ef2223f644c042290a33d55cab935fdbe9af123c88d210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f313d83849679400f0ef2223f644c042290a33d55cab935fdbe9af123c88d210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nzp64\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:35Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:35 crc kubenswrapper[4691]: I0930 06:20:35.037379 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b6df86867d5427797ec0c0a28b653ef5b8ec62af450207d9af5f490df725b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04bc58a7b740ca100f333ef769a949f962ff6024f8dd87b798c99a28514e0394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d95203bc3297779316f06d001c72859f9b4561a9a9d7c3123c5a75e8d8bc6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f46927454f67993366632bb3f6e37e4cb0c5f013266b18c1ac61e9cddc42e023\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20789e47d7e584044b0763af1eb8b81bbde2f247ec14e56a5a9b08c28894c3be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d002e869e911c24f154df90324ace473c747a4cfa6dd60ddcbef6473f9815f72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76736edc05ab57e4ee3d6dea3b7f4d2033e718cdac579abe7c4f41fe6988f62e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4eafbd6f3545c96381319c2b5692b5eccef9c2f837e82cdabf3ea0b92504260\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T06:20:06Z\\\",\\\"message\\\":\\\"/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 06:20:06.206257 6355 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0930 06:20:06.206290 6355 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0930 06:20:06.206550 6355 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 06:20:06.206243 6355 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 06:20:06.206758 6355 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 06:20:06.207239 6355 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 06:20:06.207377 6355 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 06:20:06.207466 6355 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 06:20:06.208066 6355 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:20:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:20:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f49abce591ae5c912fdbbdf0253d06ee20088424718dd6d535683a2a2ede5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sjmvw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:35Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:35 crc kubenswrapper[4691]: I0930 06:20:35.084616 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:35 crc kubenswrapper[4691]: I0930 06:20:35.084681 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:35 crc kubenswrapper[4691]: I0930 06:20:35.084701 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:35 crc kubenswrapper[4691]: I0930 06:20:35.084725 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:35 crc kubenswrapper[4691]: I0930 06:20:35.084743 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:35Z","lastTransitionTime":"2025-09-30T06:20:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:35 crc kubenswrapper[4691]: I0930 06:20:35.187717 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:35 crc kubenswrapper[4691]: I0930 06:20:35.187776 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:35 crc kubenswrapper[4691]: I0930 06:20:35.187792 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:35 crc kubenswrapper[4691]: I0930 06:20:35.187849 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:35 crc kubenswrapper[4691]: I0930 06:20:35.187868 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:35Z","lastTransitionTime":"2025-09-30T06:20:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:35 crc kubenswrapper[4691]: I0930 06:20:35.224609 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 06:20:35 crc kubenswrapper[4691]: E0930 06:20:35.224821 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 06:20:35 crc kubenswrapper[4691]: I0930 06:20:35.224849 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 06:20:35 crc kubenswrapper[4691]: E0930 06:20:35.225095 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 06:20:35 crc kubenswrapper[4691]: I0930 06:20:35.291794 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:35 crc kubenswrapper[4691]: I0930 06:20:35.291856 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:35 crc kubenswrapper[4691]: I0930 06:20:35.291872 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:35 crc kubenswrapper[4691]: I0930 06:20:35.291931 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:35 crc kubenswrapper[4691]: I0930 06:20:35.291951 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:35Z","lastTransitionTime":"2025-09-30T06:20:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:35 crc kubenswrapper[4691]: I0930 06:20:35.394494 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:35 crc kubenswrapper[4691]: I0930 06:20:35.394561 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:35 crc kubenswrapper[4691]: I0930 06:20:35.394581 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:35 crc kubenswrapper[4691]: I0930 06:20:35.394606 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:35 crc kubenswrapper[4691]: I0930 06:20:35.394625 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:35Z","lastTransitionTime":"2025-09-30T06:20:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:35 crc kubenswrapper[4691]: I0930 06:20:35.497623 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:35 crc kubenswrapper[4691]: I0930 06:20:35.497705 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:35 crc kubenswrapper[4691]: I0930 06:20:35.497725 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:35 crc kubenswrapper[4691]: I0930 06:20:35.497752 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:35 crc kubenswrapper[4691]: I0930 06:20:35.497770 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:35Z","lastTransitionTime":"2025-09-30T06:20:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:35 crc kubenswrapper[4691]: I0930 06:20:35.600833 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:35 crc kubenswrapper[4691]: I0930 06:20:35.600932 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:35 crc kubenswrapper[4691]: I0930 06:20:35.600955 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:35 crc kubenswrapper[4691]: I0930 06:20:35.600978 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:35 crc kubenswrapper[4691]: I0930 06:20:35.600996 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:35Z","lastTransitionTime":"2025-09-30T06:20:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:35 crc kubenswrapper[4691]: I0930 06:20:35.697344 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sjmvw_6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d/ovnkube-controller/3.log" Sep 30 06:20:35 crc kubenswrapper[4691]: I0930 06:20:35.698512 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sjmvw_6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d/ovnkube-controller/2.log" Sep 30 06:20:35 crc kubenswrapper[4691]: I0930 06:20:35.703288 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:35 crc kubenswrapper[4691]: I0930 06:20:35.703342 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:35 crc kubenswrapper[4691]: I0930 06:20:35.703364 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:35 crc kubenswrapper[4691]: I0930 06:20:35.703394 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:35 crc kubenswrapper[4691]: I0930 06:20:35.703416 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:35Z","lastTransitionTime":"2025-09-30T06:20:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:35 crc kubenswrapper[4691]: I0930 06:20:35.703506 4691 generic.go:334] "Generic (PLEG): container finished" podID="6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" containerID="76736edc05ab57e4ee3d6dea3b7f4d2033e718cdac579abe7c4f41fe6988f62e" exitCode=1 Sep 30 06:20:35 crc kubenswrapper[4691]: I0930 06:20:35.703554 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" event={"ID":"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d","Type":"ContainerDied","Data":"76736edc05ab57e4ee3d6dea3b7f4d2033e718cdac579abe7c4f41fe6988f62e"} Sep 30 06:20:35 crc kubenswrapper[4691]: I0930 06:20:35.703614 4691 scope.go:117] "RemoveContainer" containerID="e4eafbd6f3545c96381319c2b5692b5eccef9c2f837e82cdabf3ea0b92504260" Sep 30 06:20:35 crc kubenswrapper[4691]: I0930 06:20:35.705188 4691 scope.go:117] "RemoveContainer" containerID="76736edc05ab57e4ee3d6dea3b7f4d2033e718cdac579abe7c4f41fe6988f62e" Sep 30 06:20:35 crc kubenswrapper[4691]: E0930 06:20:35.705546 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-sjmvw_openshift-ovn-kubernetes(6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d)\"" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" podUID="6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" Sep 30 06:20:35 crc kubenswrapper[4691]: I0930 06:20:35.721024 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-p7wmt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99a6e728-8795-424d-a99e-7141c75baad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://746d60ed8b3230cabda106f1d2e4bc8496618515f2cfb2d6c4238375f71ad2af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6jfq9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-p7wmt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:35Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:35 crc kubenswrapper[4691]: I0930 06:20:35.739027 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fxv8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37b1a1c7-92d7-41ea-b5c1-4a56b40f819e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be3a8860307e499758a11dd6561cdf1ffd968b3edf9c701b52994ea4cfe1c129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4h4k7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b07ee418ef8f92ea69f7d64ef09e0de246432d4663c4c5018be94894860c6444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4h4k7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fxv8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:35Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:35 crc kubenswrapper[4691]: I0930 06:20:35.756481 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-svjxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8ed6f92-0b98-4b1b-a46e-4d0604d686a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q98dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q98dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-svjxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:35Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:35 crc kubenswrapper[4691]: I0930 06:20:35.777842 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:35Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:35 crc kubenswrapper[4691]: I0930 06:20:35.795937 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca08f5d521940a428454d90f85150e7b87e2353314c89a42b4c73740c3b23163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:35Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:35 crc kubenswrapper[4691]: I0930 06:20:35.806052 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:35 crc kubenswrapper[4691]: I0930 06:20:35.806186 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:35 crc kubenswrapper[4691]: I0930 06:20:35.806206 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:35 crc kubenswrapper[4691]: I0930 06:20:35.806231 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:35 crc kubenswrapper[4691]: I0930 06:20:35.806248 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:35Z","lastTransitionTime":"2025-09-30T06:20:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:35 crc kubenswrapper[4691]: I0930 06:20:35.820184 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xjjw8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bfd073c-4582-4a65-8170-7030f4852174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://139bc9d998c0acce39c92915ac1c15ec414b09e1f5c0ae15c37f7af6e5cf570d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a065eb18eb909fb60011f5ae6adf327088480996585d74e861628ffdd759bd9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T06:20:26Z\\\",\\\"message\\\":\\\"2025-09-30T06:19:40+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f75473c8-d557-49aa-97dd-d3a6a6daff28\\\\n2025-09-30T06:19:40+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f75473c8-d557-49aa-97dd-d3a6a6daff28 to /host/opt/cni/bin/\\\\n2025-09-30T06:19:41Z [verbose] multus-daemon started\\\\n2025-09-30T06:19:41Z [verbose] Readiness Indicator file check\\\\n2025-09-30T06:20:26Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xjjw8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:35Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:35 crc kubenswrapper[4691]: I0930 06:20:35.841148 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03f725fd435423024f16548f2fdd44ad1ddc26d1d485d56cc5b614ab2cac7a76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5fcea9ccb5c1331b7ae737abdcc80364814715f3c43c15793316f16acf0502a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:35Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:35 crc kubenswrapper[4691]: I0930 06:20:35.860535 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:35Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:35 crc kubenswrapper[4691]: I0930 06:20:35.877809 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8htrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c1c8663-d263-4b8c-93fa-05ee1b61d7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73cf34ef0b11305b959da324b55f8a211f0c278cf2dec835bb6c91ad385e4dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj86t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8htrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:35Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:35 crc kubenswrapper[4691]: I0930 06:20:35.895934 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b46ade-8260-448f-84b7-506632d23ff9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff4769002bae0f06423979897a58fdb4b2a78f9ea00abc1d5df5c8d4385c0919\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xszvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1440cd9bacd9b28d5e192b3d9143dda931c741606fdb8f246fba23b8a68c534c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xszvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4w4k6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:35Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:35 crc kubenswrapper[4691]: I0930 06:20:35.910698 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:35 crc kubenswrapper[4691]: I0930 06:20:35.910745 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:35 crc kubenswrapper[4691]: I0930 06:20:35.910763 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:35 crc kubenswrapper[4691]: I0930 06:20:35.910785 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:35 crc kubenswrapper[4691]: I0930 06:20:35.910802 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:35Z","lastTransitionTime":"2025-09-30T06:20:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:35 crc kubenswrapper[4691]: I0930 06:20:35.931196 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e55ae5a0-03bf-427d-92b2-39cdff18340d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f6cedf7ad601b81415785791be352c849e1d559ab8e6a0f33de2c89a93cf09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65513dc3f11ab14acbafe004e02c1b395b74935e84a00e75d9936bde03a97fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6240aa4c18020b30de9389e7a1869c736424444208d4011f228b4e5432520cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2841452dae413fa8cdf532c4933ed52cfb007be43a45ee2d25209b54890fd351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b052f8474075dfc1f07b8e0844072d5bbaa94bae1711f7e4cecc6928a65689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:35Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:35 crc kubenswrapper[4691]: I0930 06:20:35.954289 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb1cfb02-1e6b-4bfb-8104-f1d137231de9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07fbce63750320e7345e1a99b4be96355f54c36c70e8929599b990eab3f3e637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38c9aa49f81f590d234daa993bb92af0c48103b45f5116c27b86daa651930a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acbe061d62e7bf11b0003c035dce19386981672d5e63236fada6e43b92274c91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fc15eff68924ff75a90b9cb7b5119d95ced9b9fe9e12968bfd077db73f37ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://017d7c032ea394e2dbef22cfc837c65ce54aab7db5fbd32210312b3e7042a698\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T06:19:31Z\\\",\\\"message\\\":\\\"W0930 06:19:20.583735 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 06:19:20.584237 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759213160 cert, and key in /tmp/serving-cert-2229078217/serving-signer.crt, /tmp/serving-cert-2229078217/serving-signer.key\\\\nI0930 06:19:20.815054 1 observer_polling.go:159] Starting file observer\\\\nW0930 06:19:20.818069 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 06:19:20.818377 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 06:19:20.821284 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2229078217/tls.crt::/tmp/serving-cert-2229078217/tls.key\\\\\\\"\\\\nF0930 06:19:31.389088 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39d7f1cd59b9a24d4303415d873bac1ed21326847749130167868b5298b7f8e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:35Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:35 crc kubenswrapper[4691]: I0930 06:20:35.972872 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:35Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:35 crc kubenswrapper[4691]: I0930 06:20:35.994518 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3ecaa3-192f-40d4-abf8-938b9e03a661\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://469f62522e2f813405840a8654f761a52163bdabeba8cc7c57998ccd40548370\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198c76a83115408261c01c78cbd2845e487ad3dc896da0130b2b1338349506d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aadde3a7a603aae354d4f9dcf5ef288aeceabfb73ee3c92398fb3419326b27a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2f054ea0da71d1f65e501212a66771c5e65bfc123e3a3acd5e13900def039d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:35Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:36 crc kubenswrapper[4691]: I0930 06:20:36.013833 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:36 crc kubenswrapper[4691]: I0930 06:20:36.013925 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:36 crc kubenswrapper[4691]: I0930 06:20:36.013943 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:36 crc kubenswrapper[4691]: I0930 06:20:36.013967 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:36 crc kubenswrapper[4691]: I0930 06:20:36.013988 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:36Z","lastTransitionTime":"2025-09-30T06:20:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:36 crc kubenswrapper[4691]: I0930 06:20:36.013771 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c655fbd5-0708-4151-b0fb-a97d8e1826bc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c055c319af36509c20e700f8f5025b0d356ca5e6038be80dd69282a1f1ad716b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://810c839e6c66bbacb466fb7023bed728b17be9d13025e2db26ee5b40fea124f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1f3d8c18456850212ce283d46273b39939040ddd575f193acc0910cf479f3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://deebf28973e8320961c1318918377f952fadba50dec1e580e461de659c63f23e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://deebf28973e8320961c1318918377f952fadba50dec1e580e461de659c63f23e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:36Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:36 crc kubenswrapper[4691]: I0930 06:20:36.035873 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11fb453d515798fa4f85603df19a5dc7d305375d25c0f679598613aba7312328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:36Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:36 crc kubenswrapper[4691]: I0930 06:20:36.059292 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nzp64" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b7ceecf30a44692340aed4f0df2aa58e8c9552f8ef872b7a872b29516fa2068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d675d9797e8541b6fff5df78a8d0701a51d23b03bd2d489d0489a82f440b4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d675d9797e8541b6fff5df78a8d0701a51d23b03bd2d489d0489a82f440b4a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aacdb742861442ec0e7b14904397b4e1a4bd115cd56e6105b37c42cb762b0dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aacdb742861442ec0e7b14904397b4e1a4bd115cd56e6105b37c42cb762b0dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f6c31e7404be56f11fd658c5e92f44de4704b9c2d5ebbce1fd0af40d2ca8c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1f6c31e7404be56f11fd658c5e92f44de4704b9c2d5ebbce1fd0af40d2ca8c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74da2d6dc28ebfcbfff21e47ebad84a906c4686fddf32d04eb494be40d811b86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74da2d6dc28ebfcbfff21e47ebad84a906c4686fddf32d04eb494be40d811b86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d1c4f2763fc5ea8ecd81a1c6aff31d291156a4db150fc70c0111b811bd5467c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d1c4f2763fc5ea8ecd81a1c6aff31d291156a4db150fc70c0111b811bd5467c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f313d83849679400f0ef2223f644c042290a33d55cab935fdbe9af123c88d210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f313d83849679400f0ef2223f644c042290a33d55cab935fdbe9af123c88d210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nzp64\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:36Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:36 crc kubenswrapper[4691]: I0930 06:20:36.088926 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b6df86867d5427797ec0c0a28b653ef5b8ec62af450207d9af5f490df725b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04bc58a7b740ca100f333ef769a949f962ff6024f8dd87b798c99a28514e0394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d95203bc3297779316f06d001c72859f9b4561a9a9d7c3123c5a75e8d8bc6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f46927454f67993366632bb3f6e37e4cb0c5f013266b18c1ac61e9cddc42e023\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20789e47d7e584044b0763af1eb8b81bbde2f247ec14e56a5a9b08c28894c3be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d002e869e911c24f154df90324ace473c747a4cfa6dd60ddcbef6473f9815f72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76736edc05ab57e4ee3d6dea3b7f4d2033e718cdac579abe7c4f41fe6988f62e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4eafbd6f3545c96381319c2b5692b5eccef9c2f837e82cdabf3ea0b92504260\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T06:20:06Z\\\",\\\"message\\\":\\\"/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 06:20:06.206257 6355 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0930 06:20:06.206290 6355 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0930 06:20:06.206550 6355 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 06:20:06.206243 6355 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0930 06:20:06.206758 6355 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 06:20:06.207239 6355 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 06:20:06.207377 6355 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0930 06:20:06.207466 6355 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0930 06:20:06.208066 6355 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:20:05Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76736edc05ab57e4ee3d6dea3b7f4d2033e718cdac579abe7c4f41fe6988f62e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T06:20:35Z\\\",\\\"message\\\":\\\"selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.254:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e4e4203e-87c7-4024-930a-5d6bdfe2bdde}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:NB_Global Row:map[] Rows:[] Columns:[] Mutations:[{Column:nb_cfg Mutator:+= Value:1}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6011affd-30a6-4be6-872d-e4cf1ca780cf}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0930 06:20:35.149244 6712 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI0930 06:20:35.149429 6712 model_client.go:382] Update operations generated as: [{Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.92 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {73135118-cf1b-4568-bd31-2f50308bf69d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0930 06:20:35.149455 6712 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-crc\\\\nI0930 06:20:35.149188 6712 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-xjjw8 after 0 failed attempt(s)\\\\nI0930 06:20:35.149471 6712 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:L\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:20:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f49abce591ae5c912fdbbdf0253d06ee20088424718dd6d535683a2a2ede5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sjmvw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:36Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:36 crc kubenswrapper[4691]: I0930 06:20:36.117526 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:36 crc kubenswrapper[4691]: I0930 06:20:36.117600 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:36 crc kubenswrapper[4691]: I0930 06:20:36.117624 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:36 crc kubenswrapper[4691]: I0930 06:20:36.117651 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:36 crc kubenswrapper[4691]: I0930 06:20:36.117672 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:36Z","lastTransitionTime":"2025-09-30T06:20:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:36 crc kubenswrapper[4691]: I0930 06:20:36.221123 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:36 crc kubenswrapper[4691]: I0930 06:20:36.221183 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:36 crc kubenswrapper[4691]: I0930 06:20:36.221208 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:36 crc kubenswrapper[4691]: I0930 06:20:36.221238 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:36 crc kubenswrapper[4691]: I0930 06:20:36.221259 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:36Z","lastTransitionTime":"2025-09-30T06:20:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:36 crc kubenswrapper[4691]: I0930 06:20:36.224405 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-svjxq" Sep 30 06:20:36 crc kubenswrapper[4691]: I0930 06:20:36.224448 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 06:20:36 crc kubenswrapper[4691]: E0930 06:20:36.224587 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-svjxq" podUID="a8ed6f92-0b98-4b1b-a46e-4d0604d686a1" Sep 30 06:20:36 crc kubenswrapper[4691]: E0930 06:20:36.224719 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 06:20:36 crc kubenswrapper[4691]: I0930 06:20:36.324701 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:36 crc kubenswrapper[4691]: I0930 06:20:36.324764 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:36 crc kubenswrapper[4691]: I0930 06:20:36.324781 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:36 crc kubenswrapper[4691]: I0930 06:20:36.324806 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:36 crc kubenswrapper[4691]: I0930 06:20:36.324823 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:36Z","lastTransitionTime":"2025-09-30T06:20:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:36 crc kubenswrapper[4691]: I0930 06:20:36.427710 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:36 crc kubenswrapper[4691]: I0930 06:20:36.427766 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:36 crc kubenswrapper[4691]: I0930 06:20:36.427783 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:36 crc kubenswrapper[4691]: I0930 06:20:36.427805 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:36 crc kubenswrapper[4691]: I0930 06:20:36.427823 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:36Z","lastTransitionTime":"2025-09-30T06:20:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:36 crc kubenswrapper[4691]: I0930 06:20:36.530942 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:36 crc kubenswrapper[4691]: I0930 06:20:36.531017 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:36 crc kubenswrapper[4691]: I0930 06:20:36.531045 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:36 crc kubenswrapper[4691]: I0930 06:20:36.531076 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:36 crc kubenswrapper[4691]: I0930 06:20:36.531097 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:36Z","lastTransitionTime":"2025-09-30T06:20:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:36 crc kubenswrapper[4691]: I0930 06:20:36.633851 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:36 crc kubenswrapper[4691]: I0930 06:20:36.633934 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:36 crc kubenswrapper[4691]: I0930 06:20:36.633952 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:36 crc kubenswrapper[4691]: I0930 06:20:36.633976 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:36 crc kubenswrapper[4691]: I0930 06:20:36.633996 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:36Z","lastTransitionTime":"2025-09-30T06:20:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:36 crc kubenswrapper[4691]: I0930 06:20:36.711254 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sjmvw_6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d/ovnkube-controller/3.log" Sep 30 06:20:36 crc kubenswrapper[4691]: I0930 06:20:36.717205 4691 scope.go:117] "RemoveContainer" containerID="76736edc05ab57e4ee3d6dea3b7f4d2033e718cdac579abe7c4f41fe6988f62e" Sep 30 06:20:36 crc kubenswrapper[4691]: E0930 06:20:36.717488 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-sjmvw_openshift-ovn-kubernetes(6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d)\"" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" podUID="6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" Sep 30 06:20:36 crc kubenswrapper[4691]: I0930 06:20:36.738419 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3ecaa3-192f-40d4-abf8-938b9e03a661\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://469f62522e2f813405840a8654f761a52163bdabeba8cc7c57998ccd40548370\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198c76a83115408261c01c78cbd2845e487ad3dc896da0130b2b1338349506d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aadde3a7a603aae354d4f9dcf5ef288aeceabfb73ee3c92398fb3419326b27a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2f054ea0da71d1f65e501212a66771c5e65bfc123e3a3acd5e13900def039d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:36Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:36 crc kubenswrapper[4691]: I0930 06:20:36.738593 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:36 crc kubenswrapper[4691]: I0930 06:20:36.738680 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:36 crc kubenswrapper[4691]: I0930 06:20:36.738703 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:36 crc kubenswrapper[4691]: I0930 06:20:36.738733 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:36 crc kubenswrapper[4691]: I0930 06:20:36.738754 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:36Z","lastTransitionTime":"2025-09-30T06:20:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:36 crc kubenswrapper[4691]: I0930 06:20:36.758397 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c655fbd5-0708-4151-b0fb-a97d8e1826bc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c055c319af36509c20e700f8f5025b0d356ca5e6038be80dd69282a1f1ad716b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://810c839e6c66bbacb466fb7023bed728b17be9d13025e2db26ee5b40fea124f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1f3d8c18456850212ce283d46273b39939040ddd575f193acc0910cf479f3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://deebf28973e8320961c1318918377f952fadba50dec1e580e461de659c63f23e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://deebf28973e8320961c1318918377f952fadba50dec1e580e461de659c63f23e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:36Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:36 crc kubenswrapper[4691]: I0930 06:20:36.779679 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11fb453d515798fa4f85603df19a5dc7d305375d25c0f679598613aba7312328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:36Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:36 crc kubenswrapper[4691]: I0930 06:20:36.807253 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nzp64" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b7ceecf30a44692340aed4f0df2aa58e8c9552f8ef872b7a872b29516fa2068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d675d9797e8541b6fff5df78a8d0701a51d23b03bd2d489d0489a82f440b4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d675d9797e8541b6fff5df78a8d0701a51d23b03bd2d489d0489a82f440b4a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aacdb742861442ec0e7b14904397b4e1a4bd115cd56e6105b37c42cb762b0dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aacdb742861442ec0e7b14904397b4e1a4bd115cd56e6105b37c42cb762b0dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f6c31e7404be56f11fd658c5e92f44de4704b9c2d5ebbce1fd0af40d2ca8c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1f6c31e7404be56f11fd658c5e92f44de4704b9c2d5ebbce1fd0af40d2ca8c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74da2d6dc28ebfcbfff21e47ebad84a906c4686fddf32d04eb494be40d811b86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74da2d6dc28ebfcbfff21e47ebad84a906c4686fddf32d04eb494be40d811b86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d1c4f2763fc5ea8ecd81a1c6aff31d291156a4db150fc70c0111b811bd5467c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d1c4f2763fc5ea8ecd81a1c6aff31d291156a4db150fc70c0111b811bd5467c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f313d83849679400f0ef2223f644c042290a33d55cab935fdbe9af123c88d210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f313d83849679400f0ef2223f644c042290a33d55cab935fdbe9af123c88d210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nzp64\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:36Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:36 crc kubenswrapper[4691]: I0930 06:20:36.839566 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b6df86867d5427797ec0c0a28b653ef5b8ec62af450207d9af5f490df725b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04bc58a7b740ca100f333ef769a949f962ff6024f8dd87b798c99a28514e0394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d95203bc3297779316f06d001c72859f9b4561a9a9d7c3123c5a75e8d8bc6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f46927454f67993366632bb3f6e37e4cb0c5f013266b18c1ac61e9cddc42e023\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20789e47d7e584044b0763af1eb8b81bbde2f247ec14e56a5a9b08c28894c3be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d002e869e911c24f154df90324ace473c747a4cfa6dd60ddcbef6473f9815f72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76736edc05ab57e4ee3d6dea3b7f4d2033e718cdac579abe7c4f41fe6988f62e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76736edc05ab57e4ee3d6dea3b7f4d2033e718cdac579abe7c4f41fe6988f62e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T06:20:35Z\\\",\\\"message\\\":\\\"selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.254:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e4e4203e-87c7-4024-930a-5d6bdfe2bdde}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:NB_Global Row:map[] Rows:[] Columns:[] Mutations:[{Column:nb_cfg Mutator:+= Value:1}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6011affd-30a6-4be6-872d-e4cf1ca780cf}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0930 06:20:35.149244 6712 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI0930 06:20:35.149429 6712 model_client.go:382] Update operations generated as: [{Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.92 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {73135118-cf1b-4568-bd31-2f50308bf69d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0930 06:20:35.149455 6712 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-crc\\\\nI0930 06:20:35.149188 6712 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-xjjw8 after 0 failed attempt(s)\\\\nI0930 06:20:35.149471 6712 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:L\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:20:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-sjmvw_openshift-ovn-kubernetes(6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f49abce591ae5c912fdbbdf0253d06ee20088424718dd6d535683a2a2ede5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sjmvw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:36Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:36 crc kubenswrapper[4691]: I0930 06:20:36.841314 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:36 crc kubenswrapper[4691]: I0930 06:20:36.841360 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:36 crc kubenswrapper[4691]: I0930 06:20:36.841377 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:36 crc kubenswrapper[4691]: I0930 06:20:36.841399 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:36 crc kubenswrapper[4691]: I0930 06:20:36.841418 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:36Z","lastTransitionTime":"2025-09-30T06:20:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:36 crc kubenswrapper[4691]: I0930 06:20:36.856161 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-svjxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8ed6f92-0b98-4b1b-a46e-4d0604d686a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q98dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q98dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-svjxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:36Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:36 crc kubenswrapper[4691]: I0930 06:20:36.875070 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:36Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:36 crc kubenswrapper[4691]: I0930 06:20:36.892533 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca08f5d521940a428454d90f85150e7b87e2353314c89a42b4c73740c3b23163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:36Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:36 crc kubenswrapper[4691]: I0930 06:20:36.912688 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xjjw8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bfd073c-4582-4a65-8170-7030f4852174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://139bc9d998c0acce39c92915ac1c15ec414b09e1f5c0ae15c37f7af6e5cf570d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a065eb18eb909fb60011f5ae6adf327088480996585d74e861628ffdd759bd9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T06:20:26Z\\\",\\\"message\\\":\\\"2025-09-30T06:19:40+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f75473c8-d557-49aa-97dd-d3a6a6daff28\\\\n2025-09-30T06:19:40+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f75473c8-d557-49aa-97dd-d3a6a6daff28 to /host/opt/cni/bin/\\\\n2025-09-30T06:19:41Z [verbose] multus-daemon started\\\\n2025-09-30T06:19:41Z [verbose] Readiness Indicator file check\\\\n2025-09-30T06:20:26Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xjjw8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:36Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:36 crc kubenswrapper[4691]: I0930 06:20:36.931000 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-p7wmt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99a6e728-8795-424d-a99e-7141c75baad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://746d60ed8b3230cabda106f1d2e4bc8496618515f2cfb2d6c4238375f71ad2af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6jfq9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-p7wmt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:36Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:36 crc kubenswrapper[4691]: I0930 06:20:36.944449 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:36 crc kubenswrapper[4691]: I0930 06:20:36.944513 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:36 crc kubenswrapper[4691]: I0930 06:20:36.944536 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:36 crc kubenswrapper[4691]: I0930 06:20:36.944568 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:36 crc kubenswrapper[4691]: I0930 06:20:36.944592 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:36Z","lastTransitionTime":"2025-09-30T06:20:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:36 crc kubenswrapper[4691]: I0930 06:20:36.950676 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fxv8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37b1a1c7-92d7-41ea-b5c1-4a56b40f819e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be3a8860307e499758a11dd6561cdf1ffd968b3edf9c701b52994ea4cfe1c129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4h4k7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b07ee418ef8f92ea69f7d64ef09e0de246432d4663c4c5018be94894860c6444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4h4k7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fxv8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:36Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:36 crc kubenswrapper[4691]: I0930 06:20:36.966477 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8htrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c1c8663-d263-4b8c-93fa-05ee1b61d7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73cf34ef0b11305b959da324b55f8a211f0c278cf2dec835bb6c91ad385e4dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj86t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8htrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:36Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:36 crc kubenswrapper[4691]: I0930 06:20:36.983678 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b46ade-8260-448f-84b7-506632d23ff9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff4769002bae0f06423979897a58fdb4b2a78f9ea00abc1d5df5c8d4385c0919\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xszvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1440cd9bacd9b28d5e192b3d9143dda931c741606fdb8f246fba23b8a68c534c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xszvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4w4k6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:36Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:37 crc kubenswrapper[4691]: I0930 06:20:37.015917 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e55ae5a0-03bf-427d-92b2-39cdff18340d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f6cedf7ad601b81415785791be352c849e1d559ab8e6a0f33de2c89a93cf09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65513dc3f11ab14acbafe004e02c1b395b74935e84a00e75d9936bde03a97fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6240aa4c18020b30de9389e7a1869c736424444208d4011f228b4e5432520cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2841452dae413fa8cdf532c4933ed52cfb007be43a45ee2d25209b54890fd351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b052f8474075dfc1f07b8e0844072d5bbaa94bae1711f7e4cecc6928a65689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:37Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:37 crc kubenswrapper[4691]: I0930 06:20:37.039251 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb1cfb02-1e6b-4bfb-8104-f1d137231de9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07fbce63750320e7345e1a99b4be96355f54c36c70e8929599b990eab3f3e637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38c9aa49f81f590d234daa993bb92af0c48103b45f5116c27b86daa651930a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acbe061d62e7bf11b0003c035dce19386981672d5e63236fada6e43b92274c91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fc15eff68924ff75a90b9cb7b5119d95ced9b9fe9e12968bfd077db73f37ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://017d7c032ea394e2dbef22cfc837c65ce54aab7db5fbd32210312b3e7042a698\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T06:19:31Z\\\",\\\"message\\\":\\\"W0930 06:19:20.583735 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 06:19:20.584237 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759213160 cert, and key in /tmp/serving-cert-2229078217/serving-signer.crt, /tmp/serving-cert-2229078217/serving-signer.key\\\\nI0930 06:19:20.815054 1 observer_polling.go:159] Starting file observer\\\\nW0930 06:19:20.818069 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 06:19:20.818377 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 06:19:20.821284 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2229078217/tls.crt::/tmp/serving-cert-2229078217/tls.key\\\\\\\"\\\\nF0930 06:19:31.389088 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39d7f1cd59b9a24d4303415d873bac1ed21326847749130167868b5298b7f8e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:37Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:37 crc kubenswrapper[4691]: I0930 06:20:37.050768 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:37 crc kubenswrapper[4691]: I0930 06:20:37.050828 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:37 crc kubenswrapper[4691]: I0930 06:20:37.050846 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:37 crc kubenswrapper[4691]: I0930 06:20:37.050871 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:37 crc kubenswrapper[4691]: I0930 06:20:37.050913 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:37Z","lastTransitionTime":"2025-09-30T06:20:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:37 crc kubenswrapper[4691]: I0930 06:20:37.080724 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:37Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:37 crc kubenswrapper[4691]: I0930 06:20:37.119662 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03f725fd435423024f16548f2fdd44ad1ddc26d1d485d56cc5b614ab2cac7a76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5fcea9ccb5c1331b7ae737abdcc80364814715f3c43c15793316f16acf0502a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:37Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:37 crc kubenswrapper[4691]: I0930 06:20:37.140994 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:37Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:37 crc kubenswrapper[4691]: I0930 06:20:37.153678 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:37 crc kubenswrapper[4691]: I0930 06:20:37.153991 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:37 crc kubenswrapper[4691]: I0930 06:20:37.154139 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:37 crc kubenswrapper[4691]: I0930 06:20:37.154305 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:37 crc kubenswrapper[4691]: I0930 06:20:37.154466 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:37Z","lastTransitionTime":"2025-09-30T06:20:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:37 crc kubenswrapper[4691]: I0930 06:20:37.225033 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 06:20:37 crc kubenswrapper[4691]: I0930 06:20:37.225122 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 06:20:37 crc kubenswrapper[4691]: E0930 06:20:37.225234 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 06:20:37 crc kubenswrapper[4691]: E0930 06:20:37.225344 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 06:20:37 crc kubenswrapper[4691]: I0930 06:20:37.242075 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03f725fd435423024f16548f2fdd44ad1ddc26d1d485d56cc5b614ab2cac7a76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5fcea9ccb5c1331b7ae737abdcc80364814715f3c43c15793316f16acf0502a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:37Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:37 crc kubenswrapper[4691]: I0930 06:20:37.257922 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:37 crc kubenswrapper[4691]: I0930 06:20:37.258170 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:37 crc kubenswrapper[4691]: I0930 06:20:37.258396 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:37 crc kubenswrapper[4691]: I0930 06:20:37.258573 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:37 crc kubenswrapper[4691]: I0930 06:20:37.258737 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:37Z","lastTransitionTime":"2025-09-30T06:20:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:37 crc kubenswrapper[4691]: I0930 06:20:37.261575 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:37Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:37 crc kubenswrapper[4691]: I0930 06:20:37.277799 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8htrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c1c8663-d263-4b8c-93fa-05ee1b61d7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73cf34ef0b11305b959da324b55f8a211f0c278cf2dec835bb6c91ad385e4dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj86t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8htrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:37Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:37 crc kubenswrapper[4691]: I0930 06:20:37.292993 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b46ade-8260-448f-84b7-506632d23ff9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff4769002bae0f06423979897a58fdb4b2a78f9ea00abc1d5df5c8d4385c0919\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xszvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1440cd9bacd9b28d5e192b3d9143dda931c741606fdb8f246fba23b8a68c534c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xszvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4w4k6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:37Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:37 crc kubenswrapper[4691]: I0930 06:20:37.324874 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e55ae5a0-03bf-427d-92b2-39cdff18340d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f6cedf7ad601b81415785791be352c849e1d559ab8e6a0f33de2c89a93cf09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65513dc3f11ab14acbafe004e02c1b395b74935e84a00e75d9936bde03a97fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6240aa4c18020b30de9389e7a1869c736424444208d4011f228b4e5432520cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2841452dae413fa8cdf532c4933ed52cfb007be43a45ee2d25209b54890fd351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b052f8474075dfc1f07b8e0844072d5bbaa94bae1711f7e4cecc6928a65689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:37Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:37 crc kubenswrapper[4691]: I0930 06:20:37.342620 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb1cfb02-1e6b-4bfb-8104-f1d137231de9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07fbce63750320e7345e1a99b4be96355f54c36c70e8929599b990eab3f3e637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38c9aa49f81f590d234daa993bb92af0c48103b45f5116c27b86daa651930a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acbe061d62e7bf11b0003c035dce19386981672d5e63236fada6e43b92274c91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fc15eff68924ff75a90b9cb7b5119d95ced9b9fe9e12968bfd077db73f37ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://017d7c032ea394e2dbef22cfc837c65ce54aab7db5fbd32210312b3e7042a698\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T06:19:31Z\\\",\\\"message\\\":\\\"W0930 06:19:20.583735 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 06:19:20.584237 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759213160 cert, and key in /tmp/serving-cert-2229078217/serving-signer.crt, /tmp/serving-cert-2229078217/serving-signer.key\\\\nI0930 06:19:20.815054 1 observer_polling.go:159] Starting file observer\\\\nW0930 06:19:20.818069 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 06:19:20.818377 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 06:19:20.821284 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2229078217/tls.crt::/tmp/serving-cert-2229078217/tls.key\\\\\\\"\\\\nF0930 06:19:31.389088 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39d7f1cd59b9a24d4303415d873bac1ed21326847749130167868b5298b7f8e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:37Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:37 crc kubenswrapper[4691]: I0930 06:20:37.356515 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:37Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:37 crc kubenswrapper[4691]: I0930 06:20:37.363724 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:37 crc kubenswrapper[4691]: I0930 06:20:37.363772 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:37 crc kubenswrapper[4691]: I0930 06:20:37.363786 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:37 crc kubenswrapper[4691]: I0930 06:20:37.363804 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:37 crc kubenswrapper[4691]: I0930 06:20:37.363816 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:37Z","lastTransitionTime":"2025-09-30T06:20:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:37 crc kubenswrapper[4691]: I0930 06:20:37.378622 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3ecaa3-192f-40d4-abf8-938b9e03a661\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://469f62522e2f813405840a8654f761a52163bdabeba8cc7c57998ccd40548370\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198c76a83115408261c01c78cbd2845e487ad3dc896da0130b2b1338349506d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aadde3a7a603aae354d4f9dcf5ef288aeceabfb73ee3c92398fb3419326b27a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2f054ea0da71d1f65e501212a66771c5e65bfc123e3a3acd5e13900def039d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:37Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:37 crc kubenswrapper[4691]: I0930 06:20:37.396064 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c655fbd5-0708-4151-b0fb-a97d8e1826bc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c055c319af36509c20e700f8f5025b0d356ca5e6038be80dd69282a1f1ad716b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://810c839e6c66bbacb466fb7023bed728b17be9d13025e2db26ee5b40fea124f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1f3d8c18456850212ce283d46273b39939040ddd575f193acc0910cf479f3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://deebf28973e8320961c1318918377f952fadba50dec1e580e461de659c63f23e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://deebf28973e8320961c1318918377f952fadba50dec1e580e461de659c63f23e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:37Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:37 crc kubenswrapper[4691]: I0930 06:20:37.414912 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11fb453d515798fa4f85603df19a5dc7d305375d25c0f679598613aba7312328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:37Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:37 crc kubenswrapper[4691]: I0930 06:20:37.443841 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nzp64" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b7ceecf30a44692340aed4f0df2aa58e8c9552f8ef872b7a872b29516fa2068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d675d9797e8541b6fff5df78a8d0701a51d23b03bd2d489d0489a82f440b4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d675d9797e8541b6fff5df78a8d0701a51d23b03bd2d489d0489a82f440b4a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aacdb742861442ec0e7b14904397b4e1a4bd115cd56e6105b37c42cb762b0dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aacdb742861442ec0e7b14904397b4e1a4bd115cd56e6105b37c42cb762b0dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f6c31e7404be56f11fd658c5e92f44de4704b9c2d5ebbce1fd0af40d2ca8c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1f6c31e7404be56f11fd658c5e92f44de4704b9c2d5ebbce1fd0af40d2ca8c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74da2d6dc28ebfcbfff21e47ebad84a906c4686fddf32d04eb494be40d811b86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74da2d6dc28ebfcbfff21e47ebad84a906c4686fddf32d04eb494be40d811b86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d1c4f2763fc5ea8ecd81a1c6aff31d291156a4db150fc70c0111b811bd5467c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d1c4f2763fc5ea8ecd81a1c6aff31d291156a4db150fc70c0111b811bd5467c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f313d83849679400f0ef2223f644c042290a33d55cab935fdbe9af123c88d210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f313d83849679400f0ef2223f644c042290a33d55cab935fdbe9af123c88d210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nzp64\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:37Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:37 crc kubenswrapper[4691]: I0930 06:20:37.463742 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b6df86867d5427797ec0c0a28b653ef5b8ec62af450207d9af5f490df725b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04bc58a7b740ca100f333ef769a949f962ff6024f8dd87b798c99a28514e0394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d95203bc3297779316f06d001c72859f9b4561a9a9d7c3123c5a75e8d8bc6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f46927454f67993366632bb3f6e37e4cb0c5f013266b18c1ac61e9cddc42e023\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20789e47d7e584044b0763af1eb8b81bbde2f247ec14e56a5a9b08c28894c3be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d002e869e911c24f154df90324ace473c747a4cfa6dd60ddcbef6473f9815f72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76736edc05ab57e4ee3d6dea3b7f4d2033e718cdac579abe7c4f41fe6988f62e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76736edc05ab57e4ee3d6dea3b7f4d2033e718cdac579abe7c4f41fe6988f62e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T06:20:35Z\\\",\\\"message\\\":\\\"selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.254:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e4e4203e-87c7-4024-930a-5d6bdfe2bdde}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:NB_Global Row:map[] Rows:[] Columns:[] Mutations:[{Column:nb_cfg Mutator:+= Value:1}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6011affd-30a6-4be6-872d-e4cf1ca780cf}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0930 06:20:35.149244 6712 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI0930 06:20:35.149429 6712 model_client.go:382] Update operations generated as: [{Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.92 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {73135118-cf1b-4568-bd31-2f50308bf69d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0930 06:20:35.149455 6712 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-crc\\\\nI0930 06:20:35.149188 6712 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-xjjw8 after 0 failed attempt(s)\\\\nI0930 06:20:35.149471 6712 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:L\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:20:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-sjmvw_openshift-ovn-kubernetes(6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f49abce591ae5c912fdbbdf0253d06ee20088424718dd6d535683a2a2ede5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sjmvw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:37Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:37 crc kubenswrapper[4691]: I0930 06:20:37.465678 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:37 crc kubenswrapper[4691]: I0930 06:20:37.465776 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:37 crc kubenswrapper[4691]: I0930 06:20:37.465788 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:37 crc kubenswrapper[4691]: I0930 06:20:37.465801 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:37 crc kubenswrapper[4691]: I0930 06:20:37.465813 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:37Z","lastTransitionTime":"2025-09-30T06:20:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:37 crc kubenswrapper[4691]: I0930 06:20:37.478566 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-p7wmt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99a6e728-8795-424d-a99e-7141c75baad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://746d60ed8b3230cabda106f1d2e4bc8496618515f2cfb2d6c4238375f71ad2af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6jfq9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-p7wmt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:37Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:37 crc kubenswrapper[4691]: I0930 06:20:37.492974 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fxv8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37b1a1c7-92d7-41ea-b5c1-4a56b40f819e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be3a8860307e499758a11dd6561cdf1ffd968b3edf9c701b52994ea4cfe1c129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4h4k7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b07ee418ef8f92ea69f7d64ef09e0de246432d4663c4c5018be94894860c6444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4h4k7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fxv8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:37Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:37 crc kubenswrapper[4691]: I0930 06:20:37.507345 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-svjxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8ed6f92-0b98-4b1b-a46e-4d0604d686a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q98dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q98dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-svjxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:37Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:37 crc kubenswrapper[4691]: I0930 06:20:37.520318 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:37Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:37 crc kubenswrapper[4691]: I0930 06:20:37.535745 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca08f5d521940a428454d90f85150e7b87e2353314c89a42b4c73740c3b23163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:37Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:37 crc kubenswrapper[4691]: I0930 06:20:37.556155 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xjjw8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bfd073c-4582-4a65-8170-7030f4852174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://139bc9d998c0acce39c92915ac1c15ec414b09e1f5c0ae15c37f7af6e5cf570d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a065eb18eb909fb60011f5ae6adf327088480996585d74e861628ffdd759bd9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T06:20:26Z\\\",\\\"message\\\":\\\"2025-09-30T06:19:40+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f75473c8-d557-49aa-97dd-d3a6a6daff28\\\\n2025-09-30T06:19:40+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f75473c8-d557-49aa-97dd-d3a6a6daff28 to /host/opt/cni/bin/\\\\n2025-09-30T06:19:41Z [verbose] multus-daemon started\\\\n2025-09-30T06:19:41Z [verbose] Readiness Indicator file check\\\\n2025-09-30T06:20:26Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xjjw8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:37Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:37 crc kubenswrapper[4691]: I0930 06:20:37.568507 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:37 crc kubenswrapper[4691]: I0930 06:20:37.568556 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:37 crc kubenswrapper[4691]: I0930 06:20:37.568568 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:37 crc kubenswrapper[4691]: I0930 06:20:37.568587 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:37 crc kubenswrapper[4691]: I0930 06:20:37.568601 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:37Z","lastTransitionTime":"2025-09-30T06:20:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:37 crc kubenswrapper[4691]: I0930 06:20:37.671489 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:37 crc kubenswrapper[4691]: I0930 06:20:37.671522 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:37 crc kubenswrapper[4691]: I0930 06:20:37.671536 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:37 crc kubenswrapper[4691]: I0930 06:20:37.671552 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:37 crc kubenswrapper[4691]: I0930 06:20:37.671563 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:37Z","lastTransitionTime":"2025-09-30T06:20:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:37 crc kubenswrapper[4691]: I0930 06:20:37.774983 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:37 crc kubenswrapper[4691]: I0930 06:20:37.775086 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:37 crc kubenswrapper[4691]: I0930 06:20:37.775105 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:37 crc kubenswrapper[4691]: I0930 06:20:37.775129 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:37 crc kubenswrapper[4691]: I0930 06:20:37.775146 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:37Z","lastTransitionTime":"2025-09-30T06:20:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:37 crc kubenswrapper[4691]: I0930 06:20:37.878789 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:37 crc kubenswrapper[4691]: I0930 06:20:37.878866 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:37 crc kubenswrapper[4691]: I0930 06:20:37.878913 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:37 crc kubenswrapper[4691]: I0930 06:20:37.878952 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:37 crc kubenswrapper[4691]: I0930 06:20:37.878969 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:37Z","lastTransitionTime":"2025-09-30T06:20:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:37 crc kubenswrapper[4691]: I0930 06:20:37.981786 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:37 crc kubenswrapper[4691]: I0930 06:20:37.981835 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:37 crc kubenswrapper[4691]: I0930 06:20:37.981843 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:37 crc kubenswrapper[4691]: I0930 06:20:37.981856 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:37 crc kubenswrapper[4691]: I0930 06:20:37.981865 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:37Z","lastTransitionTime":"2025-09-30T06:20:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:38 crc kubenswrapper[4691]: I0930 06:20:38.085155 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:38 crc kubenswrapper[4691]: I0930 06:20:38.085227 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:38 crc kubenswrapper[4691]: I0930 06:20:38.085245 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:38 crc kubenswrapper[4691]: I0930 06:20:38.085272 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:38 crc kubenswrapper[4691]: I0930 06:20:38.085291 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:38Z","lastTransitionTime":"2025-09-30T06:20:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:38 crc kubenswrapper[4691]: I0930 06:20:38.189099 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:38 crc kubenswrapper[4691]: I0930 06:20:38.189171 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:38 crc kubenswrapper[4691]: I0930 06:20:38.189189 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:38 crc kubenswrapper[4691]: I0930 06:20:38.189215 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:38 crc kubenswrapper[4691]: I0930 06:20:38.189235 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:38Z","lastTransitionTime":"2025-09-30T06:20:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:38 crc kubenswrapper[4691]: I0930 06:20:38.224476 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 06:20:38 crc kubenswrapper[4691]: I0930 06:20:38.224537 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-svjxq" Sep 30 06:20:38 crc kubenswrapper[4691]: E0930 06:20:38.224752 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 06:20:38 crc kubenswrapper[4691]: E0930 06:20:38.224943 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-svjxq" podUID="a8ed6f92-0b98-4b1b-a46e-4d0604d686a1" Sep 30 06:20:38 crc kubenswrapper[4691]: I0930 06:20:38.292864 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:38 crc kubenswrapper[4691]: I0930 06:20:38.292945 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:38 crc kubenswrapper[4691]: I0930 06:20:38.292964 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:38 crc kubenswrapper[4691]: I0930 06:20:38.292991 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:38 crc kubenswrapper[4691]: I0930 06:20:38.293010 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:38Z","lastTransitionTime":"2025-09-30T06:20:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:38 crc kubenswrapper[4691]: I0930 06:20:38.397561 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:38 crc kubenswrapper[4691]: I0930 06:20:38.397624 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:38 crc kubenswrapper[4691]: I0930 06:20:38.397638 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:38 crc kubenswrapper[4691]: I0930 06:20:38.397661 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:38 crc kubenswrapper[4691]: I0930 06:20:38.397676 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:38Z","lastTransitionTime":"2025-09-30T06:20:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:38 crc kubenswrapper[4691]: I0930 06:20:38.501015 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:38 crc kubenswrapper[4691]: I0930 06:20:38.501074 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:38 crc kubenswrapper[4691]: I0930 06:20:38.501091 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:38 crc kubenswrapper[4691]: I0930 06:20:38.501116 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:38 crc kubenswrapper[4691]: I0930 06:20:38.501137 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:38Z","lastTransitionTime":"2025-09-30T06:20:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:38 crc kubenswrapper[4691]: I0930 06:20:38.604378 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:38 crc kubenswrapper[4691]: I0930 06:20:38.604427 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:38 crc kubenswrapper[4691]: I0930 06:20:38.604438 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:38 crc kubenswrapper[4691]: I0930 06:20:38.604454 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:38 crc kubenswrapper[4691]: I0930 06:20:38.604464 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:38Z","lastTransitionTime":"2025-09-30T06:20:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:38 crc kubenswrapper[4691]: I0930 06:20:38.708064 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:38 crc kubenswrapper[4691]: I0930 06:20:38.708134 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:38 crc kubenswrapper[4691]: I0930 06:20:38.708149 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:38 crc kubenswrapper[4691]: I0930 06:20:38.708182 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:38 crc kubenswrapper[4691]: I0930 06:20:38.708205 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:38Z","lastTransitionTime":"2025-09-30T06:20:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:38 crc kubenswrapper[4691]: I0930 06:20:38.810775 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:38 crc kubenswrapper[4691]: I0930 06:20:38.810851 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:38 crc kubenswrapper[4691]: I0930 06:20:38.810867 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:38 crc kubenswrapper[4691]: I0930 06:20:38.810910 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:38 crc kubenswrapper[4691]: I0930 06:20:38.810928 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:38Z","lastTransitionTime":"2025-09-30T06:20:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:38 crc kubenswrapper[4691]: I0930 06:20:38.914104 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:38 crc kubenswrapper[4691]: I0930 06:20:38.914205 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:38 crc kubenswrapper[4691]: I0930 06:20:38.914231 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:38 crc kubenswrapper[4691]: I0930 06:20:38.914268 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:38 crc kubenswrapper[4691]: I0930 06:20:38.914300 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:38Z","lastTransitionTime":"2025-09-30T06:20:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:39 crc kubenswrapper[4691]: I0930 06:20:39.017009 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:39 crc kubenswrapper[4691]: I0930 06:20:39.017099 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:39 crc kubenswrapper[4691]: I0930 06:20:39.017122 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:39 crc kubenswrapper[4691]: I0930 06:20:39.017165 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:39 crc kubenswrapper[4691]: I0930 06:20:39.017186 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:39Z","lastTransitionTime":"2025-09-30T06:20:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:39 crc kubenswrapper[4691]: I0930 06:20:39.121454 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:39 crc kubenswrapper[4691]: I0930 06:20:39.121521 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:39 crc kubenswrapper[4691]: I0930 06:20:39.121540 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:39 crc kubenswrapper[4691]: I0930 06:20:39.121566 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:39 crc kubenswrapper[4691]: I0930 06:20:39.121585 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:39Z","lastTransitionTime":"2025-09-30T06:20:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:39 crc kubenswrapper[4691]: I0930 06:20:39.225267 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 06:20:39 crc kubenswrapper[4691]: E0930 06:20:39.225402 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 06:20:39 crc kubenswrapper[4691]: I0930 06:20:39.225570 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 06:20:39 crc kubenswrapper[4691]: E0930 06:20:39.225794 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 06:20:39 crc kubenswrapper[4691]: I0930 06:20:39.225945 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:39 crc kubenswrapper[4691]: I0930 06:20:39.226030 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:39 crc kubenswrapper[4691]: I0930 06:20:39.226050 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:39 crc kubenswrapper[4691]: I0930 06:20:39.226082 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:39 crc kubenswrapper[4691]: I0930 06:20:39.226104 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:39Z","lastTransitionTime":"2025-09-30T06:20:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:39 crc kubenswrapper[4691]: I0930 06:20:39.329840 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:39 crc kubenswrapper[4691]: I0930 06:20:39.329961 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:39 crc kubenswrapper[4691]: I0930 06:20:39.329988 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:39 crc kubenswrapper[4691]: I0930 06:20:39.330021 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:39 crc kubenswrapper[4691]: I0930 06:20:39.330047 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:39Z","lastTransitionTime":"2025-09-30T06:20:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:39 crc kubenswrapper[4691]: I0930 06:20:39.434103 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:39 crc kubenswrapper[4691]: I0930 06:20:39.434544 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:39 crc kubenswrapper[4691]: I0930 06:20:39.434702 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:39 crc kubenswrapper[4691]: I0930 06:20:39.435025 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:39 crc kubenswrapper[4691]: I0930 06:20:39.435263 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:39Z","lastTransitionTime":"2025-09-30T06:20:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:39 crc kubenswrapper[4691]: I0930 06:20:39.538324 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:39 crc kubenswrapper[4691]: I0930 06:20:39.538383 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:39 crc kubenswrapper[4691]: I0930 06:20:39.538400 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:39 crc kubenswrapper[4691]: I0930 06:20:39.538427 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:39 crc kubenswrapper[4691]: I0930 06:20:39.538445 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:39Z","lastTransitionTime":"2025-09-30T06:20:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:39 crc kubenswrapper[4691]: I0930 06:20:39.641770 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:39 crc kubenswrapper[4691]: I0930 06:20:39.641829 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:39 crc kubenswrapper[4691]: I0930 06:20:39.641845 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:39 crc kubenswrapper[4691]: I0930 06:20:39.641872 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:39 crc kubenswrapper[4691]: I0930 06:20:39.641913 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:39Z","lastTransitionTime":"2025-09-30T06:20:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:39 crc kubenswrapper[4691]: I0930 06:20:39.745170 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:39 crc kubenswrapper[4691]: I0930 06:20:39.745254 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:39 crc kubenswrapper[4691]: I0930 06:20:39.745290 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:39 crc kubenswrapper[4691]: I0930 06:20:39.745323 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:39 crc kubenswrapper[4691]: I0930 06:20:39.745346 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:39Z","lastTransitionTime":"2025-09-30T06:20:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:39 crc kubenswrapper[4691]: I0930 06:20:39.848360 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:39 crc kubenswrapper[4691]: I0930 06:20:39.848404 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:39 crc kubenswrapper[4691]: I0930 06:20:39.848415 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:39 crc kubenswrapper[4691]: I0930 06:20:39.848433 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:39 crc kubenswrapper[4691]: I0930 06:20:39.848444 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:39Z","lastTransitionTime":"2025-09-30T06:20:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:39 crc kubenswrapper[4691]: I0930 06:20:39.950679 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:39 crc kubenswrapper[4691]: I0930 06:20:39.950736 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:39 crc kubenswrapper[4691]: I0930 06:20:39.950754 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:39 crc kubenswrapper[4691]: I0930 06:20:39.950772 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:39 crc kubenswrapper[4691]: I0930 06:20:39.950783 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:39Z","lastTransitionTime":"2025-09-30T06:20:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:40 crc kubenswrapper[4691]: I0930 06:20:40.054542 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:40 crc kubenswrapper[4691]: I0930 06:20:40.054607 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:40 crc kubenswrapper[4691]: I0930 06:20:40.054625 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:40 crc kubenswrapper[4691]: I0930 06:20:40.054650 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:40 crc kubenswrapper[4691]: I0930 06:20:40.054668 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:40Z","lastTransitionTime":"2025-09-30T06:20:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:40 crc kubenswrapper[4691]: I0930 06:20:40.157629 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:40 crc kubenswrapper[4691]: I0930 06:20:40.157690 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:40 crc kubenswrapper[4691]: I0930 06:20:40.157708 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:40 crc kubenswrapper[4691]: I0930 06:20:40.157731 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:40 crc kubenswrapper[4691]: I0930 06:20:40.157748 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:40Z","lastTransitionTime":"2025-09-30T06:20:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:40 crc kubenswrapper[4691]: I0930 06:20:40.224260 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 06:20:40 crc kubenswrapper[4691]: I0930 06:20:40.224300 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-svjxq" Sep 30 06:20:40 crc kubenswrapper[4691]: E0930 06:20:40.224477 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 06:20:40 crc kubenswrapper[4691]: E0930 06:20:40.224826 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-svjxq" podUID="a8ed6f92-0b98-4b1b-a46e-4d0604d686a1" Sep 30 06:20:40 crc kubenswrapper[4691]: I0930 06:20:40.242228 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Sep 30 06:20:40 crc kubenswrapper[4691]: I0930 06:20:40.259839 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:40 crc kubenswrapper[4691]: I0930 06:20:40.259939 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:40 crc kubenswrapper[4691]: I0930 06:20:40.259962 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:40 crc kubenswrapper[4691]: I0930 06:20:40.259984 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:40 crc kubenswrapper[4691]: I0930 06:20:40.260003 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:40Z","lastTransitionTime":"2025-09-30T06:20:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:40 crc kubenswrapper[4691]: I0930 06:20:40.364054 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:40 crc kubenswrapper[4691]: I0930 06:20:40.364150 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:40 crc kubenswrapper[4691]: I0930 06:20:40.364173 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:40 crc kubenswrapper[4691]: I0930 06:20:40.364197 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:40 crc kubenswrapper[4691]: I0930 06:20:40.364255 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:40Z","lastTransitionTime":"2025-09-30T06:20:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:40 crc kubenswrapper[4691]: I0930 06:20:40.467161 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:40 crc kubenswrapper[4691]: I0930 06:20:40.467216 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:40 crc kubenswrapper[4691]: I0930 06:20:40.467232 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:40 crc kubenswrapper[4691]: I0930 06:20:40.467256 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:40 crc kubenswrapper[4691]: I0930 06:20:40.467274 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:40Z","lastTransitionTime":"2025-09-30T06:20:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:40 crc kubenswrapper[4691]: I0930 06:20:40.570771 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:40 crc kubenswrapper[4691]: I0930 06:20:40.570981 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:40 crc kubenswrapper[4691]: I0930 06:20:40.571019 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:40 crc kubenswrapper[4691]: I0930 06:20:40.571059 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:40 crc kubenswrapper[4691]: I0930 06:20:40.571099 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:40Z","lastTransitionTime":"2025-09-30T06:20:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:40 crc kubenswrapper[4691]: I0930 06:20:40.675093 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:40 crc kubenswrapper[4691]: I0930 06:20:40.675146 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:40 crc kubenswrapper[4691]: I0930 06:20:40.675164 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:40 crc kubenswrapper[4691]: I0930 06:20:40.675189 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:40 crc kubenswrapper[4691]: I0930 06:20:40.675207 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:40Z","lastTransitionTime":"2025-09-30T06:20:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:40 crc kubenswrapper[4691]: I0930 06:20:40.778380 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:40 crc kubenswrapper[4691]: I0930 06:20:40.778456 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:40 crc kubenswrapper[4691]: I0930 06:20:40.778476 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:40 crc kubenswrapper[4691]: I0930 06:20:40.778506 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:40 crc kubenswrapper[4691]: I0930 06:20:40.778525 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:40Z","lastTransitionTime":"2025-09-30T06:20:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:40 crc kubenswrapper[4691]: I0930 06:20:40.882168 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:40 crc kubenswrapper[4691]: I0930 06:20:40.882212 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:40 crc kubenswrapper[4691]: I0930 06:20:40.882225 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:40 crc kubenswrapper[4691]: I0930 06:20:40.882242 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:40 crc kubenswrapper[4691]: I0930 06:20:40.882253 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:40Z","lastTransitionTime":"2025-09-30T06:20:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:40 crc kubenswrapper[4691]: I0930 06:20:40.985357 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:40 crc kubenswrapper[4691]: I0930 06:20:40.985422 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:40 crc kubenswrapper[4691]: I0930 06:20:40.985439 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:40 crc kubenswrapper[4691]: I0930 06:20:40.985464 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:40 crc kubenswrapper[4691]: I0930 06:20:40.985485 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:40Z","lastTransitionTime":"2025-09-30T06:20:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:41 crc kubenswrapper[4691]: I0930 06:20:41.066008 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 06:20:41 crc kubenswrapper[4691]: E0930 06:20:41.066305 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 06:21:45.066264021 +0000 UTC m=+148.541285091 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:20:41 crc kubenswrapper[4691]: I0930 06:20:41.066869 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 06:20:41 crc kubenswrapper[4691]: I0930 06:20:41.067162 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 06:20:41 crc kubenswrapper[4691]: E0930 06:20:41.067036 4691 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 06:20:41 crc kubenswrapper[4691]: E0930 06:20:41.067321 4691 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 06:20:41 crc kubenswrapper[4691]: E0930 06:20:41.067722 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 06:21:45.067429999 +0000 UTC m=+148.542451069 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Sep 30 06:20:41 crc kubenswrapper[4691]: E0930 06:20:41.067794 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-09-30 06:21:45.067735159 +0000 UTC m=+148.542756229 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Sep 30 06:20:41 crc kubenswrapper[4691]: I0930 06:20:41.089638 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:41 crc kubenswrapper[4691]: I0930 06:20:41.089702 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:41 crc kubenswrapper[4691]: I0930 06:20:41.089721 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:41 crc kubenswrapper[4691]: I0930 06:20:41.089750 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:41 crc kubenswrapper[4691]: I0930 06:20:41.089768 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:41Z","lastTransitionTime":"2025-09-30T06:20:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:41 crc kubenswrapper[4691]: I0930 06:20:41.168423 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 06:20:41 crc kubenswrapper[4691]: I0930 06:20:41.168569 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 06:20:41 crc kubenswrapper[4691]: E0930 06:20:41.168796 4691 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 06:20:41 crc kubenswrapper[4691]: E0930 06:20:41.168854 4691 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 06:20:41 crc kubenswrapper[4691]: E0930 06:20:41.168880 4691 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 06:20:41 crc kubenswrapper[4691]: E0930 06:20:41.168805 4691 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Sep 30 06:20:41 crc kubenswrapper[4691]: E0930 06:20:41.168999 4691 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Sep 30 06:20:41 crc kubenswrapper[4691]: E0930 06:20:41.169021 4691 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 06:20:41 crc kubenswrapper[4691]: E0930 06:20:41.169034 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-09-30 06:21:45.168999014 +0000 UTC m=+148.644020084 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 06:20:41 crc kubenswrapper[4691]: E0930 06:20:41.169107 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-09-30 06:21:45.169081816 +0000 UTC m=+148.644102886 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Sep 30 06:20:41 crc kubenswrapper[4691]: I0930 06:20:41.193841 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:41 crc kubenswrapper[4691]: I0930 06:20:41.193927 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:41 crc kubenswrapper[4691]: I0930 06:20:41.193949 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:41 crc kubenswrapper[4691]: I0930 06:20:41.193975 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:41 crc kubenswrapper[4691]: I0930 06:20:41.193996 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:41Z","lastTransitionTime":"2025-09-30T06:20:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:41 crc kubenswrapper[4691]: I0930 06:20:41.223943 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 06:20:41 crc kubenswrapper[4691]: I0930 06:20:41.223948 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 06:20:41 crc kubenswrapper[4691]: E0930 06:20:41.224141 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 06:20:41 crc kubenswrapper[4691]: E0930 06:20:41.224358 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 06:20:41 crc kubenswrapper[4691]: I0930 06:20:41.297500 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:41 crc kubenswrapper[4691]: I0930 06:20:41.297557 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:41 crc kubenswrapper[4691]: I0930 06:20:41.297573 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:41 crc kubenswrapper[4691]: I0930 06:20:41.297599 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:41 crc kubenswrapper[4691]: I0930 06:20:41.297620 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:41Z","lastTransitionTime":"2025-09-30T06:20:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:41 crc kubenswrapper[4691]: I0930 06:20:41.400631 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:41 crc kubenswrapper[4691]: I0930 06:20:41.400690 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:41 crc kubenswrapper[4691]: I0930 06:20:41.400708 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:41 crc kubenswrapper[4691]: I0930 06:20:41.400732 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:41 crc kubenswrapper[4691]: I0930 06:20:41.400750 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:41Z","lastTransitionTime":"2025-09-30T06:20:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:41 crc kubenswrapper[4691]: I0930 06:20:41.504191 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:41 crc kubenswrapper[4691]: I0930 06:20:41.504249 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:41 crc kubenswrapper[4691]: I0930 06:20:41.504262 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:41 crc kubenswrapper[4691]: I0930 06:20:41.504283 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:41 crc kubenswrapper[4691]: I0930 06:20:41.504295 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:41Z","lastTransitionTime":"2025-09-30T06:20:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:41 crc kubenswrapper[4691]: I0930 06:20:41.607685 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:41 crc kubenswrapper[4691]: I0930 06:20:41.607799 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:41 crc kubenswrapper[4691]: I0930 06:20:41.607828 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:41 crc kubenswrapper[4691]: I0930 06:20:41.607859 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:41 crc kubenswrapper[4691]: I0930 06:20:41.607880 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:41Z","lastTransitionTime":"2025-09-30T06:20:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:41 crc kubenswrapper[4691]: I0930 06:20:41.711057 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:41 crc kubenswrapper[4691]: I0930 06:20:41.711116 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:41 crc kubenswrapper[4691]: I0930 06:20:41.711139 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:41 crc kubenswrapper[4691]: I0930 06:20:41.711164 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:41 crc kubenswrapper[4691]: I0930 06:20:41.711186 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:41Z","lastTransitionTime":"2025-09-30T06:20:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:41 crc kubenswrapper[4691]: I0930 06:20:41.815039 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:41 crc kubenswrapper[4691]: I0930 06:20:41.815088 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:41 crc kubenswrapper[4691]: I0930 06:20:41.815099 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:41 crc kubenswrapper[4691]: I0930 06:20:41.815118 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:41 crc kubenswrapper[4691]: I0930 06:20:41.815163 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:41Z","lastTransitionTime":"2025-09-30T06:20:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:41 crc kubenswrapper[4691]: I0930 06:20:41.924187 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:41 crc kubenswrapper[4691]: I0930 06:20:41.924248 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:41 crc kubenswrapper[4691]: I0930 06:20:41.924266 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:41 crc kubenswrapper[4691]: I0930 06:20:41.924291 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:41 crc kubenswrapper[4691]: I0930 06:20:41.924309 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:41Z","lastTransitionTime":"2025-09-30T06:20:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:42 crc kubenswrapper[4691]: I0930 06:20:42.027308 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:42 crc kubenswrapper[4691]: I0930 06:20:42.027385 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:42 crc kubenswrapper[4691]: I0930 06:20:42.027407 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:42 crc kubenswrapper[4691]: I0930 06:20:42.027816 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:42 crc kubenswrapper[4691]: I0930 06:20:42.027860 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:42Z","lastTransitionTime":"2025-09-30T06:20:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:42 crc kubenswrapper[4691]: I0930 06:20:42.131320 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:42 crc kubenswrapper[4691]: I0930 06:20:42.131365 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:42 crc kubenswrapper[4691]: I0930 06:20:42.131381 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:42 crc kubenswrapper[4691]: I0930 06:20:42.131404 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:42 crc kubenswrapper[4691]: I0930 06:20:42.131421 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:42Z","lastTransitionTime":"2025-09-30T06:20:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:42 crc kubenswrapper[4691]: I0930 06:20:42.224757 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 06:20:42 crc kubenswrapper[4691]: I0930 06:20:42.224756 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-svjxq" Sep 30 06:20:42 crc kubenswrapper[4691]: E0930 06:20:42.225015 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 06:20:42 crc kubenswrapper[4691]: E0930 06:20:42.225339 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-svjxq" podUID="a8ed6f92-0b98-4b1b-a46e-4d0604d686a1" Sep 30 06:20:42 crc kubenswrapper[4691]: I0930 06:20:42.234332 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:42 crc kubenswrapper[4691]: I0930 06:20:42.234578 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:42 crc kubenswrapper[4691]: I0930 06:20:42.234719 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:42 crc kubenswrapper[4691]: I0930 06:20:42.234845 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:42 crc kubenswrapper[4691]: I0930 06:20:42.235006 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:42Z","lastTransitionTime":"2025-09-30T06:20:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:42 crc kubenswrapper[4691]: I0930 06:20:42.337495 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:42 crc kubenswrapper[4691]: I0930 06:20:42.337542 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:42 crc kubenswrapper[4691]: I0930 06:20:42.337559 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:42 crc kubenswrapper[4691]: I0930 06:20:42.337582 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:42 crc kubenswrapper[4691]: I0930 06:20:42.337599 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:42Z","lastTransitionTime":"2025-09-30T06:20:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:42 crc kubenswrapper[4691]: I0930 06:20:42.440541 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:42 crc kubenswrapper[4691]: I0930 06:20:42.441117 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:42 crc kubenswrapper[4691]: I0930 06:20:42.441143 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:42 crc kubenswrapper[4691]: I0930 06:20:42.441175 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:42 crc kubenswrapper[4691]: I0930 06:20:42.441203 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:42Z","lastTransitionTime":"2025-09-30T06:20:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:42 crc kubenswrapper[4691]: I0930 06:20:42.544968 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:42 crc kubenswrapper[4691]: I0930 06:20:42.545031 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:42 crc kubenswrapper[4691]: I0930 06:20:42.545045 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:42 crc kubenswrapper[4691]: I0930 06:20:42.545070 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:42 crc kubenswrapper[4691]: I0930 06:20:42.545092 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:42Z","lastTransitionTime":"2025-09-30T06:20:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:42 crc kubenswrapper[4691]: I0930 06:20:42.648100 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:42 crc kubenswrapper[4691]: I0930 06:20:42.648169 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:42 crc kubenswrapper[4691]: I0930 06:20:42.648189 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:42 crc kubenswrapper[4691]: I0930 06:20:42.648212 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:42 crc kubenswrapper[4691]: I0930 06:20:42.648230 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:42Z","lastTransitionTime":"2025-09-30T06:20:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:42 crc kubenswrapper[4691]: I0930 06:20:42.752043 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:42 crc kubenswrapper[4691]: I0930 06:20:42.752114 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:42 crc kubenswrapper[4691]: I0930 06:20:42.752132 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:42 crc kubenswrapper[4691]: I0930 06:20:42.752157 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:42 crc kubenswrapper[4691]: I0930 06:20:42.752176 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:42Z","lastTransitionTime":"2025-09-30T06:20:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:42 crc kubenswrapper[4691]: I0930 06:20:42.854840 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:42 crc kubenswrapper[4691]: I0930 06:20:42.854910 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:42 crc kubenswrapper[4691]: I0930 06:20:42.854929 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:42 crc kubenswrapper[4691]: I0930 06:20:42.854952 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:42 crc kubenswrapper[4691]: I0930 06:20:42.854969 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:42Z","lastTransitionTime":"2025-09-30T06:20:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:42 crc kubenswrapper[4691]: I0930 06:20:42.958008 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:42 crc kubenswrapper[4691]: I0930 06:20:42.958116 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:42 crc kubenswrapper[4691]: I0930 06:20:42.958139 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:42 crc kubenswrapper[4691]: I0930 06:20:42.958163 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:42 crc kubenswrapper[4691]: I0930 06:20:42.958179 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:42Z","lastTransitionTime":"2025-09-30T06:20:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:43 crc kubenswrapper[4691]: I0930 06:20:43.060495 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:43 crc kubenswrapper[4691]: I0930 06:20:43.060555 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:43 crc kubenswrapper[4691]: I0930 06:20:43.060574 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:43 crc kubenswrapper[4691]: I0930 06:20:43.060599 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:43 crc kubenswrapper[4691]: I0930 06:20:43.060619 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:43Z","lastTransitionTime":"2025-09-30T06:20:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:43 crc kubenswrapper[4691]: I0930 06:20:43.163105 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:43 crc kubenswrapper[4691]: I0930 06:20:43.163166 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:43 crc kubenswrapper[4691]: I0930 06:20:43.163183 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:43 crc kubenswrapper[4691]: I0930 06:20:43.163205 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:43 crc kubenswrapper[4691]: I0930 06:20:43.163221 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:43Z","lastTransitionTime":"2025-09-30T06:20:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:43 crc kubenswrapper[4691]: I0930 06:20:43.224301 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 06:20:43 crc kubenswrapper[4691]: I0930 06:20:43.224348 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 06:20:43 crc kubenswrapper[4691]: E0930 06:20:43.224493 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 06:20:43 crc kubenswrapper[4691]: E0930 06:20:43.224619 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 06:20:43 crc kubenswrapper[4691]: I0930 06:20:43.266046 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:43 crc kubenswrapper[4691]: I0930 06:20:43.266087 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:43 crc kubenswrapper[4691]: I0930 06:20:43.266097 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:43 crc kubenswrapper[4691]: I0930 06:20:43.266114 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:43 crc kubenswrapper[4691]: I0930 06:20:43.266125 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:43Z","lastTransitionTime":"2025-09-30T06:20:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:43 crc kubenswrapper[4691]: I0930 06:20:43.368581 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:43 crc kubenswrapper[4691]: I0930 06:20:43.368633 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:43 crc kubenswrapper[4691]: I0930 06:20:43.368651 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:43 crc kubenswrapper[4691]: I0930 06:20:43.368675 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:43 crc kubenswrapper[4691]: I0930 06:20:43.368693 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:43Z","lastTransitionTime":"2025-09-30T06:20:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:43 crc kubenswrapper[4691]: I0930 06:20:43.471816 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:43 crc kubenswrapper[4691]: I0930 06:20:43.471872 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:43 crc kubenswrapper[4691]: I0930 06:20:43.471957 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:43 crc kubenswrapper[4691]: I0930 06:20:43.471987 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:43 crc kubenswrapper[4691]: I0930 06:20:43.472009 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:43Z","lastTransitionTime":"2025-09-30T06:20:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:43 crc kubenswrapper[4691]: I0930 06:20:43.575121 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:43 crc kubenswrapper[4691]: I0930 06:20:43.575181 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:43 crc kubenswrapper[4691]: I0930 06:20:43.575198 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:43 crc kubenswrapper[4691]: I0930 06:20:43.575221 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:43 crc kubenswrapper[4691]: I0930 06:20:43.575239 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:43Z","lastTransitionTime":"2025-09-30T06:20:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:43 crc kubenswrapper[4691]: I0930 06:20:43.678248 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:43 crc kubenswrapper[4691]: I0930 06:20:43.678324 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:43 crc kubenswrapper[4691]: I0930 06:20:43.678360 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:43 crc kubenswrapper[4691]: I0930 06:20:43.678389 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:43 crc kubenswrapper[4691]: I0930 06:20:43.678413 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:43Z","lastTransitionTime":"2025-09-30T06:20:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:43 crc kubenswrapper[4691]: I0930 06:20:43.781417 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:43 crc kubenswrapper[4691]: I0930 06:20:43.781488 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:43 crc kubenswrapper[4691]: I0930 06:20:43.781507 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:43 crc kubenswrapper[4691]: I0930 06:20:43.781528 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:43 crc kubenswrapper[4691]: I0930 06:20:43.781544 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:43Z","lastTransitionTime":"2025-09-30T06:20:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:43 crc kubenswrapper[4691]: I0930 06:20:43.884979 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:43 crc kubenswrapper[4691]: I0930 06:20:43.885065 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:43 crc kubenswrapper[4691]: I0930 06:20:43.885083 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:43 crc kubenswrapper[4691]: I0930 06:20:43.885106 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:43 crc kubenswrapper[4691]: I0930 06:20:43.885123 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:43Z","lastTransitionTime":"2025-09-30T06:20:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:43 crc kubenswrapper[4691]: I0930 06:20:43.988148 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:43 crc kubenswrapper[4691]: I0930 06:20:43.988220 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:43 crc kubenswrapper[4691]: I0930 06:20:43.988243 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:43 crc kubenswrapper[4691]: I0930 06:20:43.988272 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:43 crc kubenswrapper[4691]: I0930 06:20:43.988292 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:43Z","lastTransitionTime":"2025-09-30T06:20:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:44 crc kubenswrapper[4691]: I0930 06:20:44.090811 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:44 crc kubenswrapper[4691]: I0930 06:20:44.090936 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:44 crc kubenswrapper[4691]: I0930 06:20:44.090955 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:44 crc kubenswrapper[4691]: I0930 06:20:44.090980 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:44 crc kubenswrapper[4691]: I0930 06:20:44.090999 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:44Z","lastTransitionTime":"2025-09-30T06:20:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:44 crc kubenswrapper[4691]: I0930 06:20:44.193813 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:44 crc kubenswrapper[4691]: I0930 06:20:44.193881 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:44 crc kubenswrapper[4691]: I0930 06:20:44.193943 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:44 crc kubenswrapper[4691]: I0930 06:20:44.193975 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:44 crc kubenswrapper[4691]: I0930 06:20:44.193999 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:44Z","lastTransitionTime":"2025-09-30T06:20:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:44 crc kubenswrapper[4691]: I0930 06:20:44.224671 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 06:20:44 crc kubenswrapper[4691]: I0930 06:20:44.224704 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-svjxq" Sep 30 06:20:44 crc kubenswrapper[4691]: E0930 06:20:44.224876 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 06:20:44 crc kubenswrapper[4691]: E0930 06:20:44.225034 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-svjxq" podUID="a8ed6f92-0b98-4b1b-a46e-4d0604d686a1" Sep 30 06:20:44 crc kubenswrapper[4691]: I0930 06:20:44.296679 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:44 crc kubenswrapper[4691]: I0930 06:20:44.296741 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:44 crc kubenswrapper[4691]: I0930 06:20:44.296759 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:44 crc kubenswrapper[4691]: I0930 06:20:44.296783 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:44 crc kubenswrapper[4691]: I0930 06:20:44.296800 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:44Z","lastTransitionTime":"2025-09-30T06:20:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:44 crc kubenswrapper[4691]: I0930 06:20:44.399861 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:44 crc kubenswrapper[4691]: I0930 06:20:44.399961 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:44 crc kubenswrapper[4691]: I0930 06:20:44.399984 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:44 crc kubenswrapper[4691]: I0930 06:20:44.400017 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:44 crc kubenswrapper[4691]: I0930 06:20:44.400036 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:44Z","lastTransitionTime":"2025-09-30T06:20:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:44 crc kubenswrapper[4691]: I0930 06:20:44.503514 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:44 crc kubenswrapper[4691]: I0930 06:20:44.503582 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:44 crc kubenswrapper[4691]: I0930 06:20:44.503604 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:44 crc kubenswrapper[4691]: I0930 06:20:44.503633 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:44 crc kubenswrapper[4691]: I0930 06:20:44.503652 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:44Z","lastTransitionTime":"2025-09-30T06:20:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:44 crc kubenswrapper[4691]: I0930 06:20:44.606075 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:44 crc kubenswrapper[4691]: I0930 06:20:44.606144 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:44 crc kubenswrapper[4691]: I0930 06:20:44.606165 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:44 crc kubenswrapper[4691]: I0930 06:20:44.606192 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:44 crc kubenswrapper[4691]: I0930 06:20:44.606216 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:44Z","lastTransitionTime":"2025-09-30T06:20:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:44 crc kubenswrapper[4691]: I0930 06:20:44.709255 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:44 crc kubenswrapper[4691]: I0930 06:20:44.709343 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:44 crc kubenswrapper[4691]: I0930 06:20:44.709362 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:44 crc kubenswrapper[4691]: I0930 06:20:44.709383 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:44 crc kubenswrapper[4691]: I0930 06:20:44.709400 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:44Z","lastTransitionTime":"2025-09-30T06:20:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:44 crc kubenswrapper[4691]: I0930 06:20:44.811733 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:44 crc kubenswrapper[4691]: I0930 06:20:44.811820 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:44 crc kubenswrapper[4691]: I0930 06:20:44.811837 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:44 crc kubenswrapper[4691]: I0930 06:20:44.811859 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:44 crc kubenswrapper[4691]: I0930 06:20:44.811876 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:44Z","lastTransitionTime":"2025-09-30T06:20:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:44 crc kubenswrapper[4691]: I0930 06:20:44.914476 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:44 crc kubenswrapper[4691]: I0930 06:20:44.914546 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:44 crc kubenswrapper[4691]: I0930 06:20:44.914570 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:44 crc kubenswrapper[4691]: I0930 06:20:44.914598 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:44 crc kubenswrapper[4691]: I0930 06:20:44.914617 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:44Z","lastTransitionTime":"2025-09-30T06:20:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:45 crc kubenswrapper[4691]: I0930 06:20:45.017395 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:45 crc kubenswrapper[4691]: I0930 06:20:45.017477 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:45 crc kubenswrapper[4691]: I0930 06:20:45.017499 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:45 crc kubenswrapper[4691]: I0930 06:20:45.017529 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:45 crc kubenswrapper[4691]: I0930 06:20:45.017551 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:45Z","lastTransitionTime":"2025-09-30T06:20:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:45 crc kubenswrapper[4691]: I0930 06:20:45.121344 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:45 crc kubenswrapper[4691]: I0930 06:20:45.121448 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:45 crc kubenswrapper[4691]: I0930 06:20:45.121467 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:45 crc kubenswrapper[4691]: I0930 06:20:45.121491 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:45 crc kubenswrapper[4691]: I0930 06:20:45.121508 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:45Z","lastTransitionTime":"2025-09-30T06:20:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:45 crc kubenswrapper[4691]: I0930 06:20:45.224153 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 06:20:45 crc kubenswrapper[4691]: I0930 06:20:45.224153 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 06:20:45 crc kubenswrapper[4691]: E0930 06:20:45.224359 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 06:20:45 crc kubenswrapper[4691]: I0930 06:20:45.224572 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:45 crc kubenswrapper[4691]: I0930 06:20:45.224631 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:45 crc kubenswrapper[4691]: I0930 06:20:45.224659 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:45 crc kubenswrapper[4691]: I0930 06:20:45.224730 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:45 crc kubenswrapper[4691]: E0930 06:20:45.224617 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 06:20:45 crc kubenswrapper[4691]: I0930 06:20:45.224849 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:45Z","lastTransitionTime":"2025-09-30T06:20:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:45 crc kubenswrapper[4691]: I0930 06:20:45.261447 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:45 crc kubenswrapper[4691]: I0930 06:20:45.261501 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:45 crc kubenswrapper[4691]: I0930 06:20:45.261518 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:45 crc kubenswrapper[4691]: I0930 06:20:45.261539 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:45 crc kubenswrapper[4691]: I0930 06:20:45.261556 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:45Z","lastTransitionTime":"2025-09-30T06:20:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:45 crc kubenswrapper[4691]: E0930 06:20:45.281847 4691 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5d33cbaa-1b8a-4dde-af56-05c3aae2213e\\\",\\\"systemUUID\\\":\\\"d7c4eecd-4486-44ba-8bf7-42bfa69eb2b7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:45Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:45 crc kubenswrapper[4691]: I0930 06:20:45.288025 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:45 crc kubenswrapper[4691]: I0930 06:20:45.288067 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:45 crc kubenswrapper[4691]: I0930 06:20:45.288076 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:45 crc kubenswrapper[4691]: I0930 06:20:45.288090 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:45 crc kubenswrapper[4691]: I0930 06:20:45.288101 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:45Z","lastTransitionTime":"2025-09-30T06:20:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:45 crc kubenswrapper[4691]: E0930 06:20:45.304695 4691 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5d33cbaa-1b8a-4dde-af56-05c3aae2213e\\\",\\\"systemUUID\\\":\\\"d7c4eecd-4486-44ba-8bf7-42bfa69eb2b7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:45Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:45 crc kubenswrapper[4691]: I0930 06:20:45.309244 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:45 crc kubenswrapper[4691]: I0930 06:20:45.309268 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:45 crc kubenswrapper[4691]: I0930 06:20:45.309278 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:45 crc kubenswrapper[4691]: I0930 06:20:45.309293 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:45 crc kubenswrapper[4691]: I0930 06:20:45.309303 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:45Z","lastTransitionTime":"2025-09-30T06:20:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:45 crc kubenswrapper[4691]: E0930 06:20:45.322532 4691 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5d33cbaa-1b8a-4dde-af56-05c3aae2213e\\\",\\\"systemUUID\\\":\\\"d7c4eecd-4486-44ba-8bf7-42bfa69eb2b7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:45Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:45 crc kubenswrapper[4691]: I0930 06:20:45.326770 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:45 crc kubenswrapper[4691]: I0930 06:20:45.326838 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:45 crc kubenswrapper[4691]: I0930 06:20:45.326862 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:45 crc kubenswrapper[4691]: I0930 06:20:45.326918 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:45 crc kubenswrapper[4691]: I0930 06:20:45.326946 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:45Z","lastTransitionTime":"2025-09-30T06:20:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:45 crc kubenswrapper[4691]: E0930 06:20:45.354519 4691 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5d33cbaa-1b8a-4dde-af56-05c3aae2213e\\\",\\\"systemUUID\\\":\\\"d7c4eecd-4486-44ba-8bf7-42bfa69eb2b7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:45Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:45 crc kubenswrapper[4691]: I0930 06:20:45.359811 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:45 crc kubenswrapper[4691]: I0930 06:20:45.359877 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:45 crc kubenswrapper[4691]: I0930 06:20:45.359947 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:45 crc kubenswrapper[4691]: I0930 06:20:45.359981 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:45 crc kubenswrapper[4691]: I0930 06:20:45.360002 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:45Z","lastTransitionTime":"2025-09-30T06:20:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:45 crc kubenswrapper[4691]: E0930 06:20:45.374236 4691 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:45Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5d33cbaa-1b8a-4dde-af56-05c3aae2213e\\\",\\\"systemUUID\\\":\\\"d7c4eecd-4486-44ba-8bf7-42bfa69eb2b7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:45Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:45 crc kubenswrapper[4691]: E0930 06:20:45.374457 4691 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 06:20:45 crc kubenswrapper[4691]: I0930 06:20:45.376594 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:45 crc kubenswrapper[4691]: I0930 06:20:45.376676 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:45 crc kubenswrapper[4691]: I0930 06:20:45.376704 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:45 crc kubenswrapper[4691]: I0930 06:20:45.376734 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:45 crc kubenswrapper[4691]: I0930 06:20:45.376759 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:45Z","lastTransitionTime":"2025-09-30T06:20:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:45 crc kubenswrapper[4691]: I0930 06:20:45.478742 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:45 crc kubenswrapper[4691]: I0930 06:20:45.478808 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:45 crc kubenswrapper[4691]: I0930 06:20:45.478831 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:45 crc kubenswrapper[4691]: I0930 06:20:45.478859 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:45 crc kubenswrapper[4691]: I0930 06:20:45.478916 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:45Z","lastTransitionTime":"2025-09-30T06:20:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:45 crc kubenswrapper[4691]: I0930 06:20:45.581245 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:45 crc kubenswrapper[4691]: I0930 06:20:45.581328 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:45 crc kubenswrapper[4691]: I0930 06:20:45.581350 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:45 crc kubenswrapper[4691]: I0930 06:20:45.581382 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:45 crc kubenswrapper[4691]: I0930 06:20:45.581408 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:45Z","lastTransitionTime":"2025-09-30T06:20:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:45 crc kubenswrapper[4691]: I0930 06:20:45.684657 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:45 crc kubenswrapper[4691]: I0930 06:20:45.684729 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:45 crc kubenswrapper[4691]: I0930 06:20:45.684749 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:45 crc kubenswrapper[4691]: I0930 06:20:45.684775 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:45 crc kubenswrapper[4691]: I0930 06:20:45.684793 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:45Z","lastTransitionTime":"2025-09-30T06:20:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:45 crc kubenswrapper[4691]: I0930 06:20:45.787097 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:45 crc kubenswrapper[4691]: I0930 06:20:45.787157 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:45 crc kubenswrapper[4691]: I0930 06:20:45.787173 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:45 crc kubenswrapper[4691]: I0930 06:20:45.787195 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:45 crc kubenswrapper[4691]: I0930 06:20:45.787212 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:45Z","lastTransitionTime":"2025-09-30T06:20:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:45 crc kubenswrapper[4691]: I0930 06:20:45.890656 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:45 crc kubenswrapper[4691]: I0930 06:20:45.890722 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:45 crc kubenswrapper[4691]: I0930 06:20:45.890739 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:45 crc kubenswrapper[4691]: I0930 06:20:45.890766 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:45 crc kubenswrapper[4691]: I0930 06:20:45.890783 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:45Z","lastTransitionTime":"2025-09-30T06:20:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:45 crc kubenswrapper[4691]: I0930 06:20:45.993777 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:45 crc kubenswrapper[4691]: I0930 06:20:45.993847 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:45 crc kubenswrapper[4691]: I0930 06:20:45.993867 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:45 crc kubenswrapper[4691]: I0930 06:20:45.993932 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:45 crc kubenswrapper[4691]: I0930 06:20:45.993950 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:45Z","lastTransitionTime":"2025-09-30T06:20:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:46 crc kubenswrapper[4691]: I0930 06:20:46.096626 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:46 crc kubenswrapper[4691]: I0930 06:20:46.096712 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:46 crc kubenswrapper[4691]: I0930 06:20:46.096733 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:46 crc kubenswrapper[4691]: I0930 06:20:46.096758 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:46 crc kubenswrapper[4691]: I0930 06:20:46.096775 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:46Z","lastTransitionTime":"2025-09-30T06:20:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:46 crc kubenswrapper[4691]: I0930 06:20:46.200687 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:46 crc kubenswrapper[4691]: I0930 06:20:46.200768 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:46 crc kubenswrapper[4691]: I0930 06:20:46.200792 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:46 crc kubenswrapper[4691]: I0930 06:20:46.200824 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:46 crc kubenswrapper[4691]: I0930 06:20:46.200848 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:46Z","lastTransitionTime":"2025-09-30T06:20:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:46 crc kubenswrapper[4691]: I0930 06:20:46.224615 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 06:20:46 crc kubenswrapper[4691]: I0930 06:20:46.224721 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-svjxq" Sep 30 06:20:46 crc kubenswrapper[4691]: E0930 06:20:46.224938 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 06:20:46 crc kubenswrapper[4691]: E0930 06:20:46.225247 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-svjxq" podUID="a8ed6f92-0b98-4b1b-a46e-4d0604d686a1" Sep 30 06:20:46 crc kubenswrapper[4691]: I0930 06:20:46.304572 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:46 crc kubenswrapper[4691]: I0930 06:20:46.304629 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:46 crc kubenswrapper[4691]: I0930 06:20:46.304645 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:46 crc kubenswrapper[4691]: I0930 06:20:46.304670 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:46 crc kubenswrapper[4691]: I0930 06:20:46.304688 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:46Z","lastTransitionTime":"2025-09-30T06:20:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:46 crc kubenswrapper[4691]: I0930 06:20:46.408609 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:46 crc kubenswrapper[4691]: I0930 06:20:46.408698 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:46 crc kubenswrapper[4691]: I0930 06:20:46.408715 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:46 crc kubenswrapper[4691]: I0930 06:20:46.408742 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:46 crc kubenswrapper[4691]: I0930 06:20:46.408759 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:46Z","lastTransitionTime":"2025-09-30T06:20:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:46 crc kubenswrapper[4691]: I0930 06:20:46.512538 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:46 crc kubenswrapper[4691]: I0930 06:20:46.512986 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:46 crc kubenswrapper[4691]: I0930 06:20:46.513152 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:46 crc kubenswrapper[4691]: I0930 06:20:46.513297 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:46 crc kubenswrapper[4691]: I0930 06:20:46.513431 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:46Z","lastTransitionTime":"2025-09-30T06:20:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:46 crc kubenswrapper[4691]: I0930 06:20:46.617325 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:46 crc kubenswrapper[4691]: I0930 06:20:46.617732 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:46 crc kubenswrapper[4691]: I0930 06:20:46.617945 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:46 crc kubenswrapper[4691]: I0930 06:20:46.618155 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:46 crc kubenswrapper[4691]: I0930 06:20:46.618320 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:46Z","lastTransitionTime":"2025-09-30T06:20:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:46 crc kubenswrapper[4691]: I0930 06:20:46.722415 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:46 crc kubenswrapper[4691]: I0930 06:20:46.722471 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:46 crc kubenswrapper[4691]: I0930 06:20:46.722491 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:46 crc kubenswrapper[4691]: I0930 06:20:46.722519 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:46 crc kubenswrapper[4691]: I0930 06:20:46.722537 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:46Z","lastTransitionTime":"2025-09-30T06:20:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:46 crc kubenswrapper[4691]: I0930 06:20:46.824912 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:46 crc kubenswrapper[4691]: I0930 06:20:46.824977 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:46 crc kubenswrapper[4691]: I0930 06:20:46.824996 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:46 crc kubenswrapper[4691]: I0930 06:20:46.825020 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:46 crc kubenswrapper[4691]: I0930 06:20:46.825065 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:46Z","lastTransitionTime":"2025-09-30T06:20:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:46 crc kubenswrapper[4691]: I0930 06:20:46.927571 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:46 crc kubenswrapper[4691]: I0930 06:20:46.927647 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:46 crc kubenswrapper[4691]: I0930 06:20:46.927670 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:46 crc kubenswrapper[4691]: I0930 06:20:46.927698 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:46 crc kubenswrapper[4691]: I0930 06:20:46.927718 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:46Z","lastTransitionTime":"2025-09-30T06:20:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:47 crc kubenswrapper[4691]: I0930 06:20:47.030367 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:47 crc kubenswrapper[4691]: I0930 06:20:47.030431 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:47 crc kubenswrapper[4691]: I0930 06:20:47.030453 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:47 crc kubenswrapper[4691]: I0930 06:20:47.030497 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:47 crc kubenswrapper[4691]: I0930 06:20:47.030519 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:47Z","lastTransitionTime":"2025-09-30T06:20:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:47 crc kubenswrapper[4691]: I0930 06:20:47.133571 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:47 crc kubenswrapper[4691]: I0930 06:20:47.133639 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:47 crc kubenswrapper[4691]: I0930 06:20:47.133665 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:47 crc kubenswrapper[4691]: I0930 06:20:47.133694 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:47 crc kubenswrapper[4691]: I0930 06:20:47.133717 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:47Z","lastTransitionTime":"2025-09-30T06:20:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:47 crc kubenswrapper[4691]: I0930 06:20:47.224733 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 06:20:47 crc kubenswrapper[4691]: I0930 06:20:47.224790 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 06:20:47 crc kubenswrapper[4691]: E0930 06:20:47.224972 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 06:20:47 crc kubenswrapper[4691]: E0930 06:20:47.225080 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 06:20:47 crc kubenswrapper[4691]: I0930 06:20:47.236775 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:47 crc kubenswrapper[4691]: I0930 06:20:47.236856 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:47 crc kubenswrapper[4691]: I0930 06:20:47.236916 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:47 crc kubenswrapper[4691]: I0930 06:20:47.236960 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:47 crc kubenswrapper[4691]: I0930 06:20:47.237021 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:47Z","lastTransitionTime":"2025-09-30T06:20:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:47 crc kubenswrapper[4691]: I0930 06:20:47.241774 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"782724ba-31c7-4bd7-bd92-05a1e5a31c13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3410624e48656bd289ed6d53742ae7dd5e0ba5148c42d9964a540ae97bd0b8d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fff6d766366a54e73752a41e180f1e850b8e9c41a8189b7f1df5f82d28e2566e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fff6d766366a54e73752a41e180f1e850b8e9c41a8189b7f1df5f82d28e2566e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:47Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:47 crc kubenswrapper[4691]: I0930 06:20:47.257425 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3ecaa3-192f-40d4-abf8-938b9e03a661\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://469f62522e2f813405840a8654f761a52163bdabeba8cc7c57998ccd40548370\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198c76a83115408261c01c78cbd2845e487ad3dc896da0130b2b1338349506d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aadde3a7a603aae354d4f9dcf5ef288aeceabfb73ee3c92398fb3419326b27a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2f054ea0da71d1f65e501212a66771c5e65bfc123e3a3acd5e13900def039d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:47Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:47 crc kubenswrapper[4691]: I0930 06:20:47.273414 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c655fbd5-0708-4151-b0fb-a97d8e1826bc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c055c319af36509c20e700f8f5025b0d356ca5e6038be80dd69282a1f1ad716b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://810c839e6c66bbacb466fb7023bed728b17be9d13025e2db26ee5b40fea124f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1f3d8c18456850212ce283d46273b39939040ddd575f193acc0910cf479f3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://deebf28973e8320961c1318918377f952fadba50dec1e580e461de659c63f23e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://deebf28973e8320961c1318918377f952fadba50dec1e580e461de659c63f23e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:47Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:47 crc kubenswrapper[4691]: I0930 06:20:47.298586 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11fb453d515798fa4f85603df19a5dc7d305375d25c0f679598613aba7312328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:47Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:47 crc kubenswrapper[4691]: I0930 06:20:47.324937 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nzp64" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b7ceecf30a44692340aed4f0df2aa58e8c9552f8ef872b7a872b29516fa2068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d675d9797e8541b6fff5df78a8d0701a51d23b03bd2d489d0489a82f440b4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d675d9797e8541b6fff5df78a8d0701a51d23b03bd2d489d0489a82f440b4a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aacdb742861442ec0e7b14904397b4e1a4bd115cd56e6105b37c42cb762b0dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aacdb742861442ec0e7b14904397b4e1a4bd115cd56e6105b37c42cb762b0dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f6c31e7404be56f11fd658c5e92f44de4704b9c2d5ebbce1fd0af40d2ca8c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1f6c31e7404be56f11fd658c5e92f44de4704b9c2d5ebbce1fd0af40d2ca8c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74da2d6dc28ebfcbfff21e47ebad84a906c4686fddf32d04eb494be40d811b86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74da2d6dc28ebfcbfff21e47ebad84a906c4686fddf32d04eb494be40d811b86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d1c4f2763fc5ea8ecd81a1c6aff31d291156a4db150fc70c0111b811bd5467c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d1c4f2763fc5ea8ecd81a1c6aff31d291156a4db150fc70c0111b811bd5467c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f313d83849679400f0ef2223f644c042290a33d55cab935fdbe9af123c88d210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f313d83849679400f0ef2223f644c042290a33d55cab935fdbe9af123c88d210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nzp64\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:47Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:47 crc kubenswrapper[4691]: I0930 06:20:47.339379 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:47 crc kubenswrapper[4691]: I0930 06:20:47.339444 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:47 crc kubenswrapper[4691]: I0930 06:20:47.339482 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:47 crc kubenswrapper[4691]: I0930 06:20:47.339505 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:47 crc kubenswrapper[4691]: I0930 06:20:47.339519 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:47Z","lastTransitionTime":"2025-09-30T06:20:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:47 crc kubenswrapper[4691]: I0930 06:20:47.363801 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b6df86867d5427797ec0c0a28b653ef5b8ec62af450207d9af5f490df725b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04bc58a7b740ca100f333ef769a949f962ff6024f8dd87b798c99a28514e0394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d95203bc3297779316f06d001c72859f9b4561a9a9d7c3123c5a75e8d8bc6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f46927454f67993366632bb3f6e37e4cb0c5f013266b18c1ac61e9cddc42e023\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20789e47d7e584044b0763af1eb8b81bbde2f247ec14e56a5a9b08c28894c3be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d002e869e911c24f154df90324ace473c747a4cfa6dd60ddcbef6473f9815f72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76736edc05ab57e4ee3d6dea3b7f4d2033e718cdac579abe7c4f41fe6988f62e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76736edc05ab57e4ee3d6dea3b7f4d2033e718cdac579abe7c4f41fe6988f62e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T06:20:35Z\\\",\\\"message\\\":\\\"selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.254:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e4e4203e-87c7-4024-930a-5d6bdfe2bdde}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:NB_Global Row:map[] Rows:[] Columns:[] Mutations:[{Column:nb_cfg Mutator:+= Value:1}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6011affd-30a6-4be6-872d-e4cf1ca780cf}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0930 06:20:35.149244 6712 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI0930 06:20:35.149429 6712 model_client.go:382] Update operations generated as: [{Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.92 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {73135118-cf1b-4568-bd31-2f50308bf69d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0930 06:20:35.149455 6712 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-crc\\\\nI0930 06:20:35.149188 6712 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-xjjw8 after 0 failed attempt(s)\\\\nI0930 06:20:35.149471 6712 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:L\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:20:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-sjmvw_openshift-ovn-kubernetes(6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f49abce591ae5c912fdbbdf0253d06ee20088424718dd6d535683a2a2ede5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sjmvw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:47Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:47 crc kubenswrapper[4691]: I0930 06:20:47.390758 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:47Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:47 crc kubenswrapper[4691]: I0930 06:20:47.411528 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca08f5d521940a428454d90f85150e7b87e2353314c89a42b4c73740c3b23163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:47Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:47 crc kubenswrapper[4691]: I0930 06:20:47.432825 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xjjw8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bfd073c-4582-4a65-8170-7030f4852174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://139bc9d998c0acce39c92915ac1c15ec414b09e1f5c0ae15c37f7af6e5cf570d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a065eb18eb909fb60011f5ae6adf327088480996585d74e861628ffdd759bd9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T06:20:26Z\\\",\\\"message\\\":\\\"2025-09-30T06:19:40+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f75473c8-d557-49aa-97dd-d3a6a6daff28\\\\n2025-09-30T06:19:40+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f75473c8-d557-49aa-97dd-d3a6a6daff28 to /host/opt/cni/bin/\\\\n2025-09-30T06:19:41Z [verbose] multus-daemon started\\\\n2025-09-30T06:19:41Z [verbose] Readiness Indicator file check\\\\n2025-09-30T06:20:26Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xjjw8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:47Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:47 crc kubenswrapper[4691]: I0930 06:20:47.441655 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:47 crc kubenswrapper[4691]: I0930 06:20:47.441703 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:47 crc kubenswrapper[4691]: I0930 06:20:47.441719 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:47 crc kubenswrapper[4691]: I0930 06:20:47.441738 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:47 crc kubenswrapper[4691]: I0930 06:20:47.441750 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:47Z","lastTransitionTime":"2025-09-30T06:20:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:47 crc kubenswrapper[4691]: I0930 06:20:47.449132 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-p7wmt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99a6e728-8795-424d-a99e-7141c75baad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://746d60ed8b3230cabda106f1d2e4bc8496618515f2cfb2d6c4238375f71ad2af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6jfq9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-p7wmt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:47Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:47 crc kubenswrapper[4691]: I0930 06:20:47.467789 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fxv8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37b1a1c7-92d7-41ea-b5c1-4a56b40f819e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be3a8860307e499758a11dd6561cdf1ffd968b3edf9c701b52994ea4cfe1c129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4h4k7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b07ee418ef8f92ea69f7d64ef09e0de246432d4663c4c5018be94894860c6444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4h4k7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fxv8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:47Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:47 crc kubenswrapper[4691]: I0930 06:20:47.485734 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-svjxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8ed6f92-0b98-4b1b-a46e-4d0604d686a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q98dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q98dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-svjxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:47Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:47 crc kubenswrapper[4691]: I0930 06:20:47.504102 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b46ade-8260-448f-84b7-506632d23ff9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff4769002bae0f06423979897a58fdb4b2a78f9ea00abc1d5df5c8d4385c0919\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xszvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1440cd9bacd9b28d5e192b3d9143dda931c741606fdb8f246fba23b8a68c534c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xszvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4w4k6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:47Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:47 crc kubenswrapper[4691]: I0930 06:20:47.537766 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e55ae5a0-03bf-427d-92b2-39cdff18340d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f6cedf7ad601b81415785791be352c849e1d559ab8e6a0f33de2c89a93cf09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65513dc3f11ab14acbafe004e02c1b395b74935e84a00e75d9936bde03a97fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6240aa4c18020b30de9389e7a1869c736424444208d4011f228b4e5432520cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2841452dae413fa8cdf532c4933ed52cfb007be43a45ee2d25209b54890fd351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b052f8474075dfc1f07b8e0844072d5bbaa94bae1711f7e4cecc6928a65689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:47Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:47 crc kubenswrapper[4691]: I0930 06:20:47.544215 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:47 crc kubenswrapper[4691]: I0930 06:20:47.544273 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:47 crc kubenswrapper[4691]: I0930 06:20:47.544293 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:47 crc kubenswrapper[4691]: I0930 06:20:47.544319 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:47 crc kubenswrapper[4691]: I0930 06:20:47.544339 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:47Z","lastTransitionTime":"2025-09-30T06:20:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:47 crc kubenswrapper[4691]: I0930 06:20:47.559231 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb1cfb02-1e6b-4bfb-8104-f1d137231de9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07fbce63750320e7345e1a99b4be96355f54c36c70e8929599b990eab3f3e637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38c9aa49f81f590d234daa993bb92af0c48103b45f5116c27b86daa651930a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acbe061d62e7bf11b0003c035dce19386981672d5e63236fada6e43b92274c91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fc15eff68924ff75a90b9cb7b5119d95ced9b9fe9e12968bfd077db73f37ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://017d7c032ea394e2dbef22cfc837c65ce54aab7db5fbd32210312b3e7042a698\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T06:19:31Z\\\",\\\"message\\\":\\\"W0930 06:19:20.583735 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 06:19:20.584237 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759213160 cert, and key in /tmp/serving-cert-2229078217/serving-signer.crt, /tmp/serving-cert-2229078217/serving-signer.key\\\\nI0930 06:19:20.815054 1 observer_polling.go:159] Starting file observer\\\\nW0930 06:19:20.818069 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 06:19:20.818377 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 06:19:20.821284 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2229078217/tls.crt::/tmp/serving-cert-2229078217/tls.key\\\\\\\"\\\\nF0930 06:19:31.389088 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39d7f1cd59b9a24d4303415d873bac1ed21326847749130167868b5298b7f8e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:47Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:47 crc kubenswrapper[4691]: I0930 06:20:47.578991 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:47Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:47 crc kubenswrapper[4691]: I0930 06:20:47.598776 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03f725fd435423024f16548f2fdd44ad1ddc26d1d485d56cc5b614ab2cac7a76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5fcea9ccb5c1331b7ae737abdcc80364814715f3c43c15793316f16acf0502a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:47Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:47 crc kubenswrapper[4691]: I0930 06:20:47.617235 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:47Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:47 crc kubenswrapper[4691]: I0930 06:20:47.632618 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8htrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c1c8663-d263-4b8c-93fa-05ee1b61d7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73cf34ef0b11305b959da324b55f8a211f0c278cf2dec835bb6c91ad385e4dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj86t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8htrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:47Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:47 crc kubenswrapper[4691]: I0930 06:20:47.647385 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:47 crc kubenswrapper[4691]: I0930 06:20:47.647589 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:47 crc kubenswrapper[4691]: I0930 06:20:47.647706 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:47 crc kubenswrapper[4691]: I0930 06:20:47.647828 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:47 crc kubenswrapper[4691]: I0930 06:20:47.647951 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:47Z","lastTransitionTime":"2025-09-30T06:20:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:47 crc kubenswrapper[4691]: I0930 06:20:47.750544 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:47 crc kubenswrapper[4691]: I0930 06:20:47.750595 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:47 crc kubenswrapper[4691]: I0930 06:20:47.750635 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:47 crc kubenswrapper[4691]: I0930 06:20:47.750658 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:47 crc kubenswrapper[4691]: I0930 06:20:47.750674 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:47Z","lastTransitionTime":"2025-09-30T06:20:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:47 crc kubenswrapper[4691]: I0930 06:20:47.855018 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:47 crc kubenswrapper[4691]: I0930 06:20:47.855061 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:47 crc kubenswrapper[4691]: I0930 06:20:47.855072 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:47 crc kubenswrapper[4691]: I0930 06:20:47.855091 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:47 crc kubenswrapper[4691]: I0930 06:20:47.855100 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:47Z","lastTransitionTime":"2025-09-30T06:20:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:47 crc kubenswrapper[4691]: I0930 06:20:47.958005 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:47 crc kubenswrapper[4691]: I0930 06:20:47.958067 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:47 crc kubenswrapper[4691]: I0930 06:20:47.958085 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:47 crc kubenswrapper[4691]: I0930 06:20:47.958111 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:47 crc kubenswrapper[4691]: I0930 06:20:47.958128 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:47Z","lastTransitionTime":"2025-09-30T06:20:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:48 crc kubenswrapper[4691]: I0930 06:20:48.061265 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:48 crc kubenswrapper[4691]: I0930 06:20:48.061405 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:48 crc kubenswrapper[4691]: I0930 06:20:48.061419 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:48 crc kubenswrapper[4691]: I0930 06:20:48.061438 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:48 crc kubenswrapper[4691]: I0930 06:20:48.061450 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:48Z","lastTransitionTime":"2025-09-30T06:20:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:48 crc kubenswrapper[4691]: I0930 06:20:48.164451 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:48 crc kubenswrapper[4691]: I0930 06:20:48.164520 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:48 crc kubenswrapper[4691]: I0930 06:20:48.164539 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:48 crc kubenswrapper[4691]: I0930 06:20:48.164565 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:48 crc kubenswrapper[4691]: I0930 06:20:48.164582 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:48Z","lastTransitionTime":"2025-09-30T06:20:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:48 crc kubenswrapper[4691]: I0930 06:20:48.224125 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 06:20:48 crc kubenswrapper[4691]: I0930 06:20:48.224147 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-svjxq" Sep 30 06:20:48 crc kubenswrapper[4691]: E0930 06:20:48.224364 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 06:20:48 crc kubenswrapper[4691]: E0930 06:20:48.224532 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-svjxq" podUID="a8ed6f92-0b98-4b1b-a46e-4d0604d686a1" Sep 30 06:20:48 crc kubenswrapper[4691]: I0930 06:20:48.267557 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:48 crc kubenswrapper[4691]: I0930 06:20:48.267601 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:48 crc kubenswrapper[4691]: I0930 06:20:48.267613 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:48 crc kubenswrapper[4691]: I0930 06:20:48.267630 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:48 crc kubenswrapper[4691]: I0930 06:20:48.267645 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:48Z","lastTransitionTime":"2025-09-30T06:20:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:48 crc kubenswrapper[4691]: I0930 06:20:48.371992 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:48 crc kubenswrapper[4691]: I0930 06:20:48.372066 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:48 crc kubenswrapper[4691]: I0930 06:20:48.372082 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:48 crc kubenswrapper[4691]: I0930 06:20:48.372111 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:48 crc kubenswrapper[4691]: I0930 06:20:48.372130 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:48Z","lastTransitionTime":"2025-09-30T06:20:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:48 crc kubenswrapper[4691]: I0930 06:20:48.476087 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:48 crc kubenswrapper[4691]: I0930 06:20:48.476154 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:48 crc kubenswrapper[4691]: I0930 06:20:48.476171 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:48 crc kubenswrapper[4691]: I0930 06:20:48.476197 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:48 crc kubenswrapper[4691]: I0930 06:20:48.476215 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:48Z","lastTransitionTime":"2025-09-30T06:20:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:48 crc kubenswrapper[4691]: I0930 06:20:48.579847 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:48 crc kubenswrapper[4691]: I0930 06:20:48.579974 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:48 crc kubenswrapper[4691]: I0930 06:20:48.580000 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:48 crc kubenswrapper[4691]: I0930 06:20:48.580027 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:48 crc kubenswrapper[4691]: I0930 06:20:48.580046 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:48Z","lastTransitionTime":"2025-09-30T06:20:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:48 crc kubenswrapper[4691]: I0930 06:20:48.683007 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:48 crc kubenswrapper[4691]: I0930 06:20:48.683072 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:48 crc kubenswrapper[4691]: I0930 06:20:48.683096 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:48 crc kubenswrapper[4691]: I0930 06:20:48.683124 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:48 crc kubenswrapper[4691]: I0930 06:20:48.683144 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:48Z","lastTransitionTime":"2025-09-30T06:20:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:48 crc kubenswrapper[4691]: I0930 06:20:48.786361 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:48 crc kubenswrapper[4691]: I0930 06:20:48.786422 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:48 crc kubenswrapper[4691]: I0930 06:20:48.786438 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:48 crc kubenswrapper[4691]: I0930 06:20:48.786462 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:48 crc kubenswrapper[4691]: I0930 06:20:48.786479 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:48Z","lastTransitionTime":"2025-09-30T06:20:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:48 crc kubenswrapper[4691]: I0930 06:20:48.889336 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:48 crc kubenswrapper[4691]: I0930 06:20:48.889418 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:48 crc kubenswrapper[4691]: I0930 06:20:48.889451 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:48 crc kubenswrapper[4691]: I0930 06:20:48.889479 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:48 crc kubenswrapper[4691]: I0930 06:20:48.889502 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:48Z","lastTransitionTime":"2025-09-30T06:20:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:48 crc kubenswrapper[4691]: I0930 06:20:48.993649 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:48 crc kubenswrapper[4691]: I0930 06:20:48.993720 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:48 crc kubenswrapper[4691]: I0930 06:20:48.993742 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:48 crc kubenswrapper[4691]: I0930 06:20:48.993770 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:48 crc kubenswrapper[4691]: I0930 06:20:48.993791 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:48Z","lastTransitionTime":"2025-09-30T06:20:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:49 crc kubenswrapper[4691]: I0930 06:20:49.096418 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:49 crc kubenswrapper[4691]: I0930 06:20:49.096489 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:49 crc kubenswrapper[4691]: I0930 06:20:49.096506 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:49 crc kubenswrapper[4691]: I0930 06:20:49.096528 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:49 crc kubenswrapper[4691]: I0930 06:20:49.096547 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:49Z","lastTransitionTime":"2025-09-30T06:20:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:49 crc kubenswrapper[4691]: I0930 06:20:49.205774 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:49 crc kubenswrapper[4691]: I0930 06:20:49.205843 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:49 crc kubenswrapper[4691]: I0930 06:20:49.205863 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:49 crc kubenswrapper[4691]: I0930 06:20:49.205918 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:49 crc kubenswrapper[4691]: I0930 06:20:49.205942 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:49Z","lastTransitionTime":"2025-09-30T06:20:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:49 crc kubenswrapper[4691]: I0930 06:20:49.224864 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 06:20:49 crc kubenswrapper[4691]: I0930 06:20:49.225413 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 06:20:49 crc kubenswrapper[4691]: E0930 06:20:49.225584 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 06:20:49 crc kubenswrapper[4691]: E0930 06:20:49.225745 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 06:20:49 crc kubenswrapper[4691]: I0930 06:20:49.225966 4691 scope.go:117] "RemoveContainer" containerID="76736edc05ab57e4ee3d6dea3b7f4d2033e718cdac579abe7c4f41fe6988f62e" Sep 30 06:20:49 crc kubenswrapper[4691]: E0930 06:20:49.226229 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-sjmvw_openshift-ovn-kubernetes(6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d)\"" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" podUID="6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" Sep 30 06:20:49 crc kubenswrapper[4691]: I0930 06:20:49.308854 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:49 crc kubenswrapper[4691]: I0930 06:20:49.308954 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:49 crc kubenswrapper[4691]: I0930 06:20:49.308981 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:49 crc kubenswrapper[4691]: I0930 06:20:49.309008 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:49 crc kubenswrapper[4691]: I0930 06:20:49.309028 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:49Z","lastTransitionTime":"2025-09-30T06:20:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:49 crc kubenswrapper[4691]: I0930 06:20:49.411680 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:49 crc kubenswrapper[4691]: I0930 06:20:49.411748 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:49 crc kubenswrapper[4691]: I0930 06:20:49.411773 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:49 crc kubenswrapper[4691]: I0930 06:20:49.411803 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:49 crc kubenswrapper[4691]: I0930 06:20:49.411823 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:49Z","lastTransitionTime":"2025-09-30T06:20:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:49 crc kubenswrapper[4691]: I0930 06:20:49.515126 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:49 crc kubenswrapper[4691]: I0930 06:20:49.515201 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:49 crc kubenswrapper[4691]: I0930 06:20:49.515224 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:49 crc kubenswrapper[4691]: I0930 06:20:49.515254 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:49 crc kubenswrapper[4691]: I0930 06:20:49.515279 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:49Z","lastTransitionTime":"2025-09-30T06:20:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:49 crc kubenswrapper[4691]: I0930 06:20:49.617563 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:49 crc kubenswrapper[4691]: I0930 06:20:49.617614 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:49 crc kubenswrapper[4691]: I0930 06:20:49.617628 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:49 crc kubenswrapper[4691]: I0930 06:20:49.617646 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:49 crc kubenswrapper[4691]: I0930 06:20:49.617658 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:49Z","lastTransitionTime":"2025-09-30T06:20:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:49 crc kubenswrapper[4691]: I0930 06:20:49.721594 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:49 crc kubenswrapper[4691]: I0930 06:20:49.721869 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:49 crc kubenswrapper[4691]: I0930 06:20:49.722152 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:49 crc kubenswrapper[4691]: I0930 06:20:49.722406 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:49 crc kubenswrapper[4691]: I0930 06:20:49.722607 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:49Z","lastTransitionTime":"2025-09-30T06:20:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:49 crc kubenswrapper[4691]: I0930 06:20:49.826065 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:49 crc kubenswrapper[4691]: I0930 06:20:49.826123 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:49 crc kubenswrapper[4691]: I0930 06:20:49.826141 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:49 crc kubenswrapper[4691]: I0930 06:20:49.826164 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:49 crc kubenswrapper[4691]: I0930 06:20:49.826183 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:49Z","lastTransitionTime":"2025-09-30T06:20:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:49 crc kubenswrapper[4691]: I0930 06:20:49.929712 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:49 crc kubenswrapper[4691]: I0930 06:20:49.929799 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:49 crc kubenswrapper[4691]: I0930 06:20:49.929822 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:49 crc kubenswrapper[4691]: I0930 06:20:49.929853 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:49 crc kubenswrapper[4691]: I0930 06:20:49.929876 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:49Z","lastTransitionTime":"2025-09-30T06:20:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:50 crc kubenswrapper[4691]: I0930 06:20:50.033056 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:50 crc kubenswrapper[4691]: I0930 06:20:50.033388 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:50 crc kubenswrapper[4691]: I0930 06:20:50.033702 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:50 crc kubenswrapper[4691]: I0930 06:20:50.033970 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:50 crc kubenswrapper[4691]: I0930 06:20:50.034207 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:50Z","lastTransitionTime":"2025-09-30T06:20:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:50 crc kubenswrapper[4691]: I0930 06:20:50.137403 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:50 crc kubenswrapper[4691]: I0930 06:20:50.137683 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:50 crc kubenswrapper[4691]: I0930 06:20:50.137843 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:50 crc kubenswrapper[4691]: I0930 06:20:50.138056 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:50 crc kubenswrapper[4691]: I0930 06:20:50.138203 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:50Z","lastTransitionTime":"2025-09-30T06:20:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:50 crc kubenswrapper[4691]: I0930 06:20:50.223779 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-svjxq" Sep 30 06:20:50 crc kubenswrapper[4691]: I0930 06:20:50.223854 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 06:20:50 crc kubenswrapper[4691]: E0930 06:20:50.224061 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-svjxq" podUID="a8ed6f92-0b98-4b1b-a46e-4d0604d686a1" Sep 30 06:20:50 crc kubenswrapper[4691]: E0930 06:20:50.224202 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 06:20:50 crc kubenswrapper[4691]: I0930 06:20:50.241541 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:50 crc kubenswrapper[4691]: I0930 06:20:50.241601 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:50 crc kubenswrapper[4691]: I0930 06:20:50.241620 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:50 crc kubenswrapper[4691]: I0930 06:20:50.241640 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:50 crc kubenswrapper[4691]: I0930 06:20:50.241656 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:50Z","lastTransitionTime":"2025-09-30T06:20:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:50 crc kubenswrapper[4691]: I0930 06:20:50.344201 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:50 crc kubenswrapper[4691]: I0930 06:20:50.344248 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:50 crc kubenswrapper[4691]: I0930 06:20:50.344265 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:50 crc kubenswrapper[4691]: I0930 06:20:50.344284 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:50 crc kubenswrapper[4691]: I0930 06:20:50.344299 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:50Z","lastTransitionTime":"2025-09-30T06:20:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:50 crc kubenswrapper[4691]: I0930 06:20:50.446836 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:50 crc kubenswrapper[4691]: I0930 06:20:50.446878 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:50 crc kubenswrapper[4691]: I0930 06:20:50.446907 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:50 crc kubenswrapper[4691]: I0930 06:20:50.446926 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:50 crc kubenswrapper[4691]: I0930 06:20:50.446938 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:50Z","lastTransitionTime":"2025-09-30T06:20:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:50 crc kubenswrapper[4691]: I0930 06:20:50.549979 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:50 crc kubenswrapper[4691]: I0930 06:20:50.550054 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:50 crc kubenswrapper[4691]: I0930 06:20:50.550077 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:50 crc kubenswrapper[4691]: I0930 06:20:50.550105 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:50 crc kubenswrapper[4691]: I0930 06:20:50.550123 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:50Z","lastTransitionTime":"2025-09-30T06:20:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:50 crc kubenswrapper[4691]: I0930 06:20:50.653731 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:50 crc kubenswrapper[4691]: I0930 06:20:50.653786 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:50 crc kubenswrapper[4691]: I0930 06:20:50.653802 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:50 crc kubenswrapper[4691]: I0930 06:20:50.653826 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:50 crc kubenswrapper[4691]: I0930 06:20:50.653844 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:50Z","lastTransitionTime":"2025-09-30T06:20:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:50 crc kubenswrapper[4691]: I0930 06:20:50.757384 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:50 crc kubenswrapper[4691]: I0930 06:20:50.757456 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:50 crc kubenswrapper[4691]: I0930 06:20:50.757480 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:50 crc kubenswrapper[4691]: I0930 06:20:50.757510 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:50 crc kubenswrapper[4691]: I0930 06:20:50.757533 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:50Z","lastTransitionTime":"2025-09-30T06:20:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:50 crc kubenswrapper[4691]: I0930 06:20:50.860966 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:50 crc kubenswrapper[4691]: I0930 06:20:50.861021 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:50 crc kubenswrapper[4691]: I0930 06:20:50.861044 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:50 crc kubenswrapper[4691]: I0930 06:20:50.861074 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:50 crc kubenswrapper[4691]: I0930 06:20:50.861094 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:50Z","lastTransitionTime":"2025-09-30T06:20:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:50 crc kubenswrapper[4691]: I0930 06:20:50.964008 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:50 crc kubenswrapper[4691]: I0930 06:20:50.964062 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:50 crc kubenswrapper[4691]: I0930 06:20:50.964080 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:50 crc kubenswrapper[4691]: I0930 06:20:50.964104 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:50 crc kubenswrapper[4691]: I0930 06:20:50.964120 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:50Z","lastTransitionTime":"2025-09-30T06:20:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:51 crc kubenswrapper[4691]: I0930 06:20:51.067251 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:51 crc kubenswrapper[4691]: I0930 06:20:51.067307 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:51 crc kubenswrapper[4691]: I0930 06:20:51.067324 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:51 crc kubenswrapper[4691]: I0930 06:20:51.067345 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:51 crc kubenswrapper[4691]: I0930 06:20:51.067363 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:51Z","lastTransitionTime":"2025-09-30T06:20:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:51 crc kubenswrapper[4691]: I0930 06:20:51.171288 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:51 crc kubenswrapper[4691]: I0930 06:20:51.171368 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:51 crc kubenswrapper[4691]: I0930 06:20:51.171392 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:51 crc kubenswrapper[4691]: I0930 06:20:51.171427 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:51 crc kubenswrapper[4691]: I0930 06:20:51.171450 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:51Z","lastTransitionTime":"2025-09-30T06:20:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:51 crc kubenswrapper[4691]: I0930 06:20:51.223887 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 06:20:51 crc kubenswrapper[4691]: I0930 06:20:51.223953 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 06:20:51 crc kubenswrapper[4691]: E0930 06:20:51.224113 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 06:20:51 crc kubenswrapper[4691]: E0930 06:20:51.224406 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 06:20:51 crc kubenswrapper[4691]: I0930 06:20:51.275140 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:51 crc kubenswrapper[4691]: I0930 06:20:51.275196 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:51 crc kubenswrapper[4691]: I0930 06:20:51.275223 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:51 crc kubenswrapper[4691]: I0930 06:20:51.275254 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:51 crc kubenswrapper[4691]: I0930 06:20:51.275276 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:51Z","lastTransitionTime":"2025-09-30T06:20:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:51 crc kubenswrapper[4691]: I0930 06:20:51.378510 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:51 crc kubenswrapper[4691]: I0930 06:20:51.378549 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:51 crc kubenswrapper[4691]: I0930 06:20:51.378565 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:51 crc kubenswrapper[4691]: I0930 06:20:51.378592 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:51 crc kubenswrapper[4691]: I0930 06:20:51.378608 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:51Z","lastTransitionTime":"2025-09-30T06:20:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:51 crc kubenswrapper[4691]: I0930 06:20:51.481822 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:51 crc kubenswrapper[4691]: I0930 06:20:51.481865 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:51 crc kubenswrapper[4691]: I0930 06:20:51.481876 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:51 crc kubenswrapper[4691]: I0930 06:20:51.481915 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:51 crc kubenswrapper[4691]: I0930 06:20:51.481929 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:51Z","lastTransitionTime":"2025-09-30T06:20:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:51 crc kubenswrapper[4691]: I0930 06:20:51.585102 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:51 crc kubenswrapper[4691]: I0930 06:20:51.585578 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:51 crc kubenswrapper[4691]: I0930 06:20:51.585764 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:51 crc kubenswrapper[4691]: I0930 06:20:51.585985 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:51 crc kubenswrapper[4691]: I0930 06:20:51.586184 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:51Z","lastTransitionTime":"2025-09-30T06:20:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:51 crc kubenswrapper[4691]: I0930 06:20:51.689252 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:51 crc kubenswrapper[4691]: I0930 06:20:51.689522 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:51 crc kubenswrapper[4691]: I0930 06:20:51.689728 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:51 crc kubenswrapper[4691]: I0930 06:20:51.689833 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:51 crc kubenswrapper[4691]: I0930 06:20:51.689940 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:51Z","lastTransitionTime":"2025-09-30T06:20:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:51 crc kubenswrapper[4691]: I0930 06:20:51.792338 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:51 crc kubenswrapper[4691]: I0930 06:20:51.792657 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:51 crc kubenswrapper[4691]: I0930 06:20:51.792851 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:51 crc kubenswrapper[4691]: I0930 06:20:51.793102 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:51 crc kubenswrapper[4691]: I0930 06:20:51.793240 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:51Z","lastTransitionTime":"2025-09-30T06:20:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:51 crc kubenswrapper[4691]: I0930 06:20:51.896778 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:51 crc kubenswrapper[4691]: I0930 06:20:51.896844 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:51 crc kubenswrapper[4691]: I0930 06:20:51.896867 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:51 crc kubenswrapper[4691]: I0930 06:20:51.896897 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:51 crc kubenswrapper[4691]: I0930 06:20:51.896940 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:51Z","lastTransitionTime":"2025-09-30T06:20:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:52 crc kubenswrapper[4691]: I0930 06:20:52.003829 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:52 crc kubenswrapper[4691]: I0930 06:20:52.003889 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:52 crc kubenswrapper[4691]: I0930 06:20:52.003938 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:52 crc kubenswrapper[4691]: I0930 06:20:52.003966 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:52 crc kubenswrapper[4691]: I0930 06:20:52.003984 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:52Z","lastTransitionTime":"2025-09-30T06:20:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:52 crc kubenswrapper[4691]: I0930 06:20:52.106752 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:52 crc kubenswrapper[4691]: I0930 06:20:52.106801 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:52 crc kubenswrapper[4691]: I0930 06:20:52.106818 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:52 crc kubenswrapper[4691]: I0930 06:20:52.106839 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:52 crc kubenswrapper[4691]: I0930 06:20:52.106856 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:52Z","lastTransitionTime":"2025-09-30T06:20:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:52 crc kubenswrapper[4691]: I0930 06:20:52.210356 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:52 crc kubenswrapper[4691]: I0930 06:20:52.210423 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:52 crc kubenswrapper[4691]: I0930 06:20:52.210456 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:52 crc kubenswrapper[4691]: I0930 06:20:52.210483 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:52 crc kubenswrapper[4691]: I0930 06:20:52.210504 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:52Z","lastTransitionTime":"2025-09-30T06:20:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:52 crc kubenswrapper[4691]: I0930 06:20:52.224783 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-svjxq" Sep 30 06:20:52 crc kubenswrapper[4691]: I0930 06:20:52.224819 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 06:20:52 crc kubenswrapper[4691]: E0930 06:20:52.225034 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-svjxq" podUID="a8ed6f92-0b98-4b1b-a46e-4d0604d686a1" Sep 30 06:20:52 crc kubenswrapper[4691]: E0930 06:20:52.225167 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 06:20:52 crc kubenswrapper[4691]: I0930 06:20:52.312924 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:52 crc kubenswrapper[4691]: I0930 06:20:52.312983 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:52 crc kubenswrapper[4691]: I0930 06:20:52.313002 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:52 crc kubenswrapper[4691]: I0930 06:20:52.313025 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:52 crc kubenswrapper[4691]: I0930 06:20:52.313041 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:52Z","lastTransitionTime":"2025-09-30T06:20:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:52 crc kubenswrapper[4691]: I0930 06:20:52.415037 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:52 crc kubenswrapper[4691]: I0930 06:20:52.415084 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:52 crc kubenswrapper[4691]: I0930 06:20:52.415102 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:52 crc kubenswrapper[4691]: I0930 06:20:52.415123 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:52 crc kubenswrapper[4691]: I0930 06:20:52.415138 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:52Z","lastTransitionTime":"2025-09-30T06:20:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:52 crc kubenswrapper[4691]: I0930 06:20:52.517502 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:52 crc kubenswrapper[4691]: I0930 06:20:52.517539 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:52 crc kubenswrapper[4691]: I0930 06:20:52.517550 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:52 crc kubenswrapper[4691]: I0930 06:20:52.517564 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:52 crc kubenswrapper[4691]: I0930 06:20:52.517572 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:52Z","lastTransitionTime":"2025-09-30T06:20:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:52 crc kubenswrapper[4691]: I0930 06:20:52.621230 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:52 crc kubenswrapper[4691]: I0930 06:20:52.621278 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:52 crc kubenswrapper[4691]: I0930 06:20:52.621291 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:52 crc kubenswrapper[4691]: I0930 06:20:52.621307 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:52 crc kubenswrapper[4691]: I0930 06:20:52.621318 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:52Z","lastTransitionTime":"2025-09-30T06:20:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:52 crc kubenswrapper[4691]: I0930 06:20:52.724934 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:52 crc kubenswrapper[4691]: I0930 06:20:52.724996 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:52 crc kubenswrapper[4691]: I0930 06:20:52.725014 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:52 crc kubenswrapper[4691]: I0930 06:20:52.725042 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:52 crc kubenswrapper[4691]: I0930 06:20:52.725082 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:52Z","lastTransitionTime":"2025-09-30T06:20:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:52 crc kubenswrapper[4691]: I0930 06:20:52.827978 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:52 crc kubenswrapper[4691]: I0930 06:20:52.828026 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:52 crc kubenswrapper[4691]: I0930 06:20:52.828041 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:52 crc kubenswrapper[4691]: I0930 06:20:52.828059 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:52 crc kubenswrapper[4691]: I0930 06:20:52.828074 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:52Z","lastTransitionTime":"2025-09-30T06:20:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:52 crc kubenswrapper[4691]: I0930 06:20:52.932569 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:52 crc kubenswrapper[4691]: I0930 06:20:52.932614 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:52 crc kubenswrapper[4691]: I0930 06:20:52.932628 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:52 crc kubenswrapper[4691]: I0930 06:20:52.932646 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:52 crc kubenswrapper[4691]: I0930 06:20:52.932658 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:52Z","lastTransitionTime":"2025-09-30T06:20:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:53 crc kubenswrapper[4691]: I0930 06:20:53.036187 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:53 crc kubenswrapper[4691]: I0930 06:20:53.036249 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:53 crc kubenswrapper[4691]: I0930 06:20:53.036266 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:53 crc kubenswrapper[4691]: I0930 06:20:53.036290 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:53 crc kubenswrapper[4691]: I0930 06:20:53.036306 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:53Z","lastTransitionTime":"2025-09-30T06:20:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:53 crc kubenswrapper[4691]: I0930 06:20:53.138759 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:53 crc kubenswrapper[4691]: I0930 06:20:53.138821 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:53 crc kubenswrapper[4691]: I0930 06:20:53.138845 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:53 crc kubenswrapper[4691]: I0930 06:20:53.138876 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:53 crc kubenswrapper[4691]: I0930 06:20:53.138945 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:53Z","lastTransitionTime":"2025-09-30T06:20:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:53 crc kubenswrapper[4691]: I0930 06:20:53.224613 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 06:20:53 crc kubenswrapper[4691]: E0930 06:20:53.224785 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 06:20:53 crc kubenswrapper[4691]: I0930 06:20:53.225145 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 06:20:53 crc kubenswrapper[4691]: E0930 06:20:53.225437 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 06:20:53 crc kubenswrapper[4691]: I0930 06:20:53.241642 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:53 crc kubenswrapper[4691]: I0930 06:20:53.241693 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:53 crc kubenswrapper[4691]: I0930 06:20:53.241715 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:53 crc kubenswrapper[4691]: I0930 06:20:53.241744 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:53 crc kubenswrapper[4691]: I0930 06:20:53.241765 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:53Z","lastTransitionTime":"2025-09-30T06:20:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:53 crc kubenswrapper[4691]: I0930 06:20:53.344827 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:53 crc kubenswrapper[4691]: I0930 06:20:53.344950 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:53 crc kubenswrapper[4691]: I0930 06:20:53.344977 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:53 crc kubenswrapper[4691]: I0930 06:20:53.345008 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:53 crc kubenswrapper[4691]: I0930 06:20:53.345028 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:53Z","lastTransitionTime":"2025-09-30T06:20:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:53 crc kubenswrapper[4691]: I0930 06:20:53.447972 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:53 crc kubenswrapper[4691]: I0930 06:20:53.448042 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:53 crc kubenswrapper[4691]: I0930 06:20:53.448060 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:53 crc kubenswrapper[4691]: I0930 06:20:53.448083 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:53 crc kubenswrapper[4691]: I0930 06:20:53.448104 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:53Z","lastTransitionTime":"2025-09-30T06:20:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:53 crc kubenswrapper[4691]: I0930 06:20:53.551371 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:53 crc kubenswrapper[4691]: I0930 06:20:53.551434 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:53 crc kubenswrapper[4691]: I0930 06:20:53.551453 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:53 crc kubenswrapper[4691]: I0930 06:20:53.551478 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:53 crc kubenswrapper[4691]: I0930 06:20:53.551496 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:53Z","lastTransitionTime":"2025-09-30T06:20:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:53 crc kubenswrapper[4691]: I0930 06:20:53.655025 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:53 crc kubenswrapper[4691]: I0930 06:20:53.655115 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:53 crc kubenswrapper[4691]: I0930 06:20:53.655140 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:53 crc kubenswrapper[4691]: I0930 06:20:53.655171 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:53 crc kubenswrapper[4691]: I0930 06:20:53.655192 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:53Z","lastTransitionTime":"2025-09-30T06:20:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:53 crc kubenswrapper[4691]: I0930 06:20:53.758524 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:53 crc kubenswrapper[4691]: I0930 06:20:53.758588 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:53 crc kubenswrapper[4691]: I0930 06:20:53.758605 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:53 crc kubenswrapper[4691]: I0930 06:20:53.758630 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:53 crc kubenswrapper[4691]: I0930 06:20:53.758649 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:53Z","lastTransitionTime":"2025-09-30T06:20:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:53 crc kubenswrapper[4691]: I0930 06:20:53.861015 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:53 crc kubenswrapper[4691]: I0930 06:20:53.861076 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:53 crc kubenswrapper[4691]: I0930 06:20:53.861126 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:53 crc kubenswrapper[4691]: I0930 06:20:53.861152 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:53 crc kubenswrapper[4691]: I0930 06:20:53.861172 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:53Z","lastTransitionTime":"2025-09-30T06:20:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:53 crc kubenswrapper[4691]: I0930 06:20:53.964338 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:53 crc kubenswrapper[4691]: I0930 06:20:53.964409 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:53 crc kubenswrapper[4691]: I0930 06:20:53.964427 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:53 crc kubenswrapper[4691]: I0930 06:20:53.964450 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:53 crc kubenswrapper[4691]: I0930 06:20:53.964469 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:53Z","lastTransitionTime":"2025-09-30T06:20:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:54 crc kubenswrapper[4691]: I0930 06:20:54.067929 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:54 crc kubenswrapper[4691]: I0930 06:20:54.068011 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:54 crc kubenswrapper[4691]: I0930 06:20:54.068028 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:54 crc kubenswrapper[4691]: I0930 06:20:54.068051 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:54 crc kubenswrapper[4691]: I0930 06:20:54.068068 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:54Z","lastTransitionTime":"2025-09-30T06:20:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:54 crc kubenswrapper[4691]: I0930 06:20:54.171433 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:54 crc kubenswrapper[4691]: I0930 06:20:54.171515 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:54 crc kubenswrapper[4691]: I0930 06:20:54.171539 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:54 crc kubenswrapper[4691]: I0930 06:20:54.171574 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:54 crc kubenswrapper[4691]: I0930 06:20:54.171598 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:54Z","lastTransitionTime":"2025-09-30T06:20:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:54 crc kubenswrapper[4691]: I0930 06:20:54.224493 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 06:20:54 crc kubenswrapper[4691]: I0930 06:20:54.224687 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-svjxq" Sep 30 06:20:54 crc kubenswrapper[4691]: E0930 06:20:54.225000 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 06:20:54 crc kubenswrapper[4691]: E0930 06:20:54.225074 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-svjxq" podUID="a8ed6f92-0b98-4b1b-a46e-4d0604d686a1" Sep 30 06:20:54 crc kubenswrapper[4691]: I0930 06:20:54.274527 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:54 crc kubenswrapper[4691]: I0930 06:20:54.274611 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:54 crc kubenswrapper[4691]: I0930 06:20:54.274631 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:54 crc kubenswrapper[4691]: I0930 06:20:54.274659 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:54 crc kubenswrapper[4691]: I0930 06:20:54.274678 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:54Z","lastTransitionTime":"2025-09-30T06:20:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:54 crc kubenswrapper[4691]: I0930 06:20:54.379231 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:54 crc kubenswrapper[4691]: I0930 06:20:54.379294 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:54 crc kubenswrapper[4691]: I0930 06:20:54.379310 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:54 crc kubenswrapper[4691]: I0930 06:20:54.379332 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:54 crc kubenswrapper[4691]: I0930 06:20:54.379347 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:54Z","lastTransitionTime":"2025-09-30T06:20:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:54 crc kubenswrapper[4691]: I0930 06:20:54.482752 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:54 crc kubenswrapper[4691]: I0930 06:20:54.482802 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:54 crc kubenswrapper[4691]: I0930 06:20:54.482818 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:54 crc kubenswrapper[4691]: I0930 06:20:54.482835 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:54 crc kubenswrapper[4691]: I0930 06:20:54.482851 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:54Z","lastTransitionTime":"2025-09-30T06:20:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:54 crc kubenswrapper[4691]: I0930 06:20:54.586085 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:54 crc kubenswrapper[4691]: I0930 06:20:54.586159 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:54 crc kubenswrapper[4691]: I0930 06:20:54.586175 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:54 crc kubenswrapper[4691]: I0930 06:20:54.586207 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:54 crc kubenswrapper[4691]: I0930 06:20:54.586233 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:54Z","lastTransitionTime":"2025-09-30T06:20:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:54 crc kubenswrapper[4691]: I0930 06:20:54.689426 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:54 crc kubenswrapper[4691]: I0930 06:20:54.689495 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:54 crc kubenswrapper[4691]: I0930 06:20:54.689516 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:54 crc kubenswrapper[4691]: I0930 06:20:54.689547 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:54 crc kubenswrapper[4691]: I0930 06:20:54.689569 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:54Z","lastTransitionTime":"2025-09-30T06:20:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:54 crc kubenswrapper[4691]: I0930 06:20:54.792238 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:54 crc kubenswrapper[4691]: I0930 06:20:54.792388 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:54 crc kubenswrapper[4691]: I0930 06:20:54.792410 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:54 crc kubenswrapper[4691]: I0930 06:20:54.792436 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:54 crc kubenswrapper[4691]: I0930 06:20:54.792454 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:54Z","lastTransitionTime":"2025-09-30T06:20:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:54 crc kubenswrapper[4691]: I0930 06:20:54.896518 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:54 crc kubenswrapper[4691]: I0930 06:20:54.896610 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:54 crc kubenswrapper[4691]: I0930 06:20:54.896637 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:54 crc kubenswrapper[4691]: I0930 06:20:54.896668 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:54 crc kubenswrapper[4691]: I0930 06:20:54.896704 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:54Z","lastTransitionTime":"2025-09-30T06:20:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:54 crc kubenswrapper[4691]: I0930 06:20:54.999426 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:54 crc kubenswrapper[4691]: I0930 06:20:54.999488 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:54 crc kubenswrapper[4691]: I0930 06:20:54.999509 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:54 crc kubenswrapper[4691]: I0930 06:20:54.999533 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:54 crc kubenswrapper[4691]: I0930 06:20:54.999549 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:54Z","lastTransitionTime":"2025-09-30T06:20:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:55 crc kubenswrapper[4691]: I0930 06:20:55.103006 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:55 crc kubenswrapper[4691]: I0930 06:20:55.103082 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:55 crc kubenswrapper[4691]: I0930 06:20:55.103104 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:55 crc kubenswrapper[4691]: I0930 06:20:55.103133 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:55 crc kubenswrapper[4691]: I0930 06:20:55.103155 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:55Z","lastTransitionTime":"2025-09-30T06:20:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:55 crc kubenswrapper[4691]: I0930 06:20:55.205428 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:55 crc kubenswrapper[4691]: I0930 06:20:55.205482 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:55 crc kubenswrapper[4691]: I0930 06:20:55.205499 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:55 crc kubenswrapper[4691]: I0930 06:20:55.205525 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:55 crc kubenswrapper[4691]: I0930 06:20:55.205541 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:55Z","lastTransitionTime":"2025-09-30T06:20:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:55 crc kubenswrapper[4691]: I0930 06:20:55.224507 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 06:20:55 crc kubenswrapper[4691]: I0930 06:20:55.224660 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 06:20:55 crc kubenswrapper[4691]: E0930 06:20:55.224845 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 06:20:55 crc kubenswrapper[4691]: E0930 06:20:55.225118 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 06:20:55 crc kubenswrapper[4691]: I0930 06:20:55.309395 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:55 crc kubenswrapper[4691]: I0930 06:20:55.309447 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:55 crc kubenswrapper[4691]: I0930 06:20:55.309463 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:55 crc kubenswrapper[4691]: I0930 06:20:55.309487 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:55 crc kubenswrapper[4691]: I0930 06:20:55.309504 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:55Z","lastTransitionTime":"2025-09-30T06:20:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:55 crc kubenswrapper[4691]: I0930 06:20:55.412502 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:55 crc kubenswrapper[4691]: I0930 06:20:55.412555 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:55 crc kubenswrapper[4691]: I0930 06:20:55.412572 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:55 crc kubenswrapper[4691]: I0930 06:20:55.412596 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:55 crc kubenswrapper[4691]: I0930 06:20:55.412613 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:55Z","lastTransitionTime":"2025-09-30T06:20:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:55 crc kubenswrapper[4691]: I0930 06:20:55.479060 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:55 crc kubenswrapper[4691]: I0930 06:20:55.479147 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:55 crc kubenswrapper[4691]: I0930 06:20:55.479169 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:55 crc kubenswrapper[4691]: I0930 06:20:55.479195 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:55 crc kubenswrapper[4691]: I0930 06:20:55.479217 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:55Z","lastTransitionTime":"2025-09-30T06:20:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:55 crc kubenswrapper[4691]: E0930 06:20:55.502563 4691 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5d33cbaa-1b8a-4dde-af56-05c3aae2213e\\\",\\\"systemUUID\\\":\\\"d7c4eecd-4486-44ba-8bf7-42bfa69eb2b7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:55Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:55 crc kubenswrapper[4691]: I0930 06:20:55.509179 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:55 crc kubenswrapper[4691]: I0930 06:20:55.509242 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:55 crc kubenswrapper[4691]: I0930 06:20:55.509259 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:55 crc kubenswrapper[4691]: I0930 06:20:55.509284 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:55 crc kubenswrapper[4691]: I0930 06:20:55.509304 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:55Z","lastTransitionTime":"2025-09-30T06:20:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:55 crc kubenswrapper[4691]: E0930 06:20:55.530133 4691 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5d33cbaa-1b8a-4dde-af56-05c3aae2213e\\\",\\\"systemUUID\\\":\\\"d7c4eecd-4486-44ba-8bf7-42bfa69eb2b7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:55Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:55 crc kubenswrapper[4691]: I0930 06:20:55.535456 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:55 crc kubenswrapper[4691]: I0930 06:20:55.535507 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:55 crc kubenswrapper[4691]: I0930 06:20:55.535526 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:55 crc kubenswrapper[4691]: I0930 06:20:55.535549 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:55 crc kubenswrapper[4691]: I0930 06:20:55.535567 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:55Z","lastTransitionTime":"2025-09-30T06:20:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:55 crc kubenswrapper[4691]: E0930 06:20:55.556169 4691 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5d33cbaa-1b8a-4dde-af56-05c3aae2213e\\\",\\\"systemUUID\\\":\\\"d7c4eecd-4486-44ba-8bf7-42bfa69eb2b7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:55Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:55 crc kubenswrapper[4691]: I0930 06:20:55.561378 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:55 crc kubenswrapper[4691]: I0930 06:20:55.561430 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:55 crc kubenswrapper[4691]: I0930 06:20:55.561449 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:55 crc kubenswrapper[4691]: I0930 06:20:55.561472 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:55 crc kubenswrapper[4691]: I0930 06:20:55.561490 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:55Z","lastTransitionTime":"2025-09-30T06:20:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:55 crc kubenswrapper[4691]: E0930 06:20:55.582568 4691 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5d33cbaa-1b8a-4dde-af56-05c3aae2213e\\\",\\\"systemUUID\\\":\\\"d7c4eecd-4486-44ba-8bf7-42bfa69eb2b7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:55Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:55 crc kubenswrapper[4691]: I0930 06:20:55.588112 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:55 crc kubenswrapper[4691]: I0930 06:20:55.588179 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:55 crc kubenswrapper[4691]: I0930 06:20:55.588195 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:55 crc kubenswrapper[4691]: I0930 06:20:55.588217 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:55 crc kubenswrapper[4691]: I0930 06:20:55.588234 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:55Z","lastTransitionTime":"2025-09-30T06:20:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:55 crc kubenswrapper[4691]: E0930 06:20:55.607658 4691 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:55Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:55Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-09-30T06:20:55Z\\\",\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:55Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"5d33cbaa-1b8a-4dde-af56-05c3aae2213e\\\",\\\"systemUUID\\\":\\\"d7c4eecd-4486-44ba-8bf7-42bfa69eb2b7\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:55Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:55 crc kubenswrapper[4691]: E0930 06:20:55.607874 4691 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Sep 30 06:20:55 crc kubenswrapper[4691]: I0930 06:20:55.610317 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:55 crc kubenswrapper[4691]: I0930 06:20:55.610381 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:55 crc kubenswrapper[4691]: I0930 06:20:55.610399 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:55 crc kubenswrapper[4691]: I0930 06:20:55.610423 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:55 crc kubenswrapper[4691]: I0930 06:20:55.610442 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:55Z","lastTransitionTime":"2025-09-30T06:20:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:55 crc kubenswrapper[4691]: I0930 06:20:55.714089 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:55 crc kubenswrapper[4691]: I0930 06:20:55.714132 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:55 crc kubenswrapper[4691]: I0930 06:20:55.714148 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:55 crc kubenswrapper[4691]: I0930 06:20:55.714170 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:55 crc kubenswrapper[4691]: I0930 06:20:55.714187 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:55Z","lastTransitionTime":"2025-09-30T06:20:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:55 crc kubenswrapper[4691]: I0930 06:20:55.817131 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:55 crc kubenswrapper[4691]: I0930 06:20:55.817230 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:55 crc kubenswrapper[4691]: I0930 06:20:55.817250 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:55 crc kubenswrapper[4691]: I0930 06:20:55.817275 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:55 crc kubenswrapper[4691]: I0930 06:20:55.817293 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:55Z","lastTransitionTime":"2025-09-30T06:20:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:55 crc kubenswrapper[4691]: I0930 06:20:55.921492 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:55 crc kubenswrapper[4691]: I0930 06:20:55.921553 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:55 crc kubenswrapper[4691]: I0930 06:20:55.921569 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:55 crc kubenswrapper[4691]: I0930 06:20:55.921596 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:55 crc kubenswrapper[4691]: I0930 06:20:55.921613 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:55Z","lastTransitionTime":"2025-09-30T06:20:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:56 crc kubenswrapper[4691]: I0930 06:20:56.025327 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:56 crc kubenswrapper[4691]: I0930 06:20:56.025395 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:56 crc kubenswrapper[4691]: I0930 06:20:56.025416 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:56 crc kubenswrapper[4691]: I0930 06:20:56.025444 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:56 crc kubenswrapper[4691]: I0930 06:20:56.025467 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:56Z","lastTransitionTime":"2025-09-30T06:20:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:56 crc kubenswrapper[4691]: I0930 06:20:56.128323 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:56 crc kubenswrapper[4691]: I0930 06:20:56.128387 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:56 crc kubenswrapper[4691]: I0930 06:20:56.128410 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:56 crc kubenswrapper[4691]: I0930 06:20:56.128440 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:56 crc kubenswrapper[4691]: I0930 06:20:56.128461 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:56Z","lastTransitionTime":"2025-09-30T06:20:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:56 crc kubenswrapper[4691]: I0930 06:20:56.224060 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 06:20:56 crc kubenswrapper[4691]: I0930 06:20:56.224217 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-svjxq" Sep 30 06:20:56 crc kubenswrapper[4691]: E0930 06:20:56.224547 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 06:20:56 crc kubenswrapper[4691]: E0930 06:20:56.224637 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-svjxq" podUID="a8ed6f92-0b98-4b1b-a46e-4d0604d686a1" Sep 30 06:20:56 crc kubenswrapper[4691]: I0930 06:20:56.231358 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:56 crc kubenswrapper[4691]: I0930 06:20:56.231396 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:56 crc kubenswrapper[4691]: I0930 06:20:56.231414 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:56 crc kubenswrapper[4691]: I0930 06:20:56.231439 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:56 crc kubenswrapper[4691]: I0930 06:20:56.231461 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:56Z","lastTransitionTime":"2025-09-30T06:20:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:56 crc kubenswrapper[4691]: I0930 06:20:56.335667 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:56 crc kubenswrapper[4691]: I0930 06:20:56.335732 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:56 crc kubenswrapper[4691]: I0930 06:20:56.335748 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:56 crc kubenswrapper[4691]: I0930 06:20:56.335772 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:56 crc kubenswrapper[4691]: I0930 06:20:56.335790 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:56Z","lastTransitionTime":"2025-09-30T06:20:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:56 crc kubenswrapper[4691]: I0930 06:20:56.439131 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:56 crc kubenswrapper[4691]: I0930 06:20:56.439207 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:56 crc kubenswrapper[4691]: I0930 06:20:56.439229 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:56 crc kubenswrapper[4691]: I0930 06:20:56.439261 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:56 crc kubenswrapper[4691]: I0930 06:20:56.439279 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:56Z","lastTransitionTime":"2025-09-30T06:20:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:56 crc kubenswrapper[4691]: I0930 06:20:56.542120 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:56 crc kubenswrapper[4691]: I0930 06:20:56.542189 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:56 crc kubenswrapper[4691]: I0930 06:20:56.542222 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:56 crc kubenswrapper[4691]: I0930 06:20:56.542248 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:56 crc kubenswrapper[4691]: I0930 06:20:56.542265 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:56Z","lastTransitionTime":"2025-09-30T06:20:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:56 crc kubenswrapper[4691]: I0930 06:20:56.644485 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:56 crc kubenswrapper[4691]: I0930 06:20:56.644542 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:56 crc kubenswrapper[4691]: I0930 06:20:56.644560 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:56 crc kubenswrapper[4691]: I0930 06:20:56.644584 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:56 crc kubenswrapper[4691]: I0930 06:20:56.644610 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:56Z","lastTransitionTime":"2025-09-30T06:20:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:56 crc kubenswrapper[4691]: I0930 06:20:56.748087 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:56 crc kubenswrapper[4691]: I0930 06:20:56.748149 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:56 crc kubenswrapper[4691]: I0930 06:20:56.748171 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:56 crc kubenswrapper[4691]: I0930 06:20:56.748200 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:56 crc kubenswrapper[4691]: I0930 06:20:56.748221 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:56Z","lastTransitionTime":"2025-09-30T06:20:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:56 crc kubenswrapper[4691]: I0930 06:20:56.850808 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:56 crc kubenswrapper[4691]: I0930 06:20:56.850933 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:56 crc kubenswrapper[4691]: I0930 06:20:56.850959 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:56 crc kubenswrapper[4691]: I0930 06:20:56.850993 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:56 crc kubenswrapper[4691]: I0930 06:20:56.851016 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:56Z","lastTransitionTime":"2025-09-30T06:20:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:56 crc kubenswrapper[4691]: I0930 06:20:56.965613 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:56 crc kubenswrapper[4691]: I0930 06:20:56.965649 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:56 crc kubenswrapper[4691]: I0930 06:20:56.965657 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:56 crc kubenswrapper[4691]: I0930 06:20:56.965671 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:56 crc kubenswrapper[4691]: I0930 06:20:56.965681 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:56Z","lastTransitionTime":"2025-09-30T06:20:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:57 crc kubenswrapper[4691]: I0930 06:20:57.068996 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:57 crc kubenswrapper[4691]: I0930 06:20:57.069101 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:57 crc kubenswrapper[4691]: I0930 06:20:57.069120 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:57 crc kubenswrapper[4691]: I0930 06:20:57.069213 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:57 crc kubenswrapper[4691]: I0930 06:20:57.069235 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:57Z","lastTransitionTime":"2025-09-30T06:20:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:57 crc kubenswrapper[4691]: I0930 06:20:57.154870 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a8ed6f92-0b98-4b1b-a46e-4d0604d686a1-metrics-certs\") pod \"network-metrics-daemon-svjxq\" (UID: \"a8ed6f92-0b98-4b1b-a46e-4d0604d686a1\") " pod="openshift-multus/network-metrics-daemon-svjxq" Sep 30 06:20:57 crc kubenswrapper[4691]: E0930 06:20:57.155131 4691 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 06:20:57 crc kubenswrapper[4691]: E0930 06:20:57.155280 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a8ed6f92-0b98-4b1b-a46e-4d0604d686a1-metrics-certs podName:a8ed6f92-0b98-4b1b-a46e-4d0604d686a1 nodeName:}" failed. No retries permitted until 2025-09-30 06:22:01.155240316 +0000 UTC m=+164.630261396 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a8ed6f92-0b98-4b1b-a46e-4d0604d686a1-metrics-certs") pod "network-metrics-daemon-svjxq" (UID: "a8ed6f92-0b98-4b1b-a46e-4d0604d686a1") : object "openshift-multus"/"metrics-daemon-secret" not registered Sep 30 06:20:57 crc kubenswrapper[4691]: I0930 06:20:57.171865 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:57 crc kubenswrapper[4691]: I0930 06:20:57.172027 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:57 crc kubenswrapper[4691]: I0930 06:20:57.172053 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:57 crc kubenswrapper[4691]: I0930 06:20:57.172090 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:57 crc kubenswrapper[4691]: I0930 06:20:57.172115 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:57Z","lastTransitionTime":"2025-09-30T06:20:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:57 crc kubenswrapper[4691]: I0930 06:20:57.223970 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 06:20:57 crc kubenswrapper[4691]: I0930 06:20:57.224057 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 06:20:57 crc kubenswrapper[4691]: E0930 06:20:57.224176 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 06:20:57 crc kubenswrapper[4691]: E0930 06:20:57.224352 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 06:20:57 crc kubenswrapper[4691]: I0930 06:20:57.242864 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-p7wmt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"99a6e728-8795-424d-a99e-7141c75baad5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://746d60ed8b3230cabda106f1d2e4bc8496618515f2cfb2d6c4238375f71ad2af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6jfq9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-p7wmt\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:57Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:57 crc kubenswrapper[4691]: I0930 06:20:57.260553 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fxv8w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37b1a1c7-92d7-41ea-b5c1-4a56b40f819e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://be3a8860307e499758a11dd6561cdf1ffd968b3edf9c701b52994ea4cfe1c129\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4h4k7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b07ee418ef8f92ea69f7d64ef09e0de246432d4663c4c5018be94894860c6444\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4h4k7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-fxv8w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:57Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:57 crc kubenswrapper[4691]: I0930 06:20:57.274820 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:57 crc kubenswrapper[4691]: I0930 06:20:57.274866 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:57 crc kubenswrapper[4691]: I0930 06:20:57.274940 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:57 crc kubenswrapper[4691]: I0930 06:20:57.274968 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:57 crc kubenswrapper[4691]: I0930 06:20:57.274987 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:57Z","lastTransitionTime":"2025-09-30T06:20:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:57 crc kubenswrapper[4691]: I0930 06:20:57.278692 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-svjxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a8ed6f92-0b98-4b1b-a46e-4d0604d686a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:53Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q98dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q98dd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:53Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-svjxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:57Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:57 crc kubenswrapper[4691]: I0930 06:20:57.299217 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:57Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:57 crc kubenswrapper[4691]: I0930 06:20:57.318339 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca08f5d521940a428454d90f85150e7b87e2353314c89a42b4c73740c3b23163\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:57Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:57 crc kubenswrapper[4691]: I0930 06:20:57.340263 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-xjjw8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5bfd073c-4582-4a65-8170-7030f4852174\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://139bc9d998c0acce39c92915ac1c15ec414b09e1f5c0ae15c37f7af6e5cf570d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3a065eb18eb909fb60011f5ae6adf327088480996585d74e861628ffdd759bd9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T06:20:26Z\\\",\\\"message\\\":\\\"2025-09-30T06:19:40+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_f75473c8-d557-49aa-97dd-d3a6a6daff28\\\\n2025-09-30T06:19:40+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_f75473c8-d557-49aa-97dd-d3a6a6daff28 to /host/opt/cni/bin/\\\\n2025-09-30T06:19:41Z [verbose] multus-daemon started\\\\n2025-09-30T06:19:41Z [verbose] Readiness Indicator file check\\\\n2025-09-30T06:20:26Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:20:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tjq2l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-xjjw8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:57Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:57 crc kubenswrapper[4691]: I0930 06:20:57.360365 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://03f725fd435423024f16548f2fdd44ad1ddc26d1d485d56cc5b614ab2cac7a76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5fcea9ccb5c1331b7ae737abdcc80364814715f3c43c15793316f16acf0502a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:57Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:57 crc kubenswrapper[4691]: I0930 06:20:57.378769 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:57 crc kubenswrapper[4691]: I0930 06:20:57.378847 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:57 crc kubenswrapper[4691]: I0930 06:20:57.378873 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:57 crc kubenswrapper[4691]: I0930 06:20:57.378941 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:57 crc kubenswrapper[4691]: I0930 06:20:57.378966 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:57Z","lastTransitionTime":"2025-09-30T06:20:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:57 crc kubenswrapper[4691]: I0930 06:20:57.379245 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:57Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:57 crc kubenswrapper[4691]: I0930 06:20:57.394797 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-8htrc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7c1c8663-d263-4b8c-93fa-05ee1b61d7f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://73cf34ef0b11305b959da324b55f8a211f0c278cf2dec835bb6c91ad385e4dca\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dj86t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-8htrc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:57Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:57 crc kubenswrapper[4691]: I0930 06:20:57.412147 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69b46ade-8260-448f-84b7-506632d23ff9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ff4769002bae0f06423979897a58fdb4b2a78f9ea00abc1d5df5c8d4385c0919\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xszvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1440cd9bacd9b28d5e192b3d9143dda931c741606fdb8f246fba23b8a68c534c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xszvq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4w4k6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:57Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:57 crc kubenswrapper[4691]: I0930 06:20:57.444385 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e55ae5a0-03bf-427d-92b2-39cdff18340d\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://56f6cedf7ad601b81415785791be352c849e1d559ab8e6a0f33de2c89a93cf09\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a65513dc3f11ab14acbafe004e02c1b395b74935e84a00e75d9936bde03a97fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6240aa4c18020b30de9389e7a1869c736424444208d4011f228b4e5432520cc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2841452dae413fa8cdf532c4933ed52cfb007be43a45ee2d25209b54890fd351\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4b052f8474075dfc1f07b8e0844072d5bbaa94bae1711f7e4cecc6928a65689a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://af65c5d81066ebe41bbd9d2e5f932d106474508430978bc2ae891f6adfe65f82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6c22d45f25c21aeb36f236e08dc8d80f325dc86a5b68cec8b18f296a72dec7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://81fab18a863da719b451037eb3b03c7ab65ea50664c9874a9ce59100121187a7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:20Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:57Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:57 crc kubenswrapper[4691]: I0930 06:20:57.466321 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cb1cfb02-1e6b-4bfb-8104-f1d137231de9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07fbce63750320e7345e1a99b4be96355f54c36c70e8929599b990eab3f3e637\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38c9aa49f81f590d234daa993bb92af0c48103b45f5116c27b86daa651930a2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://acbe061d62e7bf11b0003c035dce19386981672d5e63236fada6e43b92274c91\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://16fc15eff68924ff75a90b9cb7b5119d95ced9b9fe9e12968bfd077db73f37ca\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://017d7c032ea394e2dbef22cfc837c65ce54aab7db5fbd32210312b3e7042a698\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-09-30T06:19:31Z\\\",\\\"message\\\":\\\"W0930 06:19:20.583735 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0930 06:19:20.584237 1 crypto.go:601] Generating new CA for check-endpoints-signer@1759213160 cert, and key in /tmp/serving-cert-2229078217/serving-signer.crt, /tmp/serving-cert-2229078217/serving-signer.key\\\\nI0930 06:19:20.815054 1 observer_polling.go:159] Starting file observer\\\\nW0930 06:19:20.818069 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0930 06:19:20.818377 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0930 06:19:20.821284 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-2229078217/tls.crt::/tmp/serving-cert-2229078217/tls.key\\\\\\\"\\\\nF0930 06:19:31.389088 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": net/http: TLS handshake timeout\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://39d7f1cd59b9a24d4303415d873bac1ed21326847749130167868b5298b7f8e8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cc63f1a6320e8faacd878085b8fa4ce7df70a1b2d460c16d1599283c32511e53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:57Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:57 crc kubenswrapper[4691]: I0930 06:20:57.482163 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:57 crc kubenswrapper[4691]: I0930 06:20:57.482259 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:57 crc kubenswrapper[4691]: I0930 06:20:57.482278 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:57 crc kubenswrapper[4691]: I0930 06:20:57.482300 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:57 crc kubenswrapper[4691]: I0930 06:20:57.482319 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:57Z","lastTransitionTime":"2025-09-30T06:20:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:57 crc kubenswrapper[4691]: I0930 06:20:57.488542 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:57Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:57 crc kubenswrapper[4691]: I0930 06:20:57.504759 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"782724ba-31c7-4bd7-bd92-05a1e5a31c13\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3410624e48656bd289ed6d53742ae7dd5e0ba5148c42d9964a540ae97bd0b8d5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fff6d766366a54e73752a41e180f1e850b8e9c41a8189b7f1df5f82d28e2566e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fff6d766366a54e73752a41e180f1e850b8e9c41a8189b7f1df5f82d28e2566e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:57Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:57 crc kubenswrapper[4691]: I0930 06:20:57.525493 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3c3ecaa3-192f-40d4-abf8-938b9e03a661\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://469f62522e2f813405840a8654f761a52163bdabeba8cc7c57998ccd40548370\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198c76a83115408261c01c78cbd2845e487ad3dc896da0130b2b1338349506d8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9aadde3a7a603aae354d4f9dcf5ef288aeceabfb73ee3c92398fb3419326b27a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d2f054ea0da71d1f65e501212a66771c5e65bfc123e3a3acd5e13900def039d8\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:57Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:57 crc kubenswrapper[4691]: I0930 06:20:57.542997 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c655fbd5-0708-4151-b0fb-a97d8e1826bc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:20:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c055c319af36509c20e700f8f5025b0d356ca5e6038be80dd69282a1f1ad716b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://810c839e6c66bbacb466fb7023bed728b17be9d13025e2db26ee5b40fea124f7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d1f3d8c18456850212ce283d46273b39939040ddd575f193acc0910cf479f3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://deebf28973e8320961c1318918377f952fadba50dec1e580e461de659c63f23e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://deebf28973e8320961c1318918377f952fadba50dec1e580e461de659c63f23e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:18Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:18Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:17Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:57Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:57 crc kubenswrapper[4691]: I0930 06:20:57.562946 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:38Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://11fb453d515798fa4f85603df19a5dc7d305375d25c0f679598613aba7312328\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:57Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:57 crc kubenswrapper[4691]: I0930 06:20:57.585169 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:57 crc kubenswrapper[4691]: I0930 06:20:57.585242 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:57 crc kubenswrapper[4691]: I0930 06:20:57.585263 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:57 crc kubenswrapper[4691]: I0930 06:20:57.585294 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:57 crc kubenswrapper[4691]: I0930 06:20:57.585316 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:57Z","lastTransitionTime":"2025-09-30T06:20:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:57 crc kubenswrapper[4691]: I0930 06:20:57.586316 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-nzp64" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f8a4c6e8-ff14-4aa3-89bd-987f9d402ea6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b7ceecf30a44692340aed4f0df2aa58e8c9552f8ef872b7a872b29516fa2068\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d675d9797e8541b6fff5df78a8d0701a51d23b03bd2d489d0489a82f440b4a1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2d675d9797e8541b6fff5df78a8d0701a51d23b03bd2d489d0489a82f440b4a1\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8aacdb742861442ec0e7b14904397b4e1a4bd115cd56e6105b37c42cb762b0dd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8aacdb742861442ec0e7b14904397b4e1a4bd115cd56e6105b37c42cb762b0dd\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1f6c31e7404be56f11fd658c5e92f44de4704b9c2d5ebbce1fd0af40d2ca8c9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1f6c31e7404be56f11fd658c5e92f44de4704b9c2d5ebbce1fd0af40d2ca8c9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://74da2d6dc28ebfcbfff21e47ebad84a906c4686fddf32d04eb494be40d811b86\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://74da2d6dc28ebfcbfff21e47ebad84a906c4686fddf32d04eb494be40d811b86\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0d1c4f2763fc5ea8ecd81a1c6aff31d291156a4db150fc70c0111b811bd5467c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0d1c4f2763fc5ea8ecd81a1c6aff31d291156a4db150fc70c0111b811bd5467c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f313d83849679400f0ef2223f644c042290a33d55cab935fdbe9af123c88d210\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f313d83849679400f0ef2223f644c042290a33d55cab935fdbe9af123c88d210\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tmgj7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-nzp64\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:57Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:57 crc kubenswrapper[4691]: I0930 06:20:57.617334 4691 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5b6df86867d5427797ec0c0a28b653ef5b8ec62af450207d9af5f490df725b05\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04bc58a7b740ca100f333ef769a949f962ff6024f8dd87b798c99a28514e0394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36d95203bc3297779316f06d001c72859f9b4561a9a9d7c3123c5a75e8d8bc6b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f46927454f67993366632bb3f6e37e4cb0c5f013266b18c1ac61e9cddc42e023\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://20789e47d7e584044b0763af1eb8b81bbde2f247ec14e56a5a9b08c28894c3be\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d002e869e911c24f154df90324ace473c747a4cfa6dd60ddcbef6473f9815f72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://76736edc05ab57e4ee3d6dea3b7f4d2033e718cdac579abe7c4f41fe6988f62e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://76736edc05ab57e4ee3d6dea3b7f4d2033e718cdac579abe7c4f41fe6988f62e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-09-30T06:20:35Z\\\",\\\"message\\\":\\\"selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.254:443:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {e4e4203e-87c7-4024-930a-5d6bdfe2bdde}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:} {Op:mutate Table:NB_Global Row:map[] Rows:[] Columns:[] Mutations:[{Column:nb_cfg Mutator:+= Value:1}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {6011affd-30a6-4be6-872d-e4cf1ca780cf}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0930 06:20:35.149244 6712 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI0930 06:20:35.149429 6712 model_client.go:382] Update operations generated as: [{Op:update Table:NAT Row:map[external_ip:192.168.126.11 logical_ip:10.217.0.92 options:{GoMap:map[stateless:false]} type:snat] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {73135118-cf1b-4568-bd31-2f50308bf69d}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0930 06:20:35.149455 6712 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-crc\\\\nI0930 06:20:35.149188 6712 obj_retry.go:386] Retry successful for *v1.Pod openshift-multus/multus-xjjw8 after 0 failed attempt(s)\\\\nI0930 06:20:35.149471 6712 model_client.go:398] Mutate operations generated as: [{Op:mutate Table:L\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-09-30T06:20:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-sjmvw_openshift-ovn-kubernetes(6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c9f49abce591ae5c912fdbbdf0253d06ee20088424718dd6d535683a2a2ede5e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-09-30T06:19:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-09-30T06:19:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-09-30T06:19:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5nvgw\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-09-30T06:19:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-sjmvw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-09-30T06:20:57Z is after 2025-08-24T17:21:41Z" Sep 30 06:20:57 crc kubenswrapper[4691]: I0930 06:20:57.688185 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:57 crc kubenswrapper[4691]: I0930 06:20:57.688241 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:57 crc kubenswrapper[4691]: I0930 06:20:57.688260 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:57 crc kubenswrapper[4691]: I0930 06:20:57.688285 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:57 crc kubenswrapper[4691]: I0930 06:20:57.688302 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:57Z","lastTransitionTime":"2025-09-30T06:20:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:57 crc kubenswrapper[4691]: I0930 06:20:57.791100 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:57 crc kubenswrapper[4691]: I0930 06:20:57.791164 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:57 crc kubenswrapper[4691]: I0930 06:20:57.791185 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:57 crc kubenswrapper[4691]: I0930 06:20:57.791215 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:57 crc kubenswrapper[4691]: I0930 06:20:57.791246 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:57Z","lastTransitionTime":"2025-09-30T06:20:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:57 crc kubenswrapper[4691]: I0930 06:20:57.894045 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:57 crc kubenswrapper[4691]: I0930 06:20:57.894095 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:57 crc kubenswrapper[4691]: I0930 06:20:57.894114 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:57 crc kubenswrapper[4691]: I0930 06:20:57.894136 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:57 crc kubenswrapper[4691]: I0930 06:20:57.894153 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:57Z","lastTransitionTime":"2025-09-30T06:20:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:57 crc kubenswrapper[4691]: I0930 06:20:57.997343 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:57 crc kubenswrapper[4691]: I0930 06:20:57.997397 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:57 crc kubenswrapper[4691]: I0930 06:20:57.997415 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:57 crc kubenswrapper[4691]: I0930 06:20:57.997438 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:57 crc kubenswrapper[4691]: I0930 06:20:57.997455 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:57Z","lastTransitionTime":"2025-09-30T06:20:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:58 crc kubenswrapper[4691]: I0930 06:20:58.100648 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:58 crc kubenswrapper[4691]: I0930 06:20:58.100705 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:58 crc kubenswrapper[4691]: I0930 06:20:58.100725 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:58 crc kubenswrapper[4691]: I0930 06:20:58.100747 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:58 crc kubenswrapper[4691]: I0930 06:20:58.100764 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:58Z","lastTransitionTime":"2025-09-30T06:20:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:58 crc kubenswrapper[4691]: I0930 06:20:58.203432 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:58 crc kubenswrapper[4691]: I0930 06:20:58.203492 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:58 crc kubenswrapper[4691]: I0930 06:20:58.203508 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:58 crc kubenswrapper[4691]: I0930 06:20:58.203535 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:58 crc kubenswrapper[4691]: I0930 06:20:58.203555 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:58Z","lastTransitionTime":"2025-09-30T06:20:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:58 crc kubenswrapper[4691]: I0930 06:20:58.224377 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 06:20:58 crc kubenswrapper[4691]: E0930 06:20:58.224551 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 06:20:58 crc kubenswrapper[4691]: I0930 06:20:58.224406 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-svjxq" Sep 30 06:20:58 crc kubenswrapper[4691]: E0930 06:20:58.225052 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-svjxq" podUID="a8ed6f92-0b98-4b1b-a46e-4d0604d686a1" Sep 30 06:20:58 crc kubenswrapper[4691]: I0930 06:20:58.307394 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:58 crc kubenswrapper[4691]: I0930 06:20:58.307454 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:58 crc kubenswrapper[4691]: I0930 06:20:58.307478 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:58 crc kubenswrapper[4691]: I0930 06:20:58.307506 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:58 crc kubenswrapper[4691]: I0930 06:20:58.307525 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:58Z","lastTransitionTime":"2025-09-30T06:20:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:58 crc kubenswrapper[4691]: I0930 06:20:58.410683 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:58 crc kubenswrapper[4691]: I0930 06:20:58.410726 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:58 crc kubenswrapper[4691]: I0930 06:20:58.410744 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:58 crc kubenswrapper[4691]: I0930 06:20:58.410765 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:58 crc kubenswrapper[4691]: I0930 06:20:58.410781 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:58Z","lastTransitionTime":"2025-09-30T06:20:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:58 crc kubenswrapper[4691]: I0930 06:20:58.513843 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:58 crc kubenswrapper[4691]: I0930 06:20:58.513941 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:58 crc kubenswrapper[4691]: I0930 06:20:58.513967 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:58 crc kubenswrapper[4691]: I0930 06:20:58.513991 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:58 crc kubenswrapper[4691]: I0930 06:20:58.514007 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:58Z","lastTransitionTime":"2025-09-30T06:20:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:58 crc kubenswrapper[4691]: I0930 06:20:58.617559 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:58 crc kubenswrapper[4691]: I0930 06:20:58.617619 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:58 crc kubenswrapper[4691]: I0930 06:20:58.617635 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:58 crc kubenswrapper[4691]: I0930 06:20:58.617661 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:58 crc kubenswrapper[4691]: I0930 06:20:58.617678 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:58Z","lastTransitionTime":"2025-09-30T06:20:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:58 crc kubenswrapper[4691]: I0930 06:20:58.722008 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:58 crc kubenswrapper[4691]: I0930 06:20:58.722073 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:58 crc kubenswrapper[4691]: I0930 06:20:58.722098 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:58 crc kubenswrapper[4691]: I0930 06:20:58.722128 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:58 crc kubenswrapper[4691]: I0930 06:20:58.722150 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:58Z","lastTransitionTime":"2025-09-30T06:20:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:58 crc kubenswrapper[4691]: I0930 06:20:58.824781 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:58 crc kubenswrapper[4691]: I0930 06:20:58.824860 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:58 crc kubenswrapper[4691]: I0930 06:20:58.824882 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:58 crc kubenswrapper[4691]: I0930 06:20:58.824951 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:58 crc kubenswrapper[4691]: I0930 06:20:58.824989 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:58Z","lastTransitionTime":"2025-09-30T06:20:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:58 crc kubenswrapper[4691]: I0930 06:20:58.927309 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:58 crc kubenswrapper[4691]: I0930 06:20:58.927370 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:58 crc kubenswrapper[4691]: I0930 06:20:58.927387 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:58 crc kubenswrapper[4691]: I0930 06:20:58.927410 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:58 crc kubenswrapper[4691]: I0930 06:20:58.927427 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:58Z","lastTransitionTime":"2025-09-30T06:20:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:59 crc kubenswrapper[4691]: I0930 06:20:59.029875 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:59 crc kubenswrapper[4691]: I0930 06:20:59.029962 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:59 crc kubenswrapper[4691]: I0930 06:20:59.029978 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:59 crc kubenswrapper[4691]: I0930 06:20:59.030002 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:59 crc kubenswrapper[4691]: I0930 06:20:59.030020 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:59Z","lastTransitionTime":"2025-09-30T06:20:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:59 crc kubenswrapper[4691]: I0930 06:20:59.132686 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:59 crc kubenswrapper[4691]: I0930 06:20:59.132747 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:59 crc kubenswrapper[4691]: I0930 06:20:59.132771 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:59 crc kubenswrapper[4691]: I0930 06:20:59.132801 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:59 crc kubenswrapper[4691]: I0930 06:20:59.132821 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:59Z","lastTransitionTime":"2025-09-30T06:20:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:59 crc kubenswrapper[4691]: I0930 06:20:59.224263 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 06:20:59 crc kubenswrapper[4691]: I0930 06:20:59.224275 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 06:20:59 crc kubenswrapper[4691]: E0930 06:20:59.224528 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 06:20:59 crc kubenswrapper[4691]: E0930 06:20:59.224800 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 06:20:59 crc kubenswrapper[4691]: I0930 06:20:59.235634 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:59 crc kubenswrapper[4691]: I0930 06:20:59.235718 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:59 crc kubenswrapper[4691]: I0930 06:20:59.235746 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:59 crc kubenswrapper[4691]: I0930 06:20:59.235780 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:59 crc kubenswrapper[4691]: I0930 06:20:59.235805 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:59Z","lastTransitionTime":"2025-09-30T06:20:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:59 crc kubenswrapper[4691]: I0930 06:20:59.338394 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:59 crc kubenswrapper[4691]: I0930 06:20:59.338444 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:59 crc kubenswrapper[4691]: I0930 06:20:59.338455 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:59 crc kubenswrapper[4691]: I0930 06:20:59.338473 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:59 crc kubenswrapper[4691]: I0930 06:20:59.338486 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:59Z","lastTransitionTime":"2025-09-30T06:20:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:59 crc kubenswrapper[4691]: I0930 06:20:59.441286 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:59 crc kubenswrapper[4691]: I0930 06:20:59.441345 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:59 crc kubenswrapper[4691]: I0930 06:20:59.441361 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:59 crc kubenswrapper[4691]: I0930 06:20:59.441385 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:59 crc kubenswrapper[4691]: I0930 06:20:59.441404 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:59Z","lastTransitionTime":"2025-09-30T06:20:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:59 crc kubenswrapper[4691]: I0930 06:20:59.544284 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:59 crc kubenswrapper[4691]: I0930 06:20:59.544352 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:59 crc kubenswrapper[4691]: I0930 06:20:59.544373 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:59 crc kubenswrapper[4691]: I0930 06:20:59.544396 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:59 crc kubenswrapper[4691]: I0930 06:20:59.544417 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:59Z","lastTransitionTime":"2025-09-30T06:20:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:59 crc kubenswrapper[4691]: I0930 06:20:59.646694 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:59 crc kubenswrapper[4691]: I0930 06:20:59.646726 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:59 crc kubenswrapper[4691]: I0930 06:20:59.646741 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:59 crc kubenswrapper[4691]: I0930 06:20:59.646760 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:59 crc kubenswrapper[4691]: I0930 06:20:59.646777 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:59Z","lastTransitionTime":"2025-09-30T06:20:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:59 crc kubenswrapper[4691]: I0930 06:20:59.748824 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:59 crc kubenswrapper[4691]: I0930 06:20:59.748914 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:59 crc kubenswrapper[4691]: I0930 06:20:59.748940 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:59 crc kubenswrapper[4691]: I0930 06:20:59.748969 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:59 crc kubenswrapper[4691]: I0930 06:20:59.748990 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:59Z","lastTransitionTime":"2025-09-30T06:20:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:59 crc kubenswrapper[4691]: I0930 06:20:59.851962 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:59 crc kubenswrapper[4691]: I0930 06:20:59.852110 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:59 crc kubenswrapper[4691]: I0930 06:20:59.852140 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:59 crc kubenswrapper[4691]: I0930 06:20:59.852172 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:59 crc kubenswrapper[4691]: I0930 06:20:59.852191 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:59Z","lastTransitionTime":"2025-09-30T06:20:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:20:59 crc kubenswrapper[4691]: I0930 06:20:59.954591 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:20:59 crc kubenswrapper[4691]: I0930 06:20:59.954652 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:20:59 crc kubenswrapper[4691]: I0930 06:20:59.954668 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:20:59 crc kubenswrapper[4691]: I0930 06:20:59.954690 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:20:59 crc kubenswrapper[4691]: I0930 06:20:59.954707 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:20:59Z","lastTransitionTime":"2025-09-30T06:20:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:21:00 crc kubenswrapper[4691]: I0930 06:21:00.058249 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:21:00 crc kubenswrapper[4691]: I0930 06:21:00.058304 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:21:00 crc kubenswrapper[4691]: I0930 06:21:00.058320 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:21:00 crc kubenswrapper[4691]: I0930 06:21:00.058343 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:21:00 crc kubenswrapper[4691]: I0930 06:21:00.058359 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:21:00Z","lastTransitionTime":"2025-09-30T06:21:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:21:00 crc kubenswrapper[4691]: I0930 06:21:00.161045 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:21:00 crc kubenswrapper[4691]: I0930 06:21:00.161106 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:21:00 crc kubenswrapper[4691]: I0930 06:21:00.161129 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:21:00 crc kubenswrapper[4691]: I0930 06:21:00.161158 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:21:00 crc kubenswrapper[4691]: I0930 06:21:00.161179 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:21:00Z","lastTransitionTime":"2025-09-30T06:21:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:21:00 crc kubenswrapper[4691]: I0930 06:21:00.224114 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-svjxq" Sep 30 06:21:00 crc kubenswrapper[4691]: I0930 06:21:00.224280 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 06:21:00 crc kubenswrapper[4691]: E0930 06:21:00.224829 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-svjxq" podUID="a8ed6f92-0b98-4b1b-a46e-4d0604d686a1" Sep 30 06:21:00 crc kubenswrapper[4691]: E0930 06:21:00.225012 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 06:21:00 crc kubenswrapper[4691]: I0930 06:21:00.264725 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:21:00 crc kubenswrapper[4691]: I0930 06:21:00.264788 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:21:00 crc kubenswrapper[4691]: I0930 06:21:00.264810 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:21:00 crc kubenswrapper[4691]: I0930 06:21:00.264837 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:21:00 crc kubenswrapper[4691]: I0930 06:21:00.264858 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:21:00Z","lastTransitionTime":"2025-09-30T06:21:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:21:00 crc kubenswrapper[4691]: I0930 06:21:00.368100 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:21:00 crc kubenswrapper[4691]: I0930 06:21:00.368158 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:21:00 crc kubenswrapper[4691]: I0930 06:21:00.368174 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:21:00 crc kubenswrapper[4691]: I0930 06:21:00.368198 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:21:00 crc kubenswrapper[4691]: I0930 06:21:00.368215 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:21:00Z","lastTransitionTime":"2025-09-30T06:21:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:21:00 crc kubenswrapper[4691]: I0930 06:21:00.471310 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:21:00 crc kubenswrapper[4691]: I0930 06:21:00.471359 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:21:00 crc kubenswrapper[4691]: I0930 06:21:00.471399 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:21:00 crc kubenswrapper[4691]: I0930 06:21:00.471422 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:21:00 crc kubenswrapper[4691]: I0930 06:21:00.471438 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:21:00Z","lastTransitionTime":"2025-09-30T06:21:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:21:00 crc kubenswrapper[4691]: I0930 06:21:00.574398 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:21:00 crc kubenswrapper[4691]: I0930 06:21:00.574445 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:21:00 crc kubenswrapper[4691]: I0930 06:21:00.574562 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:21:00 crc kubenswrapper[4691]: I0930 06:21:00.574598 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:21:00 crc kubenswrapper[4691]: I0930 06:21:00.574614 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:21:00Z","lastTransitionTime":"2025-09-30T06:21:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:21:00 crc kubenswrapper[4691]: I0930 06:21:00.677443 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:21:00 crc kubenswrapper[4691]: I0930 06:21:00.677505 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:21:00 crc kubenswrapper[4691]: I0930 06:21:00.677523 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:21:00 crc kubenswrapper[4691]: I0930 06:21:00.677547 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:21:00 crc kubenswrapper[4691]: I0930 06:21:00.677566 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:21:00Z","lastTransitionTime":"2025-09-30T06:21:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:21:00 crc kubenswrapper[4691]: I0930 06:21:00.779578 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:21:00 crc kubenswrapper[4691]: I0930 06:21:00.779628 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:21:00 crc kubenswrapper[4691]: I0930 06:21:00.779644 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:21:00 crc kubenswrapper[4691]: I0930 06:21:00.779668 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:21:00 crc kubenswrapper[4691]: I0930 06:21:00.779687 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:21:00Z","lastTransitionTime":"2025-09-30T06:21:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:21:00 crc kubenswrapper[4691]: I0930 06:21:00.882372 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:21:00 crc kubenswrapper[4691]: I0930 06:21:00.882432 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:21:00 crc kubenswrapper[4691]: I0930 06:21:00.882449 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:21:00 crc kubenswrapper[4691]: I0930 06:21:00.882473 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:21:00 crc kubenswrapper[4691]: I0930 06:21:00.882490 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:21:00Z","lastTransitionTime":"2025-09-30T06:21:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:21:00 crc kubenswrapper[4691]: I0930 06:21:00.985143 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:21:00 crc kubenswrapper[4691]: I0930 06:21:00.985199 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:21:00 crc kubenswrapper[4691]: I0930 06:21:00.985215 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:21:00 crc kubenswrapper[4691]: I0930 06:21:00.985238 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:21:00 crc kubenswrapper[4691]: I0930 06:21:00.985256 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:21:00Z","lastTransitionTime":"2025-09-30T06:21:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:21:01 crc kubenswrapper[4691]: I0930 06:21:01.087973 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:21:01 crc kubenswrapper[4691]: I0930 06:21:01.088037 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:21:01 crc kubenswrapper[4691]: I0930 06:21:01.088054 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:21:01 crc kubenswrapper[4691]: I0930 06:21:01.088078 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:21:01 crc kubenswrapper[4691]: I0930 06:21:01.088095 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:21:01Z","lastTransitionTime":"2025-09-30T06:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:21:01 crc kubenswrapper[4691]: I0930 06:21:01.191067 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:21:01 crc kubenswrapper[4691]: I0930 06:21:01.191169 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:21:01 crc kubenswrapper[4691]: I0930 06:21:01.191189 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:21:01 crc kubenswrapper[4691]: I0930 06:21:01.191213 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:21:01 crc kubenswrapper[4691]: I0930 06:21:01.191231 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:21:01Z","lastTransitionTime":"2025-09-30T06:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:21:01 crc kubenswrapper[4691]: I0930 06:21:01.224261 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 06:21:01 crc kubenswrapper[4691]: E0930 06:21:01.224463 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 06:21:01 crc kubenswrapper[4691]: I0930 06:21:01.224580 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 06:21:01 crc kubenswrapper[4691]: E0930 06:21:01.224832 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 06:21:01 crc kubenswrapper[4691]: I0930 06:21:01.293963 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:21:01 crc kubenswrapper[4691]: I0930 06:21:01.294068 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:21:01 crc kubenswrapper[4691]: I0930 06:21:01.294088 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:21:01 crc kubenswrapper[4691]: I0930 06:21:01.294118 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:21:01 crc kubenswrapper[4691]: I0930 06:21:01.294135 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:21:01Z","lastTransitionTime":"2025-09-30T06:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:21:01 crc kubenswrapper[4691]: I0930 06:21:01.397840 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:21:01 crc kubenswrapper[4691]: I0930 06:21:01.397945 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:21:01 crc kubenswrapper[4691]: I0930 06:21:01.397964 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:21:01 crc kubenswrapper[4691]: I0930 06:21:01.397990 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:21:01 crc kubenswrapper[4691]: I0930 06:21:01.398009 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:21:01Z","lastTransitionTime":"2025-09-30T06:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:21:01 crc kubenswrapper[4691]: I0930 06:21:01.502232 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:21:01 crc kubenswrapper[4691]: I0930 06:21:01.502320 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:21:01 crc kubenswrapper[4691]: I0930 06:21:01.502341 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:21:01 crc kubenswrapper[4691]: I0930 06:21:01.502373 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:21:01 crc kubenswrapper[4691]: I0930 06:21:01.502396 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:21:01Z","lastTransitionTime":"2025-09-30T06:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:21:01 crc kubenswrapper[4691]: I0930 06:21:01.606111 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:21:01 crc kubenswrapper[4691]: I0930 06:21:01.606193 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:21:01 crc kubenswrapper[4691]: I0930 06:21:01.606213 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:21:01 crc kubenswrapper[4691]: I0930 06:21:01.606249 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:21:01 crc kubenswrapper[4691]: I0930 06:21:01.606270 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:21:01Z","lastTransitionTime":"2025-09-30T06:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:21:01 crc kubenswrapper[4691]: I0930 06:21:01.709123 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:21:01 crc kubenswrapper[4691]: I0930 06:21:01.709200 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:21:01 crc kubenswrapper[4691]: I0930 06:21:01.709228 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:21:01 crc kubenswrapper[4691]: I0930 06:21:01.709260 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:21:01 crc kubenswrapper[4691]: I0930 06:21:01.709286 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:21:01Z","lastTransitionTime":"2025-09-30T06:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:21:01 crc kubenswrapper[4691]: I0930 06:21:01.812704 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:21:01 crc kubenswrapper[4691]: I0930 06:21:01.812772 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:21:01 crc kubenswrapper[4691]: I0930 06:21:01.812790 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:21:01 crc kubenswrapper[4691]: I0930 06:21:01.812819 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:21:01 crc kubenswrapper[4691]: I0930 06:21:01.812835 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:21:01Z","lastTransitionTime":"2025-09-30T06:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:21:01 crc kubenswrapper[4691]: I0930 06:21:01.916712 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:21:01 crc kubenswrapper[4691]: I0930 06:21:01.916780 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:21:01 crc kubenswrapper[4691]: I0930 06:21:01.916796 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:21:01 crc kubenswrapper[4691]: I0930 06:21:01.916821 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:21:01 crc kubenswrapper[4691]: I0930 06:21:01.916840 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:21:01Z","lastTransitionTime":"2025-09-30T06:21:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:21:02 crc kubenswrapper[4691]: I0930 06:21:02.022036 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:21:02 crc kubenswrapper[4691]: I0930 06:21:02.022078 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:21:02 crc kubenswrapper[4691]: I0930 06:21:02.022091 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:21:02 crc kubenswrapper[4691]: I0930 06:21:02.022112 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:21:02 crc kubenswrapper[4691]: I0930 06:21:02.022127 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:21:02Z","lastTransitionTime":"2025-09-30T06:21:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:21:02 crc kubenswrapper[4691]: I0930 06:21:02.125249 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:21:02 crc kubenswrapper[4691]: I0930 06:21:02.125296 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:21:02 crc kubenswrapper[4691]: I0930 06:21:02.125308 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:21:02 crc kubenswrapper[4691]: I0930 06:21:02.125327 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:21:02 crc kubenswrapper[4691]: I0930 06:21:02.125341 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:21:02Z","lastTransitionTime":"2025-09-30T06:21:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:21:02 crc kubenswrapper[4691]: I0930 06:21:02.224244 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 06:21:02 crc kubenswrapper[4691]: I0930 06:21:02.224343 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-svjxq" Sep 30 06:21:02 crc kubenswrapper[4691]: E0930 06:21:02.224417 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 06:21:02 crc kubenswrapper[4691]: E0930 06:21:02.224485 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-svjxq" podUID="a8ed6f92-0b98-4b1b-a46e-4d0604d686a1" Sep 30 06:21:02 crc kubenswrapper[4691]: I0930 06:21:02.228662 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:21:02 crc kubenswrapper[4691]: I0930 06:21:02.228727 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:21:02 crc kubenswrapper[4691]: I0930 06:21:02.228745 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:21:02 crc kubenswrapper[4691]: I0930 06:21:02.228767 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:21:02 crc kubenswrapper[4691]: I0930 06:21:02.228785 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:21:02Z","lastTransitionTime":"2025-09-30T06:21:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:21:02 crc kubenswrapper[4691]: I0930 06:21:02.331328 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:21:02 crc kubenswrapper[4691]: I0930 06:21:02.331397 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:21:02 crc kubenswrapper[4691]: I0930 06:21:02.331420 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:21:02 crc kubenswrapper[4691]: I0930 06:21:02.331450 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:21:02 crc kubenswrapper[4691]: I0930 06:21:02.331472 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:21:02Z","lastTransitionTime":"2025-09-30T06:21:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:21:02 crc kubenswrapper[4691]: I0930 06:21:02.434016 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:21:02 crc kubenswrapper[4691]: I0930 06:21:02.434078 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:21:02 crc kubenswrapper[4691]: I0930 06:21:02.434095 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:21:02 crc kubenswrapper[4691]: I0930 06:21:02.434118 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:21:02 crc kubenswrapper[4691]: I0930 06:21:02.434135 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:21:02Z","lastTransitionTime":"2025-09-30T06:21:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:21:02 crc kubenswrapper[4691]: I0930 06:21:02.536660 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:21:02 crc kubenswrapper[4691]: I0930 06:21:02.536718 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:21:02 crc kubenswrapper[4691]: I0930 06:21:02.536734 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:21:02 crc kubenswrapper[4691]: I0930 06:21:02.536757 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:21:02 crc kubenswrapper[4691]: I0930 06:21:02.536774 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:21:02Z","lastTransitionTime":"2025-09-30T06:21:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:21:02 crc kubenswrapper[4691]: I0930 06:21:02.641137 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:21:02 crc kubenswrapper[4691]: I0930 06:21:02.641197 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:21:02 crc kubenswrapper[4691]: I0930 06:21:02.641231 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:21:02 crc kubenswrapper[4691]: I0930 06:21:02.641267 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:21:02 crc kubenswrapper[4691]: I0930 06:21:02.641290 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:21:02Z","lastTransitionTime":"2025-09-30T06:21:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:21:02 crc kubenswrapper[4691]: I0930 06:21:02.744763 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:21:02 crc kubenswrapper[4691]: I0930 06:21:02.744824 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:21:02 crc kubenswrapper[4691]: I0930 06:21:02.744850 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:21:02 crc kubenswrapper[4691]: I0930 06:21:02.744914 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:21:02 crc kubenswrapper[4691]: I0930 06:21:02.744940 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:21:02Z","lastTransitionTime":"2025-09-30T06:21:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:21:02 crc kubenswrapper[4691]: I0930 06:21:02.848316 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:21:02 crc kubenswrapper[4691]: I0930 06:21:02.848385 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:21:02 crc kubenswrapper[4691]: I0930 06:21:02.848410 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:21:02 crc kubenswrapper[4691]: I0930 06:21:02.848439 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:21:02 crc kubenswrapper[4691]: I0930 06:21:02.848461 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:21:02Z","lastTransitionTime":"2025-09-30T06:21:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:21:02 crc kubenswrapper[4691]: I0930 06:21:02.951992 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:21:02 crc kubenswrapper[4691]: I0930 06:21:02.952050 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:21:02 crc kubenswrapper[4691]: I0930 06:21:02.952066 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:21:02 crc kubenswrapper[4691]: I0930 06:21:02.952094 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:21:02 crc kubenswrapper[4691]: I0930 06:21:02.952112 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:21:02Z","lastTransitionTime":"2025-09-30T06:21:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:21:03 crc kubenswrapper[4691]: I0930 06:21:03.055305 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:21:03 crc kubenswrapper[4691]: I0930 06:21:03.055356 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:21:03 crc kubenswrapper[4691]: I0930 06:21:03.055377 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:21:03 crc kubenswrapper[4691]: I0930 06:21:03.055403 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:21:03 crc kubenswrapper[4691]: I0930 06:21:03.055424 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:21:03Z","lastTransitionTime":"2025-09-30T06:21:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:21:03 crc kubenswrapper[4691]: I0930 06:21:03.157308 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:21:03 crc kubenswrapper[4691]: I0930 06:21:03.157359 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:21:03 crc kubenswrapper[4691]: I0930 06:21:03.157375 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:21:03 crc kubenswrapper[4691]: I0930 06:21:03.157400 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:21:03 crc kubenswrapper[4691]: I0930 06:21:03.157417 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:21:03Z","lastTransitionTime":"2025-09-30T06:21:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:21:03 crc kubenswrapper[4691]: I0930 06:21:03.224302 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 06:21:03 crc kubenswrapper[4691]: E0930 06:21:03.224576 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 06:21:03 crc kubenswrapper[4691]: I0930 06:21:03.225118 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 06:21:03 crc kubenswrapper[4691]: E0930 06:21:03.225243 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 06:21:03 crc kubenswrapper[4691]: I0930 06:21:03.260568 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:21:03 crc kubenswrapper[4691]: I0930 06:21:03.260636 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:21:03 crc kubenswrapper[4691]: I0930 06:21:03.260655 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:21:03 crc kubenswrapper[4691]: I0930 06:21:03.260691 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:21:03 crc kubenswrapper[4691]: I0930 06:21:03.260713 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:21:03Z","lastTransitionTime":"2025-09-30T06:21:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:21:03 crc kubenswrapper[4691]: I0930 06:21:03.364531 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:21:03 crc kubenswrapper[4691]: I0930 06:21:03.364627 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:21:03 crc kubenswrapper[4691]: I0930 06:21:03.364644 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:21:03 crc kubenswrapper[4691]: I0930 06:21:03.364751 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:21:03 crc kubenswrapper[4691]: I0930 06:21:03.364770 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:21:03Z","lastTransitionTime":"2025-09-30T06:21:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:21:03 crc kubenswrapper[4691]: I0930 06:21:03.467722 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:21:03 crc kubenswrapper[4691]: I0930 06:21:03.467782 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:21:03 crc kubenswrapper[4691]: I0930 06:21:03.467799 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:21:03 crc kubenswrapper[4691]: I0930 06:21:03.467823 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:21:03 crc kubenswrapper[4691]: I0930 06:21:03.467843 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:21:03Z","lastTransitionTime":"2025-09-30T06:21:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:21:03 crc kubenswrapper[4691]: I0930 06:21:03.571373 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:21:03 crc kubenswrapper[4691]: I0930 06:21:03.571556 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:21:03 crc kubenswrapper[4691]: I0930 06:21:03.571633 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:21:03 crc kubenswrapper[4691]: I0930 06:21:03.571671 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:21:03 crc kubenswrapper[4691]: I0930 06:21:03.571745 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:21:03Z","lastTransitionTime":"2025-09-30T06:21:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:21:03 crc kubenswrapper[4691]: I0930 06:21:03.674821 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:21:03 crc kubenswrapper[4691]: I0930 06:21:03.674902 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:21:03 crc kubenswrapper[4691]: I0930 06:21:03.674914 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:21:03 crc kubenswrapper[4691]: I0930 06:21:03.674934 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:21:03 crc kubenswrapper[4691]: I0930 06:21:03.674950 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:21:03Z","lastTransitionTime":"2025-09-30T06:21:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:21:03 crc kubenswrapper[4691]: I0930 06:21:03.777809 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:21:03 crc kubenswrapper[4691]: I0930 06:21:03.777881 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:21:03 crc kubenswrapper[4691]: I0930 06:21:03.777925 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:21:03 crc kubenswrapper[4691]: I0930 06:21:03.777950 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:21:03 crc kubenswrapper[4691]: I0930 06:21:03.777967 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:21:03Z","lastTransitionTime":"2025-09-30T06:21:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:21:03 crc kubenswrapper[4691]: I0930 06:21:03.880936 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:21:03 crc kubenswrapper[4691]: I0930 06:21:03.881022 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:21:03 crc kubenswrapper[4691]: I0930 06:21:03.881047 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:21:03 crc kubenswrapper[4691]: I0930 06:21:03.881087 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:21:03 crc kubenswrapper[4691]: I0930 06:21:03.881114 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:21:03Z","lastTransitionTime":"2025-09-30T06:21:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:21:03 crc kubenswrapper[4691]: I0930 06:21:03.983846 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:21:03 crc kubenswrapper[4691]: I0930 06:21:03.983925 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:21:03 crc kubenswrapper[4691]: I0930 06:21:03.983944 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:21:03 crc kubenswrapper[4691]: I0930 06:21:03.983966 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:21:03 crc kubenswrapper[4691]: I0930 06:21:03.983982 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:21:03Z","lastTransitionTime":"2025-09-30T06:21:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:21:04 crc kubenswrapper[4691]: I0930 06:21:04.087366 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:21:04 crc kubenswrapper[4691]: I0930 06:21:04.087431 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:21:04 crc kubenswrapper[4691]: I0930 06:21:04.087447 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:21:04 crc kubenswrapper[4691]: I0930 06:21:04.087473 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:21:04 crc kubenswrapper[4691]: I0930 06:21:04.087490 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:21:04Z","lastTransitionTime":"2025-09-30T06:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:21:04 crc kubenswrapper[4691]: I0930 06:21:04.190116 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:21:04 crc kubenswrapper[4691]: I0930 06:21:04.190213 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:21:04 crc kubenswrapper[4691]: I0930 06:21:04.190232 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:21:04 crc kubenswrapper[4691]: I0930 06:21:04.190254 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:21:04 crc kubenswrapper[4691]: I0930 06:21:04.190272 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:21:04Z","lastTransitionTime":"2025-09-30T06:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:21:04 crc kubenswrapper[4691]: I0930 06:21:04.223771 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-svjxq" Sep 30 06:21:04 crc kubenswrapper[4691]: E0930 06:21:04.224002 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-svjxq" podUID="a8ed6f92-0b98-4b1b-a46e-4d0604d686a1" Sep 30 06:21:04 crc kubenswrapper[4691]: I0930 06:21:04.224069 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 06:21:04 crc kubenswrapper[4691]: I0930 06:21:04.225002 4691 scope.go:117] "RemoveContainer" containerID="76736edc05ab57e4ee3d6dea3b7f4d2033e718cdac579abe7c4f41fe6988f62e" Sep 30 06:21:04 crc kubenswrapper[4691]: E0930 06:21:04.225177 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 06:21:04 crc kubenswrapper[4691]: E0930 06:21:04.225365 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-sjmvw_openshift-ovn-kubernetes(6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d)\"" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" podUID="6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" Sep 30 06:21:04 crc kubenswrapper[4691]: I0930 06:21:04.292729 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:21:04 crc kubenswrapper[4691]: I0930 06:21:04.292778 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:21:04 crc kubenswrapper[4691]: I0930 06:21:04.292796 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:21:04 crc kubenswrapper[4691]: I0930 06:21:04.292820 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:21:04 crc kubenswrapper[4691]: I0930 06:21:04.292838 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:21:04Z","lastTransitionTime":"2025-09-30T06:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:21:04 crc kubenswrapper[4691]: I0930 06:21:04.396255 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:21:04 crc kubenswrapper[4691]: I0930 06:21:04.396324 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:21:04 crc kubenswrapper[4691]: I0930 06:21:04.396344 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:21:04 crc kubenswrapper[4691]: I0930 06:21:04.396371 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:21:04 crc kubenswrapper[4691]: I0930 06:21:04.396391 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:21:04Z","lastTransitionTime":"2025-09-30T06:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:21:04 crc kubenswrapper[4691]: I0930 06:21:04.499596 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:21:04 crc kubenswrapper[4691]: I0930 06:21:04.499671 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:21:04 crc kubenswrapper[4691]: I0930 06:21:04.499694 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:21:04 crc kubenswrapper[4691]: I0930 06:21:04.499724 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:21:04 crc kubenswrapper[4691]: I0930 06:21:04.499746 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:21:04Z","lastTransitionTime":"2025-09-30T06:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:21:04 crc kubenswrapper[4691]: I0930 06:21:04.602989 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:21:04 crc kubenswrapper[4691]: I0930 06:21:04.603074 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:21:04 crc kubenswrapper[4691]: I0930 06:21:04.603098 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:21:04 crc kubenswrapper[4691]: I0930 06:21:04.603164 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:21:04 crc kubenswrapper[4691]: I0930 06:21:04.603190 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:21:04Z","lastTransitionTime":"2025-09-30T06:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:21:04 crc kubenswrapper[4691]: I0930 06:21:04.706997 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:21:04 crc kubenswrapper[4691]: I0930 06:21:04.707075 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:21:04 crc kubenswrapper[4691]: I0930 06:21:04.707096 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:21:04 crc kubenswrapper[4691]: I0930 06:21:04.707124 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:21:04 crc kubenswrapper[4691]: I0930 06:21:04.707143 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:21:04Z","lastTransitionTime":"2025-09-30T06:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:21:04 crc kubenswrapper[4691]: I0930 06:21:04.810758 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:21:04 crc kubenswrapper[4691]: I0930 06:21:04.810841 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:21:04 crc kubenswrapper[4691]: I0930 06:21:04.810864 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:21:04 crc kubenswrapper[4691]: I0930 06:21:04.810919 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:21:04 crc kubenswrapper[4691]: I0930 06:21:04.810938 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:21:04Z","lastTransitionTime":"2025-09-30T06:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:21:04 crc kubenswrapper[4691]: I0930 06:21:04.914575 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:21:04 crc kubenswrapper[4691]: I0930 06:21:04.914688 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:21:04 crc kubenswrapper[4691]: I0930 06:21:04.914704 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:21:04 crc kubenswrapper[4691]: I0930 06:21:04.914730 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:21:04 crc kubenswrapper[4691]: I0930 06:21:04.914747 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:21:04Z","lastTransitionTime":"2025-09-30T06:21:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:21:05 crc kubenswrapper[4691]: I0930 06:21:05.017856 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:21:05 crc kubenswrapper[4691]: I0930 06:21:05.017976 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:21:05 crc kubenswrapper[4691]: I0930 06:21:05.017998 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:21:05 crc kubenswrapper[4691]: I0930 06:21:05.018031 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:21:05 crc kubenswrapper[4691]: I0930 06:21:05.018053 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:21:05Z","lastTransitionTime":"2025-09-30T06:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:21:05 crc kubenswrapper[4691]: I0930 06:21:05.121245 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:21:05 crc kubenswrapper[4691]: I0930 06:21:05.121314 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:21:05 crc kubenswrapper[4691]: I0930 06:21:05.121332 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:21:05 crc kubenswrapper[4691]: I0930 06:21:05.121357 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:21:05 crc kubenswrapper[4691]: I0930 06:21:05.121372 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:21:05Z","lastTransitionTime":"2025-09-30T06:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:21:05 crc kubenswrapper[4691]: I0930 06:21:05.223636 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:21:05 crc kubenswrapper[4691]: I0930 06:21:05.223698 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:21:05 crc kubenswrapper[4691]: I0930 06:21:05.223717 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:21:05 crc kubenswrapper[4691]: I0930 06:21:05.223741 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:21:05 crc kubenswrapper[4691]: I0930 06:21:05.223762 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:21:05Z","lastTransitionTime":"2025-09-30T06:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:21:05 crc kubenswrapper[4691]: I0930 06:21:05.223993 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 06:21:05 crc kubenswrapper[4691]: I0930 06:21:05.224076 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 06:21:05 crc kubenswrapper[4691]: E0930 06:21:05.224158 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 06:21:05 crc kubenswrapper[4691]: E0930 06:21:05.224263 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 06:21:05 crc kubenswrapper[4691]: I0930 06:21:05.327257 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:21:05 crc kubenswrapper[4691]: I0930 06:21:05.327333 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:21:05 crc kubenswrapper[4691]: I0930 06:21:05.327351 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:21:05 crc kubenswrapper[4691]: I0930 06:21:05.327377 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:21:05 crc kubenswrapper[4691]: I0930 06:21:05.327394 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:21:05Z","lastTransitionTime":"2025-09-30T06:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:21:05 crc kubenswrapper[4691]: I0930 06:21:05.430200 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:21:05 crc kubenswrapper[4691]: I0930 06:21:05.430269 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:21:05 crc kubenswrapper[4691]: I0930 06:21:05.430287 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:21:05 crc kubenswrapper[4691]: I0930 06:21:05.430310 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:21:05 crc kubenswrapper[4691]: I0930 06:21:05.430326 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:21:05Z","lastTransitionTime":"2025-09-30T06:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:21:05 crc kubenswrapper[4691]: I0930 06:21:05.533674 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:21:05 crc kubenswrapper[4691]: I0930 06:21:05.533736 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:21:05 crc kubenswrapper[4691]: I0930 06:21:05.533754 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:21:05 crc kubenswrapper[4691]: I0930 06:21:05.533776 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:21:05 crc kubenswrapper[4691]: I0930 06:21:05.533794 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:21:05Z","lastTransitionTime":"2025-09-30T06:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:21:05 crc kubenswrapper[4691]: I0930 06:21:05.637113 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:21:05 crc kubenswrapper[4691]: I0930 06:21:05.637205 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:21:05 crc kubenswrapper[4691]: I0930 06:21:05.637228 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:21:05 crc kubenswrapper[4691]: I0930 06:21:05.637263 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:21:05 crc kubenswrapper[4691]: I0930 06:21:05.637287 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:21:05Z","lastTransitionTime":"2025-09-30T06:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:21:05 crc kubenswrapper[4691]: I0930 06:21:05.718039 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Sep 30 06:21:05 crc kubenswrapper[4691]: I0930 06:21:05.718106 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Sep 30 06:21:05 crc kubenswrapper[4691]: I0930 06:21:05.718126 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Sep 30 06:21:05 crc kubenswrapper[4691]: I0930 06:21:05.718150 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Sep 30 06:21:05 crc kubenswrapper[4691]: I0930 06:21:05.718168 4691 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-09-30T06:21:05Z","lastTransitionTime":"2025-09-30T06:21:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Sep 30 06:21:05 crc kubenswrapper[4691]: I0930 06:21:05.788628 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-x6ds5"] Sep 30 06:21:05 crc kubenswrapper[4691]: I0930 06:21:05.789201 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x6ds5" Sep 30 06:21:05 crc kubenswrapper[4691]: I0930 06:21:05.792757 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Sep 30 06:21:05 crc kubenswrapper[4691]: I0930 06:21:05.792848 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Sep 30 06:21:05 crc kubenswrapper[4691]: I0930 06:21:05.793684 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Sep 30 06:21:05 crc kubenswrapper[4691]: I0930 06:21:05.794157 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Sep 30 06:21:05 crc kubenswrapper[4691]: I0930 06:21:05.817149 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=87.817117812 podStartE2EDuration="1m27.817117812s" podCreationTimestamp="2025-09-30 06:19:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:21:05.816655727 +0000 UTC m=+109.291676807" watchObservedRunningTime="2025-09-30 06:21:05.817117812 +0000 UTC m=+109.292138892" Sep 30 06:21:05 crc kubenswrapper[4691]: I0930 06:21:05.859980 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=52.859952173 podStartE2EDuration="52.859952173s" podCreationTimestamp="2025-09-30 06:20:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:21:05.841981288 +0000 UTC m=+109.317002358" watchObservedRunningTime="2025-09-30 06:21:05.859952173 +0000 UTC m=+109.334973243" Sep 30 06:21:05 crc kubenswrapper[4691]: I0930 06:21:05.875256 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=25.875193822 podStartE2EDuration="25.875193822s" podCreationTimestamp="2025-09-30 06:20:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:21:05.874991425 +0000 UTC m=+109.350012535" watchObservedRunningTime="2025-09-30 06:21:05.875193822 +0000 UTC m=+109.350214892" Sep 30 06:21:05 crc kubenswrapper[4691]: I0930 06:21:05.939854 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-nzp64" podStartSLOduration=87.939803772 podStartE2EDuration="1m27.939803772s" podCreationTimestamp="2025-09-30 06:19:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:21:05.901418872 +0000 UTC m=+109.376439962" watchObservedRunningTime="2025-09-30 06:21:05.939803772 +0000 UTC m=+109.414824842" Sep 30 06:21:05 crc kubenswrapper[4691]: I0930 06:21:05.958234 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/07d9acdf-6537-4d43-ad25-65f839888daf-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-x6ds5\" (UID: \"07d9acdf-6537-4d43-ad25-65f839888daf\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x6ds5" Sep 30 06:21:05 crc kubenswrapper[4691]: I0930 06:21:05.958296 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/07d9acdf-6537-4d43-ad25-65f839888daf-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-x6ds5\" (UID: \"07d9acdf-6537-4d43-ad25-65f839888daf\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x6ds5" Sep 30 06:21:05 crc kubenswrapper[4691]: I0930 06:21:05.958326 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/07d9acdf-6537-4d43-ad25-65f839888daf-service-ca\") pod \"cluster-version-operator-5c965bbfc6-x6ds5\" (UID: \"07d9acdf-6537-4d43-ad25-65f839888daf\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x6ds5" Sep 30 06:21:05 crc kubenswrapper[4691]: I0930 06:21:05.958382 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07d9acdf-6537-4d43-ad25-65f839888daf-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-x6ds5\" (UID: \"07d9acdf-6537-4d43-ad25-65f839888daf\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x6ds5" Sep 30 06:21:05 crc kubenswrapper[4691]: I0930 06:21:05.958582 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/07d9acdf-6537-4d43-ad25-65f839888daf-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-x6ds5\" (UID: \"07d9acdf-6537-4d43-ad25-65f839888daf\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x6ds5" Sep 30 06:21:06 crc kubenswrapper[4691]: I0930 06:21:06.024747 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-xjjw8" podStartSLOduration=88.024719202 podStartE2EDuration="1m28.024719202s" podCreationTimestamp="2025-09-30 06:19:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:21:06.007436829 +0000 UTC m=+109.482457909" watchObservedRunningTime="2025-09-30 06:21:06.024719202 +0000 UTC m=+109.499740282" Sep 30 06:21:06 crc kubenswrapper[4691]: I0930 06:21:06.025135 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-p7wmt" podStartSLOduration=88.025125875 podStartE2EDuration="1m28.025125875s" podCreationTimestamp="2025-09-30 06:19:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:21:06.022822321 +0000 UTC m=+109.497843421" watchObservedRunningTime="2025-09-30 06:21:06.025125875 +0000 UTC m=+109.500146955" Sep 30 06:21:06 crc kubenswrapper[4691]: I0930 06:21:06.040564 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-fxv8w" podStartSLOduration=87.040542039 podStartE2EDuration="1m27.040542039s" podCreationTimestamp="2025-09-30 06:19:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:21:06.039461554 +0000 UTC m=+109.514482664" watchObservedRunningTime="2025-09-30 06:21:06.040542039 +0000 UTC m=+109.515563119" Sep 30 06:21:06 crc kubenswrapper[4691]: I0930 06:21:06.059479 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/07d9acdf-6537-4d43-ad25-65f839888daf-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-x6ds5\" (UID: \"07d9acdf-6537-4d43-ad25-65f839888daf\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x6ds5" Sep 30 06:21:06 crc kubenswrapper[4691]: I0930 06:21:06.059662 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/07d9acdf-6537-4d43-ad25-65f839888daf-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-x6ds5\" (UID: \"07d9acdf-6537-4d43-ad25-65f839888daf\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x6ds5" Sep 30 06:21:06 crc kubenswrapper[4691]: I0930 06:21:06.059728 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/07d9acdf-6537-4d43-ad25-65f839888daf-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-x6ds5\" (UID: \"07d9acdf-6537-4d43-ad25-65f839888daf\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x6ds5" Sep 30 06:21:06 crc kubenswrapper[4691]: I0930 06:21:06.059785 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/07d9acdf-6537-4d43-ad25-65f839888daf-service-ca\") pod \"cluster-version-operator-5c965bbfc6-x6ds5\" (UID: \"07d9acdf-6537-4d43-ad25-65f839888daf\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x6ds5" Sep 30 06:21:06 crc kubenswrapper[4691]: I0930 06:21:06.059792 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/07d9acdf-6537-4d43-ad25-65f839888daf-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-x6ds5\" (UID: \"07d9acdf-6537-4d43-ad25-65f839888daf\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x6ds5" Sep 30 06:21:06 crc kubenswrapper[4691]: I0930 06:21:06.059873 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/07d9acdf-6537-4d43-ad25-65f839888daf-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-x6ds5\" (UID: \"07d9acdf-6537-4d43-ad25-65f839888daf\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x6ds5" Sep 30 06:21:06 crc kubenswrapper[4691]: I0930 06:21:06.059948 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07d9acdf-6537-4d43-ad25-65f839888daf-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-x6ds5\" (UID: \"07d9acdf-6537-4d43-ad25-65f839888daf\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x6ds5" Sep 30 06:21:06 crc kubenswrapper[4691]: I0930 06:21:06.061307 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/07d9acdf-6537-4d43-ad25-65f839888daf-service-ca\") pod \"cluster-version-operator-5c965bbfc6-x6ds5\" (UID: \"07d9acdf-6537-4d43-ad25-65f839888daf\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x6ds5" Sep 30 06:21:06 crc kubenswrapper[4691]: I0930 06:21:06.069837 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07d9acdf-6537-4d43-ad25-65f839888daf-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-x6ds5\" (UID: \"07d9acdf-6537-4d43-ad25-65f839888daf\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x6ds5" Sep 30 06:21:06 crc kubenswrapper[4691]: I0930 06:21:06.101621 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/07d9acdf-6537-4d43-ad25-65f839888daf-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-x6ds5\" (UID: \"07d9acdf-6537-4d43-ad25-65f839888daf\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x6ds5" Sep 30 06:21:06 crc kubenswrapper[4691]: I0930 06:21:06.111809 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x6ds5" Sep 30 06:21:06 crc kubenswrapper[4691]: I0930 06:21:06.138038 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=90.138004552 podStartE2EDuration="1m30.138004552s" podCreationTimestamp="2025-09-30 06:19:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:21:06.107476684 +0000 UTC m=+109.582497784" watchObservedRunningTime="2025-09-30 06:21:06.138004552 +0000 UTC m=+109.613025642" Sep 30 06:21:06 crc kubenswrapper[4691]: I0930 06:21:06.201660 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-8htrc" podStartSLOduration=88.20163621 podStartE2EDuration="1m28.20163621s" podCreationTimestamp="2025-09-30 06:19:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:21:06.201099843 +0000 UTC m=+109.676120913" watchObservedRunningTime="2025-09-30 06:21:06.20163621 +0000 UTC m=+109.676657280" Sep 30 06:21:06 crc kubenswrapper[4691]: I0930 06:21:06.213258 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podStartSLOduration=88.213241102 podStartE2EDuration="1m28.213241102s" podCreationTimestamp="2025-09-30 06:19:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:21:06.212967443 +0000 UTC m=+109.687988523" watchObservedRunningTime="2025-09-30 06:21:06.213241102 +0000 UTC m=+109.688262172" Sep 30 06:21:06 crc kubenswrapper[4691]: I0930 06:21:06.224082 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 06:21:06 crc kubenswrapper[4691]: I0930 06:21:06.224115 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-svjxq" Sep 30 06:21:06 crc kubenswrapper[4691]: E0930 06:21:06.224233 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 06:21:06 crc kubenswrapper[4691]: E0930 06:21:06.224372 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-svjxq" podUID="a8ed6f92-0b98-4b1b-a46e-4d0604d686a1" Sep 30 06:21:06 crc kubenswrapper[4691]: I0930 06:21:06.247805 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=90.247779799 podStartE2EDuration="1m30.247779799s" podCreationTimestamp="2025-09-30 06:19:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:21:06.245729092 +0000 UTC m=+109.720750202" watchObservedRunningTime="2025-09-30 06:21:06.247779799 +0000 UTC m=+109.722800879" Sep 30 06:21:06 crc kubenswrapper[4691]: I0930 06:21:06.834709 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x6ds5" event={"ID":"07d9acdf-6537-4d43-ad25-65f839888daf","Type":"ContainerStarted","Data":"b8d6a1c4e637360b88e53f51ed9d82902639bc1d19d5322b4d40ddb554a7310c"} Sep 30 06:21:06 crc kubenswrapper[4691]: I0930 06:21:06.835617 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x6ds5" event={"ID":"07d9acdf-6537-4d43-ad25-65f839888daf","Type":"ContainerStarted","Data":"0f894ba8d4158b9ef5249f161558cf2fb4acd4c2a8fd0b968b27655e2c0dc195"} Sep 30 06:21:06 crc kubenswrapper[4691]: I0930 06:21:06.856078 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-x6ds5" podStartSLOduration=88.856052006 podStartE2EDuration="1m28.856052006s" podCreationTimestamp="2025-09-30 06:19:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:21:06.855760577 +0000 UTC m=+110.330781657" watchObservedRunningTime="2025-09-30 06:21:06.856052006 +0000 UTC m=+110.331073076" Sep 30 06:21:07 crc kubenswrapper[4691]: I0930 06:21:07.224361 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 06:21:07 crc kubenswrapper[4691]: I0930 06:21:07.224373 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 06:21:07 crc kubenswrapper[4691]: E0930 06:21:07.226497 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 06:21:07 crc kubenswrapper[4691]: E0930 06:21:07.226577 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 06:21:08 crc kubenswrapper[4691]: I0930 06:21:08.224525 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-svjxq" Sep 30 06:21:08 crc kubenswrapper[4691]: I0930 06:21:08.224543 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 06:21:08 crc kubenswrapper[4691]: E0930 06:21:08.224786 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-svjxq" podUID="a8ed6f92-0b98-4b1b-a46e-4d0604d686a1" Sep 30 06:21:08 crc kubenswrapper[4691]: E0930 06:21:08.224917 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 06:21:09 crc kubenswrapper[4691]: I0930 06:21:09.224413 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 06:21:09 crc kubenswrapper[4691]: I0930 06:21:09.224510 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 06:21:09 crc kubenswrapper[4691]: E0930 06:21:09.224599 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 06:21:09 crc kubenswrapper[4691]: E0930 06:21:09.224696 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 06:21:10 crc kubenswrapper[4691]: I0930 06:21:10.224292 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 06:21:10 crc kubenswrapper[4691]: E0930 06:21:10.224464 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 06:21:10 crc kubenswrapper[4691]: I0930 06:21:10.224820 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-svjxq" Sep 30 06:21:10 crc kubenswrapper[4691]: E0930 06:21:10.225161 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-svjxq" podUID="a8ed6f92-0b98-4b1b-a46e-4d0604d686a1" Sep 30 06:21:11 crc kubenswrapper[4691]: I0930 06:21:11.224317 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 06:21:11 crc kubenswrapper[4691]: E0930 06:21:11.224515 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 06:21:11 crc kubenswrapper[4691]: I0930 06:21:11.225722 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 06:21:11 crc kubenswrapper[4691]: E0930 06:21:11.226117 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 06:21:12 crc kubenswrapper[4691]: I0930 06:21:12.224358 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 06:21:12 crc kubenswrapper[4691]: I0930 06:21:12.224380 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-svjxq" Sep 30 06:21:12 crc kubenswrapper[4691]: E0930 06:21:12.224486 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 06:21:12 crc kubenswrapper[4691]: E0930 06:21:12.224581 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-svjxq" podUID="a8ed6f92-0b98-4b1b-a46e-4d0604d686a1" Sep 30 06:21:12 crc kubenswrapper[4691]: I0930 06:21:12.857586 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xjjw8_5bfd073c-4582-4a65-8170-7030f4852174/kube-multus/1.log" Sep 30 06:21:12 crc kubenswrapper[4691]: I0930 06:21:12.858665 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xjjw8_5bfd073c-4582-4a65-8170-7030f4852174/kube-multus/0.log" Sep 30 06:21:12 crc kubenswrapper[4691]: I0930 06:21:12.858928 4691 generic.go:334] "Generic (PLEG): container finished" podID="5bfd073c-4582-4a65-8170-7030f4852174" containerID="139bc9d998c0acce39c92915ac1c15ec414b09e1f5c0ae15c37f7af6e5cf570d" exitCode=1 Sep 30 06:21:12 crc kubenswrapper[4691]: I0930 06:21:12.859044 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xjjw8" event={"ID":"5bfd073c-4582-4a65-8170-7030f4852174","Type":"ContainerDied","Data":"139bc9d998c0acce39c92915ac1c15ec414b09e1f5c0ae15c37f7af6e5cf570d"} Sep 30 06:21:12 crc kubenswrapper[4691]: I0930 06:21:12.859261 4691 scope.go:117] "RemoveContainer" containerID="3a065eb18eb909fb60011f5ae6adf327088480996585d74e861628ffdd759bd9" Sep 30 06:21:12 crc kubenswrapper[4691]: I0930 06:21:12.859800 4691 scope.go:117] "RemoveContainer" containerID="139bc9d998c0acce39c92915ac1c15ec414b09e1f5c0ae15c37f7af6e5cf570d" Sep 30 06:21:12 crc kubenswrapper[4691]: E0930 06:21:12.860073 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-xjjw8_openshift-multus(5bfd073c-4582-4a65-8170-7030f4852174)\"" pod="openshift-multus/multus-xjjw8" podUID="5bfd073c-4582-4a65-8170-7030f4852174" Sep 30 06:21:13 crc kubenswrapper[4691]: I0930 06:21:13.224637 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 06:21:13 crc kubenswrapper[4691]: E0930 06:21:13.225238 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 06:21:13 crc kubenswrapper[4691]: I0930 06:21:13.224661 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 06:21:13 crc kubenswrapper[4691]: E0930 06:21:13.225696 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 06:21:13 crc kubenswrapper[4691]: I0930 06:21:13.864526 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xjjw8_5bfd073c-4582-4a65-8170-7030f4852174/kube-multus/1.log" Sep 30 06:21:14 crc kubenswrapper[4691]: I0930 06:21:14.223879 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 06:21:14 crc kubenswrapper[4691]: E0930 06:21:14.224076 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 06:21:14 crc kubenswrapper[4691]: I0930 06:21:14.224193 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-svjxq" Sep 30 06:21:14 crc kubenswrapper[4691]: E0930 06:21:14.224818 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-svjxq" podUID="a8ed6f92-0b98-4b1b-a46e-4d0604d686a1" Sep 30 06:21:15 crc kubenswrapper[4691]: I0930 06:21:15.224183 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 06:21:15 crc kubenswrapper[4691]: I0930 06:21:15.224183 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 06:21:15 crc kubenswrapper[4691]: E0930 06:21:15.224372 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 06:21:15 crc kubenswrapper[4691]: E0930 06:21:15.224498 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 06:21:16 crc kubenswrapper[4691]: I0930 06:21:16.223981 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 06:21:16 crc kubenswrapper[4691]: I0930 06:21:16.224076 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-svjxq" Sep 30 06:21:16 crc kubenswrapper[4691]: E0930 06:21:16.224723 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-svjxq" podUID="a8ed6f92-0b98-4b1b-a46e-4d0604d686a1" Sep 30 06:21:16 crc kubenswrapper[4691]: E0930 06:21:16.225395 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 06:21:17 crc kubenswrapper[4691]: E0930 06:21:17.205929 4691 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Sep 30 06:21:17 crc kubenswrapper[4691]: I0930 06:21:17.224768 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 06:21:17 crc kubenswrapper[4691]: I0930 06:21:17.224848 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 06:21:17 crc kubenswrapper[4691]: E0930 06:21:17.225486 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 06:21:17 crc kubenswrapper[4691]: E0930 06:21:17.225716 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 06:21:17 crc kubenswrapper[4691]: I0930 06:21:17.226310 4691 scope.go:117] "RemoveContainer" containerID="76736edc05ab57e4ee3d6dea3b7f4d2033e718cdac579abe7c4f41fe6988f62e" Sep 30 06:21:17 crc kubenswrapper[4691]: E0930 06:21:17.339553 4691 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 30 06:21:17 crc kubenswrapper[4691]: I0930 06:21:17.883389 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sjmvw_6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d/ovnkube-controller/3.log" Sep 30 06:21:17 crc kubenswrapper[4691]: I0930 06:21:17.887359 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" event={"ID":"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d","Type":"ContainerStarted","Data":"0879865d465a3a53a432bcc490519dc13015da51ae4252508ba71348c72bd4c8"} Sep 30 06:21:17 crc kubenswrapper[4691]: I0930 06:21:17.888082 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" Sep 30 06:21:18 crc kubenswrapper[4691]: I0930 06:21:18.071080 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" podStartSLOduration=100.071051998 podStartE2EDuration="1m40.071051998s" podCreationTimestamp="2025-09-30 06:19:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:21:17.937012144 +0000 UTC m=+121.412033284" watchObservedRunningTime="2025-09-30 06:21:18.071051998 +0000 UTC m=+121.546073068" Sep 30 06:21:18 crc kubenswrapper[4691]: I0930 06:21:18.071537 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-svjxq"] Sep 30 06:21:18 crc kubenswrapper[4691]: I0930 06:21:18.071713 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-svjxq" Sep 30 06:21:18 crc kubenswrapper[4691]: E0930 06:21:18.071938 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-svjxq" podUID="a8ed6f92-0b98-4b1b-a46e-4d0604d686a1" Sep 30 06:21:18 crc kubenswrapper[4691]: I0930 06:21:18.224005 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 06:21:18 crc kubenswrapper[4691]: E0930 06:21:18.224304 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 06:21:19 crc kubenswrapper[4691]: I0930 06:21:19.224452 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 06:21:19 crc kubenswrapper[4691]: I0930 06:21:19.224521 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 06:21:19 crc kubenswrapper[4691]: E0930 06:21:19.224915 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 06:21:19 crc kubenswrapper[4691]: E0930 06:21:19.224954 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 06:21:19 crc kubenswrapper[4691]: I0930 06:21:19.224558 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-svjxq" Sep 30 06:21:19 crc kubenswrapper[4691]: E0930 06:21:19.225202 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-svjxq" podUID="a8ed6f92-0b98-4b1b-a46e-4d0604d686a1" Sep 30 06:21:20 crc kubenswrapper[4691]: I0930 06:21:20.224189 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 06:21:20 crc kubenswrapper[4691]: E0930 06:21:20.224400 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 06:21:21 crc kubenswrapper[4691]: I0930 06:21:21.224523 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 06:21:21 crc kubenswrapper[4691]: I0930 06:21:21.224614 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 06:21:21 crc kubenswrapper[4691]: I0930 06:21:21.224742 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-svjxq" Sep 30 06:21:21 crc kubenswrapper[4691]: E0930 06:21:21.224831 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 06:21:21 crc kubenswrapper[4691]: E0930 06:21:21.225015 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-svjxq" podUID="a8ed6f92-0b98-4b1b-a46e-4d0604d686a1" Sep 30 06:21:21 crc kubenswrapper[4691]: E0930 06:21:21.225134 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 06:21:22 crc kubenswrapper[4691]: I0930 06:21:22.224164 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 06:21:22 crc kubenswrapper[4691]: E0930 06:21:22.224409 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 06:21:22 crc kubenswrapper[4691]: E0930 06:21:22.341446 4691 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 30 06:21:23 crc kubenswrapper[4691]: I0930 06:21:23.224334 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 06:21:23 crc kubenswrapper[4691]: I0930 06:21:23.224450 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 06:21:23 crc kubenswrapper[4691]: E0930 06:21:23.224530 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 06:21:23 crc kubenswrapper[4691]: I0930 06:21:23.224572 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-svjxq" Sep 30 06:21:23 crc kubenswrapper[4691]: E0930 06:21:23.224767 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 06:21:23 crc kubenswrapper[4691]: E0930 06:21:23.224856 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-svjxq" podUID="a8ed6f92-0b98-4b1b-a46e-4d0604d686a1" Sep 30 06:21:24 crc kubenswrapper[4691]: I0930 06:21:24.224226 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 06:21:24 crc kubenswrapper[4691]: E0930 06:21:24.224405 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 06:21:25 crc kubenswrapper[4691]: I0930 06:21:25.224144 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 06:21:25 crc kubenswrapper[4691]: I0930 06:21:25.224219 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-svjxq" Sep 30 06:21:25 crc kubenswrapper[4691]: I0930 06:21:25.224143 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 06:21:25 crc kubenswrapper[4691]: E0930 06:21:25.224298 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 06:21:25 crc kubenswrapper[4691]: E0930 06:21:25.224411 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-svjxq" podUID="a8ed6f92-0b98-4b1b-a46e-4d0604d686a1" Sep 30 06:21:25 crc kubenswrapper[4691]: E0930 06:21:25.224572 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 06:21:26 crc kubenswrapper[4691]: I0930 06:21:26.224656 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 06:21:26 crc kubenswrapper[4691]: E0930 06:21:26.225276 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 06:21:27 crc kubenswrapper[4691]: I0930 06:21:27.235293 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 06:21:27 crc kubenswrapper[4691]: E0930 06:21:27.236758 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 06:21:27 crc kubenswrapper[4691]: I0930 06:21:27.236853 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 06:21:27 crc kubenswrapper[4691]: I0930 06:21:27.236931 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-svjxq" Sep 30 06:21:27 crc kubenswrapper[4691]: E0930 06:21:27.237151 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 06:21:27 crc kubenswrapper[4691]: E0930 06:21:27.237322 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-svjxq" podUID="a8ed6f92-0b98-4b1b-a46e-4d0604d686a1" Sep 30 06:21:27 crc kubenswrapper[4691]: I0930 06:21:27.237666 4691 scope.go:117] "RemoveContainer" containerID="139bc9d998c0acce39c92915ac1c15ec414b09e1f5c0ae15c37f7af6e5cf570d" Sep 30 06:21:27 crc kubenswrapper[4691]: E0930 06:21:27.342399 4691 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Sep 30 06:21:27 crc kubenswrapper[4691]: I0930 06:21:27.928720 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xjjw8_5bfd073c-4582-4a65-8170-7030f4852174/kube-multus/1.log" Sep 30 06:21:27 crc kubenswrapper[4691]: I0930 06:21:27.928807 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xjjw8" event={"ID":"5bfd073c-4582-4a65-8170-7030f4852174","Type":"ContainerStarted","Data":"6dc9f6de9a72745abb5fcd3a1cc65a6aade6d9c7dc8696106fc1a98b3550d079"} Sep 30 06:21:28 crc kubenswrapper[4691]: I0930 06:21:28.224493 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 06:21:28 crc kubenswrapper[4691]: E0930 06:21:28.224686 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 06:21:29 crc kubenswrapper[4691]: I0930 06:21:29.224559 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 06:21:29 crc kubenswrapper[4691]: I0930 06:21:29.224559 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 06:21:29 crc kubenswrapper[4691]: I0930 06:21:29.224724 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-svjxq" Sep 30 06:21:29 crc kubenswrapper[4691]: E0930 06:21:29.224875 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 06:21:29 crc kubenswrapper[4691]: E0930 06:21:29.225025 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 06:21:29 crc kubenswrapper[4691]: E0930 06:21:29.225339 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-svjxq" podUID="a8ed6f92-0b98-4b1b-a46e-4d0604d686a1" Sep 30 06:21:30 crc kubenswrapper[4691]: I0930 06:21:30.224534 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 06:21:30 crc kubenswrapper[4691]: E0930 06:21:30.224723 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 06:21:31 crc kubenswrapper[4691]: I0930 06:21:31.224463 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 06:21:31 crc kubenswrapper[4691]: I0930 06:21:31.224648 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-svjxq" Sep 30 06:21:31 crc kubenswrapper[4691]: I0930 06:21:31.224646 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 06:21:31 crc kubenswrapper[4691]: E0930 06:21:31.224756 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Sep 30 06:21:31 crc kubenswrapper[4691]: E0930 06:21:31.225039 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Sep 30 06:21:31 crc kubenswrapper[4691]: E0930 06:21:31.225216 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-svjxq" podUID="a8ed6f92-0b98-4b1b-a46e-4d0604d686a1" Sep 30 06:21:32 crc kubenswrapper[4691]: I0930 06:21:32.224047 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 06:21:32 crc kubenswrapper[4691]: E0930 06:21:32.224240 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Sep 30 06:21:33 crc kubenswrapper[4691]: I0930 06:21:33.224167 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-svjxq" Sep 30 06:21:33 crc kubenswrapper[4691]: I0930 06:21:33.224279 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 06:21:33 crc kubenswrapper[4691]: I0930 06:21:33.224179 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 06:21:33 crc kubenswrapper[4691]: I0930 06:21:33.227295 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Sep 30 06:21:33 crc kubenswrapper[4691]: I0930 06:21:33.227420 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Sep 30 06:21:33 crc kubenswrapper[4691]: I0930 06:21:33.227759 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Sep 30 06:21:33 crc kubenswrapper[4691]: I0930 06:21:33.231063 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Sep 30 06:21:34 crc kubenswrapper[4691]: I0930 06:21:34.224631 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 06:21:34 crc kubenswrapper[4691]: I0930 06:21:34.226690 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Sep 30 06:21:34 crc kubenswrapper[4691]: I0930 06:21:34.227733 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.452220 4691 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.500830 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-5hlv9"] Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.501694 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-5hlv9" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.508078 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.508090 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.508727 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.508969 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.509616 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.510118 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.510770 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.511276 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.511914 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.518853 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-kzftt"] Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.519533 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-4bdqt"] Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.519989 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-4bdqt" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.520436 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-kzftt" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.526548 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-fthnz"] Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.527314 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fthnz" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.528012 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-hf5vd"] Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.539208 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hf5vd" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.548006 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-fqxlz"] Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.564672 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.565067 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.565323 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.565407 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.565550 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.565941 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.566481 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fqxlz" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.566260 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.567036 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.567215 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.567380 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.568199 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.568443 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.568703 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.568970 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.569190 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.569321 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.569401 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.569572 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.569715 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.570136 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.571513 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.571977 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-7bzfz"] Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.572446 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-7bzfz" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.573406 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kp6rk"] Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.573746 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kp6rk" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.574188 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.574235 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.574360 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.574410 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.574198 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.574505 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.575159 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.575967 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.577932 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-nklgj"] Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.578170 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.578426 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-nklgj" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.578430 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.578474 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.578724 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.580308 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.580528 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.580710 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.580743 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.581070 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.581205 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.581372 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.584338 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-xlxgb"] Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.584752 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-9d949"] Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.585126 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-9d949" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.585456 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.589512 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.589543 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5652b"] Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.589809 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.590042 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5652b" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.590047 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.590069 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.590462 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jwl8z"] Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.590746 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jwl8z" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.591284 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-ss8nw"] Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.591566 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-ss8nw" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.592062 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-thj2p"] Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.592482 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-thj2p" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.597955 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.599792 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-r8bfj"] Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.600099 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.600406 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.600418 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-r8bfj" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.600647 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.600710 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-7frz7"] Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.600806 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.601068 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.601137 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-7frz7" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.601712 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.601796 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.601865 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.601955 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.602093 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.602162 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.602282 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-62hxb"] Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.602661 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-62hxb" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.602285 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.604676 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.604814 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.604912 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.620133 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.620302 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.620351 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.620493 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.621013 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.622429 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.622662 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.623225 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.623630 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.623874 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.624165 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.624237 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.624364 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.624831 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.625620 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.626021 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.631205 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.631221 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.631249 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.631490 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.631665 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.647686 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.647877 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.647943 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.648367 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.648463 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-bshxb"] Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.648564 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.648601 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.648816 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.648981 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.649144 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.649199 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-bshxb" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.649210 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.649629 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m9vv7"] Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.659004 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-6q5zd"] Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.655200 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.659134 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m9vv7" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.658080 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.658147 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.658169 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.659973 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-6q5zd" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.660179 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4s6nm"] Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.661373 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.663635 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lh444"] Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.663973 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-fthnz"] Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.663993 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lww4z"] Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.664322 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lww4z" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.664592 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4s6nm" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.664733 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lh444" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.669569 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-6gt99"] Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.670168 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-srgxj"] Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.670583 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-srgxj" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.670793 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6gt99" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.671518 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-4bdqt"] Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.672010 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f1dfc875-304c-4f81-8d12-c5463743ad08-metrics-tls\") pod \"ingress-operator-5b745b69d9-62hxb\" (UID: \"f1dfc875-304c-4f81-8d12-c5463743ad08\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-62hxb" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.672042 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/535f395c-e127-4a48-8766-707bf9d4d5a3-client-ca\") pod \"route-controller-manager-6576b87f9c-fqxlz\" (UID: \"535f395c-e127-4a48-8766-707bf9d4d5a3\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fqxlz" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.672062 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xljs\" (UniqueName: \"kubernetes.io/projected/e35ebb0f-1009-48b7-b981-c3c46bc7fce6-kube-api-access-2xljs\") pod \"authentication-operator-69f744f599-7bzfz\" (UID: \"e35ebb0f-1009-48b7-b981-c3c46bc7fce6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7bzfz" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.672083 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/46e3679a-b63e-4f7c-b118-02287f570a24-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-r8bfj\" (UID: \"46e3679a-b63e-4f7c-b118-02287f570a24\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8bfj" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.672125 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/46e3679a-b63e-4f7c-b118-02287f570a24-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-r8bfj\" (UID: \"46e3679a-b63e-4f7c-b118-02287f570a24\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8bfj" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.672162 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cctm5\" (UniqueName: \"kubernetes.io/projected/50e623b3-90a5-4b59-8e37-9ac5c96c3304-kube-api-access-cctm5\") pod \"machine-approver-56656f9798-hf5vd\" (UID: \"50e623b3-90a5-4b59-8e37-9ac5c96c3304\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hf5vd" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.672182 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f468aa01-3497-4b9b-bf5b-33aaf845e8cd-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-m9vv7\" (UID: \"f468aa01-3497-4b9b-bf5b-33aaf845e8cd\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m9vv7" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.672196 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jx82g\" (UniqueName: \"kubernetes.io/projected/f1dfc875-304c-4f81-8d12-c5463743ad08-kube-api-access-jx82g\") pod \"ingress-operator-5b745b69d9-62hxb\" (UID: \"f1dfc875-304c-4f81-8d12-c5463743ad08\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-62hxb" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.672211 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da9aea11-a56e-498b-8678-590b288b372f-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-5652b\" (UID: \"da9aea11-a56e-498b-8678-590b288b372f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5652b" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.672241 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf24a16c-4573-4014-8057-b4da43a0b145-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lh444\" (UID: \"bf24a16c-4573-4014-8057-b4da43a0b145\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lh444" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.672257 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-974f2\" (UniqueName: \"kubernetes.io/projected/162d74bd-6a30-4fa0-88b7-2aa59426c6c8-kube-api-access-974f2\") pod \"migrator-59844c95c7-6gt99\" (UID: \"162d74bd-6a30-4fa0-88b7-2aa59426c6c8\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6gt99" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.672276 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/535f395c-e127-4a48-8766-707bf9d4d5a3-config\") pod \"route-controller-manager-6576b87f9c-fqxlz\" (UID: \"535f395c-e127-4a48-8766-707bf9d4d5a3\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fqxlz" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.672291 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/10e7bec3-75fd-4ed0-bdcd-ff1ee3fd3f9b-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-srgxj\" (UID: \"10e7bec3-75fd-4ed0-bdcd-ff1ee3fd3f9b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-srgxj" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.672306 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7a677441-8b2d-41ae-8dd8-e3334c16c700-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-4bdqt\" (UID: \"7a677441-8b2d-41ae-8dd8-e3334c16c700\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4bdqt" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.672323 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/d6587adc-a984-4ce3-af8d-6739325c8604-default-certificate\") pod \"router-default-5444994796-7frz7\" (UID: \"d6587adc-a984-4ce3-af8d-6739325c8604\") " pod="openshift-ingress/router-default-5444994796-7frz7" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.672357 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/535f395c-e127-4a48-8766-707bf9d4d5a3-serving-cert\") pod \"route-controller-manager-6576b87f9c-fqxlz\" (UID: \"535f395c-e127-4a48-8766-707bf9d4d5a3\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fqxlz" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.672373 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/37dd19aa-104d-4c79-859c-7161a185ad1c-encryption-config\") pod \"apiserver-76f77b778f-5hlv9\" (UID: \"37dd19aa-104d-4c79-859c-7161a185ad1c\") " pod="openshift-apiserver/apiserver-76f77b778f-5hlv9" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.672389 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6587adc-a984-4ce3-af8d-6739325c8604-service-ca-bundle\") pod \"router-default-5444994796-7frz7\" (UID: \"d6587adc-a984-4ce3-af8d-6739325c8604\") " pod="openshift-ingress/router-default-5444994796-7frz7" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.672406 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vmf4\" (UniqueName: \"kubernetes.io/projected/4e494e6a-0b34-4706-8284-5dc5086c89b3-kube-api-access-4vmf4\") pod \"dns-operator-744455d44c-6q5zd\" (UID: \"4e494e6a-0b34-4706-8284-5dc5086c89b3\") " pod="openshift-dns-operator/dns-operator-744455d44c-6q5zd" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.672434 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-cc4c5"] Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.672461 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/46e3679a-b63e-4f7c-b118-02287f570a24-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-r8bfj\" (UID: \"46e3679a-b63e-4f7c-b118-02287f570a24\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8bfj" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.672485 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/37dd19aa-104d-4c79-859c-7161a185ad1c-image-import-ca\") pod \"apiserver-76f77b778f-5hlv9\" (UID: \"37dd19aa-104d-4c79-859c-7161a185ad1c\") " pod="openshift-apiserver/apiserver-76f77b778f-5hlv9" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.672503 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a677441-8b2d-41ae-8dd8-e3334c16c700-config\") pod \"controller-manager-879f6c89f-4bdqt\" (UID: \"7a677441-8b2d-41ae-8dd8-e3334c16c700\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4bdqt" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.672518 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6l96\" (UniqueName: \"kubernetes.io/projected/b45e6c14-bbb6-4a9c-92e0-10c09fe37093-kube-api-access-z6l96\") pod \"apiserver-7bbb656c7d-fthnz\" (UID: \"b45e6c14-bbb6-4a9c-92e0-10c09fe37093\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fthnz" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.672539 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fck42\" (UniqueName: \"kubernetes.io/projected/ee1c2dd6-d759-4d3c-9ec7-86ec11419202-kube-api-access-fck42\") pod \"machine-api-operator-5694c8668f-kzftt\" (UID: \"ee1c2dd6-d759-4d3c-9ec7-86ec11419202\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kzftt" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.672554 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/d6587adc-a984-4ce3-af8d-6739325c8604-stats-auth\") pod \"router-default-5444994796-7frz7\" (UID: \"d6587adc-a984-4ce3-af8d-6739325c8604\") " pod="openshift-ingress/router-default-5444994796-7frz7" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.672575 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e35ebb0f-1009-48b7-b981-c3c46bc7fce6-config\") pod \"authentication-operator-69f744f599-7bzfz\" (UID: \"e35ebb0f-1009-48b7-b981-c3c46bc7fce6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7bzfz" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.672629 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/50e623b3-90a5-4b59-8e37-9ac5c96c3304-auth-proxy-config\") pod \"machine-approver-56656f9798-hf5vd\" (UID: \"50e623b3-90a5-4b59-8e37-9ac5c96c3304\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hf5vd" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.672649 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2kjp\" (UniqueName: \"kubernetes.io/projected/da9aea11-a56e-498b-8678-590b288b372f-kube-api-access-q2kjp\") pod \"openshift-controller-manager-operator-756b6f6bc6-5652b\" (UID: \"da9aea11-a56e-498b-8678-590b288b372f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5652b" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.672675 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b45e6c14-bbb6-4a9c-92e0-10c09fe37093-etcd-client\") pod \"apiserver-7bbb656c7d-fthnz\" (UID: \"b45e6c14-bbb6-4a9c-92e0-10c09fe37093\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fthnz" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.672692 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bf24a16c-4573-4014-8057-b4da43a0b145-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lh444\" (UID: \"bf24a16c-4573-4014-8057-b4da43a0b145\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lh444" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.672716 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6vcq\" (UniqueName: \"kubernetes.io/projected/535f395c-e127-4a48-8766-707bf9d4d5a3-kube-api-access-p6vcq\") pod \"route-controller-manager-6576b87f9c-fqxlz\" (UID: \"535f395c-e127-4a48-8766-707bf9d4d5a3\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fqxlz" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.672733 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f468aa01-3497-4b9b-bf5b-33aaf845e8cd-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-m9vv7\" (UID: \"f468aa01-3497-4b9b-bf5b-33aaf845e8cd\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m9vv7" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.672750 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hvpt\" (UniqueName: \"kubernetes.io/projected/d6587adc-a984-4ce3-af8d-6739325c8604-kube-api-access-7hvpt\") pod \"router-default-5444994796-7frz7\" (UID: \"d6587adc-a984-4ce3-af8d-6739325c8604\") " pod="openshift-ingress/router-default-5444994796-7frz7" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.672769 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssd8b\" (UniqueName: \"kubernetes.io/projected/46e3679a-b63e-4f7c-b118-02287f570a24-kube-api-access-ssd8b\") pod \"oauth-openshift-558db77b4-r8bfj\" (UID: \"46e3679a-b63e-4f7c-b118-02287f570a24\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8bfj" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.672806 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/37dd19aa-104d-4c79-859c-7161a185ad1c-audit-dir\") pod \"apiserver-76f77b778f-5hlv9\" (UID: \"37dd19aa-104d-4c79-859c-7161a185ad1c\") " pod="openshift-apiserver/apiserver-76f77b778f-5hlv9" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.672824 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46e3679a-b63e-4f7c-b118-02287f570a24-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-r8bfj\" (UID: \"46e3679a-b63e-4f7c-b118-02287f570a24\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8bfj" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.672842 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee1c2dd6-d759-4d3c-9ec7-86ec11419202-config\") pod \"machine-api-operator-5694c8668f-kzftt\" (UID: \"ee1c2dd6-d759-4d3c-9ec7-86ec11419202\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kzftt" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.672859 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/50e623b3-90a5-4b59-8e37-9ac5c96c3304-machine-approver-tls\") pod \"machine-approver-56656f9798-hf5vd\" (UID: \"50e623b3-90a5-4b59-8e37-9ac5c96c3304\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hf5vd" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.672873 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50e623b3-90a5-4b59-8e37-9ac5c96c3304-config\") pod \"machine-approver-56656f9798-hf5vd\" (UID: \"50e623b3-90a5-4b59-8e37-9ac5c96c3304\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hf5vd" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.672905 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/46e3679a-b63e-4f7c-b118-02287f570a24-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-r8bfj\" (UID: \"46e3679a-b63e-4f7c-b118-02287f570a24\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8bfj" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.673036 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b45e6c14-bbb6-4a9c-92e0-10c09fe37093-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-fthnz\" (UID: \"b45e6c14-bbb6-4a9c-92e0-10c09fe37093\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fthnz" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.673059 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ee1c2dd6-d759-4d3c-9ec7-86ec11419202-images\") pod \"machine-api-operator-5694c8668f-kzftt\" (UID: \"ee1c2dd6-d759-4d3c-9ec7-86ec11419202\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kzftt" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.673074 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/46e3679a-b63e-4f7c-b118-02287f570a24-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-r8bfj\" (UID: \"46e3679a-b63e-4f7c-b118-02287f570a24\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8bfj" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.673095 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/37dd19aa-104d-4c79-859c-7161a185ad1c-audit\") pod \"apiserver-76f77b778f-5hlv9\" (UID: \"37dd19aa-104d-4c79-859c-7161a185ad1c\") " pod="openshift-apiserver/apiserver-76f77b778f-5hlv9" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.673105 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-cc4c5" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.673730 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dzf49"] Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.673113 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/37dd19aa-104d-4c79-859c-7161a185ad1c-etcd-serving-ca\") pod \"apiserver-76f77b778f-5hlv9\" (UID: \"37dd19aa-104d-4c79-859c-7161a185ad1c\") " pod="openshift-apiserver/apiserver-76f77b778f-5hlv9" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.674140 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-dzf49" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.674136 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7dc5\" (UniqueName: \"kubernetes.io/projected/37dd19aa-104d-4c79-859c-7161a185ad1c-kube-api-access-r7dc5\") pod \"apiserver-76f77b778f-5hlv9\" (UID: \"37dd19aa-104d-4c79-859c-7161a185ad1c\") " pod="openshift-apiserver/apiserver-76f77b778f-5hlv9" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.674320 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/37dd19aa-104d-4c79-859c-7161a185ad1c-trusted-ca-bundle\") pod \"apiserver-76f77b778f-5hlv9\" (UID: \"37dd19aa-104d-4c79-859c-7161a185ad1c\") " pod="openshift-apiserver/apiserver-76f77b778f-5hlv9" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.674342 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7a677441-8b2d-41ae-8dd8-e3334c16c700-client-ca\") pod \"controller-manager-879f6c89f-4bdqt\" (UID: \"7a677441-8b2d-41ae-8dd8-e3334c16c700\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4bdqt" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.674360 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d6587adc-a984-4ce3-af8d-6739325c8604-metrics-certs\") pod \"router-default-5444994796-7frz7\" (UID: \"d6587adc-a984-4ce3-af8d-6739325c8604\") " pod="openshift-ingress/router-default-5444994796-7frz7" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.674378 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/46e3679a-b63e-4f7c-b118-02287f570a24-audit-policies\") pod \"oauth-openshift-558db77b4-r8bfj\" (UID: \"46e3679a-b63e-4f7c-b118-02287f570a24\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8bfj" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.674393 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b45e6c14-bbb6-4a9c-92e0-10c09fe37093-audit-policies\") pod \"apiserver-7bbb656c7d-fthnz\" (UID: \"b45e6c14-bbb6-4a9c-92e0-10c09fe37093\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fthnz" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.674407 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/37dd19aa-104d-4c79-859c-7161a185ad1c-etcd-client\") pod \"apiserver-76f77b778f-5hlv9\" (UID: \"37dd19aa-104d-4c79-859c-7161a185ad1c\") " pod="openshift-apiserver/apiserver-76f77b778f-5hlv9" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.674425 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e35ebb0f-1009-48b7-b981-c3c46bc7fce6-serving-cert\") pod \"authentication-operator-69f744f599-7bzfz\" (UID: \"e35ebb0f-1009-48b7-b981-c3c46bc7fce6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7bzfz" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.674440 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b45e6c14-bbb6-4a9c-92e0-10c09fe37093-serving-cert\") pod \"apiserver-7bbb656c7d-fthnz\" (UID: \"b45e6c14-bbb6-4a9c-92e0-10c09fe37093\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fthnz" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.674454 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/929acffa-90b0-4dfc-a65b-a8758c000f41-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-kp6rk\" (UID: \"929acffa-90b0-4dfc-a65b-a8758c000f41\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kp6rk" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.674467 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f1dfc875-304c-4f81-8d12-c5463743ad08-trusted-ca\") pod \"ingress-operator-5b745b69d9-62hxb\" (UID: \"f1dfc875-304c-4f81-8d12-c5463743ad08\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-62hxb" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.674483 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/46e3679a-b63e-4f7c-b118-02287f570a24-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-r8bfj\" (UID: \"46e3679a-b63e-4f7c-b118-02287f570a24\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8bfj" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.674502 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/37dd19aa-104d-4c79-859c-7161a185ad1c-node-pullsecrets\") pod \"apiserver-76f77b778f-5hlv9\" (UID: \"37dd19aa-104d-4c79-859c-7161a185ad1c\") " pod="openshift-apiserver/apiserver-76f77b778f-5hlv9" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.674517 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6jc8\" (UniqueName: \"kubernetes.io/projected/7a677441-8b2d-41ae-8dd8-e3334c16c700-kube-api-access-f6jc8\") pod \"controller-manager-879f6c89f-4bdqt\" (UID: \"7a677441-8b2d-41ae-8dd8-e3334c16c700\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4bdqt" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.674543 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da9aea11-a56e-498b-8678-590b288b372f-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-5652b\" (UID: \"da9aea11-a56e-498b-8678-590b288b372f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5652b" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.674608 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b45e6c14-bbb6-4a9c-92e0-10c09fe37093-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-fthnz\" (UID: \"b45e6c14-bbb6-4a9c-92e0-10c09fe37093\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fthnz" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.674630 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4e494e6a-0b34-4706-8284-5dc5086c89b3-metrics-tls\") pod \"dns-operator-744455d44c-6q5zd\" (UID: \"4e494e6a-0b34-4706-8284-5dc5086c89b3\") " pod="openshift-dns-operator/dns-operator-744455d44c-6q5zd" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.674682 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/46e3679a-b63e-4f7c-b118-02287f570a24-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-r8bfj\" (UID: \"46e3679a-b63e-4f7c-b118-02287f570a24\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8bfj" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.674726 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b45e6c14-bbb6-4a9c-92e0-10c09fe37093-audit-dir\") pod \"apiserver-7bbb656c7d-fthnz\" (UID: \"b45e6c14-bbb6-4a9c-92e0-10c09fe37093\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fthnz" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.674757 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37dd19aa-104d-4c79-859c-7161a185ad1c-config\") pod \"apiserver-76f77b778f-5hlv9\" (UID: \"37dd19aa-104d-4c79-859c-7161a185ad1c\") " pod="openshift-apiserver/apiserver-76f77b778f-5hlv9" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.674793 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf24a16c-4573-4014-8057-b4da43a0b145-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lh444\" (UID: \"bf24a16c-4573-4014-8057-b4da43a0b145\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lh444" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.674831 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/37dd19aa-104d-4c79-859c-7161a185ad1c-serving-cert\") pod \"apiserver-76f77b778f-5hlv9\" (UID: \"37dd19aa-104d-4c79-859c-7161a185ad1c\") " pod="openshift-apiserver/apiserver-76f77b778f-5hlv9" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.674854 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7qgl\" (UniqueName: \"kubernetes.io/projected/929acffa-90b0-4dfc-a65b-a8758c000f41-kube-api-access-j7qgl\") pod \"cluster-image-registry-operator-dc59b4c8b-kp6rk\" (UID: \"929acffa-90b0-4dfc-a65b-a8758c000f41\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kp6rk" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.674873 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e35ebb0f-1009-48b7-b981-c3c46bc7fce6-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-7bzfz\" (UID: \"e35ebb0f-1009-48b7-b981-c3c46bc7fce6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7bzfz" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.674922 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/46e3679a-b63e-4f7c-b118-02287f570a24-audit-dir\") pod \"oauth-openshift-558db77b4-r8bfj\" (UID: \"46e3679a-b63e-4f7c-b118-02287f570a24\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8bfj" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.674957 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/46e3679a-b63e-4f7c-b118-02287f570a24-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-r8bfj\" (UID: \"46e3679a-b63e-4f7c-b118-02287f570a24\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8bfj" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.674983 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wnm8\" (UniqueName: \"kubernetes.io/projected/10e7bec3-75fd-4ed0-bdcd-ff1ee3fd3f9b-kube-api-access-6wnm8\") pod \"multus-admission-controller-857f4d67dd-srgxj\" (UID: \"10e7bec3-75fd-4ed0-bdcd-ff1ee3fd3f9b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-srgxj" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.675003 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e35ebb0f-1009-48b7-b981-c3c46bc7fce6-service-ca-bundle\") pod \"authentication-operator-69f744f599-7bzfz\" (UID: \"e35ebb0f-1009-48b7-b981-c3c46bc7fce6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7bzfz" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.675021 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f468aa01-3497-4b9b-bf5b-33aaf845e8cd-config\") pod \"kube-controller-manager-operator-78b949d7b-m9vv7\" (UID: \"f468aa01-3497-4b9b-bf5b-33aaf845e8cd\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m9vv7" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.675049 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/929acffa-90b0-4dfc-a65b-a8758c000f41-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-kp6rk\" (UID: \"929acffa-90b0-4dfc-a65b-a8758c000f41\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kp6rk" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.675069 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/929acffa-90b0-4dfc-a65b-a8758c000f41-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-kp6rk\" (UID: \"929acffa-90b0-4dfc-a65b-a8758c000f41\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kp6rk" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.675122 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/ee1c2dd6-d759-4d3c-9ec7-86ec11419202-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-kzftt\" (UID: \"ee1c2dd6-d759-4d3c-9ec7-86ec11419202\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kzftt" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.675138 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a677441-8b2d-41ae-8dd8-e3334c16c700-serving-cert\") pod \"controller-manager-879f6c89f-4bdqt\" (UID: \"7a677441-8b2d-41ae-8dd8-e3334c16c700\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4bdqt" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.675164 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f1dfc875-304c-4f81-8d12-c5463743ad08-bound-sa-token\") pod \"ingress-operator-5b745b69d9-62hxb\" (UID: \"f1dfc875-304c-4f81-8d12-c5463743ad08\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-62hxb" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.675190 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/46e3679a-b63e-4f7c-b118-02287f570a24-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-r8bfj\" (UID: \"46e3679a-b63e-4f7c-b118-02287f570a24\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8bfj" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.675268 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/46e3679a-b63e-4f7c-b118-02287f570a24-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-r8bfj\" (UID: \"46e3679a-b63e-4f7c-b118-02287f570a24\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8bfj" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.675331 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b45e6c14-bbb6-4a9c-92e0-10c09fe37093-encryption-config\") pod \"apiserver-7bbb656c7d-fthnz\" (UID: \"b45e6c14-bbb6-4a9c-92e0-10c09fe37093\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fthnz" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.677415 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-5hlv9"] Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.678729 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-65pf2"] Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.679264 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-65pf2" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.682380 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.683191 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-bjpn7"] Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.683857 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bjpn7" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.686495 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-kzftt"] Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.691178 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2hd2f"] Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.694377 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2hd2f" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.694591 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kr8lp"] Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.696297 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kr8lp" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.697706 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.698017 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-htgw7"] Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.706465 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.712567 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nmhdf"] Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.712992 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-htgw7" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.713026 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320215-9kldt"] Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.713449 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-k6vnd"] Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.713775 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nmhdf" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.713815 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-ss8nw"] Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.713831 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-9d949"] Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.713842 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jwl8z"] Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.713853 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dchbw"] Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.713945 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320215-9kldt" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.714039 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-k6vnd" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.714491 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dchbw" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.714528 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-s2xqr"] Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.715190 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-s2xqr" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.716729 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-7bzfz"] Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.717186 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kp6rk"] Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.718843 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-fqxlz"] Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.719963 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-stxbl"] Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.721273 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-stxbl" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.723019 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-thj2p"] Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.723384 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.724475 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-62hxb"] Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.725929 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-r8bfj"] Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.727136 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-bshxb"] Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.729108 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-nklgj"] Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.732459 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5652b"] Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.736875 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lh444"] Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.737969 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-xlxgb"] Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.739196 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320215-9kldt"] Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.740240 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-srgxj"] Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.741374 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m9vv7"] Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.742526 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kr8lp"] Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.743351 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.744027 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-bjpn7"] Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.745244 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-6gt99"] Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.748186 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-6q5zd"] Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.748858 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4s6nm"] Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.749925 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lww4z"] Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.751286 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-k6vnd"] Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.752858 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-cc4c5"] Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.753949 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dchbw"] Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.755104 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-s2xqr"] Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.756162 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-htgw7"] Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.757318 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-h6qlb"] Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.758521 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-5wzgc"] Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.758779 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-h6qlb" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.759268 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-5wzgc" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.760288 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nmhdf"] Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.761793 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-65pf2"] Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.763055 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2hd2f"] Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.763286 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.764600 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-stxbl"] Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.765759 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-h6qlb"] Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.767163 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dzf49"] Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.768775 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-gw6s6"] Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.771572 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-gw6s6" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.772486 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-gw6s6"] Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.775901 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/929acffa-90b0-4dfc-a65b-a8758c000f41-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-kp6rk\" (UID: \"929acffa-90b0-4dfc-a65b-a8758c000f41\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kp6rk" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.776039 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/ee1c2dd6-d759-4d3c-9ec7-86ec11419202-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-kzftt\" (UID: \"ee1c2dd6-d759-4d3c-9ec7-86ec11419202\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kzftt" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.776144 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b45e6c14-bbb6-4a9c-92e0-10c09fe37093-encryption-config\") pod \"apiserver-7bbb656c7d-fthnz\" (UID: \"b45e6c14-bbb6-4a9c-92e0-10c09fe37093\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fthnz" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.776224 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a677441-8b2d-41ae-8dd8-e3334c16c700-serving-cert\") pod \"controller-manager-879f6c89f-4bdqt\" (UID: \"7a677441-8b2d-41ae-8dd8-e3334c16c700\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4bdqt" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.776955 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f1dfc875-304c-4f81-8d12-c5463743ad08-bound-sa-token\") pod \"ingress-operator-5b745b69d9-62hxb\" (UID: \"f1dfc875-304c-4f81-8d12-c5463743ad08\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-62hxb" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.776992 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/46e3679a-b63e-4f7c-b118-02287f570a24-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-r8bfj\" (UID: \"46e3679a-b63e-4f7c-b118-02287f570a24\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8bfj" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.777013 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/46e3679a-b63e-4f7c-b118-02287f570a24-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-r8bfj\" (UID: \"46e3679a-b63e-4f7c-b118-02287f570a24\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8bfj" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.777045 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f1dfc875-304c-4f81-8d12-c5463743ad08-metrics-tls\") pod \"ingress-operator-5b745b69d9-62hxb\" (UID: \"f1dfc875-304c-4f81-8d12-c5463743ad08\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-62hxb" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.777064 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/535f395c-e127-4a48-8766-707bf9d4d5a3-client-ca\") pod \"route-controller-manager-6576b87f9c-fqxlz\" (UID: \"535f395c-e127-4a48-8766-707bf9d4d5a3\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fqxlz" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.777081 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xljs\" (UniqueName: \"kubernetes.io/projected/e35ebb0f-1009-48b7-b981-c3c46bc7fce6-kube-api-access-2xljs\") pod \"authentication-operator-69f744f599-7bzfz\" (UID: \"e35ebb0f-1009-48b7-b981-c3c46bc7fce6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7bzfz" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.777764 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cctm5\" (UniqueName: \"kubernetes.io/projected/50e623b3-90a5-4b59-8e37-9ac5c96c3304-kube-api-access-cctm5\") pod \"machine-approver-56656f9798-hf5vd\" (UID: \"50e623b3-90a5-4b59-8e37-9ac5c96c3304\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hf5vd" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.777790 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f468aa01-3497-4b9b-bf5b-33aaf845e8cd-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-m9vv7\" (UID: \"f468aa01-3497-4b9b-bf5b-33aaf845e8cd\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m9vv7" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.777806 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jx82g\" (UniqueName: \"kubernetes.io/projected/f1dfc875-304c-4f81-8d12-c5463743ad08-kube-api-access-jx82g\") pod \"ingress-operator-5b745b69d9-62hxb\" (UID: \"f1dfc875-304c-4f81-8d12-c5463743ad08\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-62hxb" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.777825 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/46e3679a-b63e-4f7c-b118-02287f570a24-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-r8bfj\" (UID: \"46e3679a-b63e-4f7c-b118-02287f570a24\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8bfj" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.777841 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/46e3679a-b63e-4f7c-b118-02287f570a24-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-r8bfj\" (UID: \"46e3679a-b63e-4f7c-b118-02287f570a24\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8bfj" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.777860 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da9aea11-a56e-498b-8678-590b288b372f-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-5652b\" (UID: \"da9aea11-a56e-498b-8678-590b288b372f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5652b" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.777865 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/46e3679a-b63e-4f7c-b118-02287f570a24-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-r8bfj\" (UID: \"46e3679a-b63e-4f7c-b118-02287f570a24\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8bfj" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.777894 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf24a16c-4573-4014-8057-b4da43a0b145-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lh444\" (UID: \"bf24a16c-4573-4014-8057-b4da43a0b145\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lh444" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.777913 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-974f2\" (UniqueName: \"kubernetes.io/projected/162d74bd-6a30-4fa0-88b7-2aa59426c6c8-kube-api-access-974f2\") pod \"migrator-59844c95c7-6gt99\" (UID: \"162d74bd-6a30-4fa0-88b7-2aa59426c6c8\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6gt99" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.777931 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/535f395c-e127-4a48-8766-707bf9d4d5a3-config\") pod \"route-controller-manager-6576b87f9c-fqxlz\" (UID: \"535f395c-e127-4a48-8766-707bf9d4d5a3\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fqxlz" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.777945 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/10e7bec3-75fd-4ed0-bdcd-ff1ee3fd3f9b-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-srgxj\" (UID: \"10e7bec3-75fd-4ed0-bdcd-ff1ee3fd3f9b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-srgxj" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.777961 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7a677441-8b2d-41ae-8dd8-e3334c16c700-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-4bdqt\" (UID: \"7a677441-8b2d-41ae-8dd8-e3334c16c700\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4bdqt" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.777978 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/d6587adc-a984-4ce3-af8d-6739325c8604-default-certificate\") pod \"router-default-5444994796-7frz7\" (UID: \"d6587adc-a984-4ce3-af8d-6739325c8604\") " pod="openshift-ingress/router-default-5444994796-7frz7" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.778000 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/535f395c-e127-4a48-8766-707bf9d4d5a3-serving-cert\") pod \"route-controller-manager-6576b87f9c-fqxlz\" (UID: \"535f395c-e127-4a48-8766-707bf9d4d5a3\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fqxlz" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.778014 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/37dd19aa-104d-4c79-859c-7161a185ad1c-encryption-config\") pod \"apiserver-76f77b778f-5hlv9\" (UID: \"37dd19aa-104d-4c79-859c-7161a185ad1c\") " pod="openshift-apiserver/apiserver-76f77b778f-5hlv9" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.778031 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6587adc-a984-4ce3-af8d-6739325c8604-service-ca-bundle\") pod \"router-default-5444994796-7frz7\" (UID: \"d6587adc-a984-4ce3-af8d-6739325c8604\") " pod="openshift-ingress/router-default-5444994796-7frz7" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.778045 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vmf4\" (UniqueName: \"kubernetes.io/projected/4e494e6a-0b34-4706-8284-5dc5086c89b3-kube-api-access-4vmf4\") pod \"dns-operator-744455d44c-6q5zd\" (UID: \"4e494e6a-0b34-4706-8284-5dc5086c89b3\") " pod="openshift-dns-operator/dns-operator-744455d44c-6q5zd" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.778065 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/37dd19aa-104d-4c79-859c-7161a185ad1c-image-import-ca\") pod \"apiserver-76f77b778f-5hlv9\" (UID: \"37dd19aa-104d-4c79-859c-7161a185ad1c\") " pod="openshift-apiserver/apiserver-76f77b778f-5hlv9" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.778079 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a677441-8b2d-41ae-8dd8-e3334c16c700-config\") pod \"controller-manager-879f6c89f-4bdqt\" (UID: \"7a677441-8b2d-41ae-8dd8-e3334c16c700\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4bdqt" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.778096 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/46e3679a-b63e-4f7c-b118-02287f570a24-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-r8bfj\" (UID: \"46e3679a-b63e-4f7c-b118-02287f570a24\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8bfj" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.778115 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6l96\" (UniqueName: \"kubernetes.io/projected/b45e6c14-bbb6-4a9c-92e0-10c09fe37093-kube-api-access-z6l96\") pod \"apiserver-7bbb656c7d-fthnz\" (UID: \"b45e6c14-bbb6-4a9c-92e0-10c09fe37093\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fthnz" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.778131 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fck42\" (UniqueName: \"kubernetes.io/projected/ee1c2dd6-d759-4d3c-9ec7-86ec11419202-kube-api-access-fck42\") pod \"machine-api-operator-5694c8668f-kzftt\" (UID: \"ee1c2dd6-d759-4d3c-9ec7-86ec11419202\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kzftt" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.778147 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/d6587adc-a984-4ce3-af8d-6739325c8604-stats-auth\") pod \"router-default-5444994796-7frz7\" (UID: \"d6587adc-a984-4ce3-af8d-6739325c8604\") " pod="openshift-ingress/router-default-5444994796-7frz7" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.778161 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e35ebb0f-1009-48b7-b981-c3c46bc7fce6-config\") pod \"authentication-operator-69f744f599-7bzfz\" (UID: \"e35ebb0f-1009-48b7-b981-c3c46bc7fce6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7bzfz" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.778185 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b45e6c14-bbb6-4a9c-92e0-10c09fe37093-etcd-client\") pod \"apiserver-7bbb656c7d-fthnz\" (UID: \"b45e6c14-bbb6-4a9c-92e0-10c09fe37093\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fthnz" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.778201 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/50e623b3-90a5-4b59-8e37-9ac5c96c3304-auth-proxy-config\") pod \"machine-approver-56656f9798-hf5vd\" (UID: \"50e623b3-90a5-4b59-8e37-9ac5c96c3304\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hf5vd" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.778217 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2kjp\" (UniqueName: \"kubernetes.io/projected/da9aea11-a56e-498b-8678-590b288b372f-kube-api-access-q2kjp\") pod \"openshift-controller-manager-operator-756b6f6bc6-5652b\" (UID: \"da9aea11-a56e-498b-8678-590b288b372f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5652b" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.778235 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6vcq\" (UniqueName: \"kubernetes.io/projected/535f395c-e127-4a48-8766-707bf9d4d5a3-kube-api-access-p6vcq\") pod \"route-controller-manager-6576b87f9c-fqxlz\" (UID: \"535f395c-e127-4a48-8766-707bf9d4d5a3\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fqxlz" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.778252 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bf24a16c-4573-4014-8057-b4da43a0b145-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lh444\" (UID: \"bf24a16c-4573-4014-8057-b4da43a0b145\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lh444" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.778278 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f468aa01-3497-4b9b-bf5b-33aaf845e8cd-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-m9vv7\" (UID: \"f468aa01-3497-4b9b-bf5b-33aaf845e8cd\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m9vv7" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.778294 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hvpt\" (UniqueName: \"kubernetes.io/projected/d6587adc-a984-4ce3-af8d-6739325c8604-kube-api-access-7hvpt\") pod \"router-default-5444994796-7frz7\" (UID: \"d6587adc-a984-4ce3-af8d-6739325c8604\") " pod="openshift-ingress/router-default-5444994796-7frz7" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.778311 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssd8b\" (UniqueName: \"kubernetes.io/projected/46e3679a-b63e-4f7c-b118-02287f570a24-kube-api-access-ssd8b\") pod \"oauth-openshift-558db77b4-r8bfj\" (UID: \"46e3679a-b63e-4f7c-b118-02287f570a24\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8bfj" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.778329 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/37dd19aa-104d-4c79-859c-7161a185ad1c-audit-dir\") pod \"apiserver-76f77b778f-5hlv9\" (UID: \"37dd19aa-104d-4c79-859c-7161a185ad1c\") " pod="openshift-apiserver/apiserver-76f77b778f-5hlv9" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.778343 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee1c2dd6-d759-4d3c-9ec7-86ec11419202-config\") pod \"machine-api-operator-5694c8668f-kzftt\" (UID: \"ee1c2dd6-d759-4d3c-9ec7-86ec11419202\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kzftt" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.778358 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/50e623b3-90a5-4b59-8e37-9ac5c96c3304-machine-approver-tls\") pod \"machine-approver-56656f9798-hf5vd\" (UID: \"50e623b3-90a5-4b59-8e37-9ac5c96c3304\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hf5vd" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.778373 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50e623b3-90a5-4b59-8e37-9ac5c96c3304-config\") pod \"machine-approver-56656f9798-hf5vd\" (UID: \"50e623b3-90a5-4b59-8e37-9ac5c96c3304\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hf5vd" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.778392 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/46e3679a-b63e-4f7c-b118-02287f570a24-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-r8bfj\" (UID: \"46e3679a-b63e-4f7c-b118-02287f570a24\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8bfj" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.778411 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46e3679a-b63e-4f7c-b118-02287f570a24-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-r8bfj\" (UID: \"46e3679a-b63e-4f7c-b118-02287f570a24\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8bfj" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.778426 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b45e6c14-bbb6-4a9c-92e0-10c09fe37093-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-fthnz\" (UID: \"b45e6c14-bbb6-4a9c-92e0-10c09fe37093\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fthnz" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.778442 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/37dd19aa-104d-4c79-859c-7161a185ad1c-audit\") pod \"apiserver-76f77b778f-5hlv9\" (UID: \"37dd19aa-104d-4c79-859c-7161a185ad1c\") " pod="openshift-apiserver/apiserver-76f77b778f-5hlv9" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.778458 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/37dd19aa-104d-4c79-859c-7161a185ad1c-etcd-serving-ca\") pod \"apiserver-76f77b778f-5hlv9\" (UID: \"37dd19aa-104d-4c79-859c-7161a185ad1c\") " pod="openshift-apiserver/apiserver-76f77b778f-5hlv9" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.778472 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ee1c2dd6-d759-4d3c-9ec7-86ec11419202-images\") pod \"machine-api-operator-5694c8668f-kzftt\" (UID: \"ee1c2dd6-d759-4d3c-9ec7-86ec11419202\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kzftt" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.778489 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/46e3679a-b63e-4f7c-b118-02287f570a24-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-r8bfj\" (UID: \"46e3679a-b63e-4f7c-b118-02287f570a24\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8bfj" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.778506 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/37dd19aa-104d-4c79-859c-7161a185ad1c-trusted-ca-bundle\") pod \"apiserver-76f77b778f-5hlv9\" (UID: \"37dd19aa-104d-4c79-859c-7161a185ad1c\") " pod="openshift-apiserver/apiserver-76f77b778f-5hlv9" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.778526 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7dc5\" (UniqueName: \"kubernetes.io/projected/37dd19aa-104d-4c79-859c-7161a185ad1c-kube-api-access-r7dc5\") pod \"apiserver-76f77b778f-5hlv9\" (UID: \"37dd19aa-104d-4c79-859c-7161a185ad1c\") " pod="openshift-apiserver/apiserver-76f77b778f-5hlv9" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.778544 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b45e6c14-bbb6-4a9c-92e0-10c09fe37093-audit-policies\") pod \"apiserver-7bbb656c7d-fthnz\" (UID: \"b45e6c14-bbb6-4a9c-92e0-10c09fe37093\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fthnz" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.778559 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/37dd19aa-104d-4c79-859c-7161a185ad1c-etcd-client\") pod \"apiserver-76f77b778f-5hlv9\" (UID: \"37dd19aa-104d-4c79-859c-7161a185ad1c\") " pod="openshift-apiserver/apiserver-76f77b778f-5hlv9" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.778574 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7a677441-8b2d-41ae-8dd8-e3334c16c700-client-ca\") pod \"controller-manager-879f6c89f-4bdqt\" (UID: \"7a677441-8b2d-41ae-8dd8-e3334c16c700\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4bdqt" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.778592 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d6587adc-a984-4ce3-af8d-6739325c8604-metrics-certs\") pod \"router-default-5444994796-7frz7\" (UID: \"d6587adc-a984-4ce3-af8d-6739325c8604\") " pod="openshift-ingress/router-default-5444994796-7frz7" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.778607 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/46e3679a-b63e-4f7c-b118-02287f570a24-audit-policies\") pod \"oauth-openshift-558db77b4-r8bfj\" (UID: \"46e3679a-b63e-4f7c-b118-02287f570a24\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8bfj" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.778623 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b45e6c14-bbb6-4a9c-92e0-10c09fe37093-serving-cert\") pod \"apiserver-7bbb656c7d-fthnz\" (UID: \"b45e6c14-bbb6-4a9c-92e0-10c09fe37093\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fthnz" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.778640 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/929acffa-90b0-4dfc-a65b-a8758c000f41-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-kp6rk\" (UID: \"929acffa-90b0-4dfc-a65b-a8758c000f41\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kp6rk" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.778657 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e35ebb0f-1009-48b7-b981-c3c46bc7fce6-serving-cert\") pod \"authentication-operator-69f744f599-7bzfz\" (UID: \"e35ebb0f-1009-48b7-b981-c3c46bc7fce6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7bzfz" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.778672 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/37dd19aa-104d-4c79-859c-7161a185ad1c-node-pullsecrets\") pod \"apiserver-76f77b778f-5hlv9\" (UID: \"37dd19aa-104d-4c79-859c-7161a185ad1c\") " pod="openshift-apiserver/apiserver-76f77b778f-5hlv9" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.778688 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6jc8\" (UniqueName: \"kubernetes.io/projected/7a677441-8b2d-41ae-8dd8-e3334c16c700-kube-api-access-f6jc8\") pod \"controller-manager-879f6c89f-4bdqt\" (UID: \"7a677441-8b2d-41ae-8dd8-e3334c16c700\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4bdqt" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.778709 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f1dfc875-304c-4f81-8d12-c5463743ad08-trusted-ca\") pod \"ingress-operator-5b745b69d9-62hxb\" (UID: \"f1dfc875-304c-4f81-8d12-c5463743ad08\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-62hxb" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.778724 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/46e3679a-b63e-4f7c-b118-02287f570a24-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-r8bfj\" (UID: \"46e3679a-b63e-4f7c-b118-02287f570a24\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8bfj" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.778743 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b45e6c14-bbb6-4a9c-92e0-10c09fe37093-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-fthnz\" (UID: \"b45e6c14-bbb6-4a9c-92e0-10c09fe37093\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fthnz" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.778758 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da9aea11-a56e-498b-8678-590b288b372f-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-5652b\" (UID: \"da9aea11-a56e-498b-8678-590b288b372f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5652b" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.778787 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4e494e6a-0b34-4706-8284-5dc5086c89b3-metrics-tls\") pod \"dns-operator-744455d44c-6q5zd\" (UID: \"4e494e6a-0b34-4706-8284-5dc5086c89b3\") " pod="openshift-dns-operator/dns-operator-744455d44c-6q5zd" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.778804 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b45e6c14-bbb6-4a9c-92e0-10c09fe37093-audit-dir\") pod \"apiserver-7bbb656c7d-fthnz\" (UID: \"b45e6c14-bbb6-4a9c-92e0-10c09fe37093\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fthnz" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.778821 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37dd19aa-104d-4c79-859c-7161a185ad1c-config\") pod \"apiserver-76f77b778f-5hlv9\" (UID: \"37dd19aa-104d-4c79-859c-7161a185ad1c\") " pod="openshift-apiserver/apiserver-76f77b778f-5hlv9" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.778838 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/46e3679a-b63e-4f7c-b118-02287f570a24-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-r8bfj\" (UID: \"46e3679a-b63e-4f7c-b118-02287f570a24\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8bfj" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.778856 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/37dd19aa-104d-4c79-859c-7161a185ad1c-serving-cert\") pod \"apiserver-76f77b778f-5hlv9\" (UID: \"37dd19aa-104d-4c79-859c-7161a185ad1c\") " pod="openshift-apiserver/apiserver-76f77b778f-5hlv9" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.778873 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7qgl\" (UniqueName: \"kubernetes.io/projected/929acffa-90b0-4dfc-a65b-a8758c000f41-kube-api-access-j7qgl\") pod \"cluster-image-registry-operator-dc59b4c8b-kp6rk\" (UID: \"929acffa-90b0-4dfc-a65b-a8758c000f41\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kp6rk" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.778903 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf24a16c-4573-4014-8057-b4da43a0b145-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lh444\" (UID: \"bf24a16c-4573-4014-8057-b4da43a0b145\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lh444" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.778926 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/46e3679a-b63e-4f7c-b118-02287f570a24-audit-dir\") pod \"oauth-openshift-558db77b4-r8bfj\" (UID: \"46e3679a-b63e-4f7c-b118-02287f570a24\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8bfj" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.778941 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/46e3679a-b63e-4f7c-b118-02287f570a24-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-r8bfj\" (UID: \"46e3679a-b63e-4f7c-b118-02287f570a24\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8bfj" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.778959 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e35ebb0f-1009-48b7-b981-c3c46bc7fce6-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-7bzfz\" (UID: \"e35ebb0f-1009-48b7-b981-c3c46bc7fce6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7bzfz" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.778998 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e35ebb0f-1009-48b7-b981-c3c46bc7fce6-service-ca-bundle\") pod \"authentication-operator-69f744f599-7bzfz\" (UID: \"e35ebb0f-1009-48b7-b981-c3c46bc7fce6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7bzfz" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.779015 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wnm8\" (UniqueName: \"kubernetes.io/projected/10e7bec3-75fd-4ed0-bdcd-ff1ee3fd3f9b-kube-api-access-6wnm8\") pod \"multus-admission-controller-857f4d67dd-srgxj\" (UID: \"10e7bec3-75fd-4ed0-bdcd-ff1ee3fd3f9b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-srgxj" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.779031 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f468aa01-3497-4b9b-bf5b-33aaf845e8cd-config\") pod \"kube-controller-manager-operator-78b949d7b-m9vv7\" (UID: \"f468aa01-3497-4b9b-bf5b-33aaf845e8cd\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m9vv7" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.779036 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/da9aea11-a56e-498b-8678-590b288b372f-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-5652b\" (UID: \"da9aea11-a56e-498b-8678-590b288b372f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5652b" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.779048 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/929acffa-90b0-4dfc-a65b-a8758c000f41-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-kp6rk\" (UID: \"929acffa-90b0-4dfc-a65b-a8758c000f41\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kp6rk" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.779084 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7a677441-8b2d-41ae-8dd8-e3334c16c700-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-4bdqt\" (UID: \"7a677441-8b2d-41ae-8dd8-e3334c16c700\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4bdqt" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.779283 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/37dd19aa-104d-4c79-859c-7161a185ad1c-audit-dir\") pod \"apiserver-76f77b778f-5hlv9\" (UID: \"37dd19aa-104d-4c79-859c-7161a185ad1c\") " pod="openshift-apiserver/apiserver-76f77b778f-5hlv9" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.779451 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a677441-8b2d-41ae-8dd8-e3334c16c700-config\") pod \"controller-manager-879f6c89f-4bdqt\" (UID: \"7a677441-8b2d-41ae-8dd8-e3334c16c700\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4bdqt" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.780217 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/535f395c-e127-4a48-8766-707bf9d4d5a3-config\") pod \"route-controller-manager-6576b87f9c-fqxlz\" (UID: \"535f395c-e127-4a48-8766-707bf9d4d5a3\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fqxlz" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.780812 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e35ebb0f-1009-48b7-b981-c3c46bc7fce6-service-ca-bundle\") pod \"authentication-operator-69f744f599-7bzfz\" (UID: \"e35ebb0f-1009-48b7-b981-c3c46bc7fce6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7bzfz" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.780943 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e35ebb0f-1009-48b7-b981-c3c46bc7fce6-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-7bzfz\" (UID: \"e35ebb0f-1009-48b7-b981-c3c46bc7fce6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7bzfz" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.782273 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/46e3679a-b63e-4f7c-b118-02287f570a24-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-r8bfj\" (UID: \"46e3679a-b63e-4f7c-b118-02287f570a24\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8bfj" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.782710 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/46e3679a-b63e-4f7c-b118-02287f570a24-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-r8bfj\" (UID: \"46e3679a-b63e-4f7c-b118-02287f570a24\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8bfj" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.782813 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/ee1c2dd6-d759-4d3c-9ec7-86ec11419202-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-kzftt\" (UID: \"ee1c2dd6-d759-4d3c-9ec7-86ec11419202\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kzftt" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.782849 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a677441-8b2d-41ae-8dd8-e3334c16c700-serving-cert\") pod \"controller-manager-879f6c89f-4bdqt\" (UID: \"7a677441-8b2d-41ae-8dd8-e3334c16c700\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4bdqt" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.783285 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/46e3679a-b63e-4f7c-b118-02287f570a24-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-r8bfj\" (UID: \"46e3679a-b63e-4f7c-b118-02287f570a24\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8bfj" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.783322 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/37dd19aa-104d-4c79-859c-7161a185ad1c-etcd-client\") pod \"apiserver-76f77b778f-5hlv9\" (UID: \"37dd19aa-104d-4c79-859c-7161a185ad1c\") " pod="openshift-apiserver/apiserver-76f77b778f-5hlv9" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.783418 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b45e6c14-bbb6-4a9c-92e0-10c09fe37093-audit-policies\") pod \"apiserver-7bbb656c7d-fthnz\" (UID: \"b45e6c14-bbb6-4a9c-92e0-10c09fe37093\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fthnz" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.783630 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/d6587adc-a984-4ce3-af8d-6739325c8604-default-certificate\") pod \"router-default-5444994796-7frz7\" (UID: \"d6587adc-a984-4ce3-af8d-6739325c8604\") " pod="openshift-ingress/router-default-5444994796-7frz7" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.783788 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b45e6c14-bbb6-4a9c-92e0-10c09fe37093-encryption-config\") pod \"apiserver-7bbb656c7d-fthnz\" (UID: \"b45e6c14-bbb6-4a9c-92e0-10c09fe37093\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fthnz" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.783846 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e35ebb0f-1009-48b7-b981-c3c46bc7fce6-config\") pod \"authentication-operator-69f744f599-7bzfz\" (UID: \"e35ebb0f-1009-48b7-b981-c3c46bc7fce6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7bzfz" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.784051 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/46e3679a-b63e-4f7c-b118-02287f570a24-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-r8bfj\" (UID: \"46e3679a-b63e-4f7c-b118-02287f570a24\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8bfj" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.784216 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b45e6c14-bbb6-4a9c-92e0-10c09fe37093-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-fthnz\" (UID: \"b45e6c14-bbb6-4a9c-92e0-10c09fe37093\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fthnz" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.784742 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ee1c2dd6-d759-4d3c-9ec7-86ec11419202-config\") pod \"machine-api-operator-5694c8668f-kzftt\" (UID: \"ee1c2dd6-d759-4d3c-9ec7-86ec11419202\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kzftt" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.785494 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/929acffa-90b0-4dfc-a65b-a8758c000f41-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-kp6rk\" (UID: \"929acffa-90b0-4dfc-a65b-a8758c000f41\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kp6rk" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.785695 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b45e6c14-bbb6-4a9c-92e0-10c09fe37093-etcd-client\") pod \"apiserver-7bbb656c7d-fthnz\" (UID: \"b45e6c14-bbb6-4a9c-92e0-10c09fe37093\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fthnz" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.785818 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/da9aea11-a56e-498b-8678-590b288b372f-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-5652b\" (UID: \"da9aea11-a56e-498b-8678-590b288b372f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5652b" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.785900 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/535f395c-e127-4a48-8766-707bf9d4d5a3-serving-cert\") pod \"route-controller-manager-6576b87f9c-fqxlz\" (UID: \"535f395c-e127-4a48-8766-707bf9d4d5a3\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fqxlz" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.786152 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b45e6c14-bbb6-4a9c-92e0-10c09fe37093-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-fthnz\" (UID: \"b45e6c14-bbb6-4a9c-92e0-10c09fe37093\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fthnz" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.786328 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/46e3679a-b63e-4f7c-b118-02287f570a24-audit-policies\") pod \"oauth-openshift-558db77b4-r8bfj\" (UID: \"46e3679a-b63e-4f7c-b118-02287f570a24\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8bfj" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.786403 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/d6587adc-a984-4ce3-af8d-6739325c8604-stats-auth\") pod \"router-default-5444994796-7frz7\" (UID: \"d6587adc-a984-4ce3-af8d-6739325c8604\") " pod="openshift-ingress/router-default-5444994796-7frz7" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.787063 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/37dd19aa-104d-4c79-859c-7161a185ad1c-node-pullsecrets\") pod \"apiserver-76f77b778f-5hlv9\" (UID: \"37dd19aa-104d-4c79-859c-7161a185ad1c\") " pod="openshift-apiserver/apiserver-76f77b778f-5hlv9" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.787342 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/46e3679a-b63e-4f7c-b118-02287f570a24-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-r8bfj\" (UID: \"46e3679a-b63e-4f7c-b118-02287f570a24\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8bfj" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.787712 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/37dd19aa-104d-4c79-859c-7161a185ad1c-audit\") pod \"apiserver-76f77b778f-5hlv9\" (UID: \"37dd19aa-104d-4c79-859c-7161a185ad1c\") " pod="openshift-apiserver/apiserver-76f77b778f-5hlv9" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.787808 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/37dd19aa-104d-4c79-859c-7161a185ad1c-image-import-ca\") pod \"apiserver-76f77b778f-5hlv9\" (UID: \"37dd19aa-104d-4c79-859c-7161a185ad1c\") " pod="openshift-apiserver/apiserver-76f77b778f-5hlv9" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.787877 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/37dd19aa-104d-4c79-859c-7161a185ad1c-trusted-ca-bundle\") pod \"apiserver-76f77b778f-5hlv9\" (UID: \"37dd19aa-104d-4c79-859c-7161a185ad1c\") " pod="openshift-apiserver/apiserver-76f77b778f-5hlv9" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.788073 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37dd19aa-104d-4c79-859c-7161a185ad1c-config\") pod \"apiserver-76f77b778f-5hlv9\" (UID: \"37dd19aa-104d-4c79-859c-7161a185ad1c\") " pod="openshift-apiserver/apiserver-76f77b778f-5hlv9" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.788280 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/37dd19aa-104d-4c79-859c-7161a185ad1c-encryption-config\") pod \"apiserver-76f77b778f-5hlv9\" (UID: \"37dd19aa-104d-4c79-859c-7161a185ad1c\") " pod="openshift-apiserver/apiserver-76f77b778f-5hlv9" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.788331 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b45e6c14-bbb6-4a9c-92e0-10c09fe37093-audit-dir\") pod \"apiserver-7bbb656c7d-fthnz\" (UID: \"b45e6c14-bbb6-4a9c-92e0-10c09fe37093\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fthnz" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.788507 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/50e623b3-90a5-4b59-8e37-9ac5c96c3304-machine-approver-tls\") pod \"machine-approver-56656f9798-hf5vd\" (UID: \"50e623b3-90a5-4b59-8e37-9ac5c96c3304\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hf5vd" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.788719 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50e623b3-90a5-4b59-8e37-9ac5c96c3304-config\") pod \"machine-approver-56656f9798-hf5vd\" (UID: \"50e623b3-90a5-4b59-8e37-9ac5c96c3304\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hf5vd" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.788724 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/46e3679a-b63e-4f7c-b118-02287f570a24-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-r8bfj\" (UID: \"46e3679a-b63e-4f7c-b118-02287f570a24\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8bfj" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.788782 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b45e6c14-bbb6-4a9c-92e0-10c09fe37093-serving-cert\") pod \"apiserver-7bbb656c7d-fthnz\" (UID: \"b45e6c14-bbb6-4a9c-92e0-10c09fe37093\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fthnz" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.788976 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/535f395c-e127-4a48-8766-707bf9d4d5a3-client-ca\") pod \"route-controller-manager-6576b87f9c-fqxlz\" (UID: \"535f395c-e127-4a48-8766-707bf9d4d5a3\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fqxlz" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.789048 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ee1c2dd6-d759-4d3c-9ec7-86ec11419202-images\") pod \"machine-api-operator-5694c8668f-kzftt\" (UID: \"ee1c2dd6-d759-4d3c-9ec7-86ec11419202\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kzftt" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.789085 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/46e3679a-b63e-4f7c-b118-02287f570a24-audit-dir\") pod \"oauth-openshift-558db77b4-r8bfj\" (UID: \"46e3679a-b63e-4f7c-b118-02287f570a24\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8bfj" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.789419 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/37dd19aa-104d-4c79-859c-7161a185ad1c-etcd-serving-ca\") pod \"apiserver-76f77b778f-5hlv9\" (UID: \"37dd19aa-104d-4c79-859c-7161a185ad1c\") " pod="openshift-apiserver/apiserver-76f77b778f-5hlv9" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.789597 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7a677441-8b2d-41ae-8dd8-e3334c16c700-client-ca\") pod \"controller-manager-879f6c89f-4bdqt\" (UID: \"7a677441-8b2d-41ae-8dd8-e3334c16c700\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4bdqt" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.789693 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/929acffa-90b0-4dfc-a65b-a8758c000f41-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-kp6rk\" (UID: \"929acffa-90b0-4dfc-a65b-a8758c000f41\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kp6rk" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.790343 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e35ebb0f-1009-48b7-b981-c3c46bc7fce6-serving-cert\") pod \"authentication-operator-69f744f599-7bzfz\" (UID: \"e35ebb0f-1009-48b7-b981-c3c46bc7fce6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7bzfz" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.790553 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/46e3679a-b63e-4f7c-b118-02287f570a24-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-r8bfj\" (UID: \"46e3679a-b63e-4f7c-b118-02287f570a24\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8bfj" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.790868 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d6587adc-a984-4ce3-af8d-6739325c8604-metrics-certs\") pod \"router-default-5444994796-7frz7\" (UID: \"d6587adc-a984-4ce3-af8d-6739325c8604\") " pod="openshift-ingress/router-default-5444994796-7frz7" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.791388 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/46e3679a-b63e-4f7c-b118-02287f570a24-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-r8bfj\" (UID: \"46e3679a-b63e-4f7c-b118-02287f570a24\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8bfj" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.791599 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/46e3679a-b63e-4f7c-b118-02287f570a24-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-r8bfj\" (UID: \"46e3679a-b63e-4f7c-b118-02287f570a24\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8bfj" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.794030 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/50e623b3-90a5-4b59-8e37-9ac5c96c3304-auth-proxy-config\") pod \"machine-approver-56656f9798-hf5vd\" (UID: \"50e623b3-90a5-4b59-8e37-9ac5c96c3304\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hf5vd" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.795306 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/37dd19aa-104d-4c79-859c-7161a185ad1c-serving-cert\") pod \"apiserver-76f77b778f-5hlv9\" (UID: \"37dd19aa-104d-4c79-859c-7161a185ad1c\") " pod="openshift-apiserver/apiserver-76f77b778f-5hlv9" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.797247 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.802663 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46e3679a-b63e-4f7c-b118-02287f570a24-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-r8bfj\" (UID: \"46e3679a-b63e-4f7c-b118-02287f570a24\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8bfj" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.804070 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.823455 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.827779 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d6587adc-a984-4ce3-af8d-6739325c8604-service-ca-bundle\") pod \"router-default-5444994796-7frz7\" (UID: \"d6587adc-a984-4ce3-af8d-6739325c8604\") " pod="openshift-ingress/router-default-5444994796-7frz7" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.843346 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.874802 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.882748 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f1dfc875-304c-4f81-8d12-c5463743ad08-trusted-ca\") pod \"ingress-operator-5b745b69d9-62hxb\" (UID: \"f1dfc875-304c-4f81-8d12-c5463743ad08\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-62hxb" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.884266 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.903658 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.923791 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.932197 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f1dfc875-304c-4f81-8d12-c5463743ad08-metrics-tls\") pod \"ingress-operator-5b745b69d9-62hxb\" (UID: \"f1dfc875-304c-4f81-8d12-c5463743ad08\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-62hxb" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.965147 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Sep 30 06:21:36 crc kubenswrapper[4691]: I0930 06:21:36.983379 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Sep 30 06:21:37 crc kubenswrapper[4691]: I0930 06:21:37.003822 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Sep 30 06:21:37 crc kubenswrapper[4691]: I0930 06:21:37.023264 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Sep 30 06:21:37 crc kubenswrapper[4691]: I0930 06:21:37.043559 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Sep 30 06:21:37 crc kubenswrapper[4691]: I0930 06:21:37.063637 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Sep 30 06:21:37 crc kubenswrapper[4691]: I0930 06:21:37.070247 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f468aa01-3497-4b9b-bf5b-33aaf845e8cd-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-m9vv7\" (UID: \"f468aa01-3497-4b9b-bf5b-33aaf845e8cd\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m9vv7" Sep 30 06:21:37 crc kubenswrapper[4691]: I0930 06:21:37.084206 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Sep 30 06:21:37 crc kubenswrapper[4691]: I0930 06:21:37.103565 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Sep 30 06:21:37 crc kubenswrapper[4691]: I0930 06:21:37.112347 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f468aa01-3497-4b9b-bf5b-33aaf845e8cd-config\") pod \"kube-controller-manager-operator-78b949d7b-m9vv7\" (UID: \"f468aa01-3497-4b9b-bf5b-33aaf845e8cd\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m9vv7" Sep 30 06:21:37 crc kubenswrapper[4691]: I0930 06:21:37.124363 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Sep 30 06:21:37 crc kubenswrapper[4691]: I0930 06:21:37.144752 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Sep 30 06:21:37 crc kubenswrapper[4691]: I0930 06:21:37.159196 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4e494e6a-0b34-4706-8284-5dc5086c89b3-metrics-tls\") pod \"dns-operator-744455d44c-6q5zd\" (UID: \"4e494e6a-0b34-4706-8284-5dc5086c89b3\") " pod="openshift-dns-operator/dns-operator-744455d44c-6q5zd" Sep 30 06:21:37 crc kubenswrapper[4691]: I0930 06:21:37.164027 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Sep 30 06:21:37 crc kubenswrapper[4691]: I0930 06:21:37.184788 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Sep 30 06:21:37 crc kubenswrapper[4691]: I0930 06:21:37.204350 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Sep 30 06:21:37 crc kubenswrapper[4691]: I0930 06:21:37.224439 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Sep 30 06:21:37 crc kubenswrapper[4691]: I0930 06:21:37.244854 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Sep 30 06:21:37 crc kubenswrapper[4691]: I0930 06:21:37.264642 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Sep 30 06:21:37 crc kubenswrapper[4691]: I0930 06:21:37.284345 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Sep 30 06:21:37 crc kubenswrapper[4691]: I0930 06:21:37.304195 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Sep 30 06:21:37 crc kubenswrapper[4691]: I0930 06:21:37.325370 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Sep 30 06:21:37 crc kubenswrapper[4691]: I0930 06:21:37.344865 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Sep 30 06:21:37 crc kubenswrapper[4691]: I0930 06:21:37.364544 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Sep 30 06:21:37 crc kubenswrapper[4691]: I0930 06:21:37.384366 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Sep 30 06:21:37 crc kubenswrapper[4691]: I0930 06:21:37.404471 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Sep 30 06:21:37 crc kubenswrapper[4691]: I0930 06:21:37.425192 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Sep 30 06:21:37 crc kubenswrapper[4691]: I0930 06:21:37.434352 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf24a16c-4573-4014-8057-b4da43a0b145-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lh444\" (UID: \"bf24a16c-4573-4014-8057-b4da43a0b145\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lh444" Sep 30 06:21:37 crc kubenswrapper[4691]: I0930 06:21:37.444298 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Sep 30 06:21:37 crc kubenswrapper[4691]: I0930 06:21:37.449322 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf24a16c-4573-4014-8057-b4da43a0b145-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lh444\" (UID: \"bf24a16c-4573-4014-8057-b4da43a0b145\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lh444" Sep 30 06:21:37 crc kubenswrapper[4691]: I0930 06:21:37.465067 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Sep 30 06:21:37 crc kubenswrapper[4691]: I0930 06:21:37.475325 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/10e7bec3-75fd-4ed0-bdcd-ff1ee3fd3f9b-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-srgxj\" (UID: \"10e7bec3-75fd-4ed0-bdcd-ff1ee3fd3f9b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-srgxj" Sep 30 06:21:37 crc kubenswrapper[4691]: I0930 06:21:37.484622 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Sep 30 06:21:37 crc kubenswrapper[4691]: I0930 06:21:37.505101 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Sep 30 06:21:37 crc kubenswrapper[4691]: I0930 06:21:37.524563 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Sep 30 06:21:37 crc kubenswrapper[4691]: I0930 06:21:37.545280 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Sep 30 06:21:37 crc kubenswrapper[4691]: I0930 06:21:37.584066 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Sep 30 06:21:37 crc kubenswrapper[4691]: I0930 06:21:37.605469 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Sep 30 06:21:37 crc kubenswrapper[4691]: I0930 06:21:37.624567 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Sep 30 06:21:37 crc kubenswrapper[4691]: I0930 06:21:37.644991 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Sep 30 06:21:37 crc kubenswrapper[4691]: I0930 06:21:37.665103 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Sep 30 06:21:37 crc kubenswrapper[4691]: I0930 06:21:37.683179 4691 request.go:700] Waited for 1.008517982s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/secrets?fieldSelector=metadata.name%3Dmarketplace-operator-metrics&limit=500&resourceVersion=0 Sep 30 06:21:37 crc kubenswrapper[4691]: I0930 06:21:37.685600 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Sep 30 06:21:37 crc kubenswrapper[4691]: I0930 06:21:37.714343 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Sep 30 06:21:37 crc kubenswrapper[4691]: I0930 06:21:37.724401 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Sep 30 06:21:37 crc kubenswrapper[4691]: I0930 06:21:37.744834 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Sep 30 06:21:37 crc kubenswrapper[4691]: I0930 06:21:37.765522 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Sep 30 06:21:37 crc kubenswrapper[4691]: I0930 06:21:37.784953 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Sep 30 06:21:37 crc kubenswrapper[4691]: I0930 06:21:37.804522 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Sep 30 06:21:37 crc kubenswrapper[4691]: I0930 06:21:37.823616 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Sep 30 06:21:37 crc kubenswrapper[4691]: I0930 06:21:37.844357 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Sep 30 06:21:37 crc kubenswrapper[4691]: I0930 06:21:37.864370 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Sep 30 06:21:37 crc kubenswrapper[4691]: I0930 06:21:37.884009 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Sep 30 06:21:37 crc kubenswrapper[4691]: I0930 06:21:37.904848 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Sep 30 06:21:37 crc kubenswrapper[4691]: I0930 06:21:37.924869 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Sep 30 06:21:37 crc kubenswrapper[4691]: I0930 06:21:37.944288 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Sep 30 06:21:37 crc kubenswrapper[4691]: I0930 06:21:37.964133 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Sep 30 06:21:37 crc kubenswrapper[4691]: I0930 06:21:37.985063 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Sep 30 06:21:38 crc kubenswrapper[4691]: I0930 06:21:38.005202 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Sep 30 06:21:38 crc kubenswrapper[4691]: I0930 06:21:38.024410 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Sep 30 06:21:38 crc kubenswrapper[4691]: I0930 06:21:38.046016 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Sep 30 06:21:38 crc kubenswrapper[4691]: I0930 06:21:38.064019 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Sep 30 06:21:38 crc kubenswrapper[4691]: I0930 06:21:38.085246 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Sep 30 06:21:38 crc kubenswrapper[4691]: I0930 06:21:38.104850 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Sep 30 06:21:38 crc kubenswrapper[4691]: I0930 06:21:38.124982 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 06:21:38 crc kubenswrapper[4691]: I0930 06:21:38.144778 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Sep 30 06:21:38 crc kubenswrapper[4691]: I0930 06:21:38.163511 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Sep 30 06:21:38 crc kubenswrapper[4691]: I0930 06:21:38.184389 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Sep 30 06:21:38 crc kubenswrapper[4691]: I0930 06:21:38.204625 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Sep 30 06:21:38 crc kubenswrapper[4691]: I0930 06:21:38.224808 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Sep 30 06:21:38 crc kubenswrapper[4691]: I0930 06:21:38.244244 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 06:21:38 crc kubenswrapper[4691]: I0930 06:21:38.264363 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Sep 30 06:21:38 crc kubenswrapper[4691]: I0930 06:21:38.284682 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Sep 30 06:21:38 crc kubenswrapper[4691]: I0930 06:21:38.304245 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Sep 30 06:21:38 crc kubenswrapper[4691]: I0930 06:21:38.324804 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Sep 30 06:21:38 crc kubenswrapper[4691]: I0930 06:21:38.345552 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Sep 30 06:21:38 crc kubenswrapper[4691]: I0930 06:21:38.364429 4691 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Sep 30 06:21:38 crc kubenswrapper[4691]: I0930 06:21:38.384009 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Sep 30 06:21:38 crc kubenswrapper[4691]: I0930 06:21:38.404760 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Sep 30 06:21:38 crc kubenswrapper[4691]: I0930 06:21:38.425147 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Sep 30 06:21:38 crc kubenswrapper[4691]: I0930 06:21:38.444823 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Sep 30 06:21:38 crc kubenswrapper[4691]: I0930 06:21:38.465496 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Sep 30 06:21:38 crc kubenswrapper[4691]: I0930 06:21:38.485077 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Sep 30 06:21:38 crc kubenswrapper[4691]: I0930 06:21:38.510359 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Sep 30 06:21:38 crc kubenswrapper[4691]: I0930 06:21:38.525567 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Sep 30 06:21:38 crc kubenswrapper[4691]: I0930 06:21:38.545782 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Sep 30 06:21:38 crc kubenswrapper[4691]: I0930 06:21:38.602156 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f1dfc875-304c-4f81-8d12-c5463743ad08-bound-sa-token\") pod \"ingress-operator-5b745b69d9-62hxb\" (UID: \"f1dfc875-304c-4f81-8d12-c5463743ad08\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-62hxb" Sep 30 06:21:38 crc kubenswrapper[4691]: I0930 06:21:38.604245 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xljs\" (UniqueName: \"kubernetes.io/projected/e35ebb0f-1009-48b7-b981-c3c46bc7fce6-kube-api-access-2xljs\") pod \"authentication-operator-69f744f599-7bzfz\" (UID: \"e35ebb0f-1009-48b7-b981-c3c46bc7fce6\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-7bzfz" Sep 30 06:21:38 crc kubenswrapper[4691]: I0930 06:21:38.621998 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cctm5\" (UniqueName: \"kubernetes.io/projected/50e623b3-90a5-4b59-8e37-9ac5c96c3304-kube-api-access-cctm5\") pod \"machine-approver-56656f9798-hf5vd\" (UID: \"50e623b3-90a5-4b59-8e37-9ac5c96c3304\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hf5vd" Sep 30 06:21:38 crc kubenswrapper[4691]: I0930 06:21:38.625057 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" Sep 30 06:21:38 crc kubenswrapper[4691]: I0930 06:21:38.650555 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f468aa01-3497-4b9b-bf5b-33aaf845e8cd-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-m9vv7\" (UID: \"f468aa01-3497-4b9b-bf5b-33aaf845e8cd\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m9vv7" Sep 30 06:21:38 crc kubenswrapper[4691]: I0930 06:21:38.666744 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jx82g\" (UniqueName: \"kubernetes.io/projected/f1dfc875-304c-4f81-8d12-c5463743ad08-kube-api-access-jx82g\") pod \"ingress-operator-5b745b69d9-62hxb\" (UID: \"f1dfc875-304c-4f81-8d12-c5463743ad08\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-62hxb" Sep 30 06:21:38 crc kubenswrapper[4691]: I0930 06:21:38.690353 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-974f2\" (UniqueName: \"kubernetes.io/projected/162d74bd-6a30-4fa0-88b7-2aa59426c6c8-kube-api-access-974f2\") pod \"migrator-59844c95c7-6gt99\" (UID: \"162d74bd-6a30-4fa0-88b7-2aa59426c6c8\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6gt99" Sep 30 06:21:38 crc kubenswrapper[4691]: I0930 06:21:38.702654 4691 request.go:700] Waited for 1.92157487s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/serviceaccounts/cluster-image-registry-operator/token Sep 30 06:21:38 crc kubenswrapper[4691]: I0930 06:21:38.708931 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wnm8\" (UniqueName: \"kubernetes.io/projected/10e7bec3-75fd-4ed0-bdcd-ff1ee3fd3f9b-kube-api-access-6wnm8\") pod \"multus-admission-controller-857f4d67dd-srgxj\" (UID: \"10e7bec3-75fd-4ed0-bdcd-ff1ee3fd3f9b\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-srgxj" Sep 30 06:21:38 crc kubenswrapper[4691]: I0930 06:21:38.722819 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/929acffa-90b0-4dfc-a65b-a8758c000f41-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-kp6rk\" (UID: \"929acffa-90b0-4dfc-a65b-a8758c000f41\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kp6rk" Sep 30 06:21:38 crc kubenswrapper[4691]: I0930 06:21:38.744703 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6l96\" (UniqueName: \"kubernetes.io/projected/b45e6c14-bbb6-4a9c-92e0-10c09fe37093-kube-api-access-z6l96\") pod \"apiserver-7bbb656c7d-fthnz\" (UID: \"b45e6c14-bbb6-4a9c-92e0-10c09fe37093\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fthnz" Sep 30 06:21:38 crc kubenswrapper[4691]: I0930 06:21:38.748741 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hf5vd" Sep 30 06:21:38 crc kubenswrapper[4691]: I0930 06:21:38.773369 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7dc5\" (UniqueName: \"kubernetes.io/projected/37dd19aa-104d-4c79-859c-7161a185ad1c-kube-api-access-r7dc5\") pod \"apiserver-76f77b778f-5hlv9\" (UID: \"37dd19aa-104d-4c79-859c-7161a185ad1c\") " pod="openshift-apiserver/apiserver-76f77b778f-5hlv9" Sep 30 06:21:38 crc kubenswrapper[4691]: I0930 06:21:38.779238 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2kjp\" (UniqueName: \"kubernetes.io/projected/da9aea11-a56e-498b-8678-590b288b372f-kube-api-access-q2kjp\") pod \"openshift-controller-manager-operator-756b6f6bc6-5652b\" (UID: \"da9aea11-a56e-498b-8678-590b288b372f\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5652b" Sep 30 06:21:38 crc kubenswrapper[4691]: I0930 06:21:38.798686 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-7bzfz" Sep 30 06:21:38 crc kubenswrapper[4691]: I0930 06:21:38.803735 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6vcq\" (UniqueName: \"kubernetes.io/projected/535f395c-e127-4a48-8766-707bf9d4d5a3-kube-api-access-p6vcq\") pod \"route-controller-manager-6576b87f9c-fqxlz\" (UID: \"535f395c-e127-4a48-8766-707bf9d4d5a3\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fqxlz" Sep 30 06:21:38 crc kubenswrapper[4691]: I0930 06:21:38.828312 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5652b" Sep 30 06:21:38 crc kubenswrapper[4691]: I0930 06:21:38.828795 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bf24a16c-4573-4014-8057-b4da43a0b145-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-lh444\" (UID: \"bf24a16c-4573-4014-8057-b4da43a0b145\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lh444" Sep 30 06:21:38 crc kubenswrapper[4691]: I0930 06:21:38.843720 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssd8b\" (UniqueName: \"kubernetes.io/projected/46e3679a-b63e-4f7c-b118-02287f570a24-kube-api-access-ssd8b\") pod \"oauth-openshift-558db77b4-r8bfj\" (UID: \"46e3679a-b63e-4f7c-b118-02287f570a24\") " pod="openshift-authentication/oauth-openshift-558db77b4-r8bfj" Sep 30 06:21:38 crc kubenswrapper[4691]: I0930 06:21:38.857942 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-r8bfj" Sep 30 06:21:38 crc kubenswrapper[4691]: I0930 06:21:38.859444 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hvpt\" (UniqueName: \"kubernetes.io/projected/d6587adc-a984-4ce3-af8d-6739325c8604-kube-api-access-7hvpt\") pod \"router-default-5444994796-7frz7\" (UID: \"d6587adc-a984-4ce3-af8d-6739325c8604\") " pod="openshift-ingress/router-default-5444994796-7frz7" Sep 30 06:21:38 crc kubenswrapper[4691]: I0930 06:21:38.865652 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-7frz7" Sep 30 06:21:38 crc kubenswrapper[4691]: I0930 06:21:38.871936 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-62hxb" Sep 30 06:21:38 crc kubenswrapper[4691]: I0930 06:21:38.882048 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fck42\" (UniqueName: \"kubernetes.io/projected/ee1c2dd6-d759-4d3c-9ec7-86ec11419202-kube-api-access-fck42\") pod \"machine-api-operator-5694c8668f-kzftt\" (UID: \"ee1c2dd6-d759-4d3c-9ec7-86ec11419202\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-kzftt" Sep 30 06:21:38 crc kubenswrapper[4691]: I0930 06:21:38.906334 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vmf4\" (UniqueName: \"kubernetes.io/projected/4e494e6a-0b34-4706-8284-5dc5086c89b3-kube-api-access-4vmf4\") pod \"dns-operator-744455d44c-6q5zd\" (UID: \"4e494e6a-0b34-4706-8284-5dc5086c89b3\") " pod="openshift-dns-operator/dns-operator-744455d44c-6q5zd" Sep 30 06:21:38 crc kubenswrapper[4691]: W0930 06:21:38.909605 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6587adc_a984_4ce3_af8d_6739325c8604.slice/crio-f73708f086857fb9ee57e2b50dcd9f5625335d5503fc679a9e747019acba5423 WatchSource:0}: Error finding container f73708f086857fb9ee57e2b50dcd9f5625335d5503fc679a9e747019acba5423: Status 404 returned error can't find the container with id f73708f086857fb9ee57e2b50dcd9f5625335d5503fc679a9e747019acba5423 Sep 30 06:21:38 crc kubenswrapper[4691]: I0930 06:21:38.917965 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m9vv7" Sep 30 06:21:38 crc kubenswrapper[4691]: I0930 06:21:38.924091 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7qgl\" (UniqueName: \"kubernetes.io/projected/929acffa-90b0-4dfc-a65b-a8758c000f41-kube-api-access-j7qgl\") pod \"cluster-image-registry-operator-dc59b4c8b-kp6rk\" (UID: \"929acffa-90b0-4dfc-a65b-a8758c000f41\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kp6rk" Sep 30 06:21:38 crc kubenswrapper[4691]: I0930 06:21:38.924914 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-6q5zd" Sep 30 06:21:38 crc kubenswrapper[4691]: I0930 06:21:38.945594 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6jc8\" (UniqueName: \"kubernetes.io/projected/7a677441-8b2d-41ae-8dd8-e3334c16c700-kube-api-access-f6jc8\") pod \"controller-manager-879f6c89f-4bdqt\" (UID: \"7a677441-8b2d-41ae-8dd8-e3334c16c700\") " pod="openshift-controller-manager/controller-manager-879f6c89f-4bdqt" Sep 30 06:21:38 crc kubenswrapper[4691]: I0930 06:21:38.947095 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lh444" Sep 30 06:21:38 crc kubenswrapper[4691]: I0930 06:21:38.956623 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-5hlv9" Sep 30 06:21:38 crc kubenswrapper[4691]: I0930 06:21:38.956762 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-srgxj" Sep 30 06:21:38 crc kubenswrapper[4691]: I0930 06:21:38.961290 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6gt99" Sep 30 06:21:38 crc kubenswrapper[4691]: I0930 06:21:38.980194 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-4bdqt" Sep 30 06:21:38 crc kubenswrapper[4691]: I0930 06:21:38.991947 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hf5vd" event={"ID":"50e623b3-90a5-4b59-8e37-9ac5c96c3304","Type":"ContainerStarted","Data":"6f023108f8798a7808968558de24a2dfa7367a56c8c33e3cda18b618c6a141e7"} Sep 30 06:21:38 crc kubenswrapper[4691]: I0930 06:21:38.994626 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-kzftt" Sep 30 06:21:38 crc kubenswrapper[4691]: I0930 06:21:38.996396 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-7frz7" event={"ID":"d6587adc-a984-4ce3-af8d-6739325c8604","Type":"ContainerStarted","Data":"f73708f086857fb9ee57e2b50dcd9f5625335d5503fc679a9e747019acba5423"} Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.016684 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3-console-config\") pod \"console-f9d7485db-thj2p\" (UID: \"4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3\") " pod="openshift-console/console-f9d7485db-thj2p" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.016756 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef565c10-206c-406c-8b36-5b3336fa1934-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-lww4z\" (UID: \"ef565c10-206c-406c-8b36-5b3336fa1934\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lww4z" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.016790 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c5e7cf84-31e8-4a66-971f-18cba2113669-etcd-client\") pod \"etcd-operator-b45778765-9d949\" (UID: \"c5e7cf84-31e8-4a66-971f-18cba2113669\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9d949" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.016818 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c134a0fe-e3a2-4683-95d1-045ba2056b14-registry-tls\") pod \"image-registry-697d97f7c8-xlxgb\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.016832 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tfqp\" (UniqueName: \"kubernetes.io/projected/19dcb2c9-5608-463f-96aa-37fb332fcd57-kube-api-access-9tfqp\") pod \"openshift-apiserver-operator-796bbdcf4f-jwl8z\" (UID: \"19dcb2c9-5608-463f-96aa-37fb332fcd57\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jwl8z" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.016847 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c134a0fe-e3a2-4683-95d1-045ba2056b14-registry-certificates\") pod \"image-registry-697d97f7c8-xlxgb\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.016861 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzwbn\" (UniqueName: \"kubernetes.io/projected/c5e7cf84-31e8-4a66-971f-18cba2113669-kube-api-access-wzwbn\") pod \"etcd-operator-b45778765-9d949\" (UID: \"c5e7cf84-31e8-4a66-971f-18cba2113669\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9d949" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.016875 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/063ba2d8-4d97-474f-8c2d-0541f6f5cec5-serving-cert\") pod \"openshift-config-operator-7777fb866f-bshxb\" (UID: \"063ba2d8-4d97-474f-8c2d-0541f6f5cec5\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-bshxb" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.016908 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3cd27b2f-3e77-4f36-b646-60e833384949-trusted-ca\") pod \"console-operator-58897d9998-ss8nw\" (UID: \"3cd27b2f-3e77-4f36-b646-60e833384949\") " pod="openshift-console-operator/console-operator-58897d9998-ss8nw" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.016924 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c134a0fe-e3a2-4683-95d1-045ba2056b14-ca-trust-extracted\") pod \"image-registry-697d97f7c8-xlxgb\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.016940 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3cd27b2f-3e77-4f36-b646-60e833384949-config\") pod \"console-operator-58897d9998-ss8nw\" (UID: \"3cd27b2f-3e77-4f36-b646-60e833384949\") " pod="openshift-console-operator/console-operator-58897d9998-ss8nw" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.016957 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5ch9\" (UniqueName: \"kubernetes.io/projected/4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3-kube-api-access-g5ch9\") pod \"console-f9d7485db-thj2p\" (UID: \"4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3\") " pod="openshift-console/console-f9d7485db-thj2p" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.016971 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/063ba2d8-4d97-474f-8c2d-0541f6f5cec5-available-featuregates\") pod \"openshift-config-operator-7777fb866f-bshxb\" (UID: \"063ba2d8-4d97-474f-8c2d-0541f6f5cec5\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-bshxb" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.016998 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c134a0fe-e3a2-4683-95d1-045ba2056b14-installation-pull-secrets\") pod \"image-registry-697d97f7c8-xlxgb\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.017014 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19dcb2c9-5608-463f-96aa-37fb332fcd57-config\") pod \"openshift-apiserver-operator-796bbdcf4f-jwl8z\" (UID: \"19dcb2c9-5608-463f-96aa-37fb332fcd57\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jwl8z" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.017032 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kclg\" (UniqueName: \"kubernetes.io/projected/ef565c10-206c-406c-8b36-5b3336fa1934-kube-api-access-7kclg\") pod \"kube-storage-version-migrator-operator-b67b599dd-lww4z\" (UID: \"ef565c10-206c-406c-8b36-5b3336fa1934\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lww4z" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.017056 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5e7cf84-31e8-4a66-971f-18cba2113669-serving-cert\") pod \"etcd-operator-b45778765-9d949\" (UID: \"c5e7cf84-31e8-4a66-971f-18cba2113669\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9d949" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.017071 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5e7cf84-31e8-4a66-971f-18cba2113669-config\") pod \"etcd-operator-b45778765-9d949\" (UID: \"c5e7cf84-31e8-4a66-971f-18cba2113669\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9d949" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.017088 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c134a0fe-e3a2-4683-95d1-045ba2056b14-trusted-ca\") pod \"image-registry-697d97f7c8-xlxgb\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.017108 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfp66\" (UniqueName: \"kubernetes.io/projected/c134a0fe-e3a2-4683-95d1-045ba2056b14-kube-api-access-bfp66\") pod \"image-registry-697d97f7c8-xlxgb\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.017182 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3cd27b2f-3e77-4f36-b646-60e833384949-serving-cert\") pod \"console-operator-58897d9998-ss8nw\" (UID: \"3cd27b2f-3e77-4f36-b646-60e833384949\") " pod="openshift-console-operator/console-operator-58897d9998-ss8nw" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.017207 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbfnn\" (UniqueName: \"kubernetes.io/projected/3cd27b2f-3e77-4f36-b646-60e833384949-kube-api-access-nbfnn\") pod \"console-operator-58897d9998-ss8nw\" (UID: \"3cd27b2f-3e77-4f36-b646-60e833384949\") " pod="openshift-console-operator/console-operator-58897d9998-ss8nw" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.017223 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c134a0fe-e3a2-4683-95d1-045ba2056b14-bound-sa-token\") pod \"image-registry-697d97f7c8-xlxgb\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.017247 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/c5e7cf84-31e8-4a66-971f-18cba2113669-etcd-ca\") pod \"etcd-operator-b45778765-9d949\" (UID: \"c5e7cf84-31e8-4a66-971f-18cba2113669\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9d949" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.017261 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/c5e7cf84-31e8-4a66-971f-18cba2113669-etcd-service-ca\") pod \"etcd-operator-b45778765-9d949\" (UID: \"c5e7cf84-31e8-4a66-971f-18cba2113669\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9d949" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.017278 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xx8zh\" (UniqueName: \"kubernetes.io/projected/ffd77185-f5b4-418f-b675-db92dbe4c19f-kube-api-access-xx8zh\") pod \"cluster-samples-operator-665b6dd947-4s6nm\" (UID: \"ffd77185-f5b4-418f-b675-db92dbe4c19f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4s6nm" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.017305 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef565c10-206c-406c-8b36-5b3336fa1934-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-lww4z\" (UID: \"ef565c10-206c-406c-8b36-5b3336fa1934\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lww4z" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.017320 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/19dcb2c9-5608-463f-96aa-37fb332fcd57-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-jwl8z\" (UID: \"19dcb2c9-5608-463f-96aa-37fb332fcd57\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jwl8z" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.017347 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9wml\" (UniqueName: \"kubernetes.io/projected/a779ae38-da0c-4953-8d61-6047076785d2-kube-api-access-q9wml\") pod \"downloads-7954f5f757-nklgj\" (UID: \"a779ae38-da0c-4953-8d61-6047076785d2\") " pod="openshift-console/downloads-7954f5f757-nklgj" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.017363 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ffd77185-f5b4-418f-b675-db92dbe4c19f-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-4s6nm\" (UID: \"ffd77185-f5b4-418f-b675-db92dbe4c19f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4s6nm" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.017386 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlxgb\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.017401 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3-oauth-serving-cert\") pod \"console-f9d7485db-thj2p\" (UID: \"4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3\") " pod="openshift-console/console-f9d7485db-thj2p" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.017414 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3-service-ca\") pod \"console-f9d7485db-thj2p\" (UID: \"4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3\") " pod="openshift-console/console-f9d7485db-thj2p" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.017431 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3-console-serving-cert\") pod \"console-f9d7485db-thj2p\" (UID: \"4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3\") " pod="openshift-console/console-f9d7485db-thj2p" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.017445 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3-trusted-ca-bundle\") pod \"console-f9d7485db-thj2p\" (UID: \"4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3\") " pod="openshift-console/console-f9d7485db-thj2p" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.017461 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dtg9\" (UniqueName: \"kubernetes.io/projected/063ba2d8-4d97-474f-8c2d-0541f6f5cec5-kube-api-access-6dtg9\") pod \"openshift-config-operator-7777fb866f-bshxb\" (UID: \"063ba2d8-4d97-474f-8c2d-0541f6f5cec5\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-bshxb" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.017483 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3-console-oauth-config\") pod \"console-f9d7485db-thj2p\" (UID: \"4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3\") " pod="openshift-console/console-f9d7485db-thj2p" Sep 30 06:21:39 crc kubenswrapper[4691]: E0930 06:21:39.018535 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 06:21:39.518523711 +0000 UTC m=+142.993544751 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlxgb" (UID: "c134a0fe-e3a2-4683-95d1-045ba2056b14") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.021427 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fthnz" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.070192 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fqxlz" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.099735 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kp6rk" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.118516 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.118646 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ffd77185-f5b4-418f-b675-db92dbe4c19f-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-4s6nm\" (UID: \"ffd77185-f5b4-418f-b675-db92dbe4c19f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4s6nm" Sep 30 06:21:39 crc kubenswrapper[4691]: E0930 06:21:39.118743 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 06:21:39.618685932 +0000 UTC m=+143.093706972 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.118827 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3-oauth-serving-cert\") pod \"console-f9d7485db-thj2p\" (UID: \"4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3\") " pod="openshift-console/console-f9d7485db-thj2p" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.118928 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlxgb\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.118994 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3-service-ca\") pod \"console-f9d7485db-thj2p\" (UID: \"4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3\") " pod="openshift-console/console-f9d7485db-thj2p" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.119039 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3-console-serving-cert\") pod \"console-f9d7485db-thj2p\" (UID: \"4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3\") " pod="openshift-console/console-f9d7485db-thj2p" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.119064 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3-trusted-ca-bundle\") pod \"console-f9d7485db-thj2p\" (UID: \"4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3\") " pod="openshift-console/console-f9d7485db-thj2p" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.119080 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dtg9\" (UniqueName: \"kubernetes.io/projected/063ba2d8-4d97-474f-8c2d-0541f6f5cec5-kube-api-access-6dtg9\") pod \"openshift-config-operator-7777fb866f-bshxb\" (UID: \"063ba2d8-4d97-474f-8c2d-0541f6f5cec5\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-bshxb" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.119100 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/64faa7eb-4089-40e6-b084-6924f274ac51-auth-proxy-config\") pod \"machine-config-operator-74547568cd-bjpn7\" (UID: \"64faa7eb-4089-40e6-b084-6924f274ac51\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bjpn7" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.119117 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3-console-oauth-config\") pod \"console-f9d7485db-thj2p\" (UID: \"4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3\") " pod="openshift-console/console-f9d7485db-thj2p" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.119175 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8vv6\" (UniqueName: \"kubernetes.io/projected/16791f1d-bdee-46a9-9155-c1af947d96ec-kube-api-access-f8vv6\") pod \"catalog-operator-68c6474976-65pf2\" (UID: \"16791f1d-bdee-46a9-9155-c1af947d96ec\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-65pf2" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.119209 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2jn4\" (UniqueName: \"kubernetes.io/projected/66f39462-9632-40fb-abfa-5c13b3365d59-kube-api-access-x2jn4\") pod \"csi-hostpathplugin-h6qlb\" (UID: \"66f39462-9632-40fb-abfa-5c13b3365d59\") " pod="hostpath-provisioner/csi-hostpathplugin-h6qlb" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.119236 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/ec8d16a6-daeb-4cc0-9815-35e09c34fb71-tmpfs\") pod \"packageserver-d55dfcdfc-dchbw\" (UID: \"ec8d16a6-daeb-4cc0-9815-35e09c34fb71\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dchbw" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.119280 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/16791f1d-bdee-46a9-9155-c1af947d96ec-srv-cert\") pod \"catalog-operator-68c6474976-65pf2\" (UID: \"16791f1d-bdee-46a9-9155-c1af947d96ec\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-65pf2" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.119297 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/16791f1d-bdee-46a9-9155-c1af947d96ec-profile-collector-cert\") pod \"catalog-operator-68c6474976-65pf2\" (UID: \"16791f1d-bdee-46a9-9155-c1af947d96ec\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-65pf2" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.119313 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57168c43-1c61-42dd-863a-d524aa606da0-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-2hd2f\" (UID: \"57168c43-1c61-42dd-863a-d524aa606da0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2hd2f" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.119339 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/84873517-271a-49cb-8a39-5f28ddb68148-proxy-tls\") pod \"machine-config-controller-84d6567774-cc4c5\" (UID: \"84873517-271a-49cb-8a39-5f28ddb68148\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-cc4c5" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.119356 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/66f39462-9632-40fb-abfa-5c13b3365d59-plugins-dir\") pod \"csi-hostpathplugin-h6qlb\" (UID: \"66f39462-9632-40fb-abfa-5c13b3365d59\") " pod="hostpath-provisioner/csi-hostpathplugin-h6qlb" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.119394 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3-console-config\") pod \"console-f9d7485db-thj2p\" (UID: \"4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3\") " pod="openshift-console/console-f9d7485db-thj2p" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.119412 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef565c10-206c-406c-8b36-5b3336fa1934-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-lww4z\" (UID: \"ef565c10-206c-406c-8b36-5b3336fa1934\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lww4z" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.119429 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c4d1f634-c758-400a-8dba-baefd045834d-config-volume\") pod \"dns-default-stxbl\" (UID: \"c4d1f634-c758-400a-8dba-baefd045834d\") " pod="openshift-dns/dns-default-stxbl" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.119443 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/09bc6f61-b7ce-46ff-bae4-04ff6609b246-signing-key\") pod \"service-ca-9c57cc56f-k6vnd\" (UID: \"09bc6f61-b7ce-46ff-bae4-04ff6609b246\") " pod="openshift-service-ca/service-ca-9c57cc56f-k6vnd" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.119458 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e2df0f77-43e6-417a-a2ef-dfee3e80cbd8-config-volume\") pod \"collect-profiles-29320215-9kldt\" (UID: \"e2df0f77-43e6-417a-a2ef-dfee3e80cbd8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320215-9kldt" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.119487 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/66f39462-9632-40fb-abfa-5c13b3365d59-registration-dir\") pod \"csi-hostpathplugin-h6qlb\" (UID: \"66f39462-9632-40fb-abfa-5c13b3365d59\") " pod="hostpath-provisioner/csi-hostpathplugin-h6qlb" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.119502 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e2df0f77-43e6-417a-a2ef-dfee3e80cbd8-secret-volume\") pod \"collect-profiles-29320215-9kldt\" (UID: \"e2df0f77-43e6-417a-a2ef-dfee3e80cbd8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320215-9kldt" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.119516 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/09bc6f61-b7ce-46ff-bae4-04ff6609b246-signing-cabundle\") pod \"service-ca-9c57cc56f-k6vnd\" (UID: \"09bc6f61-b7ce-46ff-bae4-04ff6609b246\") " pod="openshift-service-ca/service-ca-9c57cc56f-k6vnd" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.119557 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ec8d16a6-daeb-4cc0-9815-35e09c34fb71-apiservice-cert\") pod \"packageserver-d55dfcdfc-dchbw\" (UID: \"ec8d16a6-daeb-4cc0-9815-35e09c34fb71\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dchbw" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.119576 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c5e7cf84-31e8-4a66-971f-18cba2113669-etcd-client\") pod \"etcd-operator-b45778765-9d949\" (UID: \"c5e7cf84-31e8-4a66-971f-18cba2113669\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9d949" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.119593 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/05d014c8-5910-4be2-aafe-731263c73c0f-node-bootstrap-token\") pod \"machine-config-server-5wzgc\" (UID: \"05d014c8-5910-4be2-aafe-731263c73c0f\") " pod="openshift-machine-config-operator/machine-config-server-5wzgc" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.119608 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbfjw\" (UniqueName: \"kubernetes.io/projected/c359f4fc-5f36-4b96-8524-f885abe54d26-kube-api-access-sbfjw\") pod \"service-ca-operator-777779d784-htgw7\" (UID: \"c359f4fc-5f36-4b96-8524-f885abe54d26\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-htgw7" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.119623 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/66f39462-9632-40fb-abfa-5c13b3365d59-mountpoint-dir\") pod \"csi-hostpathplugin-h6qlb\" (UID: \"66f39462-9632-40fb-abfa-5c13b3365d59\") " pod="hostpath-provisioner/csi-hostpathplugin-h6qlb" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.119637 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/05d014c8-5910-4be2-aafe-731263c73c0f-certs\") pod \"machine-config-server-5wzgc\" (UID: \"05d014c8-5910-4be2-aafe-731263c73c0f\") " pod="openshift-machine-config-operator/machine-config-server-5wzgc" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.119652 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-676xh\" (UniqueName: \"kubernetes.io/projected/f5ef6b93-5bb5-467f-8268-5feb300e2d5c-kube-api-access-676xh\") pod \"marketplace-operator-79b997595-dzf49\" (UID: \"f5ef6b93-5bb5-467f-8268-5feb300e2d5c\") " pod="openshift-marketplace/marketplace-operator-79b997595-dzf49" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.119669 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c134a0fe-e3a2-4683-95d1-045ba2056b14-registry-tls\") pod \"image-registry-697d97f7c8-xlxgb\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.119685 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tfqp\" (UniqueName: \"kubernetes.io/projected/19dcb2c9-5608-463f-96aa-37fb332fcd57-kube-api-access-9tfqp\") pod \"openshift-apiserver-operator-796bbdcf4f-jwl8z\" (UID: \"19dcb2c9-5608-463f-96aa-37fb332fcd57\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jwl8z" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.119701 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/84873517-271a-49cb-8a39-5f28ddb68148-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-cc4c5\" (UID: \"84873517-271a-49cb-8a39-5f28ddb68148\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-cc4c5" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.119717 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gt4r8\" (UniqueName: \"kubernetes.io/projected/84873517-271a-49cb-8a39-5f28ddb68148-kube-api-access-gt4r8\") pod \"machine-config-controller-84d6567774-cc4c5\" (UID: \"84873517-271a-49cb-8a39-5f28ddb68148\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-cc4c5" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.119739 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/57168c43-1c61-42dd-863a-d524aa606da0-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-2hd2f\" (UID: \"57168c43-1c61-42dd-863a-d524aa606da0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2hd2f" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.119765 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6khgs\" (UniqueName: \"kubernetes.io/projected/f8303a7f-5f1a-4ad3-9a02-a25fefffdc8b-kube-api-access-6khgs\") pod \"ingress-canary-gw6s6\" (UID: \"f8303a7f-5f1a-4ad3-9a02-a25fefffdc8b\") " pod="openshift-ingress-canary/ingress-canary-gw6s6" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.119781 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c134a0fe-e3a2-4683-95d1-045ba2056b14-registry-certificates\") pod \"image-registry-697d97f7c8-xlxgb\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.119796 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzwbn\" (UniqueName: \"kubernetes.io/projected/c5e7cf84-31e8-4a66-971f-18cba2113669-kube-api-access-wzwbn\") pod \"etcd-operator-b45778765-9d949\" (UID: \"c5e7cf84-31e8-4a66-971f-18cba2113669\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9d949" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.119821 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/063ba2d8-4d97-474f-8c2d-0541f6f5cec5-serving-cert\") pod \"openshift-config-operator-7777fb866f-bshxb\" (UID: \"063ba2d8-4d97-474f-8c2d-0541f6f5cec5\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-bshxb" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.119837 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwh9r\" (UniqueName: \"kubernetes.io/projected/e2df0f77-43e6-417a-a2ef-dfee3e80cbd8-kube-api-access-cwh9r\") pod \"collect-profiles-29320215-9kldt\" (UID: \"e2df0f77-43e6-417a-a2ef-dfee3e80cbd8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320215-9kldt" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.119852 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57168c43-1c61-42dd-863a-d524aa606da0-config\") pod \"kube-apiserver-operator-766d6c64bb-2hd2f\" (UID: \"57168c43-1c61-42dd-863a-d524aa606da0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2hd2f" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.119902 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3cd27b2f-3e77-4f36-b646-60e833384949-trusted-ca\") pod \"console-operator-58897d9998-ss8nw\" (UID: \"3cd27b2f-3e77-4f36-b646-60e833384949\") " pod="openshift-console-operator/console-operator-58897d9998-ss8nw" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.119920 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f5ef6b93-5bb5-467f-8268-5feb300e2d5c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-dzf49\" (UID: \"f5ef6b93-5bb5-467f-8268-5feb300e2d5c\") " pod="openshift-marketplace/marketplace-operator-79b997595-dzf49" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.119944 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c134a0fe-e3a2-4683-95d1-045ba2056b14-ca-trust-extracted\") pod \"image-registry-697d97f7c8-xlxgb\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.119968 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5ch9\" (UniqueName: \"kubernetes.io/projected/4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3-kube-api-access-g5ch9\") pod \"console-f9d7485db-thj2p\" (UID: \"4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3\") " pod="openshift-console/console-f9d7485db-thj2p" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.119983 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/063ba2d8-4d97-474f-8c2d-0541f6f5cec5-available-featuregates\") pod \"openshift-config-operator-7777fb866f-bshxb\" (UID: \"063ba2d8-4d97-474f-8c2d-0541f6f5cec5\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-bshxb" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.119998 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3cd27b2f-3e77-4f36-b646-60e833384949-config\") pod \"console-operator-58897d9998-ss8nw\" (UID: \"3cd27b2f-3e77-4f36-b646-60e833384949\") " pod="openshift-console-operator/console-operator-58897d9998-ss8nw" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.120021 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c359f4fc-5f36-4b96-8524-f885abe54d26-config\") pod \"service-ca-operator-777779d784-htgw7\" (UID: \"c359f4fc-5f36-4b96-8524-f885abe54d26\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-htgw7" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.120036 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/64faa7eb-4089-40e6-b084-6924f274ac51-proxy-tls\") pod \"machine-config-operator-74547568cd-bjpn7\" (UID: \"64faa7eb-4089-40e6-b084-6924f274ac51\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bjpn7" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.120060 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c4d1f634-c758-400a-8dba-baefd045834d-metrics-tls\") pod \"dns-default-stxbl\" (UID: \"c4d1f634-c758-400a-8dba-baefd045834d\") " pod="openshift-dns/dns-default-stxbl" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.120095 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c134a0fe-e3a2-4683-95d1-045ba2056b14-installation-pull-secrets\") pod \"image-registry-697d97f7c8-xlxgb\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.120113 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/2997210f-48b1-46e1-bf0f-12ed24852c8b-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-kr8lp\" (UID: \"2997210f-48b1-46e1-bf0f-12ed24852c8b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kr8lp" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.120129 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/66f39462-9632-40fb-abfa-5c13b3365d59-socket-dir\") pod \"csi-hostpathplugin-h6qlb\" (UID: \"66f39462-9632-40fb-abfa-5c13b3365d59\") " pod="hostpath-provisioner/csi-hostpathplugin-h6qlb" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.120172 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19dcb2c9-5608-463f-96aa-37fb332fcd57-config\") pod \"openshift-apiserver-operator-796bbdcf4f-jwl8z\" (UID: \"19dcb2c9-5608-463f-96aa-37fb332fcd57\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jwl8z" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.120187 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f5ef6b93-5bb5-467f-8268-5feb300e2d5c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-dzf49\" (UID: \"f5ef6b93-5bb5-467f-8268-5feb300e2d5c\") " pod="openshift-marketplace/marketplace-operator-79b997595-dzf49" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.120220 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3-service-ca\") pod \"console-f9d7485db-thj2p\" (UID: \"4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3\") " pod="openshift-console/console-f9d7485db-thj2p" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.120223 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kclg\" (UniqueName: \"kubernetes.io/projected/ef565c10-206c-406c-8b36-5b3336fa1934-kube-api-access-7kclg\") pod \"kube-storage-version-migrator-operator-b67b599dd-lww4z\" (UID: \"ef565c10-206c-406c-8b36-5b3336fa1934\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lww4z" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.120300 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ab09b6ea-c08c-4b80-a30a-09392178e10c-srv-cert\") pod \"olm-operator-6b444d44fb-nmhdf\" (UID: \"ab09b6ea-c08c-4b80-a30a-09392178e10c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nmhdf" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.120336 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5e7cf84-31e8-4a66-971f-18cba2113669-serving-cert\") pod \"etcd-operator-b45778765-9d949\" (UID: \"c5e7cf84-31e8-4a66-971f-18cba2113669\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9d949" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.120386 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5e7cf84-31e8-4a66-971f-18cba2113669-config\") pod \"etcd-operator-b45778765-9d949\" (UID: \"c5e7cf84-31e8-4a66-971f-18cba2113669\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9d949" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.120464 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c359f4fc-5f36-4b96-8524-f885abe54d26-serving-cert\") pod \"service-ca-operator-777779d784-htgw7\" (UID: \"c359f4fc-5f36-4b96-8524-f885abe54d26\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-htgw7" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.120492 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c134a0fe-e3a2-4683-95d1-045ba2056b14-trusted-ca\") pod \"image-registry-697d97f7c8-xlxgb\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.120543 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rl2fm\" (UniqueName: \"kubernetes.io/projected/500b5163-5eae-4fcc-9546-76c8f59841fa-kube-api-access-rl2fm\") pod \"package-server-manager-789f6589d5-s2xqr\" (UID: \"500b5163-5eae-4fcc-9546-76c8f59841fa\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-s2xqr" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.120608 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfp66\" (UniqueName: \"kubernetes.io/projected/c134a0fe-e3a2-4683-95d1-045ba2056b14-kube-api-access-bfp66\") pod \"image-registry-697d97f7c8-xlxgb\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.120632 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3cd27b2f-3e77-4f36-b646-60e833384949-serving-cert\") pod \"console-operator-58897d9998-ss8nw\" (UID: \"3cd27b2f-3e77-4f36-b646-60e833384949\") " pod="openshift-console-operator/console-operator-58897d9998-ss8nw" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.120655 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjs2s\" (UniqueName: \"kubernetes.io/projected/ec8d16a6-daeb-4cc0-9815-35e09c34fb71-kube-api-access-tjs2s\") pod \"packageserver-d55dfcdfc-dchbw\" (UID: \"ec8d16a6-daeb-4cc0-9815-35e09c34fb71\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dchbw" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.120740 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbfnn\" (UniqueName: \"kubernetes.io/projected/3cd27b2f-3e77-4f36-b646-60e833384949-kube-api-access-nbfnn\") pod \"console-operator-58897d9998-ss8nw\" (UID: \"3cd27b2f-3e77-4f36-b646-60e833384949\") " pod="openshift-console-operator/console-operator-58897d9998-ss8nw" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.120791 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8j45\" (UniqueName: \"kubernetes.io/projected/05d014c8-5910-4be2-aafe-731263c73c0f-kube-api-access-n8j45\") pod \"machine-config-server-5wzgc\" (UID: \"05d014c8-5910-4be2-aafe-731263c73c0f\") " pod="openshift-machine-config-operator/machine-config-server-5wzgc" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.121259 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c134a0fe-e3a2-4683-95d1-045ba2056b14-bound-sa-token\") pod \"image-registry-697d97f7c8-xlxgb\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.121335 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xx8zh\" (UniqueName: \"kubernetes.io/projected/ffd77185-f5b4-418f-b675-db92dbe4c19f-kube-api-access-xx8zh\") pod \"cluster-samples-operator-665b6dd947-4s6nm\" (UID: \"ffd77185-f5b4-418f-b675-db92dbe4c19f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4s6nm" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.121364 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/c5e7cf84-31e8-4a66-971f-18cba2113669-etcd-ca\") pod \"etcd-operator-b45778765-9d949\" (UID: \"c5e7cf84-31e8-4a66-971f-18cba2113669\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9d949" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.121404 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/c5e7cf84-31e8-4a66-971f-18cba2113669-etcd-service-ca\") pod \"etcd-operator-b45778765-9d949\" (UID: \"c5e7cf84-31e8-4a66-971f-18cba2113669\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9d949" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.121428 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ec8d16a6-daeb-4cc0-9815-35e09c34fb71-webhook-cert\") pod \"packageserver-d55dfcdfc-dchbw\" (UID: \"ec8d16a6-daeb-4cc0-9815-35e09c34fb71\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dchbw" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.121490 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrzfp\" (UniqueName: \"kubernetes.io/projected/ab09b6ea-c08c-4b80-a30a-09392178e10c-kube-api-access-mrzfp\") pod \"olm-operator-6b444d44fb-nmhdf\" (UID: \"ab09b6ea-c08c-4b80-a30a-09392178e10c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nmhdf" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.121517 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/500b5163-5eae-4fcc-9546-76c8f59841fa-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-s2xqr\" (UID: \"500b5163-5eae-4fcc-9546-76c8f59841fa\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-s2xqr" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.121544 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f8303a7f-5f1a-4ad3-9a02-a25fefffdc8b-cert\") pod \"ingress-canary-gw6s6\" (UID: \"f8303a7f-5f1a-4ad3-9a02-a25fefffdc8b\") " pod="openshift-ingress-canary/ingress-canary-gw6s6" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.121606 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ab09b6ea-c08c-4b80-a30a-09392178e10c-profile-collector-cert\") pod \"olm-operator-6b444d44fb-nmhdf\" (UID: \"ab09b6ea-c08c-4b80-a30a-09392178e10c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nmhdf" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.121651 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/64faa7eb-4089-40e6-b084-6924f274ac51-images\") pod \"machine-config-operator-74547568cd-bjpn7\" (UID: \"64faa7eb-4089-40e6-b084-6924f274ac51\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bjpn7" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.121738 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef565c10-206c-406c-8b36-5b3336fa1934-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-lww4z\" (UID: \"ef565c10-206c-406c-8b36-5b3336fa1934\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lww4z" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.123506 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c134a0fe-e3a2-4683-95d1-045ba2056b14-ca-trust-extracted\") pod \"image-registry-697d97f7c8-xlxgb\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.123980 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3-oauth-serving-cert\") pod \"console-f9d7485db-thj2p\" (UID: \"4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3\") " pod="openshift-console/console-f9d7485db-thj2p" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.124614 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3-console-config\") pod \"console-f9d7485db-thj2p\" (UID: \"4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3\") " pod="openshift-console/console-f9d7485db-thj2p" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.124758 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5e7cf84-31e8-4a66-971f-18cba2113669-config\") pod \"etcd-operator-b45778765-9d949\" (UID: \"c5e7cf84-31e8-4a66-971f-18cba2113669\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9d949" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.124850 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/c5e7cf84-31e8-4a66-971f-18cba2113669-etcd-ca\") pod \"etcd-operator-b45778765-9d949\" (UID: \"c5e7cf84-31e8-4a66-971f-18cba2113669\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9d949" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.124866 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3-trusted-ca-bundle\") pod \"console-f9d7485db-thj2p\" (UID: \"4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3\") " pod="openshift-console/console-f9d7485db-thj2p" Sep 30 06:21:39 crc kubenswrapper[4691]: E0930 06:21:39.125345 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 06:21:39.625327793 +0000 UTC m=+143.100348933 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlxgb" (UID: "c134a0fe-e3a2-4683-95d1-045ba2056b14") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.125574 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/c5e7cf84-31e8-4a66-971f-18cba2113669-etcd-service-ca\") pod \"etcd-operator-b45778765-9d949\" (UID: \"c5e7cf84-31e8-4a66-971f-18cba2113669\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9d949" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.126115 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3cd27b2f-3e77-4f36-b646-60e833384949-config\") pod \"console-operator-58897d9998-ss8nw\" (UID: \"3cd27b2f-3e77-4f36-b646-60e833384949\") " pod="openshift-console-operator/console-operator-58897d9998-ss8nw" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.126590 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19dcb2c9-5608-463f-96aa-37fb332fcd57-config\") pod \"openshift-apiserver-operator-796bbdcf4f-jwl8z\" (UID: \"19dcb2c9-5608-463f-96aa-37fb332fcd57\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jwl8z" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.126716 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/063ba2d8-4d97-474f-8c2d-0541f6f5cec5-available-featuregates\") pod \"openshift-config-operator-7777fb866f-bshxb\" (UID: \"063ba2d8-4d97-474f-8c2d-0541f6f5cec5\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-bshxb" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.126746 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/3cd27b2f-3e77-4f36-b646-60e833384949-trusted-ca\") pod \"console-operator-58897d9998-ss8nw\" (UID: \"3cd27b2f-3e77-4f36-b646-60e833384949\") " pod="openshift-console-operator/console-operator-58897d9998-ss8nw" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.126801 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kklfr\" (UniqueName: \"kubernetes.io/projected/c4d1f634-c758-400a-8dba-baefd045834d-kube-api-access-kklfr\") pod \"dns-default-stxbl\" (UID: \"c4d1f634-c758-400a-8dba-baefd045834d\") " pod="openshift-dns/dns-default-stxbl" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.126830 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9pv2\" (UniqueName: \"kubernetes.io/projected/64faa7eb-4089-40e6-b084-6924f274ac51-kube-api-access-q9pv2\") pod \"machine-config-operator-74547568cd-bjpn7\" (UID: \"64faa7eb-4089-40e6-b084-6924f274ac51\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bjpn7" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.126867 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/19dcb2c9-5608-463f-96aa-37fb332fcd57-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-jwl8z\" (UID: \"19dcb2c9-5608-463f-96aa-37fb332fcd57\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jwl8z" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.127020 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5e7cf84-31e8-4a66-971f-18cba2113669-serving-cert\") pod \"etcd-operator-b45778765-9d949\" (UID: \"c5e7cf84-31e8-4a66-971f-18cba2113669\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9d949" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.127124 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c134a0fe-e3a2-4683-95d1-045ba2056b14-trusted-ca\") pod \"image-registry-697d97f7c8-xlxgb\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.128375 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3cd27b2f-3e77-4f36-b646-60e833384949-serving-cert\") pod \"console-operator-58897d9998-ss8nw\" (UID: \"3cd27b2f-3e77-4f36-b646-60e833384949\") " pod="openshift-console-operator/console-operator-58897d9998-ss8nw" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.128597 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef565c10-206c-406c-8b36-5b3336fa1934-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-lww4z\" (UID: \"ef565c10-206c-406c-8b36-5b3336fa1934\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lww4z" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.128718 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/66f39462-9632-40fb-abfa-5c13b3365d59-csi-data-dir\") pod \"csi-hostpathplugin-h6qlb\" (UID: \"66f39462-9632-40fb-abfa-5c13b3365d59\") " pod="hostpath-provisioner/csi-hostpathplugin-h6qlb" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.129416 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nhvm\" (UniqueName: \"kubernetes.io/projected/09bc6f61-b7ce-46ff-bae4-04ff6609b246-kube-api-access-8nhvm\") pod \"service-ca-9c57cc56f-k6vnd\" (UID: \"09bc6f61-b7ce-46ff-bae4-04ff6609b246\") " pod="openshift-service-ca/service-ca-9c57cc56f-k6vnd" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.129805 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9wml\" (UniqueName: \"kubernetes.io/projected/a779ae38-da0c-4953-8d61-6047076785d2-kube-api-access-q9wml\") pod \"downloads-7954f5f757-nklgj\" (UID: \"a779ae38-da0c-4953-8d61-6047076785d2\") " pod="openshift-console/downloads-7954f5f757-nklgj" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.130051 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svxh7\" (UniqueName: \"kubernetes.io/projected/2997210f-48b1-46e1-bf0f-12ed24852c8b-kube-api-access-svxh7\") pod \"control-plane-machine-set-operator-78cbb6b69f-kr8lp\" (UID: \"2997210f-48b1-46e1-bf0f-12ed24852c8b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kr8lp" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.130711 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c134a0fe-e3a2-4683-95d1-045ba2056b14-registry-tls\") pod \"image-registry-697d97f7c8-xlxgb\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.133421 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3-console-oauth-config\") pod \"console-f9d7485db-thj2p\" (UID: \"4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3\") " pod="openshift-console/console-f9d7485db-thj2p" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.137293 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/c5e7cf84-31e8-4a66-971f-18cba2113669-etcd-client\") pod \"etcd-operator-b45778765-9d949\" (UID: \"c5e7cf84-31e8-4a66-971f-18cba2113669\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9d949" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.139472 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ffd77185-f5b4-418f-b675-db92dbe4c19f-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-4s6nm\" (UID: \"ffd77185-f5b4-418f-b675-db92dbe4c19f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4s6nm" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.140835 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ef565c10-206c-406c-8b36-5b3336fa1934-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-lww4z\" (UID: \"ef565c10-206c-406c-8b36-5b3336fa1934\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lww4z" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.142872 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c134a0fe-e3a2-4683-95d1-045ba2056b14-installation-pull-secrets\") pod \"image-registry-697d97f7c8-xlxgb\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.145955 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3-console-serving-cert\") pod \"console-f9d7485db-thj2p\" (UID: \"4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3\") " pod="openshift-console/console-f9d7485db-thj2p" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.146963 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/063ba2d8-4d97-474f-8c2d-0541f6f5cec5-serving-cert\") pod \"openshift-config-operator-7777fb866f-bshxb\" (UID: \"063ba2d8-4d97-474f-8c2d-0541f6f5cec5\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-bshxb" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.152289 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c134a0fe-e3a2-4683-95d1-045ba2056b14-registry-certificates\") pod \"image-registry-697d97f7c8-xlxgb\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.163573 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/19dcb2c9-5608-463f-96aa-37fb332fcd57-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-jwl8z\" (UID: \"19dcb2c9-5608-463f-96aa-37fb332fcd57\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jwl8z" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.169254 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kclg\" (UniqueName: \"kubernetes.io/projected/ef565c10-206c-406c-8b36-5b3336fa1934-kube-api-access-7kclg\") pod \"kube-storage-version-migrator-operator-b67b599dd-lww4z\" (UID: \"ef565c10-206c-406c-8b36-5b3336fa1934\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lww4z" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.183764 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5ch9\" (UniqueName: \"kubernetes.io/projected/4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3-kube-api-access-g5ch9\") pod \"console-f9d7485db-thj2p\" (UID: \"4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3\") " pod="openshift-console/console-f9d7485db-thj2p" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.207378 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tfqp\" (UniqueName: \"kubernetes.io/projected/19dcb2c9-5608-463f-96aa-37fb332fcd57-kube-api-access-9tfqp\") pod \"openshift-apiserver-operator-796bbdcf4f-jwl8z\" (UID: \"19dcb2c9-5608-463f-96aa-37fb332fcd57\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jwl8z" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.222980 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbfnn\" (UniqueName: \"kubernetes.io/projected/3cd27b2f-3e77-4f36-b646-60e833384949-kube-api-access-nbfnn\") pod \"console-operator-58897d9998-ss8nw\" (UID: \"3cd27b2f-3e77-4f36-b646-60e833384949\") " pod="openshift-console-operator/console-operator-58897d9998-ss8nw" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.232333 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lww4z" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.232366 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.232501 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/66f39462-9632-40fb-abfa-5c13b3365d59-registration-dir\") pod \"csi-hostpathplugin-h6qlb\" (UID: \"66f39462-9632-40fb-abfa-5c13b3365d59\") " pod="hostpath-provisioner/csi-hostpathplugin-h6qlb" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.232523 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e2df0f77-43e6-417a-a2ef-dfee3e80cbd8-secret-volume\") pod \"collect-profiles-29320215-9kldt\" (UID: \"e2df0f77-43e6-417a-a2ef-dfee3e80cbd8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320215-9kldt" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.232538 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/09bc6f61-b7ce-46ff-bae4-04ff6609b246-signing-cabundle\") pod \"service-ca-9c57cc56f-k6vnd\" (UID: \"09bc6f61-b7ce-46ff-bae4-04ff6609b246\") " pod="openshift-service-ca/service-ca-9c57cc56f-k6vnd" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.232554 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ec8d16a6-daeb-4cc0-9815-35e09c34fb71-apiservice-cert\") pod \"packageserver-d55dfcdfc-dchbw\" (UID: \"ec8d16a6-daeb-4cc0-9815-35e09c34fb71\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dchbw" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.232570 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sbfjw\" (UniqueName: \"kubernetes.io/projected/c359f4fc-5f36-4b96-8524-f885abe54d26-kube-api-access-sbfjw\") pod \"service-ca-operator-777779d784-htgw7\" (UID: \"c359f4fc-5f36-4b96-8524-f885abe54d26\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-htgw7" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.232589 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/05d014c8-5910-4be2-aafe-731263c73c0f-node-bootstrap-token\") pod \"machine-config-server-5wzgc\" (UID: \"05d014c8-5910-4be2-aafe-731263c73c0f\") " pod="openshift-machine-config-operator/machine-config-server-5wzgc" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.232613 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/66f39462-9632-40fb-abfa-5c13b3365d59-mountpoint-dir\") pod \"csi-hostpathplugin-h6qlb\" (UID: \"66f39462-9632-40fb-abfa-5c13b3365d59\") " pod="hostpath-provisioner/csi-hostpathplugin-h6qlb" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.232635 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/05d014c8-5910-4be2-aafe-731263c73c0f-certs\") pod \"machine-config-server-5wzgc\" (UID: \"05d014c8-5910-4be2-aafe-731263c73c0f\") " pod="openshift-machine-config-operator/machine-config-server-5wzgc" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.232659 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-676xh\" (UniqueName: \"kubernetes.io/projected/f5ef6b93-5bb5-467f-8268-5feb300e2d5c-kube-api-access-676xh\") pod \"marketplace-operator-79b997595-dzf49\" (UID: \"f5ef6b93-5bb5-467f-8268-5feb300e2d5c\") " pod="openshift-marketplace/marketplace-operator-79b997595-dzf49" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.232686 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/84873517-271a-49cb-8a39-5f28ddb68148-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-cc4c5\" (UID: \"84873517-271a-49cb-8a39-5f28ddb68148\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-cc4c5" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.232703 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gt4r8\" (UniqueName: \"kubernetes.io/projected/84873517-271a-49cb-8a39-5f28ddb68148-kube-api-access-gt4r8\") pod \"machine-config-controller-84d6567774-cc4c5\" (UID: \"84873517-271a-49cb-8a39-5f28ddb68148\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-cc4c5" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.232717 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/57168c43-1c61-42dd-863a-d524aa606da0-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-2hd2f\" (UID: \"57168c43-1c61-42dd-863a-d524aa606da0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2hd2f" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.232741 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6khgs\" (UniqueName: \"kubernetes.io/projected/f8303a7f-5f1a-4ad3-9a02-a25fefffdc8b-kube-api-access-6khgs\") pod \"ingress-canary-gw6s6\" (UID: \"f8303a7f-5f1a-4ad3-9a02-a25fefffdc8b\") " pod="openshift-ingress-canary/ingress-canary-gw6s6" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.232758 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwh9r\" (UniqueName: \"kubernetes.io/projected/e2df0f77-43e6-417a-a2ef-dfee3e80cbd8-kube-api-access-cwh9r\") pod \"collect-profiles-29320215-9kldt\" (UID: \"e2df0f77-43e6-417a-a2ef-dfee3e80cbd8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320215-9kldt" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.232771 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57168c43-1c61-42dd-863a-d524aa606da0-config\") pod \"kube-apiserver-operator-766d6c64bb-2hd2f\" (UID: \"57168c43-1c61-42dd-863a-d524aa606da0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2hd2f" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.232794 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f5ef6b93-5bb5-467f-8268-5feb300e2d5c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-dzf49\" (UID: \"f5ef6b93-5bb5-467f-8268-5feb300e2d5c\") " pod="openshift-marketplace/marketplace-operator-79b997595-dzf49" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.232814 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c359f4fc-5f36-4b96-8524-f885abe54d26-config\") pod \"service-ca-operator-777779d784-htgw7\" (UID: \"c359f4fc-5f36-4b96-8524-f885abe54d26\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-htgw7" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.232832 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/64faa7eb-4089-40e6-b084-6924f274ac51-proxy-tls\") pod \"machine-config-operator-74547568cd-bjpn7\" (UID: \"64faa7eb-4089-40e6-b084-6924f274ac51\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bjpn7" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.232845 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c4d1f634-c758-400a-8dba-baefd045834d-metrics-tls\") pod \"dns-default-stxbl\" (UID: \"c4d1f634-c758-400a-8dba-baefd045834d\") " pod="openshift-dns/dns-default-stxbl" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.232864 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/2997210f-48b1-46e1-bf0f-12ed24852c8b-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-kr8lp\" (UID: \"2997210f-48b1-46e1-bf0f-12ed24852c8b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kr8lp" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.233142 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/66f39462-9632-40fb-abfa-5c13b3365d59-socket-dir\") pod \"csi-hostpathplugin-h6qlb\" (UID: \"66f39462-9632-40fb-abfa-5c13b3365d59\") " pod="hostpath-provisioner/csi-hostpathplugin-h6qlb" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.233172 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f5ef6b93-5bb5-467f-8268-5feb300e2d5c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-dzf49\" (UID: \"f5ef6b93-5bb5-467f-8268-5feb300e2d5c\") " pod="openshift-marketplace/marketplace-operator-79b997595-dzf49" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.233189 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ab09b6ea-c08c-4b80-a30a-09392178e10c-srv-cert\") pod \"olm-operator-6b444d44fb-nmhdf\" (UID: \"ab09b6ea-c08c-4b80-a30a-09392178e10c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nmhdf" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.233214 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c359f4fc-5f36-4b96-8524-f885abe54d26-serving-cert\") pod \"service-ca-operator-777779d784-htgw7\" (UID: \"c359f4fc-5f36-4b96-8524-f885abe54d26\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-htgw7" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.233230 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rl2fm\" (UniqueName: \"kubernetes.io/projected/500b5163-5eae-4fcc-9546-76c8f59841fa-kube-api-access-rl2fm\") pod \"package-server-manager-789f6589d5-s2xqr\" (UID: \"500b5163-5eae-4fcc-9546-76c8f59841fa\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-s2xqr" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.233254 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjs2s\" (UniqueName: \"kubernetes.io/projected/ec8d16a6-daeb-4cc0-9815-35e09c34fb71-kube-api-access-tjs2s\") pod \"packageserver-d55dfcdfc-dchbw\" (UID: \"ec8d16a6-daeb-4cc0-9815-35e09c34fb71\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dchbw" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.233272 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8j45\" (UniqueName: \"kubernetes.io/projected/05d014c8-5910-4be2-aafe-731263c73c0f-kube-api-access-n8j45\") pod \"machine-config-server-5wzgc\" (UID: \"05d014c8-5910-4be2-aafe-731263c73c0f\") " pod="openshift-machine-config-operator/machine-config-server-5wzgc" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.233307 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ec8d16a6-daeb-4cc0-9815-35e09c34fb71-webhook-cert\") pod \"packageserver-d55dfcdfc-dchbw\" (UID: \"ec8d16a6-daeb-4cc0-9815-35e09c34fb71\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dchbw" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.233321 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrzfp\" (UniqueName: \"kubernetes.io/projected/ab09b6ea-c08c-4b80-a30a-09392178e10c-kube-api-access-mrzfp\") pod \"olm-operator-6b444d44fb-nmhdf\" (UID: \"ab09b6ea-c08c-4b80-a30a-09392178e10c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nmhdf" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.233327 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/09bc6f61-b7ce-46ff-bae4-04ff6609b246-signing-cabundle\") pod \"service-ca-9c57cc56f-k6vnd\" (UID: \"09bc6f61-b7ce-46ff-bae4-04ff6609b246\") " pod="openshift-service-ca/service-ca-9c57cc56f-k6vnd" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.233335 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f8303a7f-5f1a-4ad3-9a02-a25fefffdc8b-cert\") pod \"ingress-canary-gw6s6\" (UID: \"f8303a7f-5f1a-4ad3-9a02-a25fefffdc8b\") " pod="openshift-ingress-canary/ingress-canary-gw6s6" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.233361 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/500b5163-5eae-4fcc-9546-76c8f59841fa-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-s2xqr\" (UID: \"500b5163-5eae-4fcc-9546-76c8f59841fa\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-s2xqr" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.233379 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ab09b6ea-c08c-4b80-a30a-09392178e10c-profile-collector-cert\") pod \"olm-operator-6b444d44fb-nmhdf\" (UID: \"ab09b6ea-c08c-4b80-a30a-09392178e10c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nmhdf" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.233394 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/64faa7eb-4089-40e6-b084-6924f274ac51-images\") pod \"machine-config-operator-74547568cd-bjpn7\" (UID: \"64faa7eb-4089-40e6-b084-6924f274ac51\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bjpn7" Sep 30 06:21:39 crc kubenswrapper[4691]: E0930 06:21:39.233409 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 06:21:39.733391535 +0000 UTC m=+143.208412575 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.233435 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kklfr\" (UniqueName: \"kubernetes.io/projected/c4d1f634-c758-400a-8dba-baefd045834d-kube-api-access-kklfr\") pod \"dns-default-stxbl\" (UID: \"c4d1f634-c758-400a-8dba-baefd045834d\") " pod="openshift-dns/dns-default-stxbl" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.233459 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9pv2\" (UniqueName: \"kubernetes.io/projected/64faa7eb-4089-40e6-b084-6924f274ac51-kube-api-access-q9pv2\") pod \"machine-config-operator-74547568cd-bjpn7\" (UID: \"64faa7eb-4089-40e6-b084-6924f274ac51\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bjpn7" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.233485 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/66f39462-9632-40fb-abfa-5c13b3365d59-csi-data-dir\") pod \"csi-hostpathplugin-h6qlb\" (UID: \"66f39462-9632-40fb-abfa-5c13b3365d59\") " pod="hostpath-provisioner/csi-hostpathplugin-h6qlb" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.233509 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nhvm\" (UniqueName: \"kubernetes.io/projected/09bc6f61-b7ce-46ff-bae4-04ff6609b246-kube-api-access-8nhvm\") pod \"service-ca-9c57cc56f-k6vnd\" (UID: \"09bc6f61-b7ce-46ff-bae4-04ff6609b246\") " pod="openshift-service-ca/service-ca-9c57cc56f-k6vnd" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.233535 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svxh7\" (UniqueName: \"kubernetes.io/projected/2997210f-48b1-46e1-bf0f-12ed24852c8b-kube-api-access-svxh7\") pod \"control-plane-machine-set-operator-78cbb6b69f-kr8lp\" (UID: \"2997210f-48b1-46e1-bf0f-12ed24852c8b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kr8lp" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.233570 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlxgb\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.233599 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/64faa7eb-4089-40e6-b084-6924f274ac51-auth-proxy-config\") pod \"machine-config-operator-74547568cd-bjpn7\" (UID: \"64faa7eb-4089-40e6-b084-6924f274ac51\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bjpn7" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.233633 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8vv6\" (UniqueName: \"kubernetes.io/projected/16791f1d-bdee-46a9-9155-c1af947d96ec-kube-api-access-f8vv6\") pod \"catalog-operator-68c6474976-65pf2\" (UID: \"16791f1d-bdee-46a9-9155-c1af947d96ec\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-65pf2" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.233652 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2jn4\" (UniqueName: \"kubernetes.io/projected/66f39462-9632-40fb-abfa-5c13b3365d59-kube-api-access-x2jn4\") pod \"csi-hostpathplugin-h6qlb\" (UID: \"66f39462-9632-40fb-abfa-5c13b3365d59\") " pod="hostpath-provisioner/csi-hostpathplugin-h6qlb" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.233671 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/ec8d16a6-daeb-4cc0-9815-35e09c34fb71-tmpfs\") pod \"packageserver-d55dfcdfc-dchbw\" (UID: \"ec8d16a6-daeb-4cc0-9815-35e09c34fb71\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dchbw" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.233688 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/16791f1d-bdee-46a9-9155-c1af947d96ec-srv-cert\") pod \"catalog-operator-68c6474976-65pf2\" (UID: \"16791f1d-bdee-46a9-9155-c1af947d96ec\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-65pf2" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.233704 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/16791f1d-bdee-46a9-9155-c1af947d96ec-profile-collector-cert\") pod \"catalog-operator-68c6474976-65pf2\" (UID: \"16791f1d-bdee-46a9-9155-c1af947d96ec\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-65pf2" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.233721 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57168c43-1c61-42dd-863a-d524aa606da0-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-2hd2f\" (UID: \"57168c43-1c61-42dd-863a-d524aa606da0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2hd2f" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.233739 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/84873517-271a-49cb-8a39-5f28ddb68148-proxy-tls\") pod \"machine-config-controller-84d6567774-cc4c5\" (UID: \"84873517-271a-49cb-8a39-5f28ddb68148\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-cc4c5" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.233755 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/66f39462-9632-40fb-abfa-5c13b3365d59-plugins-dir\") pod \"csi-hostpathplugin-h6qlb\" (UID: \"66f39462-9632-40fb-abfa-5c13b3365d59\") " pod="hostpath-provisioner/csi-hostpathplugin-h6qlb" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.233773 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c4d1f634-c758-400a-8dba-baefd045834d-config-volume\") pod \"dns-default-stxbl\" (UID: \"c4d1f634-c758-400a-8dba-baefd045834d\") " pod="openshift-dns/dns-default-stxbl" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.233790 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/09bc6f61-b7ce-46ff-bae4-04ff6609b246-signing-key\") pod \"service-ca-9c57cc56f-k6vnd\" (UID: \"09bc6f61-b7ce-46ff-bae4-04ff6609b246\") " pod="openshift-service-ca/service-ca-9c57cc56f-k6vnd" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.233807 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e2df0f77-43e6-417a-a2ef-dfee3e80cbd8-config-volume\") pod \"collect-profiles-29320215-9kldt\" (UID: \"e2df0f77-43e6-417a-a2ef-dfee3e80cbd8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320215-9kldt" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.233987 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/64faa7eb-4089-40e6-b084-6924f274ac51-images\") pod \"machine-config-operator-74547568cd-bjpn7\" (UID: \"64faa7eb-4089-40e6-b084-6924f274ac51\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bjpn7" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.234418 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e2df0f77-43e6-417a-a2ef-dfee3e80cbd8-config-volume\") pod \"collect-profiles-29320215-9kldt\" (UID: \"e2df0f77-43e6-417a-a2ef-dfee3e80cbd8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320215-9kldt" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.234628 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/66f39462-9632-40fb-abfa-5c13b3365d59-registration-dir\") pod \"csi-hostpathplugin-h6qlb\" (UID: \"66f39462-9632-40fb-abfa-5c13b3365d59\") " pod="hostpath-provisioner/csi-hostpathplugin-h6qlb" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.235836 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/57168c43-1c61-42dd-863a-d524aa606da0-config\") pod \"kube-apiserver-operator-766d6c64bb-2hd2f\" (UID: \"57168c43-1c61-42dd-863a-d524aa606da0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2hd2f" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.238008 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/66f39462-9632-40fb-abfa-5c13b3365d59-csi-data-dir\") pod \"csi-hostpathplugin-h6qlb\" (UID: \"66f39462-9632-40fb-abfa-5c13b3365d59\") " pod="hostpath-provisioner/csi-hostpathplugin-h6qlb" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.239792 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f5ef6b93-5bb5-467f-8268-5feb300e2d5c-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-dzf49\" (UID: \"f5ef6b93-5bb5-467f-8268-5feb300e2d5c\") " pod="openshift-marketplace/marketplace-operator-79b997595-dzf49" Sep 30 06:21:39 crc kubenswrapper[4691]: E0930 06:21:39.240559 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 06:21:39.740436759 +0000 UTC m=+143.215457799 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlxgb" (UID: "c134a0fe-e3a2-4683-95d1-045ba2056b14") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.241965 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/05d014c8-5910-4be2-aafe-731263c73c0f-certs\") pod \"machine-config-server-5wzgc\" (UID: \"05d014c8-5910-4be2-aafe-731263c73c0f\") " pod="openshift-machine-config-operator/machine-config-server-5wzgc" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.242023 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/66f39462-9632-40fb-abfa-5c13b3365d59-mountpoint-dir\") pod \"csi-hostpathplugin-h6qlb\" (UID: \"66f39462-9632-40fb-abfa-5c13b3365d59\") " pod="hostpath-provisioner/csi-hostpathplugin-h6qlb" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.243623 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/05d014c8-5910-4be2-aafe-731263c73c0f-node-bootstrap-token\") pod \"machine-config-server-5wzgc\" (UID: \"05d014c8-5910-4be2-aafe-731263c73c0f\") " pod="openshift-machine-config-operator/machine-config-server-5wzgc" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.243823 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/64faa7eb-4089-40e6-b084-6924f274ac51-auth-proxy-config\") pod \"machine-config-operator-74547568cd-bjpn7\" (UID: \"64faa7eb-4089-40e6-b084-6924f274ac51\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bjpn7" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.245144 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ec8d16a6-daeb-4cc0-9815-35e09c34fb71-apiservice-cert\") pod \"packageserver-d55dfcdfc-dchbw\" (UID: \"ec8d16a6-daeb-4cc0-9815-35e09c34fb71\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dchbw" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.245183 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/66f39462-9632-40fb-abfa-5c13b3365d59-plugins-dir\") pod \"csi-hostpathplugin-h6qlb\" (UID: \"66f39462-9632-40fb-abfa-5c13b3365d59\") " pod="hostpath-provisioner/csi-hostpathplugin-h6qlb" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.246278 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c4d1f634-c758-400a-8dba-baefd045834d-config-volume\") pod \"dns-default-stxbl\" (UID: \"c4d1f634-c758-400a-8dba-baefd045834d\") " pod="openshift-dns/dns-default-stxbl" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.246681 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/66f39462-9632-40fb-abfa-5c13b3365d59-socket-dir\") pod \"csi-hostpathplugin-h6qlb\" (UID: \"66f39462-9632-40fb-abfa-5c13b3365d59\") " pod="hostpath-provisioner/csi-hostpathplugin-h6qlb" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.247180 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/84873517-271a-49cb-8a39-5f28ddb68148-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-cc4c5\" (UID: \"84873517-271a-49cb-8a39-5f28ddb68148\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-cc4c5" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.248554 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/ec8d16a6-daeb-4cc0-9815-35e09c34fb71-tmpfs\") pod \"packageserver-d55dfcdfc-dchbw\" (UID: \"ec8d16a6-daeb-4cc0-9815-35e09c34fb71\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dchbw" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.248970 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/57168c43-1c61-42dd-863a-d524aa606da0-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-2hd2f\" (UID: \"57168c43-1c61-42dd-863a-d524aa606da0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2hd2f" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.251678 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c359f4fc-5f36-4b96-8524-f885abe54d26-config\") pod \"service-ca-operator-777779d784-htgw7\" (UID: \"c359f4fc-5f36-4b96-8524-f885abe54d26\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-htgw7" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.258397 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ec8d16a6-daeb-4cc0-9815-35e09c34fb71-webhook-cert\") pod \"packageserver-d55dfcdfc-dchbw\" (UID: \"ec8d16a6-daeb-4cc0-9815-35e09c34fb71\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dchbw" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.258873 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/09bc6f61-b7ce-46ff-bae4-04ff6609b246-signing-key\") pod \"service-ca-9c57cc56f-k6vnd\" (UID: \"09bc6f61-b7ce-46ff-bae4-04ff6609b246\") " pod="openshift-service-ca/service-ca-9c57cc56f-k6vnd" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.258977 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/64faa7eb-4089-40e6-b084-6924f274ac51-proxy-tls\") pod \"machine-config-operator-74547568cd-bjpn7\" (UID: \"64faa7eb-4089-40e6-b084-6924f274ac51\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bjpn7" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.260209 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/16791f1d-bdee-46a9-9155-c1af947d96ec-profile-collector-cert\") pod \"catalog-operator-68c6474976-65pf2\" (UID: \"16791f1d-bdee-46a9-9155-c1af947d96ec\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-65pf2" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.262343 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c134a0fe-e3a2-4683-95d1-045ba2056b14-bound-sa-token\") pod \"image-registry-697d97f7c8-xlxgb\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.269716 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e2df0f77-43e6-417a-a2ef-dfee3e80cbd8-secret-volume\") pod \"collect-profiles-29320215-9kldt\" (UID: \"e2df0f77-43e6-417a-a2ef-dfee3e80cbd8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320215-9kldt" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.273135 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f8303a7f-5f1a-4ad3-9a02-a25fefffdc8b-cert\") pod \"ingress-canary-gw6s6\" (UID: \"f8303a7f-5f1a-4ad3-9a02-a25fefffdc8b\") " pod="openshift-ingress-canary/ingress-canary-gw6s6" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.273671 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c359f4fc-5f36-4b96-8524-f885abe54d26-serving-cert\") pod \"service-ca-operator-777779d784-htgw7\" (UID: \"c359f4fc-5f36-4b96-8524-f885abe54d26\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-htgw7" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.274046 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/16791f1d-bdee-46a9-9155-c1af947d96ec-srv-cert\") pod \"catalog-operator-68c6474976-65pf2\" (UID: \"16791f1d-bdee-46a9-9155-c1af947d96ec\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-65pf2" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.274539 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dtg9\" (UniqueName: \"kubernetes.io/projected/063ba2d8-4d97-474f-8c2d-0541f6f5cec5-kube-api-access-6dtg9\") pod \"openshift-config-operator-7777fb866f-bshxb\" (UID: \"063ba2d8-4d97-474f-8c2d-0541f6f5cec5\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-bshxb" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.275961 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ab09b6ea-c08c-4b80-a30a-09392178e10c-srv-cert\") pod \"olm-operator-6b444d44fb-nmhdf\" (UID: \"ab09b6ea-c08c-4b80-a30a-09392178e10c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nmhdf" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.276108 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/500b5163-5eae-4fcc-9546-76c8f59841fa-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-s2xqr\" (UID: \"500b5163-5eae-4fcc-9546-76c8f59841fa\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-s2xqr" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.276191 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ab09b6ea-c08c-4b80-a30a-09392178e10c-profile-collector-cert\") pod \"olm-operator-6b444d44fb-nmhdf\" (UID: \"ab09b6ea-c08c-4b80-a30a-09392178e10c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nmhdf" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.276878 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f5ef6b93-5bb5-467f-8268-5feb300e2d5c-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-dzf49\" (UID: \"f5ef6b93-5bb5-467f-8268-5feb300e2d5c\") " pod="openshift-marketplace/marketplace-operator-79b997595-dzf49" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.279122 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c4d1f634-c758-400a-8dba-baefd045834d-metrics-tls\") pod \"dns-default-stxbl\" (UID: \"c4d1f634-c758-400a-8dba-baefd045834d\") " pod="openshift-dns/dns-default-stxbl" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.289666 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/2997210f-48b1-46e1-bf0f-12ed24852c8b-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-kr8lp\" (UID: \"2997210f-48b1-46e1-bf0f-12ed24852c8b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kr8lp" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.298997 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-7bzfz"] Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.301406 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xx8zh\" (UniqueName: \"kubernetes.io/projected/ffd77185-f5b4-418f-b675-db92dbe4c19f-kube-api-access-xx8zh\") pod \"cluster-samples-operator-665b6dd947-4s6nm\" (UID: \"ffd77185-f5b4-418f-b675-db92dbe4c19f\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4s6nm" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.303188 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/84873517-271a-49cb-8a39-5f28ddb68148-proxy-tls\") pod \"machine-config-controller-84d6567774-cc4c5\" (UID: \"84873517-271a-49cb-8a39-5f28ddb68148\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-cc4c5" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.307646 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfp66\" (UniqueName: \"kubernetes.io/projected/c134a0fe-e3a2-4683-95d1-045ba2056b14-kube-api-access-bfp66\") pod \"image-registry-697d97f7c8-xlxgb\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.317468 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzwbn\" (UniqueName: \"kubernetes.io/projected/c5e7cf84-31e8-4a66-971f-18cba2113669-kube-api-access-wzwbn\") pod \"etcd-operator-b45778765-9d949\" (UID: \"c5e7cf84-31e8-4a66-971f-18cba2113669\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9d949" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.334669 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 06:21:39 crc kubenswrapper[4691]: E0930 06:21:39.334841 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 06:21:39.834815366 +0000 UTC m=+143.309836406 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.334946 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlxgb\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" Sep 30 06:21:39 crc kubenswrapper[4691]: E0930 06:21:39.335362 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 06:21:39.835346263 +0000 UTC m=+143.310367303 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlxgb" (UID: "c134a0fe-e3a2-4683-95d1-045ba2056b14") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.338682 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9wml\" (UniqueName: \"kubernetes.io/projected/a779ae38-da0c-4953-8d61-6047076785d2-kube-api-access-q9wml\") pod \"downloads-7954f5f757-nklgj\" (UID: \"a779ae38-da0c-4953-8d61-6047076785d2\") " pod="openshift-console/downloads-7954f5f757-nklgj" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.365625 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5652b"] Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.368559 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-r8bfj"] Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.381334 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-bshxb" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.381941 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/57168c43-1c61-42dd-863a-d524aa606da0-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-2hd2f\" (UID: \"57168c43-1c61-42dd-863a-d524aa606da0\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2hd2f" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.400604 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6khgs\" (UniqueName: \"kubernetes.io/projected/f8303a7f-5f1a-4ad3-9a02-a25fefffdc8b-kube-api-access-6khgs\") pod \"ingress-canary-gw6s6\" (UID: \"f8303a7f-5f1a-4ad3-9a02-a25fefffdc8b\") " pod="openshift-ingress-canary/ingress-canary-gw6s6" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.409409 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-nklgj" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.417854 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-9d949" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.425373 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-gw6s6" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.427577 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwh9r\" (UniqueName: \"kubernetes.io/projected/e2df0f77-43e6-417a-a2ef-dfee3e80cbd8-kube-api-access-cwh9r\") pod \"collect-profiles-29320215-9kldt\" (UID: \"e2df0f77-43e6-417a-a2ef-dfee3e80cbd8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320215-9kldt" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.436457 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.436582 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jwl8z" Sep 30 06:21:39 crc kubenswrapper[4691]: E0930 06:21:39.436667 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 06:21:39.936642861 +0000 UTC m=+143.411663901 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.436901 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlxgb\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" Sep 30 06:21:39 crc kubenswrapper[4691]: E0930 06:21:39.437265 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 06:21:39.937258499 +0000 UTC m=+143.412279529 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlxgb" (UID: "c134a0fe-e3a2-4683-95d1-045ba2056b14") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.439254 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbfjw\" (UniqueName: \"kubernetes.io/projected/c359f4fc-5f36-4b96-8524-f885abe54d26-kube-api-access-sbfjw\") pod \"service-ca-operator-777779d784-htgw7\" (UID: \"c359f4fc-5f36-4b96-8524-f885abe54d26\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-htgw7" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.441471 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-ss8nw" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.450371 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-thj2p" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.469274 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2jn4\" (UniqueName: \"kubernetes.io/projected/66f39462-9632-40fb-abfa-5c13b3365d59-kube-api-access-x2jn4\") pod \"csi-hostpathplugin-h6qlb\" (UID: \"66f39462-9632-40fb-abfa-5c13b3365d59\") " pod="hostpath-provisioner/csi-hostpathplugin-h6qlb" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.481200 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kklfr\" (UniqueName: \"kubernetes.io/projected/c4d1f634-c758-400a-8dba-baefd045834d-kube-api-access-kklfr\") pod \"dns-default-stxbl\" (UID: \"c4d1f634-c758-400a-8dba-baefd045834d\") " pod="openshift-dns/dns-default-stxbl" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.510831 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9pv2\" (UniqueName: \"kubernetes.io/projected/64faa7eb-4089-40e6-b084-6924f274ac51-kube-api-access-q9pv2\") pod \"machine-config-operator-74547568cd-bjpn7\" (UID: \"64faa7eb-4089-40e6-b084-6924f274ac51\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bjpn7" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.518430 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nhvm\" (UniqueName: \"kubernetes.io/projected/09bc6f61-b7ce-46ff-bae4-04ff6609b246-kube-api-access-8nhvm\") pod \"service-ca-9c57cc56f-k6vnd\" (UID: \"09bc6f61-b7ce-46ff-bae4-04ff6609b246\") " pod="openshift-service-ca/service-ca-9c57cc56f-k6vnd" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.530466 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lh444"] Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.531613 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-62hxb"] Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.543443 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 06:21:39 crc kubenswrapper[4691]: E0930 06:21:39.543798 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 06:21:40.043783053 +0000 UTC m=+143.518804083 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.544234 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4s6nm" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.544264 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m9vv7"] Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.548679 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svxh7\" (UniqueName: \"kubernetes.io/projected/2997210f-48b1-46e1-bf0f-12ed24852c8b-kube-api-access-svxh7\") pod \"control-plane-machine-set-operator-78cbb6b69f-kr8lp\" (UID: \"2997210f-48b1-46e1-bf0f-12ed24852c8b\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kr8lp" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.561143 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-6q5zd"] Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.575281 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8vv6\" (UniqueName: \"kubernetes.io/projected/16791f1d-bdee-46a9-9155-c1af947d96ec-kube-api-access-f8vv6\") pod \"catalog-operator-68c6474976-65pf2\" (UID: \"16791f1d-bdee-46a9-9155-c1af947d96ec\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-65pf2" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.583305 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-65pf2" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.585662 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gt4r8\" (UniqueName: \"kubernetes.io/projected/84873517-271a-49cb-8a39-5f28ddb68148-kube-api-access-gt4r8\") pod \"machine-config-controller-84d6567774-cc4c5\" (UID: \"84873517-271a-49cb-8a39-5f28ddb68148\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-cc4c5" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.591113 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bjpn7" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.597619 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2hd2f" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.599504 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-676xh\" (UniqueName: \"kubernetes.io/projected/f5ef6b93-5bb5-467f-8268-5feb300e2d5c-kube-api-access-676xh\") pod \"marketplace-operator-79b997595-dzf49\" (UID: \"f5ef6b93-5bb5-467f-8268-5feb300e2d5c\") " pod="openshift-marketplace/marketplace-operator-79b997595-dzf49" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.606951 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kr8lp" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.618161 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-htgw7" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.626272 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rl2fm\" (UniqueName: \"kubernetes.io/projected/500b5163-5eae-4fcc-9546-76c8f59841fa-kube-api-access-rl2fm\") pod \"package-server-manager-789f6589d5-s2xqr\" (UID: \"500b5163-5eae-4fcc-9546-76c8f59841fa\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-s2xqr" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.629103 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320215-9kldt" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.644456 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlxgb\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" Sep 30 06:21:39 crc kubenswrapper[4691]: E0930 06:21:39.644755 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 06:21:40.144745299 +0000 UTC m=+143.619766339 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlxgb" (UID: "c134a0fe-e3a2-4683-95d1-045ba2056b14") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:39 crc kubenswrapper[4691]: W0930 06:21:39.647606 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e494e6a_0b34_4706_8284_5dc5086c89b3.slice/crio-38d79c89fee43e5d720e0bbd3946836be7c64e2a449246c0454a40a06fae17a3 WatchSource:0}: Error finding container 38d79c89fee43e5d720e0bbd3946836be7c64e2a449246c0454a40a06fae17a3: Status 404 returned error can't find the container with id 38d79c89fee43e5d720e0bbd3946836be7c64e2a449246c0454a40a06fae17a3 Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.650572 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjs2s\" (UniqueName: \"kubernetes.io/projected/ec8d16a6-daeb-4cc0-9815-35e09c34fb71-kube-api-access-tjs2s\") pod \"packageserver-d55dfcdfc-dchbw\" (UID: \"ec8d16a6-daeb-4cc0-9815-35e09c34fb71\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dchbw" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.657297 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-k6vnd" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.670276 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dchbw" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.671907 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-fthnz"] Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.677156 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-s2xqr" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.678502 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8j45\" (UniqueName: \"kubernetes.io/projected/05d014c8-5910-4be2-aafe-731263c73c0f-kube-api-access-n8j45\") pod \"machine-config-server-5wzgc\" (UID: \"05d014c8-5910-4be2-aafe-731263c73c0f\") " pod="openshift-machine-config-operator/machine-config-server-5wzgc" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.685071 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-stxbl" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.687848 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-4bdqt"] Sep 30 06:21:39 crc kubenswrapper[4691]: W0930 06:21:39.687862 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb45e6c14_bbb6_4a9c_92e0_10c09fe37093.slice/crio-7957be09c4b14d9efcf5ab9f819655d0ee7b08e17fe7dca2c133b55c78b432f9 WatchSource:0}: Error finding container 7957be09c4b14d9efcf5ab9f819655d0ee7b08e17fe7dca2c133b55c78b432f9: Status 404 returned error can't find the container with id 7957be09c4b14d9efcf5ab9f819655d0ee7b08e17fe7dca2c133b55c78b432f9 Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.691445 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrzfp\" (UniqueName: \"kubernetes.io/projected/ab09b6ea-c08c-4b80-a30a-09392178e10c-kube-api-access-mrzfp\") pod \"olm-operator-6b444d44fb-nmhdf\" (UID: \"ab09b6ea-c08c-4b80-a30a-09392178e10c\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nmhdf" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.705165 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-5hlv9"] Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.705197 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-srgxj"] Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.710765 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-h6qlb" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.719912 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-5wzgc" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.734632 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-6gt99"] Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.761137 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 06:21:39 crc kubenswrapper[4691]: E0930 06:21:39.761310 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 06:21:40.261267351 +0000 UTC m=+143.736288391 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.761520 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlxgb\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" Sep 30 06:21:39 crc kubenswrapper[4691]: E0930 06:21:39.761863 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 06:21:40.26185054 +0000 UTC m=+143.736871580 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlxgb" (UID: "c134a0fe-e3a2-4683-95d1-045ba2056b14") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.777288 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-bshxb"] Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.810333 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-kzftt"] Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.811564 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lww4z"] Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.816082 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-nklgj"] Sep 30 06:21:39 crc kubenswrapper[4691]: W0930 06:21:39.826917 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10e7bec3_75fd_4ed0_bdcd_ff1ee3fd3f9b.slice/crio-716ae26d3516a58e6ce34ee6af3f53fb9b5c12a38e5ef89c9ee5ff7756781ab3 WatchSource:0}: Error finding container 716ae26d3516a58e6ce34ee6af3f53fb9b5c12a38e5ef89c9ee5ff7756781ab3: Status 404 returned error can't find the container with id 716ae26d3516a58e6ce34ee6af3f53fb9b5c12a38e5ef89c9ee5ff7756781ab3 Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.832810 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-fqxlz"] Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.848122 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kp6rk"] Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.863913 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 06:21:39 crc kubenswrapper[4691]: E0930 06:21:39.864212 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 06:21:40.36419576 +0000 UTC m=+143.839216790 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.866448 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jwl8z"] Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.868620 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-cc4c5" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.879582 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-dzf49" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.925208 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nmhdf" Sep 30 06:21:39 crc kubenswrapper[4691]: I0930 06:21:39.965375 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlxgb\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" Sep 30 06:21:39 crc kubenswrapper[4691]: E0930 06:21:39.965649 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 06:21:40.465630562 +0000 UTC m=+143.940651602 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlxgb" (UID: "c134a0fe-e3a2-4683-95d1-045ba2056b14") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:40 crc kubenswrapper[4691]: I0930 06:21:40.007389 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-gw6s6"] Sep 30 06:21:40 crc kubenswrapper[4691]: I0930 06:21:40.020398 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-r8bfj" event={"ID":"46e3679a-b63e-4f7c-b118-02287f570a24","Type":"ContainerStarted","Data":"570f75deff6c2681edaecad239d0e9e4e6133109760d5d09ea77a65fcf1d59b9"} Sep 30 06:21:40 crc kubenswrapper[4691]: I0930 06:21:40.057922 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-nklgj" event={"ID":"a779ae38-da0c-4953-8d61-6047076785d2","Type":"ContainerStarted","Data":"db33ce27bcd80e8cb6e652a0070d750f31e0f911ebfd8a421acdf20e02da5c66"} Sep 30 06:21:40 crc kubenswrapper[4691]: I0930 06:21:40.060601 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lh444" event={"ID":"bf24a16c-4573-4014-8057-b4da43a0b145","Type":"ContainerStarted","Data":"a687a8cb12c137583137dc2952f39656fac319bf64f807d420e22a36870734e9"} Sep 30 06:21:40 crc kubenswrapper[4691]: I0930 06:21:40.065708 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fqxlz" event={"ID":"535f395c-e127-4a48-8766-707bf9d4d5a3","Type":"ContainerStarted","Data":"88b2975d929ad637c7ee9e3385f84baf2c9722d48f88262b68afa90fcc09e9dd"} Sep 30 06:21:40 crc kubenswrapper[4691]: I0930 06:21:40.066102 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 06:21:40 crc kubenswrapper[4691]: E0930 06:21:40.066458 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 06:21:40.566443713 +0000 UTC m=+144.041464753 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:40 crc kubenswrapper[4691]: I0930 06:21:40.087459 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-6q5zd" event={"ID":"4e494e6a-0b34-4706-8284-5dc5086c89b3","Type":"ContainerStarted","Data":"38d79c89fee43e5d720e0bbd3946836be7c64e2a449246c0454a40a06fae17a3"} Sep 30 06:21:40 crc kubenswrapper[4691]: I0930 06:21:40.088751 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lww4z" event={"ID":"ef565c10-206c-406c-8b36-5b3336fa1934","Type":"ContainerStarted","Data":"c7d0873f1f2e0fe7b19d6c6d301b90008c46f89bb8f5bc8b6e8b0b7c1fe2cfae"} Sep 30 06:21:40 crc kubenswrapper[4691]: I0930 06:21:40.090391 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-9d949"] Sep 30 06:21:40 crc kubenswrapper[4691]: I0930 06:21:40.096101 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-thj2p"] Sep 30 06:21:40 crc kubenswrapper[4691]: I0930 06:21:40.101118 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-7bzfz" event={"ID":"e35ebb0f-1009-48b7-b981-c3c46bc7fce6","Type":"ContainerStarted","Data":"8a864b3782d02b9241f18e33efa40f52c63b45636293b3aed8dd304ad3d9806f"} Sep 30 06:21:40 crc kubenswrapper[4691]: I0930 06:21:40.101153 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-7bzfz" event={"ID":"e35ebb0f-1009-48b7-b981-c3c46bc7fce6","Type":"ContainerStarted","Data":"b2d651fee30e248cec42cbfae964194d8847c25d1e9cf1acfceccc617462e9ac"} Sep 30 06:21:40 crc kubenswrapper[4691]: I0930 06:21:40.111552 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jwl8z" event={"ID":"19dcb2c9-5608-463f-96aa-37fb332fcd57","Type":"ContainerStarted","Data":"bde64936efe35819d77ba02431572e6d6000a5ba8c74b131eec8d07027e3388e"} Sep 30 06:21:40 crc kubenswrapper[4691]: I0930 06:21:40.125325 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fthnz" event={"ID":"b45e6c14-bbb6-4a9c-92e0-10c09fe37093","Type":"ContainerStarted","Data":"7957be09c4b14d9efcf5ab9f819655d0ee7b08e17fe7dca2c133b55c78b432f9"} Sep 30 06:21:40 crc kubenswrapper[4691]: I0930 06:21:40.132265 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-5hlv9" event={"ID":"37dd19aa-104d-4c79-859c-7161a185ad1c","Type":"ContainerStarted","Data":"d3d8edc5596d4389940210f3de5b3c5511c20f693710036b4df58b47fb2e96eb"} Sep 30 06:21:40 crc kubenswrapper[4691]: I0930 06:21:40.134148 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-bjpn7"] Sep 30 06:21:40 crc kubenswrapper[4691]: I0930 06:21:40.144213 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-kzftt" event={"ID":"ee1c2dd6-d759-4d3c-9ec7-86ec11419202","Type":"ContainerStarted","Data":"fdb2a4385253ef54288ec9713cb20e9b13f6bc684f08ad34d4c1f499ed3289c2"} Sep 30 06:21:40 crc kubenswrapper[4691]: I0930 06:21:40.158566 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-4bdqt" event={"ID":"7a677441-8b2d-41ae-8dd8-e3334c16c700","Type":"ContainerStarted","Data":"51f276215a44bcc6bac3fd8de14efcc278517f7c02d577bc36d4f2d90d996cf8"} Sep 30 06:21:40 crc kubenswrapper[4691]: I0930 06:21:40.169478 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlxgb\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" Sep 30 06:21:40 crc kubenswrapper[4691]: E0930 06:21:40.169602 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 06:21:40.669591079 +0000 UTC m=+144.144612109 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlxgb" (UID: "c134a0fe-e3a2-4683-95d1-045ba2056b14") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:40 crc kubenswrapper[4691]: I0930 06:21:40.188978 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6gt99" event={"ID":"162d74bd-6a30-4fa0-88b7-2aa59426c6c8","Type":"ContainerStarted","Data":"4c598fae6856d035e3c5993e8cc9325a9bd3ef2c421dda430245d2d1a02b1da4"} Sep 30 06:21:40 crc kubenswrapper[4691]: I0930 06:21:40.191244 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-ss8nw"] Sep 30 06:21:40 crc kubenswrapper[4691]: I0930 06:21:40.201286 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320215-9kldt"] Sep 30 06:21:40 crc kubenswrapper[4691]: W0930 06:21:40.213244 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5e7cf84_31e8_4a66_971f_18cba2113669.slice/crio-99e62df967b9fe735edb8bd8c104c96dd5eec6c5f70954315809c0658a3f16cd WatchSource:0}: Error finding container 99e62df967b9fe735edb8bd8c104c96dd5eec6c5f70954315809c0658a3f16cd: Status 404 returned error can't find the container with id 99e62df967b9fe735edb8bd8c104c96dd5eec6c5f70954315809c0658a3f16cd Sep 30 06:21:40 crc kubenswrapper[4691]: I0930 06:21:40.218958 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-srgxj" event={"ID":"10e7bec3-75fd-4ed0-bdcd-ff1ee3fd3f9b","Type":"ContainerStarted","Data":"716ae26d3516a58e6ce34ee6af3f53fb9b5c12a38e5ef89c9ee5ff7756781ab3"} Sep 30 06:21:40 crc kubenswrapper[4691]: I0930 06:21:40.227992 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-7frz7" event={"ID":"d6587adc-a984-4ce3-af8d-6739325c8604","Type":"ContainerStarted","Data":"4a501fdd6468a66f4fc76c602c2a0ee402768c590567285f6cfcd98147e9587e"} Sep 30 06:21:40 crc kubenswrapper[4691]: I0930 06:21:40.234083 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-62hxb" event={"ID":"f1dfc875-304c-4f81-8d12-c5463743ad08","Type":"ContainerStarted","Data":"7cb6e108dda85d17a52fa1a7dddb7efcd26f21a6732bf7193cfe53d6c0cee9f2"} Sep 30 06:21:40 crc kubenswrapper[4691]: I0930 06:21:40.237495 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-bshxb" event={"ID":"063ba2d8-4d97-474f-8c2d-0541f6f5cec5","Type":"ContainerStarted","Data":"da6dcdeb29609ea5666bc5c6eb9ec018b8ea50a0ef126652e63043e4eea9891b"} Sep 30 06:21:40 crc kubenswrapper[4691]: I0930 06:21:40.240132 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-htgw7"] Sep 30 06:21:40 crc kubenswrapper[4691]: I0930 06:21:40.258440 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m9vv7" event={"ID":"f468aa01-3497-4b9b-bf5b-33aaf845e8cd","Type":"ContainerStarted","Data":"084d03fdc602e656a9435a035f11cdfa9ed5140fe28ed864d5e834bfe3a4be4a"} Sep 30 06:21:40 crc kubenswrapper[4691]: I0930 06:21:40.261444 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-65pf2"] Sep 30 06:21:40 crc kubenswrapper[4691]: I0930 06:21:40.268676 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4s6nm"] Sep 30 06:21:40 crc kubenswrapper[4691]: I0930 06:21:40.269481 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5652b" event={"ID":"da9aea11-a56e-498b-8678-590b288b372f","Type":"ContainerStarted","Data":"35d4f387239453c9d2f217db77410e8a42e23337918ff154b7ad1b9972a478c4"} Sep 30 06:21:40 crc kubenswrapper[4691]: I0930 06:21:40.269535 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5652b" event={"ID":"da9aea11-a56e-498b-8678-590b288b372f","Type":"ContainerStarted","Data":"ae592222d13d8c6f4a963b629fedea5560d53c326e4b2fa8cebc17514929d462"} Sep 30 06:21:40 crc kubenswrapper[4691]: E0930 06:21:40.271100 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 06:21:40.771084513 +0000 UTC m=+144.246105553 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:40 crc kubenswrapper[4691]: I0930 06:21:40.271118 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 06:21:40 crc kubenswrapper[4691]: I0930 06:21:40.272807 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlxgb\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" Sep 30 06:21:40 crc kubenswrapper[4691]: E0930 06:21:40.273355 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 06:21:40.773320164 +0000 UTC m=+144.248341204 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlxgb" (UID: "c134a0fe-e3a2-4683-95d1-045ba2056b14") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:40 crc kubenswrapper[4691]: I0930 06:21:40.290325 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hf5vd" event={"ID":"50e623b3-90a5-4b59-8e37-9ac5c96c3304","Type":"ContainerStarted","Data":"5384557d773ae5c4f3f8b39149792e46602445f9152f55b8752953da8a9442ed"} Sep 30 06:21:40 crc kubenswrapper[4691]: I0930 06:21:40.290356 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hf5vd" event={"ID":"50e623b3-90a5-4b59-8e37-9ac5c96c3304","Type":"ContainerStarted","Data":"ec5654a5bedcf2603a774bd7ee80d82a9b819ec1c16eedde48eeee1426a9a11b"} Sep 30 06:21:40 crc kubenswrapper[4691]: I0930 06:21:40.297359 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-7bzfz" podStartSLOduration=122.297342916 podStartE2EDuration="2m2.297342916s" podCreationTimestamp="2025-09-30 06:19:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:21:40.296234372 +0000 UTC m=+143.771255422" watchObservedRunningTime="2025-09-30 06:21:40.297342916 +0000 UTC m=+143.772363956" Sep 30 06:21:40 crc kubenswrapper[4691]: I0930 06:21:40.375124 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 06:21:40 crc kubenswrapper[4691]: E0930 06:21:40.375411 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 06:21:40.875383095 +0000 UTC m=+144.350404135 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:40 crc kubenswrapper[4691]: I0930 06:21:40.375596 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlxgb\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" Sep 30 06:21:40 crc kubenswrapper[4691]: E0930 06:21:40.377443 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 06:21:40.87740766 +0000 UTC m=+144.352428790 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlxgb" (UID: "c134a0fe-e3a2-4683-95d1-045ba2056b14") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:40 crc kubenswrapper[4691]: I0930 06:21:40.478078 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 06:21:40 crc kubenswrapper[4691]: E0930 06:21:40.478402 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 06:21:40.978383847 +0000 UTC m=+144.453404887 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:40 crc kubenswrapper[4691]: I0930 06:21:40.501092 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kr8lp"] Sep 30 06:21:40 crc kubenswrapper[4691]: W0930 06:21:40.568298 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2997210f_48b1_46e1_bf0f_12ed24852c8b.slice/crio-0306bd89e54321c00dfa6360e246cd472c8ef08aabc4ac87b1aeb5f13a803a90 WatchSource:0}: Error finding container 0306bd89e54321c00dfa6360e246cd472c8ef08aabc4ac87b1aeb5f13a803a90: Status 404 returned error can't find the container with id 0306bd89e54321c00dfa6360e246cd472c8ef08aabc4ac87b1aeb5f13a803a90 Sep 30 06:21:40 crc kubenswrapper[4691]: I0930 06:21:40.579125 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlxgb\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" Sep 30 06:21:40 crc kubenswrapper[4691]: E0930 06:21:40.579446 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 06:21:41.079434456 +0000 UTC m=+144.554455496 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlxgb" (UID: "c134a0fe-e3a2-4683-95d1-045ba2056b14") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:40 crc kubenswrapper[4691]: I0930 06:21:40.680922 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 06:21:40 crc kubenswrapper[4691]: E0930 06:21:40.681297 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 06:21:41.18126766 +0000 UTC m=+144.656288700 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:40 crc kubenswrapper[4691]: I0930 06:21:40.697875 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dchbw"] Sep 30 06:21:40 crc kubenswrapper[4691]: I0930 06:21:40.724642 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nmhdf"] Sep 30 06:21:40 crc kubenswrapper[4691]: I0930 06:21:40.749945 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-s2xqr"] Sep 30 06:21:40 crc kubenswrapper[4691]: I0930 06:21:40.781947 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlxgb\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" Sep 30 06:21:40 crc kubenswrapper[4691]: I0930 06:21:40.781978 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2hd2f"] Sep 30 06:21:40 crc kubenswrapper[4691]: E0930 06:21:40.782269 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 06:21:41.282255937 +0000 UTC m=+144.757276977 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlxgb" (UID: "c134a0fe-e3a2-4683-95d1-045ba2056b14") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:40 crc kubenswrapper[4691]: I0930 06:21:40.793935 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-cc4c5"] Sep 30 06:21:40 crc kubenswrapper[4691]: I0930 06:21:40.833585 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-k6vnd"] Sep 30 06:21:40 crc kubenswrapper[4691]: I0930 06:21:40.836413 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-stxbl"] Sep 30 06:21:40 crc kubenswrapper[4691]: W0930 06:21:40.837476 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab09b6ea_c08c_4b80_a30a_09392178e10c.slice/crio-e5a3adc48741a16d0bcb8b0bd9954b5f0886833e597a1797f58f83d5eb737dfd WatchSource:0}: Error finding container e5a3adc48741a16d0bcb8b0bd9954b5f0886833e597a1797f58f83d5eb737dfd: Status 404 returned error can't find the container with id e5a3adc48741a16d0bcb8b0bd9954b5f0886833e597a1797f58f83d5eb737dfd Sep 30 06:21:40 crc kubenswrapper[4691]: W0930 06:21:40.840609 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod500b5163_5eae_4fcc_9546_76c8f59841fa.slice/crio-5b068dddd0a8b27b446721a87f9f8521866955bea493648511cf75ddb9e5d2fa WatchSource:0}: Error finding container 5b068dddd0a8b27b446721a87f9f8521866955bea493648511cf75ddb9e5d2fa: Status 404 returned error can't find the container with id 5b068dddd0a8b27b446721a87f9f8521866955bea493648511cf75ddb9e5d2fa Sep 30 06:21:40 crc kubenswrapper[4691]: I0930 06:21:40.867014 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-7frz7" Sep 30 06:21:40 crc kubenswrapper[4691]: I0930 06:21:40.878031 4691 patch_prober.go:28] interesting pod/router-default-5444994796-7frz7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 06:21:40 crc kubenswrapper[4691]: [-]has-synced failed: reason withheld Sep 30 06:21:40 crc kubenswrapper[4691]: [+]process-running ok Sep 30 06:21:40 crc kubenswrapper[4691]: healthz check failed Sep 30 06:21:40 crc kubenswrapper[4691]: I0930 06:21:40.878089 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7frz7" podUID="d6587adc-a984-4ce3-af8d-6739325c8604" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 06:21:40 crc kubenswrapper[4691]: I0930 06:21:40.883344 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 06:21:40 crc kubenswrapper[4691]: E0930 06:21:40.883941 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 06:21:41.383912446 +0000 UTC m=+144.858933486 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:40 crc kubenswrapper[4691]: W0930 06:21:40.912600 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09bc6f61_b7ce_46ff_bae4_04ff6609b246.slice/crio-51dffa5a5866f304123c4727b2395acd1ea54bbb049032e14e821d0fb5be715a WatchSource:0}: Error finding container 51dffa5a5866f304123c4727b2395acd1ea54bbb049032e14e821d0fb5be715a: Status 404 returned error can't find the container with id 51dffa5a5866f304123c4727b2395acd1ea54bbb049032e14e821d0fb5be715a Sep 30 06:21:40 crc kubenswrapper[4691]: I0930 06:21:40.964741 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-h6qlb"] Sep 30 06:21:40 crc kubenswrapper[4691]: I0930 06:21:40.988989 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlxgb\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" Sep 30 06:21:40 crc kubenswrapper[4691]: E0930 06:21:40.989400 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 06:21:41.489388216 +0000 UTC m=+144.964409256 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlxgb" (UID: "c134a0fe-e3a2-4683-95d1-045ba2056b14") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:41 crc kubenswrapper[4691]: I0930 06:21:41.033319 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dzf49"] Sep 30 06:21:41 crc kubenswrapper[4691]: I0930 06:21:41.089807 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 06:21:41 crc kubenswrapper[4691]: E0930 06:21:41.090033 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 06:21:41.590005912 +0000 UTC m=+145.065026942 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:41 crc kubenswrapper[4691]: I0930 06:21:41.090137 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlxgb\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" Sep 30 06:21:41 crc kubenswrapper[4691]: E0930 06:21:41.090463 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 06:21:41.590450986 +0000 UTC m=+145.065472026 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlxgb" (UID: "c134a0fe-e3a2-4683-95d1-045ba2056b14") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:41 crc kubenswrapper[4691]: I0930 06:21:41.191063 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 06:21:41 crc kubenswrapper[4691]: E0930 06:21:41.191230 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 06:21:41.691206496 +0000 UTC m=+145.166227536 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:41 crc kubenswrapper[4691]: I0930 06:21:41.191586 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlxgb\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" Sep 30 06:21:41 crc kubenswrapper[4691]: E0930 06:21:41.194313 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 06:21:41.694298434 +0000 UTC m=+145.169319474 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlxgb" (UID: "c134a0fe-e3a2-4683-95d1-045ba2056b14") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:41 crc kubenswrapper[4691]: I0930 06:21:41.293819 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 06:21:41 crc kubenswrapper[4691]: E0930 06:21:41.294149 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 06:21:41.794133625 +0000 UTC m=+145.269154665 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:41 crc kubenswrapper[4691]: I0930 06:21:41.297687 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dchbw" event={"ID":"ec8d16a6-daeb-4cc0-9815-35e09c34fb71","Type":"ContainerStarted","Data":"eb27f991d56039740110105d7313aac3165adf7ef947594d04d715f44fff9545"} Sep 30 06:21:41 crc kubenswrapper[4691]: I0930 06:21:41.311038 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-5wzgc" event={"ID":"05d014c8-5910-4be2-aafe-731263c73c0f","Type":"ContainerStarted","Data":"4211a7337104f63e39076bc6c568410de9e3fb56926917f9584d710601d659e3"} Sep 30 06:21:41 crc kubenswrapper[4691]: I0930 06:21:41.311080 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-5wzgc" event={"ID":"05d014c8-5910-4be2-aafe-731263c73c0f","Type":"ContainerStarted","Data":"fd5f783f982df06e89e36e6e6d282b446fe5cd0eda3d1dff7859eb2ecf87ba11"} Sep 30 06:21:41 crc kubenswrapper[4691]: I0930 06:21:41.312431 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-stxbl" event={"ID":"c4d1f634-c758-400a-8dba-baefd045834d","Type":"ContainerStarted","Data":"7ac566b618f8a47da78ac0a32b0ffdb027f3467a72e2a6c4ad09c5fdb6735241"} Sep 30 06:21:41 crc kubenswrapper[4691]: I0930 06:21:41.313301 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lww4z" event={"ID":"ef565c10-206c-406c-8b36-5b3336fa1934","Type":"ContainerStarted","Data":"5756e0e07dc95f2da9d16454ac7f5752551c98d9431bd53a96789edf67c37a95"} Sep 30 06:21:41 crc kubenswrapper[4691]: I0930 06:21:41.316124 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-kzftt" event={"ID":"ee1c2dd6-d759-4d3c-9ec7-86ec11419202","Type":"ContainerStarted","Data":"94c7c6c0d40be106a5b0813cc94df4847f25c761cc1d4ce54c58d3974d37edfa"} Sep 30 06:21:41 crc kubenswrapper[4691]: I0930 06:21:41.337051 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2hd2f" event={"ID":"57168c43-1c61-42dd-863a-d524aa606da0","Type":"ContainerStarted","Data":"0d51419abd0504a326ac62d94af0812a24e12adfbef984d350f415a9ff10c6cf"} Sep 30 06:21:41 crc kubenswrapper[4691]: I0930 06:21:41.339985 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-4bdqt" event={"ID":"7a677441-8b2d-41ae-8dd8-e3334c16c700","Type":"ContainerStarted","Data":"3da83dec82e9fb5197879c972a1ef86ac9b11da0ed3884630618078742fb6558"} Sep 30 06:21:41 crc kubenswrapper[4691]: I0930 06:21:41.340035 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-4bdqt" Sep 30 06:21:41 crc kubenswrapper[4691]: I0930 06:21:41.341466 4691 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-4bdqt container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Sep 30 06:21:41 crc kubenswrapper[4691]: I0930 06:21:41.341602 4691 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-4bdqt" podUID="7a677441-8b2d-41ae-8dd8-e3334c16c700" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Sep 30 06:21:41 crc kubenswrapper[4691]: I0930 06:21:41.343662 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320215-9kldt" event={"ID":"e2df0f77-43e6-417a-a2ef-dfee3e80cbd8","Type":"ContainerStarted","Data":"182bdd9e95dffa9e8af6d34ae75a1122af8cdb776d23f0e45038ffab88d8eda6"} Sep 30 06:21:41 crc kubenswrapper[4691]: I0930 06:21:41.345670 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lh444" event={"ID":"bf24a16c-4573-4014-8057-b4da43a0b145","Type":"ContainerStarted","Data":"49d5c45cd34c55825ffdcc668379212285396830921dec6aac1154a728d3c23c"} Sep 30 06:21:41 crc kubenswrapper[4691]: I0930 06:21:41.367619 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-cc4c5" event={"ID":"84873517-271a-49cb-8a39-5f28ddb68148","Type":"ContainerStarted","Data":"33d1894db723d8a2dd45f0144f9b8ae01be673bdf7541adcfed22d815647f89e"} Sep 30 06:21:41 crc kubenswrapper[4691]: I0930 06:21:41.378992 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fqxlz" event={"ID":"535f395c-e127-4a48-8766-707bf9d4d5a3","Type":"ContainerStarted","Data":"141288b9567948ac2e145dbc13589900de4d46c60614c8ff9fc8e33de478b579"} Sep 30 06:21:41 crc kubenswrapper[4691]: I0930 06:21:41.379463 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fqxlz" Sep 30 06:21:41 crc kubenswrapper[4691]: I0930 06:21:41.380708 4691 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-fqxlz container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.15:8443/healthz\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Sep 30 06:21:41 crc kubenswrapper[4691]: I0930 06:21:41.380754 4691 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fqxlz" podUID="535f395c-e127-4a48-8766-707bf9d4d5a3" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.15:8443/healthz\": dial tcp 10.217.0.15:8443: connect: connection refused" Sep 30 06:21:41 crc kubenswrapper[4691]: I0930 06:21:41.392507 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-nklgj" event={"ID":"a779ae38-da0c-4953-8d61-6047076785d2","Type":"ContainerStarted","Data":"0013ab9e50091f76fd205312c5ad285e2d16f82341d7478830b5d27614599783"} Sep 30 06:21:41 crc kubenswrapper[4691]: I0930 06:21:41.393064 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-nklgj" Sep 30 06:21:41 crc kubenswrapper[4691]: I0930 06:21:41.396798 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlxgb\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" Sep 30 06:21:41 crc kubenswrapper[4691]: I0930 06:21:41.397085 4691 patch_prober.go:28] interesting pod/downloads-7954f5f757-nklgj container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Sep 30 06:21:41 crc kubenswrapper[4691]: I0930 06:21:41.397152 4691 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-nklgj" podUID="a779ae38-da0c-4953-8d61-6047076785d2" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Sep 30 06:21:41 crc kubenswrapper[4691]: E0930 06:21:41.399485 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 06:21:41.899469371 +0000 UTC m=+145.374490411 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlxgb" (UID: "c134a0fe-e3a2-4683-95d1-045ba2056b14") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:41 crc kubenswrapper[4691]: I0930 06:21:41.405421 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-r8bfj" event={"ID":"46e3679a-b63e-4f7c-b118-02287f570a24","Type":"ContainerStarted","Data":"883c597d43032755b085bf364c08ef47a535703632e5cfaa15e8634989643a09"} Sep 30 06:21:41 crc kubenswrapper[4691]: I0930 06:21:41.406575 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-r8bfj" Sep 30 06:21:41 crc kubenswrapper[4691]: I0930 06:21:41.410048 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-k6vnd" event={"ID":"09bc6f61-b7ce-46ff-bae4-04ff6609b246","Type":"ContainerStarted","Data":"51dffa5a5866f304123c4727b2395acd1ea54bbb049032e14e821d0fb5be715a"} Sep 30 06:21:41 crc kubenswrapper[4691]: I0930 06:21:41.411833 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kp6rk" event={"ID":"929acffa-90b0-4dfc-a65b-a8758c000f41","Type":"ContainerStarted","Data":"958a0f01e9810287478acc464215020f9e8c14a86e048cc4b7b054294b65ca94"} Sep 30 06:21:41 crc kubenswrapper[4691]: I0930 06:21:41.414950 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-r8bfj" Sep 30 06:21:41 crc kubenswrapper[4691]: I0930 06:21:41.422445 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jwl8z" event={"ID":"19dcb2c9-5608-463f-96aa-37fb332fcd57","Type":"ContainerStarted","Data":"0cfe324d9c9adfcd3cea808543ee80ad6701936d5b75b08e0f00a7f533045c51"} Sep 30 06:21:41 crc kubenswrapper[4691]: I0930 06:21:41.441480 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-dzf49" event={"ID":"f5ef6b93-5bb5-467f-8268-5feb300e2d5c","Type":"ContainerStarted","Data":"0ecf98d90806bfece70489b8b5d7a016804ae84800a14f33622c14164ca1e62f"} Sep 30 06:21:41 crc kubenswrapper[4691]: I0930 06:21:41.452112 4691 generic.go:334] "Generic (PLEG): container finished" podID="37dd19aa-104d-4c79-859c-7161a185ad1c" containerID="e95e0fe0d6c3fa9d17dc7476949f139cf7af067fe0c1201e1e17432d7f8d9e63" exitCode=0 Sep 30 06:21:41 crc kubenswrapper[4691]: I0930 06:21:41.452229 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-5hlv9" event={"ID":"37dd19aa-104d-4c79-859c-7161a185ad1c","Type":"ContainerDied","Data":"e95e0fe0d6c3fa9d17dc7476949f139cf7af067fe0c1201e1e17432d7f8d9e63"} Sep 30 06:21:41 crc kubenswrapper[4691]: I0930 06:21:41.464089 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-65pf2" event={"ID":"16791f1d-bdee-46a9-9155-c1af947d96ec","Type":"ContainerStarted","Data":"7dbab6f8bd569e3e6cd9aa8b2faf246dae4fd2568cf5da86382517401565ec30"} Sep 30 06:21:41 crc kubenswrapper[4691]: I0930 06:21:41.479414 4691 generic.go:334] "Generic (PLEG): container finished" podID="b45e6c14-bbb6-4a9c-92e0-10c09fe37093" containerID="d7614be1e8c73fd4c5999445ddc43f038419b834f36b6ce3a67caf9a88092a52" exitCode=0 Sep 30 06:21:41 crc kubenswrapper[4691]: I0930 06:21:41.480124 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fthnz" event={"ID":"b45e6c14-bbb6-4a9c-92e0-10c09fe37093","Type":"ContainerDied","Data":"d7614be1e8c73fd4c5999445ddc43f038419b834f36b6ce3a67caf9a88092a52"} Sep 30 06:21:41 crc kubenswrapper[4691]: I0930 06:21:41.484563 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m9vv7" event={"ID":"f468aa01-3497-4b9b-bf5b-33aaf845e8cd","Type":"ContainerStarted","Data":"7e321f1ee784d02403372d9c48baf3126d7f6d6f50ae6aa8da3803c7ddff7b28"} Sep 30 06:21:41 crc kubenswrapper[4691]: I0930 06:21:41.489238 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-thj2p" event={"ID":"4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3","Type":"ContainerStarted","Data":"5c1f99b3db97a62fe6cb24b3efcf5e1f2d731152a1680d0bc49103ef485d3e04"} Sep 30 06:21:41 crc kubenswrapper[4691]: I0930 06:21:41.489274 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-thj2p" event={"ID":"4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3","Type":"ContainerStarted","Data":"2288a7d9adacb360f77c41fddbca550712f08789f2d5146f0b65344bbebcb335"} Sep 30 06:21:41 crc kubenswrapper[4691]: I0930 06:21:41.493732 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-ss8nw" event={"ID":"3cd27b2f-3e77-4f36-b646-60e833384949","Type":"ContainerStarted","Data":"2ae98def1b82506628e84d8ecdefd078f678058ee9669e84352a5a6cc1ed54fb"} Sep 30 06:21:41 crc kubenswrapper[4691]: I0930 06:21:41.496577 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-h6qlb" event={"ID":"66f39462-9632-40fb-abfa-5c13b3365d59","Type":"ContainerStarted","Data":"630cabdc17bcfd97af9632ca3f3213515fbb866b482bcf4c71648167649e8e73"} Sep 30 06:21:41 crc kubenswrapper[4691]: I0930 06:21:41.497545 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 06:21:41 crc kubenswrapper[4691]: E0930 06:21:41.498515 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 06:21:41.998482805 +0000 UTC m=+145.473503845 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:41 crc kubenswrapper[4691]: I0930 06:21:41.504431 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nmhdf" event={"ID":"ab09b6ea-c08c-4b80-a30a-09392178e10c","Type":"ContainerStarted","Data":"e5a3adc48741a16d0bcb8b0bd9954b5f0886833e597a1797f58f83d5eb737dfd"} Sep 30 06:21:41 crc kubenswrapper[4691]: I0930 06:21:41.508797 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-htgw7" event={"ID":"c359f4fc-5f36-4b96-8524-f885abe54d26","Type":"ContainerStarted","Data":"f18e00ba4031654e247d54926b5863f729be8b55b5fea69dd089ea381006f795"} Sep 30 06:21:41 crc kubenswrapper[4691]: I0930 06:21:41.513836 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-6q5zd" event={"ID":"4e494e6a-0b34-4706-8284-5dc5086c89b3","Type":"ContainerStarted","Data":"6416a043dad9fb09c2246371519a27056deb761d322ce4cb688a717d0821ac98"} Sep 30 06:21:41 crc kubenswrapper[4691]: I0930 06:21:41.532105 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-9d949" event={"ID":"c5e7cf84-31e8-4a66-971f-18cba2113669","Type":"ContainerStarted","Data":"99e62df967b9fe735edb8bd8c104c96dd5eec6c5f70954315809c0658a3f16cd"} Sep 30 06:21:41 crc kubenswrapper[4691]: I0930 06:21:41.534014 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-62hxb" event={"ID":"f1dfc875-304c-4f81-8d12-c5463743ad08","Type":"ContainerStarted","Data":"071e9365d9b76fa94e770e7850ee0b9d878699d86be0566cac82afd73e2e1f93"} Sep 30 06:21:41 crc kubenswrapper[4691]: I0930 06:21:41.535746 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kr8lp" event={"ID":"2997210f-48b1-46e1-bf0f-12ed24852c8b","Type":"ContainerStarted","Data":"0306bd89e54321c00dfa6360e246cd472c8ef08aabc4ac87b1aeb5f13a803a90"} Sep 30 06:21:41 crc kubenswrapper[4691]: I0930 06:21:41.536935 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-s2xqr" event={"ID":"500b5163-5eae-4fcc-9546-76c8f59841fa","Type":"ContainerStarted","Data":"5b068dddd0a8b27b446721a87f9f8521866955bea493648511cf75ddb9e5d2fa"} Sep 30 06:21:41 crc kubenswrapper[4691]: I0930 06:21:41.541950 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6gt99" event={"ID":"162d74bd-6a30-4fa0-88b7-2aa59426c6c8","Type":"ContainerStarted","Data":"4f03e1c4e6c5e455e5b08e3be0e92af14ab725873827ceeebae82c37cc4cca46"} Sep 30 06:21:41 crc kubenswrapper[4691]: I0930 06:21:41.553222 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-gw6s6" event={"ID":"f8303a7f-5f1a-4ad3-9a02-a25fefffdc8b","Type":"ContainerStarted","Data":"377c1851f9a6a0ab2e4c106675e42dfde5ec682552cbd7809775da47b77917bf"} Sep 30 06:21:41 crc kubenswrapper[4691]: I0930 06:21:41.564491 4691 generic.go:334] "Generic (PLEG): container finished" podID="063ba2d8-4d97-474f-8c2d-0541f6f5cec5" containerID="b159c6bd4db6c6520850cad2e402314fee7863139bdf67b7c6cd5f386ff4027a" exitCode=0 Sep 30 06:21:41 crc kubenswrapper[4691]: I0930 06:21:41.564559 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-bshxb" event={"ID":"063ba2d8-4d97-474f-8c2d-0541f6f5cec5","Type":"ContainerDied","Data":"b159c6bd4db6c6520850cad2e402314fee7863139bdf67b7c6cd5f386ff4027a"} Sep 30 06:21:41 crc kubenswrapper[4691]: I0930 06:21:41.567084 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-r8bfj" podStartSLOduration=123.567073574 podStartE2EDuration="2m3.567073574s" podCreationTimestamp="2025-09-30 06:19:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:21:41.566780244 +0000 UTC m=+145.041801284" watchObservedRunningTime="2025-09-30 06:21:41.567073574 +0000 UTC m=+145.042094614" Sep 30 06:21:41 crc kubenswrapper[4691]: I0930 06:21:41.582267 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4s6nm" event={"ID":"ffd77185-f5b4-418f-b675-db92dbe4c19f","Type":"ContainerStarted","Data":"1a45d5f723d78788dedd80d6801ed419bd344fe17472608be9200b2595b4b3de"} Sep 30 06:21:41 crc kubenswrapper[4691]: I0930 06:21:41.585617 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bjpn7" event={"ID":"64faa7eb-4089-40e6-b084-6924f274ac51","Type":"ContainerStarted","Data":"deebe1ea5c0f50e354f7c9c382f537e5faa2f4f603da5adb00f2d35924d011cf"} Sep 30 06:21:41 crc kubenswrapper[4691]: I0930 06:21:41.595064 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-5wzgc" podStartSLOduration=5.595027192 podStartE2EDuration="5.595027192s" podCreationTimestamp="2025-09-30 06:21:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:21:41.59436708 +0000 UTC m=+145.069388120" watchObservedRunningTime="2025-09-30 06:21:41.595027192 +0000 UTC m=+145.070048252" Sep 30 06:21:41 crc kubenswrapper[4691]: I0930 06:21:41.603093 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlxgb\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" Sep 30 06:21:41 crc kubenswrapper[4691]: E0930 06:21:41.603968 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 06:21:42.103948925 +0000 UTC m=+145.578969965 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlxgb" (UID: "c134a0fe-e3a2-4683-95d1-045ba2056b14") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:41 crc kubenswrapper[4691]: I0930 06:21:41.635294 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-4bdqt" podStartSLOduration=123.635278719 podStartE2EDuration="2m3.635278719s" podCreationTimestamp="2025-09-30 06:19:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:21:41.633980749 +0000 UTC m=+145.109001809" watchObservedRunningTime="2025-09-30 06:21:41.635278719 +0000 UTC m=+145.110299759" Sep 30 06:21:41 crc kubenswrapper[4691]: I0930 06:21:41.689259 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-7frz7" podStartSLOduration=123.689246754 podStartE2EDuration="2m3.689246754s" podCreationTimestamp="2025-09-30 06:19:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:21:41.688486759 +0000 UTC m=+145.163507809" watchObservedRunningTime="2025-09-30 06:21:41.689246754 +0000 UTC m=+145.164267794" Sep 30 06:21:41 crc kubenswrapper[4691]: I0930 06:21:41.704407 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 06:21:41 crc kubenswrapper[4691]: E0930 06:21:41.705190 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 06:21:42.205173159 +0000 UTC m=+145.680194199 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:41 crc kubenswrapper[4691]: I0930 06:21:41.713968 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-hf5vd" podStartSLOduration=123.713947788 podStartE2EDuration="2m3.713947788s" podCreationTimestamp="2025-09-30 06:19:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:21:41.71338387 +0000 UTC m=+145.188404930" watchObservedRunningTime="2025-09-30 06:21:41.713947788 +0000 UTC m=+145.188968828" Sep 30 06:21:41 crc kubenswrapper[4691]: I0930 06:21:41.805590 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlxgb\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" Sep 30 06:21:41 crc kubenswrapper[4691]: E0930 06:21:41.806077 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 06:21:42.306057283 +0000 UTC m=+145.781078333 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlxgb" (UID: "c134a0fe-e3a2-4683-95d1-045ba2056b14") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:41 crc kubenswrapper[4691]: I0930 06:21:41.811950 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fqxlz" podStartSLOduration=122.81192941 podStartE2EDuration="2m2.81192941s" podCreationTimestamp="2025-09-30 06:19:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:21:41.770103592 +0000 UTC m=+145.245124642" watchObservedRunningTime="2025-09-30 06:21:41.81192941 +0000 UTC m=+145.286950470" Sep 30 06:21:41 crc kubenswrapper[4691]: I0930 06:21:41.814087 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-nklgj" podStartSLOduration=123.814077748 podStartE2EDuration="2m3.814077748s" podCreationTimestamp="2025-09-30 06:19:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:21:41.808233033 +0000 UTC m=+145.283254083" watchObservedRunningTime="2025-09-30 06:21:41.814077748 +0000 UTC m=+145.289098788" Sep 30 06:21:41 crc kubenswrapper[4691]: I0930 06:21:41.832783 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-5652b" podStartSLOduration=123.832767782 podStartE2EDuration="2m3.832767782s" podCreationTimestamp="2025-09-30 06:19:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:21:41.83144387 +0000 UTC m=+145.306464920" watchObservedRunningTime="2025-09-30 06:21:41.832767782 +0000 UTC m=+145.307788822" Sep 30 06:21:41 crc kubenswrapper[4691]: I0930 06:21:41.876017 4691 patch_prober.go:28] interesting pod/router-default-5444994796-7frz7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 06:21:41 crc kubenswrapper[4691]: [-]has-synced failed: reason withheld Sep 30 06:21:41 crc kubenswrapper[4691]: [+]process-running ok Sep 30 06:21:41 crc kubenswrapper[4691]: healthz check failed Sep 30 06:21:41 crc kubenswrapper[4691]: I0930 06:21:41.876067 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7frz7" podUID="d6587adc-a984-4ce3-af8d-6739325c8604" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 06:21:41 crc kubenswrapper[4691]: I0930 06:21:41.876613 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jwl8z" podStartSLOduration=123.876596104 podStartE2EDuration="2m3.876596104s" podCreationTimestamp="2025-09-30 06:19:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:21:41.876296014 +0000 UTC m=+145.351317054" watchObservedRunningTime="2025-09-30 06:21:41.876596104 +0000 UTC m=+145.351617144" Sep 30 06:21:41 crc kubenswrapper[4691]: I0930 06:21:41.914137 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 06:21:41 crc kubenswrapper[4691]: E0930 06:21:41.914455 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 06:21:42.414438156 +0000 UTC m=+145.889459196 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:41 crc kubenswrapper[4691]: I0930 06:21:41.975900 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-thj2p" podStartSLOduration=123.975868077 podStartE2EDuration="2m3.975868077s" podCreationTimestamp="2025-09-30 06:19:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:21:41.975261418 +0000 UTC m=+145.450282468" watchObservedRunningTime="2025-09-30 06:21:41.975868077 +0000 UTC m=+145.450889117" Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.016042 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlxgb\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" Sep 30 06:21:42 crc kubenswrapper[4691]: E0930 06:21:42.016410 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 06:21:42.516396404 +0000 UTC m=+145.991417444 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlxgb" (UID: "c134a0fe-e3a2-4683-95d1-045ba2056b14") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.040305 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-m9vv7" podStartSLOduration=124.040288023 podStartE2EDuration="2m4.040288023s" podCreationTimestamp="2025-09-30 06:19:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:21:42.034928353 +0000 UTC m=+145.509949393" watchObservedRunningTime="2025-09-30 06:21:42.040288023 +0000 UTC m=+145.515309063" Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.118424 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 06:21:42 crc kubenswrapper[4691]: E0930 06:21:42.118584 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 06:21:42.618557509 +0000 UTC m=+146.093578549 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.118965 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlxgb\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" Sep 30 06:21:42 crc kubenswrapper[4691]: E0930 06:21:42.119237 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 06:21:42.61922525 +0000 UTC m=+146.094246290 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlxgb" (UID: "c134a0fe-e3a2-4683-95d1-045ba2056b14") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.125079 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-lww4z" podStartSLOduration=124.125062555 podStartE2EDuration="2m4.125062555s" podCreationTimestamp="2025-09-30 06:19:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:21:42.105111292 +0000 UTC m=+145.580132332" watchObservedRunningTime="2025-09-30 06:21:42.125062555 +0000 UTC m=+145.600083585" Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.190981 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-lh444" podStartSLOduration=124.190947718 podStartE2EDuration="2m4.190947718s" podCreationTimestamp="2025-09-30 06:19:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:21:42.155701898 +0000 UTC m=+145.630722938" watchObservedRunningTime="2025-09-30 06:21:42.190947718 +0000 UTC m=+145.665968768" Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.220319 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 06:21:42 crc kubenswrapper[4691]: E0930 06:21:42.220501 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 06:21:42.720474385 +0000 UTC m=+146.195495425 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.220759 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlxgb\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" Sep 30 06:21:42 crc kubenswrapper[4691]: E0930 06:21:42.221564 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 06:21:42.721531379 +0000 UTC m=+146.196552419 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlxgb" (UID: "c134a0fe-e3a2-4683-95d1-045ba2056b14") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.322382 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 06:21:42 crc kubenswrapper[4691]: E0930 06:21:42.322685 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 06:21:42.822633021 +0000 UTC m=+146.297654061 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.322747 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlxgb\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" Sep 30 06:21:42 crc kubenswrapper[4691]: E0930 06:21:42.323173 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 06:21:42.823165148 +0000 UTC m=+146.298186188 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlxgb" (UID: "c134a0fe-e3a2-4683-95d1-045ba2056b14") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.424025 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 06:21:42 crc kubenswrapper[4691]: E0930 06:21:42.424410 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 06:21:42.924393392 +0000 UTC m=+146.399414432 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.525989 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlxgb\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" Sep 30 06:21:42 crc kubenswrapper[4691]: E0930 06:21:42.526379 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 06:21:43.026361341 +0000 UTC m=+146.501382381 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlxgb" (UID: "c134a0fe-e3a2-4683-95d1-045ba2056b14") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.591052 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nmhdf" event={"ID":"ab09b6ea-c08c-4b80-a30a-09392178e10c","Type":"ContainerStarted","Data":"4c9d97663d33cbc02f9354d39301c4d18f84c3f5c46f1fd42c7a9ad55401c949"} Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.591388 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nmhdf" Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.592549 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-k6vnd" event={"ID":"09bc6f61-b7ce-46ff-bae4-04ff6609b246","Type":"ContainerStarted","Data":"8e2265a0cb8d01dac2f18d8f183a78710d26f37274e1b0d9ed5c36a5653be632"} Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.592561 4691 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-nmhdf container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" start-of-body= Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.592616 4691 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nmhdf" podUID="ab09b6ea-c08c-4b80-a30a-09392178e10c" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.37:8443/healthz\": dial tcp 10.217.0.37:8443: connect: connection refused" Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.593913 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-9d949" event={"ID":"c5e7cf84-31e8-4a66-971f-18cba2113669","Type":"ContainerStarted","Data":"0616f0b4a9eba1e689e79df6de3ad7c1452652c594cc0f93811c0d524e58ecf8"} Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.595432 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-stxbl" event={"ID":"c4d1f634-c758-400a-8dba-baefd045834d","Type":"ContainerStarted","Data":"9130ae934b01df75e791269b426cb2a8f46993a820ddad977ec90743dfa6bac2"} Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.595456 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-stxbl" event={"ID":"c4d1f634-c758-400a-8dba-baefd045834d","Type":"ContainerStarted","Data":"480d20733c4e8c1b929779a707310ea0928724ce2267334ade7843e78b00e9a8"} Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.595548 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-stxbl" Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.597148 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-62hxb" event={"ID":"f1dfc875-304c-4f81-8d12-c5463743ad08","Type":"ContainerStarted","Data":"4841f0fa1141b496998e4d5b45f16239691f9509a0c1b6c286a0266a57e8cab9"} Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.598846 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-cc4c5" event={"ID":"84873517-271a-49cb-8a39-5f28ddb68148","Type":"ContainerStarted","Data":"808d9e5bb42fed161839425c85c317e6a0e32af0830db1f6eae5e82906f850f3"} Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.598927 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-cc4c5" event={"ID":"84873517-271a-49cb-8a39-5f28ddb68148","Type":"ContainerStarted","Data":"4a9966314479b6327e28321cd78b38a41a018d6e93024c71952585b9dcf7f782"} Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.600314 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6gt99" event={"ID":"162d74bd-6a30-4fa0-88b7-2aa59426c6c8","Type":"ContainerStarted","Data":"fdfd98b5152af845e324daa6d7bdaaa40419a517e33c2b640ba31dcfc93f30d0"} Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.601684 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-kzftt" event={"ID":"ee1c2dd6-d759-4d3c-9ec7-86ec11419202","Type":"ContainerStarted","Data":"429bec63b237944f3ff7ce5a290852d3c1c19cea1c5820dbad4363f40a165cbb"} Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.602687 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320215-9kldt" event={"ID":"e2df0f77-43e6-417a-a2ef-dfee3e80cbd8","Type":"ContainerStarted","Data":"3e3eaea348966226bce3d5ed1f615a20e73b6acce16227c61e972962c9df8826"} Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.608650 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dchbw" event={"ID":"ec8d16a6-daeb-4cc0-9815-35e09c34fb71","Type":"ContainerStarted","Data":"de70c6a341ca5e7d7678eba747bd367c80e60e45c7ef66b3022f5e47789ca3c7"} Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.609054 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dchbw" Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.610241 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-dzf49" event={"ID":"f5ef6b93-5bb5-467f-8268-5feb300e2d5c","Type":"ContainerStarted","Data":"9dfc77521acadc64b053263fa9d2d2f760270714a851575b1333fd531f276b65"} Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.610932 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-dzf49" Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.611020 4691 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-dchbw container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.27:5443/healthz\": dial tcp 10.217.0.27:5443: connect: connection refused" start-of-body= Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.611055 4691 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dchbw" podUID="ec8d16a6-daeb-4cc0-9815-35e09c34fb71" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.27:5443/healthz\": dial tcp 10.217.0.27:5443: connect: connection refused" Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.611552 4691 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-dzf49 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" start-of-body= Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.611575 4691 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-dzf49" podUID="f5ef6b93-5bb5-467f-8268-5feb300e2d5c" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.613010 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-5hlv9" event={"ID":"37dd19aa-104d-4c79-859c-7161a185ad1c","Type":"ContainerStarted","Data":"fa0a3a67719c9e1249a74fae141510ddb193b9cec49b6a4c5bb46f2279082284"} Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.613046 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-5hlv9" event={"ID":"37dd19aa-104d-4c79-859c-7161a185ad1c","Type":"ContainerStarted","Data":"1e3c8e421d50c4419d257777834d28616c5a2239a117214a37e5171c3d41f9c4"} Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.614503 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-ss8nw" event={"ID":"3cd27b2f-3e77-4f36-b646-60e833384949","Type":"ContainerStarted","Data":"344dd13885a7e63cbdef7c1748418998e34551edb2030b6688aa7dbca252f161"} Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.615064 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-ss8nw" Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.615785 4691 patch_prober.go:28] interesting pod/console-operator-58897d9998-ss8nw container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/readyz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.615831 4691 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-ss8nw" podUID="3cd27b2f-3e77-4f36-b646-60e833384949" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.13:8443/readyz\": dial tcp 10.217.0.13:8443: connect: connection refused" Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.616537 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-65pf2" event={"ID":"16791f1d-bdee-46a9-9155-c1af947d96ec","Type":"ContainerStarted","Data":"205bae24bbcc73daa1c7fe1a086bdb81bf3821571e3748e450dcbd9b1a879200"} Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.616758 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-65pf2" Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.617939 4691 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-65pf2 container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.38:8443/healthz\": dial tcp 10.217.0.38:8443: connect: connection refused" start-of-body= Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.617971 4691 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-65pf2" podUID="16791f1d-bdee-46a9-9155-c1af947d96ec" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.38:8443/healthz\": dial tcp 10.217.0.38:8443: connect: connection refused" Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.618751 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2hd2f" event={"ID":"57168c43-1c61-42dd-863a-d524aa606da0","Type":"ContainerStarted","Data":"41bcc76893c6c6ce06869beaff3b312815e1170c23eeca4d47a025e56f80619e"} Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.620681 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4s6nm" event={"ID":"ffd77185-f5b4-418f-b675-db92dbe4c19f","Type":"ContainerStarted","Data":"0000f14dcbeee8f9b1c1896f4c63f1095ce0d085e94d7e2159695d896eeb7e9c"} Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.620717 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4s6nm" event={"ID":"ffd77185-f5b4-418f-b675-db92dbe4c19f","Type":"ContainerStarted","Data":"2eac0f67245947d707024c9c3281fb1159245ba4e555de60c1c80afc0fd1ef72"} Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.621184 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nmhdf" podStartSLOduration=124.621145771 podStartE2EDuration="2m4.621145771s" podCreationTimestamp="2025-09-30 06:19:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:21:42.618051123 +0000 UTC m=+146.093072193" watchObservedRunningTime="2025-09-30 06:21:42.621145771 +0000 UTC m=+146.096166811" Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.625321 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bjpn7" event={"ID":"64faa7eb-4089-40e6-b084-6924f274ac51","Type":"ContainerStarted","Data":"704ad4beea9843ef847a0098091abcf1b55ee23e13bcee43338a5115de8e89d6"} Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.625366 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bjpn7" event={"ID":"64faa7eb-4089-40e6-b084-6924f274ac51","Type":"ContainerStarted","Data":"6aabf167af0ea6b467d3d2d6f09afe8e712ed5cc19619de7543fe5dfa0c46ae6"} Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.626724 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.626857 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-6q5zd" event={"ID":"4e494e6a-0b34-4706-8284-5dc5086c89b3","Type":"ContainerStarted","Data":"a112c130eb6755483d9a0745b418d8a2681e440ddf600eb224c707866b955cf5"} Sep 30 06:21:42 crc kubenswrapper[4691]: E0930 06:21:42.626900 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 06:21:43.126871813 +0000 UTC m=+146.601892853 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.627010 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlxgb\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" Sep 30 06:21:42 crc kubenswrapper[4691]: E0930 06:21:42.627374 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 06:21:43.127355978 +0000 UTC m=+146.602377018 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlxgb" (UID: "c134a0fe-e3a2-4683-95d1-045ba2056b14") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.628466 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-gw6s6" event={"ID":"f8303a7f-5f1a-4ad3-9a02-a25fefffdc8b","Type":"ContainerStarted","Data":"95ecbcef4922edec0cd0c25181165946453a42fe35e1bd15f8296176970ffb69"} Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.630365 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fthnz" event={"ID":"b45e6c14-bbb6-4a9c-92e0-10c09fe37093","Type":"ContainerStarted","Data":"49777da31fa2734c4d0c8d47759e27a2888a539c20b5d1bb7e2a1f2aef547b10"} Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.631517 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kp6rk" event={"ID":"929acffa-90b0-4dfc-a65b-a8758c000f41","Type":"ContainerStarted","Data":"dfd6df24b4e7258d0a84a8ea5140453d741a0193019e613b4a29510af4205ccd"} Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.633159 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-srgxj" event={"ID":"10e7bec3-75fd-4ed0-bdcd-ff1ee3fd3f9b","Type":"ContainerStarted","Data":"3d771147f0aeb114ae61700b112399fab74c5d61144f62b902c0e7ec17ee98e3"} Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.633196 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-srgxj" event={"ID":"10e7bec3-75fd-4ed0-bdcd-ff1ee3fd3f9b","Type":"ContainerStarted","Data":"b393fb501d8715584aa4e6e927f2216a302f5d1374b4826bc291f5ef49580f65"} Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.634307 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kr8lp" event={"ID":"2997210f-48b1-46e1-bf0f-12ed24852c8b","Type":"ContainerStarted","Data":"79ceaab553f451787677604d794f3eb3fafb58007fd383669b6699922a8a510f"} Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.636524 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-bshxb" event={"ID":"063ba2d8-4d97-474f-8c2d-0541f6f5cec5","Type":"ContainerStarted","Data":"b44fbd379238f2c084f3f17ee6faa48dbef7276bd737fc383294b680047d6f5c"} Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.636999 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-bshxb" Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.640135 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-s2xqr" event={"ID":"500b5163-5eae-4fcc-9546-76c8f59841fa","Type":"ContainerStarted","Data":"eb18196c29f316f81ee7f50b0e4d0c93b844c9f5d3d56944b3f67216be0582b2"} Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.640178 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-s2xqr" event={"ID":"500b5163-5eae-4fcc-9546-76c8f59841fa","Type":"ContainerStarted","Data":"c961a6350a010517d85539a6ddf8a04df0ab39909591229e2219802cf519feee"} Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.640723 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-s2xqr" Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.642515 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-htgw7" event={"ID":"c359f4fc-5f36-4b96-8524-f885abe54d26","Type":"ContainerStarted","Data":"df2c3407480143bc1c823e8395e19c94f405ef06de41f5f4371cb4db1976efb6"} Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.642670 4691 patch_prober.go:28] interesting pod/downloads-7954f5f757-nklgj container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.642704 4691 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-nklgj" podUID="a779ae38-da0c-4953-8d61-6047076785d2" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.643590 4691 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-4bdqt container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.643663 4691 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-4bdqt" podUID="7a677441-8b2d-41ae-8dd8-e3334c16c700" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.651723 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dchbw" podStartSLOduration=124.651708952 podStartE2EDuration="2m4.651708952s" podCreationTimestamp="2025-09-30 06:19:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:21:42.65007176 +0000 UTC m=+146.125092800" watchObservedRunningTime="2025-09-30 06:21:42.651708952 +0000 UTC m=+146.126729992" Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.679298 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-5hlv9" podStartSLOduration=124.679283567 podStartE2EDuration="2m4.679283567s" podCreationTimestamp="2025-09-30 06:19:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:21:42.677538662 +0000 UTC m=+146.152559722" watchObservedRunningTime="2025-09-30 06:21:42.679283567 +0000 UTC m=+146.154304607" Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.698865 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-kzftt" podStartSLOduration=124.698843389 podStartE2EDuration="2m4.698843389s" podCreationTimestamp="2025-09-30 06:19:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:21:42.698018863 +0000 UTC m=+146.173039923" watchObservedRunningTime="2025-09-30 06:21:42.698843389 +0000 UTC m=+146.173864439" Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.726721 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-ss8nw" podStartSLOduration=124.726701854 podStartE2EDuration="2m4.726701854s" podCreationTimestamp="2025-09-30 06:19:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:21:42.724368569 +0000 UTC m=+146.199389619" watchObservedRunningTime="2025-09-30 06:21:42.726701854 +0000 UTC m=+146.201722894" Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.727854 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 06:21:42 crc kubenswrapper[4691]: E0930 06:21:42.731319 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 06:21:43.2313034 +0000 UTC m=+146.706324440 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.763340 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-6gt99" podStartSLOduration=124.763324566 podStartE2EDuration="2m4.763324566s" podCreationTimestamp="2025-09-30 06:19:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:21:42.761112156 +0000 UTC m=+146.236133196" watchObservedRunningTime="2025-09-30 06:21:42.763324566 +0000 UTC m=+146.238345606" Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.794276 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-dzf49" podStartSLOduration=124.794263159 podStartE2EDuration="2m4.794263159s" podCreationTimestamp="2025-09-30 06:19:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:21:42.792735371 +0000 UTC m=+146.267756411" watchObservedRunningTime="2025-09-30 06:21:42.794263159 +0000 UTC m=+146.269284199" Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.818646 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-cc4c5" podStartSLOduration=124.818627684 podStartE2EDuration="2m4.818627684s" podCreationTimestamp="2025-09-30 06:19:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:21:42.817521628 +0000 UTC m=+146.292542658" watchObservedRunningTime="2025-09-30 06:21:42.818627684 +0000 UTC m=+146.293648724" Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.831335 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlxgb\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" Sep 30 06:21:42 crc kubenswrapper[4691]: E0930 06:21:42.831740 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 06:21:43.331716769 +0000 UTC m=+146.806737809 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlxgb" (UID: "c134a0fe-e3a2-4683-95d1-045ba2056b14") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.834493 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-9d949" podStartSLOduration=124.834481747 podStartE2EDuration="2m4.834481747s" podCreationTimestamp="2025-09-30 06:19:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:21:42.832851236 +0000 UTC m=+146.307872286" watchObservedRunningTime="2025-09-30 06:21:42.834481747 +0000 UTC m=+146.309502787" Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.882048 4691 patch_prober.go:28] interesting pod/router-default-5444994796-7frz7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 06:21:42 crc kubenswrapper[4691]: [-]has-synced failed: reason withheld Sep 30 06:21:42 crc kubenswrapper[4691]: [+]process-running ok Sep 30 06:21:42 crc kubenswrapper[4691]: healthz check failed Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.882119 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7frz7" podUID="d6587adc-a984-4ce3-af8d-6739325c8604" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.892404 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-62hxb" podStartSLOduration=124.892385586 podStartE2EDuration="2m4.892385586s" podCreationTimestamp="2025-09-30 06:19:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:21:42.869353895 +0000 UTC m=+146.344374955" watchObservedRunningTime="2025-09-30 06:21:42.892385586 +0000 UTC m=+146.367406626" Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.909774 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-k6vnd" podStartSLOduration=123.909759688 podStartE2EDuration="2m3.909759688s" podCreationTimestamp="2025-09-30 06:19:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:21:42.895179094 +0000 UTC m=+146.370200134" watchObservedRunningTime="2025-09-30 06:21:42.909759688 +0000 UTC m=+146.384780728" Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.910436 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29320215-9kldt" podStartSLOduration=124.910432769 podStartE2EDuration="2m4.910432769s" podCreationTimestamp="2025-09-30 06:19:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:21:42.908964052 +0000 UTC m=+146.383985092" watchObservedRunningTime="2025-09-30 06:21:42.910432769 +0000 UTC m=+146.385453809" Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.934638 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 06:21:42 crc kubenswrapper[4691]: E0930 06:21:42.935096 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 06:21:43.435079082 +0000 UTC m=+146.910100122 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.943595 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-stxbl" podStartSLOduration=6.943571262 podStartE2EDuration="6.943571262s" podCreationTimestamp="2025-09-30 06:21:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:21:42.943242951 +0000 UTC m=+146.418263991" watchObservedRunningTime="2025-09-30 06:21:42.943571262 +0000 UTC m=+146.418592302" Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.968131 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-65pf2" podStartSLOduration=124.968112301 podStartE2EDuration="2m4.968112301s" podCreationTimestamp="2025-09-30 06:19:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:21:42.967008526 +0000 UTC m=+146.442029566" watchObservedRunningTime="2025-09-30 06:21:42.968112301 +0000 UTC m=+146.443133331" Sep 30 06:21:42 crc kubenswrapper[4691]: I0930 06:21:42.983109 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fqxlz" Sep 30 06:21:43 crc kubenswrapper[4691]: I0930 06:21:43.013790 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-srgxj" podStartSLOduration=125.013772031 podStartE2EDuration="2m5.013772031s" podCreationTimestamp="2025-09-30 06:19:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:21:42.992694421 +0000 UTC m=+146.467715471" watchObservedRunningTime="2025-09-30 06:21:43.013772031 +0000 UTC m=+146.488793071" Sep 30 06:21:43 crc kubenswrapper[4691]: I0930 06:21:43.035976 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlxgb\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" Sep 30 06:21:43 crc kubenswrapper[4691]: E0930 06:21:43.036337 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 06:21:43.536325458 +0000 UTC m=+147.011346498 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlxgb" (UID: "c134a0fe-e3a2-4683-95d1-045ba2056b14") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:43 crc kubenswrapper[4691]: I0930 06:21:43.043105 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fthnz" podStartSLOduration=125.043088082 podStartE2EDuration="2m5.043088082s" podCreationTimestamp="2025-09-30 06:19:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:21:43.016126026 +0000 UTC m=+146.491147086" watchObservedRunningTime="2025-09-30 06:21:43.043088082 +0000 UTC m=+146.518109132" Sep 30 06:21:43 crc kubenswrapper[4691]: I0930 06:21:43.044810 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-6q5zd" podStartSLOduration=125.044801097 podStartE2EDuration="2m5.044801097s" podCreationTimestamp="2025-09-30 06:19:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:21:43.041581724 +0000 UTC m=+146.516602774" watchObservedRunningTime="2025-09-30 06:21:43.044801097 +0000 UTC m=+146.519822147" Sep 30 06:21:43 crc kubenswrapper[4691]: I0930 06:21:43.090560 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-kr8lp" podStartSLOduration=125.0905422 podStartE2EDuration="2m5.0905422s" podCreationTimestamp="2025-09-30 06:19:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:21:43.070073859 +0000 UTC m=+146.545094899" watchObservedRunningTime="2025-09-30 06:21:43.0905422 +0000 UTC m=+146.565563240" Sep 30 06:21:43 crc kubenswrapper[4691]: I0930 06:21:43.126584 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-bjpn7" podStartSLOduration=125.126568734 podStartE2EDuration="2m5.126568734s" podCreationTimestamp="2025-09-30 06:19:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:21:43.09432886 +0000 UTC m=+146.569349890" watchObservedRunningTime="2025-09-30 06:21:43.126568734 +0000 UTC m=+146.601589774" Sep 30 06:21:43 crc kubenswrapper[4691]: I0930 06:21:43.136568 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 06:21:43 crc kubenswrapper[4691]: E0930 06:21:43.136864 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 06:21:43.63684874 +0000 UTC m=+147.111869780 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:43 crc kubenswrapper[4691]: I0930 06:21:43.145800 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-bshxb" podStartSLOduration=125.145790354 podStartE2EDuration="2m5.145790354s" podCreationTimestamp="2025-09-30 06:19:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:21:43.126940996 +0000 UTC m=+146.601962046" watchObservedRunningTime="2025-09-30 06:21:43.145790354 +0000 UTC m=+146.620811394" Sep 30 06:21:43 crc kubenswrapper[4691]: I0930 06:21:43.170321 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2hd2f" podStartSLOduration=125.170303823 podStartE2EDuration="2m5.170303823s" podCreationTimestamp="2025-09-30 06:19:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:21:43.168566138 +0000 UTC m=+146.643587178" watchObservedRunningTime="2025-09-30 06:21:43.170303823 +0000 UTC m=+146.645324863" Sep 30 06:21:43 crc kubenswrapper[4691]: I0930 06:21:43.170393 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-gw6s6" podStartSLOduration=7.170389485 podStartE2EDuration="7.170389485s" podCreationTimestamp="2025-09-30 06:21:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:21:43.146838857 +0000 UTC m=+146.621859897" watchObservedRunningTime="2025-09-30 06:21:43.170389485 +0000 UTC m=+146.645410525" Sep 30 06:21:43 crc kubenswrapper[4691]: I0930 06:21:43.213222 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-4s6nm" podStartSLOduration=125.213204836 podStartE2EDuration="2m5.213204836s" podCreationTimestamp="2025-09-30 06:19:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:21:43.193414927 +0000 UTC m=+146.668435967" watchObservedRunningTime="2025-09-30 06:21:43.213204836 +0000 UTC m=+146.688225876" Sep 30 06:21:43 crc kubenswrapper[4691]: I0930 06:21:43.237481 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlxgb\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" Sep 30 06:21:43 crc kubenswrapper[4691]: E0930 06:21:43.237774 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 06:21:43.737762505 +0000 UTC m=+147.212783535 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlxgb" (UID: "c134a0fe-e3a2-4683-95d1-045ba2056b14") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:43 crc kubenswrapper[4691]: I0930 06:21:43.243396 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-htgw7" podStartSLOduration=124.243377374 podStartE2EDuration="2m4.243377374s" podCreationTimestamp="2025-09-30 06:19:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:21:43.215731006 +0000 UTC m=+146.690752046" watchObservedRunningTime="2025-09-30 06:21:43.243377374 +0000 UTC m=+146.718398414" Sep 30 06:21:43 crc kubenswrapper[4691]: I0930 06:21:43.243518 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-s2xqr" podStartSLOduration=125.243512828 podStartE2EDuration="2m5.243512828s" podCreationTimestamp="2025-09-30 06:19:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:21:43.242703582 +0000 UTC m=+146.717724622" watchObservedRunningTime="2025-09-30 06:21:43.243512828 +0000 UTC m=+146.718533868" Sep 30 06:21:43 crc kubenswrapper[4691]: I0930 06:21:43.268019 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kp6rk" podStartSLOduration=125.268002476 podStartE2EDuration="2m5.268002476s" podCreationTimestamp="2025-09-30 06:19:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:21:43.26372469 +0000 UTC m=+146.738745730" watchObservedRunningTime="2025-09-30 06:21:43.268002476 +0000 UTC m=+146.743023506" Sep 30 06:21:43 crc kubenswrapper[4691]: I0930 06:21:43.338317 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 06:21:43 crc kubenswrapper[4691]: E0930 06:21:43.338491 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 06:21:43.838463324 +0000 UTC m=+147.313484364 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:43 crc kubenswrapper[4691]: I0930 06:21:43.338922 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlxgb\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" Sep 30 06:21:43 crc kubenswrapper[4691]: E0930 06:21:43.339208 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 06:21:43.839193357 +0000 UTC m=+147.314214397 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlxgb" (UID: "c134a0fe-e3a2-4683-95d1-045ba2056b14") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:43 crc kubenswrapper[4691]: I0930 06:21:43.440204 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 06:21:43 crc kubenswrapper[4691]: E0930 06:21:43.440381 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 06:21:43.94035536 +0000 UTC m=+147.415376390 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:43 crc kubenswrapper[4691]: I0930 06:21:43.440508 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlxgb\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" Sep 30 06:21:43 crc kubenswrapper[4691]: E0930 06:21:43.440843 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 06:21:43.940835064 +0000 UTC m=+147.415856104 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlxgb" (UID: "c134a0fe-e3a2-4683-95d1-045ba2056b14") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:43 crc kubenswrapper[4691]: I0930 06:21:43.541980 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 06:21:43 crc kubenswrapper[4691]: E0930 06:21:43.542162 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 06:21:44.042137042 +0000 UTC m=+147.517158082 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:43 crc kubenswrapper[4691]: I0930 06:21:43.542321 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlxgb\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" Sep 30 06:21:43 crc kubenswrapper[4691]: E0930 06:21:43.542574 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 06:21:44.042566156 +0000 UTC m=+147.517587196 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlxgb" (UID: "c134a0fe-e3a2-4683-95d1-045ba2056b14") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:43 crc kubenswrapper[4691]: I0930 06:21:43.643427 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 06:21:43 crc kubenswrapper[4691]: E0930 06:21:43.643615 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 06:21:44.143588474 +0000 UTC m=+147.618609514 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:43 crc kubenswrapper[4691]: I0930 06:21:43.643832 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlxgb\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" Sep 30 06:21:43 crc kubenswrapper[4691]: E0930 06:21:43.644193 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 06:21:44.144184884 +0000 UTC m=+147.619205924 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlxgb" (UID: "c134a0fe-e3a2-4683-95d1-045ba2056b14") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:43 crc kubenswrapper[4691]: I0930 06:21:43.648548 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-h6qlb" event={"ID":"66f39462-9632-40fb-abfa-5c13b3365d59","Type":"ContainerStarted","Data":"de529e4f3109c779ae2371c49159df4b6d31417c7ea418f33be5037919f5a8bc"} Sep 30 06:21:43 crc kubenswrapper[4691]: I0930 06:21:43.649102 4691 patch_prober.go:28] interesting pod/console-operator-58897d9998-ss8nw container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/readyz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Sep 30 06:21:43 crc kubenswrapper[4691]: I0930 06:21:43.649123 4691 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-dzf49 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" start-of-body= Sep 30 06:21:43 crc kubenswrapper[4691]: I0930 06:21:43.649131 4691 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-ss8nw" podUID="3cd27b2f-3e77-4f36-b646-60e833384949" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.13:8443/readyz\": dial tcp 10.217.0.13:8443: connect: connection refused" Sep 30 06:21:43 crc kubenswrapper[4691]: I0930 06:21:43.649145 4691 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-dzf49" podUID="f5ef6b93-5bb5-467f-8268-5feb300e2d5c" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" Sep 30 06:21:43 crc kubenswrapper[4691]: I0930 06:21:43.659359 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nmhdf" Sep 30 06:21:43 crc kubenswrapper[4691]: I0930 06:21:43.676991 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-65pf2" Sep 30 06:21:43 crc kubenswrapper[4691]: I0930 06:21:43.744472 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 06:21:43 crc kubenswrapper[4691]: E0930 06:21:43.746232 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 06:21:44.246209863 +0000 UTC m=+147.721230913 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:43 crc kubenswrapper[4691]: I0930 06:21:43.746546 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlxgb\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" Sep 30 06:21:43 crc kubenswrapper[4691]: E0930 06:21:43.748074 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 06:21:44.248052613 +0000 UTC m=+147.723073653 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlxgb" (UID: "c134a0fe-e3a2-4683-95d1-045ba2056b14") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:43 crc kubenswrapper[4691]: I0930 06:21:43.852727 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 06:21:43 crc kubenswrapper[4691]: E0930 06:21:43.852899 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 06:21:44.352859611 +0000 UTC m=+147.827880651 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:43 crc kubenswrapper[4691]: I0930 06:21:43.853049 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlxgb\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" Sep 30 06:21:43 crc kubenswrapper[4691]: E0930 06:21:43.853359 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 06:21:44.353351126 +0000 UTC m=+147.828372166 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlxgb" (UID: "c134a0fe-e3a2-4683-95d1-045ba2056b14") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:43 crc kubenswrapper[4691]: I0930 06:21:43.873902 4691 patch_prober.go:28] interesting pod/router-default-5444994796-7frz7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 06:21:43 crc kubenswrapper[4691]: [-]has-synced failed: reason withheld Sep 30 06:21:43 crc kubenswrapper[4691]: [+]process-running ok Sep 30 06:21:43 crc kubenswrapper[4691]: healthz check failed Sep 30 06:21:43 crc kubenswrapper[4691]: I0930 06:21:43.873966 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7frz7" podUID="d6587adc-a984-4ce3-af8d-6739325c8604" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 06:21:43 crc kubenswrapper[4691]: I0930 06:21:43.954108 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 06:21:43 crc kubenswrapper[4691]: E0930 06:21:43.954421 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 06:21:44.454405026 +0000 UTC m=+147.929426056 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:43 crc kubenswrapper[4691]: I0930 06:21:43.957594 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-5hlv9" Sep 30 06:21:43 crc kubenswrapper[4691]: I0930 06:21:43.958714 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-5hlv9" Sep 30 06:21:43 crc kubenswrapper[4691]: I0930 06:21:43.958790 4691 patch_prober.go:28] interesting pod/apiserver-76f77b778f-5hlv9 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="Get \"https://10.217.0.5:8443/livez\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Sep 30 06:21:43 crc kubenswrapper[4691]: I0930 06:21:43.958819 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-5hlv9" podUID="37dd19aa-104d-4c79-859c-7161a185ad1c" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.5:8443/livez\": dial tcp 10.217.0.5:8443: connect: connection refused" Sep 30 06:21:44 crc kubenswrapper[4691]: I0930 06:21:44.023797 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fthnz" Sep 30 06:21:44 crc kubenswrapper[4691]: I0930 06:21:44.023850 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fthnz" Sep 30 06:21:44 crc kubenswrapper[4691]: I0930 06:21:44.055227 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlxgb\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" Sep 30 06:21:44 crc kubenswrapper[4691]: E0930 06:21:44.055549 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 06:21:44.555536858 +0000 UTC m=+148.030557898 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlxgb" (UID: "c134a0fe-e3a2-4683-95d1-045ba2056b14") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:44 crc kubenswrapper[4691]: I0930 06:21:44.156101 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 06:21:44 crc kubenswrapper[4691]: E0930 06:21:44.156245 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 06:21:44.656205455 +0000 UTC m=+148.131226495 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:44 crc kubenswrapper[4691]: I0930 06:21:44.156671 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlxgb\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" Sep 30 06:21:44 crc kubenswrapper[4691]: E0930 06:21:44.157033 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 06:21:44.657022292 +0000 UTC m=+148.132043332 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlxgb" (UID: "c134a0fe-e3a2-4683-95d1-045ba2056b14") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:44 crc kubenswrapper[4691]: I0930 06:21:44.258081 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 06:21:44 crc kubenswrapper[4691]: E0930 06:21:44.258294 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 06:21:44.758254046 +0000 UTC m=+148.233275086 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:44 crc kubenswrapper[4691]: I0930 06:21:44.258371 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlxgb\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" Sep 30 06:21:44 crc kubenswrapper[4691]: E0930 06:21:44.258664 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 06:21:44.758649849 +0000 UTC m=+148.233670889 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlxgb" (UID: "c134a0fe-e3a2-4683-95d1-045ba2056b14") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:44 crc kubenswrapper[4691]: I0930 06:21:44.359654 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 06:21:44 crc kubenswrapper[4691]: E0930 06:21:44.359845 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 06:21:44.859819902 +0000 UTC m=+148.334840942 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:44 crc kubenswrapper[4691]: I0930 06:21:44.360066 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlxgb\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" Sep 30 06:21:44 crc kubenswrapper[4691]: E0930 06:21:44.360457 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 06:21:44.860448532 +0000 UTC m=+148.335469572 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlxgb" (UID: "c134a0fe-e3a2-4683-95d1-045ba2056b14") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:44 crc kubenswrapper[4691]: I0930 06:21:44.461437 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 06:21:44 crc kubenswrapper[4691]: E0930 06:21:44.461615 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 06:21:44.961585374 +0000 UTC m=+148.436606414 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:44 crc kubenswrapper[4691]: I0930 06:21:44.461928 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlxgb\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" Sep 30 06:21:44 crc kubenswrapper[4691]: E0930 06:21:44.462252 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 06:21:44.962244816 +0000 UTC m=+148.437265856 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlxgb" (UID: "c134a0fe-e3a2-4683-95d1-045ba2056b14") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:44 crc kubenswrapper[4691]: I0930 06:21:44.562840 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 06:21:44 crc kubenswrapper[4691]: E0930 06:21:44.563030 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 06:21:45.063002086 +0000 UTC m=+148.538023126 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:44 crc kubenswrapper[4691]: I0930 06:21:44.563145 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlxgb\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" Sep 30 06:21:44 crc kubenswrapper[4691]: E0930 06:21:44.563486 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 06:21:45.06347413 +0000 UTC m=+148.538495170 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlxgb" (UID: "c134a0fe-e3a2-4683-95d1-045ba2056b14") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:44 crc kubenswrapper[4691]: I0930 06:21:44.650433 4691 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-dchbw container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.27:5443/healthz\": context deadline exceeded" start-of-body= Sep 30 06:21:44 crc kubenswrapper[4691]: I0930 06:21:44.650480 4691 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dchbw" podUID="ec8d16a6-daeb-4cc0-9815-35e09c34fb71" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.27:5443/healthz\": context deadline exceeded" Sep 30 06:21:44 crc kubenswrapper[4691]: I0930 06:21:44.654944 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-h6qlb" event={"ID":"66f39462-9632-40fb-abfa-5c13b3365d59","Type":"ContainerStarted","Data":"7f3d3aed33a37845b9fe4eb155c55d3bfe1ad2a1644b3ac29f02ec8c9afd04d6"} Sep 30 06:21:44 crc kubenswrapper[4691]: I0930 06:21:44.655895 4691 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-dzf49 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" start-of-body= Sep 30 06:21:44 crc kubenswrapper[4691]: I0930 06:21:44.655938 4691 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-dzf49" podUID="f5ef6b93-5bb5-467f-8268-5feb300e2d5c" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.40:8080/healthz\": dial tcp 10.217.0.40:8080: connect: connection refused" Sep 30 06:21:44 crc kubenswrapper[4691]: I0930 06:21:44.664441 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 06:21:44 crc kubenswrapper[4691]: E0930 06:21:44.664634 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 06:21:45.164612052 +0000 UTC m=+148.639633132 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:44 crc kubenswrapper[4691]: I0930 06:21:44.664847 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlxgb\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" Sep 30 06:21:44 crc kubenswrapper[4691]: E0930 06:21:44.665181 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 06:21:45.16516935 +0000 UTC m=+148.640190390 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlxgb" (UID: "c134a0fe-e3a2-4683-95d1-045ba2056b14") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:44 crc kubenswrapper[4691]: I0930 06:21:44.765506 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 06:21:44 crc kubenswrapper[4691]: E0930 06:21:44.765735 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 06:21:45.265700323 +0000 UTC m=+148.740721363 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:44 crc kubenswrapper[4691]: I0930 06:21:44.767000 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlxgb\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" Sep 30 06:21:44 crc kubenswrapper[4691]: E0930 06:21:44.767117 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 06:21:45.267108698 +0000 UTC m=+148.742129738 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlxgb" (UID: "c134a0fe-e3a2-4683-95d1-045ba2056b14") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:44 crc kubenswrapper[4691]: I0930 06:21:44.801542 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-ss8nw" Sep 30 06:21:44 crc kubenswrapper[4691]: I0930 06:21:44.868485 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 06:21:44 crc kubenswrapper[4691]: E0930 06:21:44.868691 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 06:21:45.368663103 +0000 UTC m=+148.843684143 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:44 crc kubenswrapper[4691]: I0930 06:21:44.868787 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlxgb\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" Sep 30 06:21:44 crc kubenswrapper[4691]: E0930 06:21:44.869100 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 06:21:45.369089916 +0000 UTC m=+148.844110956 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlxgb" (UID: "c134a0fe-e3a2-4683-95d1-045ba2056b14") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:44 crc kubenswrapper[4691]: I0930 06:21:44.877043 4691 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-fthnz container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Sep 30 06:21:44 crc kubenswrapper[4691]: [+]log ok Sep 30 06:21:44 crc kubenswrapper[4691]: [+]etcd ok Sep 30 06:21:44 crc kubenswrapper[4691]: [+]poststarthook/start-apiserver-admission-initializer ok Sep 30 06:21:44 crc kubenswrapper[4691]: [-]poststarthook/generic-apiserver-start-informers failed: reason withheld Sep 30 06:21:44 crc kubenswrapper[4691]: [+]poststarthook/max-in-flight-filter ok Sep 30 06:21:44 crc kubenswrapper[4691]: [+]poststarthook/storage-object-count-tracker-hook ok Sep 30 06:21:44 crc kubenswrapper[4691]: [+]poststarthook/openshift.io-StartUserInformer ok Sep 30 06:21:44 crc kubenswrapper[4691]: [+]poststarthook/openshift.io-StartOAuthInformer ok Sep 30 06:21:44 crc kubenswrapper[4691]: [+]poststarthook/openshift.io-StartTokenTimeoutUpdater ok Sep 30 06:21:44 crc kubenswrapper[4691]: livez check failed Sep 30 06:21:44 crc kubenswrapper[4691]: I0930 06:21:44.877104 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fthnz" podUID="b45e6c14-bbb6-4a9c-92e0-10c09fe37093" containerName="oauth-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 06:21:44 crc kubenswrapper[4691]: I0930 06:21:44.880249 4691 patch_prober.go:28] interesting pod/router-default-5444994796-7frz7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 06:21:44 crc kubenswrapper[4691]: [-]has-synced failed: reason withheld Sep 30 06:21:44 crc kubenswrapper[4691]: [+]process-running ok Sep 30 06:21:44 crc kubenswrapper[4691]: healthz check failed Sep 30 06:21:44 crc kubenswrapper[4691]: I0930 06:21:44.880277 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7frz7" podUID="d6587adc-a984-4ce3-af8d-6739325c8604" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 06:21:44 crc kubenswrapper[4691]: I0930 06:21:44.962692 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Sep 30 06:21:44 crc kubenswrapper[4691]: I0930 06:21:44.964747 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 06:21:44 crc kubenswrapper[4691]: I0930 06:21:44.969482 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 06:21:44 crc kubenswrapper[4691]: E0930 06:21:44.970029 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 06:21:45.470005412 +0000 UTC m=+148.945026442 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:44 crc kubenswrapper[4691]: I0930 06:21:44.989184 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Sep 30 06:21:44 crc kubenswrapper[4691]: I0930 06:21:44.990232 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.032701 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.071046 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vzgfj"] Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.071612 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bc9f14f0-3298-44a3-bf88-833165f8a662-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"bc9f14f0-3298-44a3-bf88-833165f8a662\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.071654 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.071686 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.071710 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlxgb\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.071738 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bc9f14f0-3298-44a3-bf88-833165f8a662-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"bc9f14f0-3298-44a3-bf88-833165f8a662\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.071927 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vzgfj" Sep 30 06:21:45 crc kubenswrapper[4691]: E0930 06:21:45.073070 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 06:21:45.573058124 +0000 UTC m=+149.048079164 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlxgb" (UID: "c134a0fe-e3a2-4683-95d1-045ba2056b14") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.075977 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.089900 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.090076 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.091872 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vzgfj"] Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.173426 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.173565 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bc9f14f0-3298-44a3-bf88-833165f8a662-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"bc9f14f0-3298-44a3-bf88-833165f8a662\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.173587 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.173630 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7j542\" (UniqueName: \"kubernetes.io/projected/b2bb2e76-e094-4320-8bab-f54bf623dcb1-kube-api-access-7j542\") pod \"certified-operators-vzgfj\" (UID: \"b2bb2e76-e094-4320-8bab-f54bf623dcb1\") " pod="openshift-marketplace/certified-operators-vzgfj" Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.173651 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.173672 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2bb2e76-e094-4320-8bab-f54bf623dcb1-utilities\") pod \"certified-operators-vzgfj\" (UID: \"b2bb2e76-e094-4320-8bab-f54bf623dcb1\") " pod="openshift-marketplace/certified-operators-vzgfj" Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.173689 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2bb2e76-e094-4320-8bab-f54bf623dcb1-catalog-content\") pod \"certified-operators-vzgfj\" (UID: \"b2bb2e76-e094-4320-8bab-f54bf623dcb1\") " pod="openshift-marketplace/certified-operators-vzgfj" Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.173714 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bc9f14f0-3298-44a3-bf88-833165f8a662-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"bc9f14f0-3298-44a3-bf88-833165f8a662\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.174032 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bc9f14f0-3298-44a3-bf88-833165f8a662-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"bc9f14f0-3298-44a3-bf88-833165f8a662\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 06:21:45 crc kubenswrapper[4691]: E0930 06:21:45.174166 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 06:21:45.674146455 +0000 UTC m=+149.149167495 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.183540 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.190520 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.205204 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bc9f14f0-3298-44a3-bf88-833165f8a662-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"bc9f14f0-3298-44a3-bf88-833165f8a662\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.265518 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.268209 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-qln5x"] Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.269108 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qln5x" Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.274774 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2bb2e76-e094-4320-8bab-f54bf623dcb1-utilities\") pod \"certified-operators-vzgfj\" (UID: \"b2bb2e76-e094-4320-8bab-f54bf623dcb1\") " pod="openshift-marketplace/certified-operators-vzgfj" Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.274813 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2bb2e76-e094-4320-8bab-f54bf623dcb1-catalog-content\") pod \"certified-operators-vzgfj\" (UID: \"b2bb2e76-e094-4320-8bab-f54bf623dcb1\") " pod="openshift-marketplace/certified-operators-vzgfj" Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.274896 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlxgb\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.274942 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7j542\" (UniqueName: \"kubernetes.io/projected/b2bb2e76-e094-4320-8bab-f54bf623dcb1-kube-api-access-7j542\") pod \"certified-operators-vzgfj\" (UID: \"b2bb2e76-e094-4320-8bab-f54bf623dcb1\") " pod="openshift-marketplace/certified-operators-vzgfj" Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.275508 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2bb2e76-e094-4320-8bab-f54bf623dcb1-utilities\") pod \"certified-operators-vzgfj\" (UID: \"b2bb2e76-e094-4320-8bab-f54bf623dcb1\") " pod="openshift-marketplace/certified-operators-vzgfj" Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.275709 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2bb2e76-e094-4320-8bab-f54bf623dcb1-catalog-content\") pod \"certified-operators-vzgfj\" (UID: \"b2bb2e76-e094-4320-8bab-f54bf623dcb1\") " pod="openshift-marketplace/certified-operators-vzgfj" Sep 30 06:21:45 crc kubenswrapper[4691]: E0930 06:21:45.275943 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 06:21:45.775931788 +0000 UTC m=+149.250952828 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlxgb" (UID: "c134a0fe-e3a2-4683-95d1-045ba2056b14") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.276213 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.276426 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.292181 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qln5x"] Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.297653 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7j542\" (UniqueName: \"kubernetes.io/projected/b2bb2e76-e094-4320-8bab-f54bf623dcb1-kube-api-access-7j542\") pod \"certified-operators-vzgfj\" (UID: \"b2bb2e76-e094-4320-8bab-f54bf623dcb1\") " pod="openshift-marketplace/certified-operators-vzgfj" Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.349431 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.352156 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.375793 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.375987 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca900afa-86c0-4fe0-ba3d-d5d927db24b7-utilities\") pod \"community-operators-qln5x\" (UID: \"ca900afa-86c0-4fe0-ba3d-d5d927db24b7\") " pod="openshift-marketplace/community-operators-qln5x" Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.376009 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7wxt\" (UniqueName: \"kubernetes.io/projected/ca900afa-86c0-4fe0-ba3d-d5d927db24b7-kube-api-access-x7wxt\") pod \"community-operators-qln5x\" (UID: \"ca900afa-86c0-4fe0-ba3d-d5d927db24b7\") " pod="openshift-marketplace/community-operators-qln5x" Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.376070 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca900afa-86c0-4fe0-ba3d-d5d927db24b7-catalog-content\") pod \"community-operators-qln5x\" (UID: \"ca900afa-86c0-4fe0-ba3d-d5d927db24b7\") " pod="openshift-marketplace/community-operators-qln5x" Sep 30 06:21:45 crc kubenswrapper[4691]: E0930 06:21:45.376179 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 06:21:45.876164051 +0000 UTC m=+149.351185091 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.414147 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vzgfj" Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.451506 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qxlzp"] Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.452539 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qxlzp" Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.477743 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlxgb\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.477806 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca900afa-86c0-4fe0-ba3d-d5d927db24b7-utilities\") pod \"community-operators-qln5x\" (UID: \"ca900afa-86c0-4fe0-ba3d-d5d927db24b7\") " pod="openshift-marketplace/community-operators-qln5x" Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.477825 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7wxt\" (UniqueName: \"kubernetes.io/projected/ca900afa-86c0-4fe0-ba3d-d5d927db24b7-kube-api-access-x7wxt\") pod \"community-operators-qln5x\" (UID: \"ca900afa-86c0-4fe0-ba3d-d5d927db24b7\") " pod="openshift-marketplace/community-operators-qln5x" Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.477867 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca900afa-86c0-4fe0-ba3d-d5d927db24b7-catalog-content\") pod \"community-operators-qln5x\" (UID: \"ca900afa-86c0-4fe0-ba3d-d5d927db24b7\") " pod="openshift-marketplace/community-operators-qln5x" Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.478419 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca900afa-86c0-4fe0-ba3d-d5d927db24b7-catalog-content\") pod \"community-operators-qln5x\" (UID: \"ca900afa-86c0-4fe0-ba3d-d5d927db24b7\") " pod="openshift-marketplace/community-operators-qln5x" Sep 30 06:21:45 crc kubenswrapper[4691]: E0930 06:21:45.478655 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 06:21:45.978645137 +0000 UTC m=+149.453666177 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlxgb" (UID: "c134a0fe-e3a2-4683-95d1-045ba2056b14") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.478974 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca900afa-86c0-4fe0-ba3d-d5d927db24b7-utilities\") pod \"community-operators-qln5x\" (UID: \"ca900afa-86c0-4fe0-ba3d-d5d927db24b7\") " pod="openshift-marketplace/community-operators-qln5x" Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.529626 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7wxt\" (UniqueName: \"kubernetes.io/projected/ca900afa-86c0-4fe0-ba3d-d5d927db24b7-kube-api-access-x7wxt\") pod \"community-operators-qln5x\" (UID: \"ca900afa-86c0-4fe0-ba3d-d5d927db24b7\") " pod="openshift-marketplace/community-operators-qln5x" Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.532839 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qxlzp"] Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.575206 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-bshxb" Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.579408 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.579705 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07c0a985-95a5-4fd3-a271-9e46d5f51af4-catalog-content\") pod \"certified-operators-qxlzp\" (UID: \"07c0a985-95a5-4fd3-a271-9e46d5f51af4\") " pod="openshift-marketplace/certified-operators-qxlzp" Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.579745 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g958m\" (UniqueName: \"kubernetes.io/projected/07c0a985-95a5-4fd3-a271-9e46d5f51af4-kube-api-access-g958m\") pod \"certified-operators-qxlzp\" (UID: \"07c0a985-95a5-4fd3-a271-9e46d5f51af4\") " pod="openshift-marketplace/certified-operators-qxlzp" Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.579808 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07c0a985-95a5-4fd3-a271-9e46d5f51af4-utilities\") pod \"certified-operators-qxlzp\" (UID: \"07c0a985-95a5-4fd3-a271-9e46d5f51af4\") " pod="openshift-marketplace/certified-operators-qxlzp" Sep 30 06:21:45 crc kubenswrapper[4691]: E0930 06:21:45.579979 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 06:21:46.079960814 +0000 UTC m=+149.554981864 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.634142 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qln5x" Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.667217 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-x8kfg"] Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.668295 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x8kfg" Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.682686 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07c0a985-95a5-4fd3-a271-9e46d5f51af4-utilities\") pod \"certified-operators-qxlzp\" (UID: \"07c0a985-95a5-4fd3-a271-9e46d5f51af4\") " pod="openshift-marketplace/certified-operators-qxlzp" Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.682717 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlxgb\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.682759 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07c0a985-95a5-4fd3-a271-9e46d5f51af4-catalog-content\") pod \"certified-operators-qxlzp\" (UID: \"07c0a985-95a5-4fd3-a271-9e46d5f51af4\") " pod="openshift-marketplace/certified-operators-qxlzp" Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.682789 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g958m\" (UniqueName: \"kubernetes.io/projected/07c0a985-95a5-4fd3-a271-9e46d5f51af4-kube-api-access-g958m\") pod \"certified-operators-qxlzp\" (UID: \"07c0a985-95a5-4fd3-a271-9e46d5f51af4\") " pod="openshift-marketplace/certified-operators-qxlzp" Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.685261 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-x8kfg"] Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.685670 4691 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.686641 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07c0a985-95a5-4fd3-a271-9e46d5f51af4-utilities\") pod \"certified-operators-qxlzp\" (UID: \"07c0a985-95a5-4fd3-a271-9e46d5f51af4\") " pod="openshift-marketplace/certified-operators-qxlzp" Sep 30 06:21:45 crc kubenswrapper[4691]: E0930 06:21:45.688092 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 06:21:46.187960175 +0000 UTC m=+149.662981215 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlxgb" (UID: "c134a0fe-e3a2-4683-95d1-045ba2056b14") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.688447 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07c0a985-95a5-4fd3-a271-9e46d5f51af4-catalog-content\") pod \"certified-operators-qxlzp\" (UID: \"07c0a985-95a5-4fd3-a271-9e46d5f51af4\") " pod="openshift-marketplace/certified-operators-qxlzp" Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.703602 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-h6qlb" event={"ID":"66f39462-9632-40fb-abfa-5c13b3365d59","Type":"ContainerStarted","Data":"838b98a0b9aef133bd5c95a5f9f349bd66e4a658ee0aa48bc46f77a01d77f408"} Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.703649 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-h6qlb" event={"ID":"66f39462-9632-40fb-abfa-5c13b3365d59","Type":"ContainerStarted","Data":"17207909054d2149a06d7e7934455ace6d6bdf0afe5d91fe5b3bff94fe5106c9"} Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.741712 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g958m\" (UniqueName: \"kubernetes.io/projected/07c0a985-95a5-4fd3-a271-9e46d5f51af4-kube-api-access-g958m\") pod \"certified-operators-qxlzp\" (UID: \"07c0a985-95a5-4fd3-a271-9e46d5f51af4\") " pod="openshift-marketplace/certified-operators-qxlzp" Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.766565 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qxlzp" Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.783642 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 06:21:45 crc kubenswrapper[4691]: E0930 06:21:45.783857 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 06:21:46.283818869 +0000 UTC m=+149.758839909 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.784035 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5278b1f-f696-4223-a657-24a3d309b5f3-catalog-content\") pod \"community-operators-x8kfg\" (UID: \"c5278b1f-f696-4223-a657-24a3d309b5f3\") " pod="openshift-marketplace/community-operators-x8kfg" Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.784123 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5278b1f-f696-4223-a657-24a3d309b5f3-utilities\") pod \"community-operators-x8kfg\" (UID: \"c5278b1f-f696-4223-a657-24a3d309b5f3\") " pod="openshift-marketplace/community-operators-x8kfg" Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.784160 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4f92b\" (UniqueName: \"kubernetes.io/projected/c5278b1f-f696-4223-a657-24a3d309b5f3-kube-api-access-4f92b\") pod \"community-operators-x8kfg\" (UID: \"c5278b1f-f696-4223-a657-24a3d309b5f3\") " pod="openshift-marketplace/community-operators-x8kfg" Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.886638 4691 patch_prober.go:28] interesting pod/router-default-5444994796-7frz7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 06:21:45 crc kubenswrapper[4691]: [-]has-synced failed: reason withheld Sep 30 06:21:45 crc kubenswrapper[4691]: [+]process-running ok Sep 30 06:21:45 crc kubenswrapper[4691]: healthz check failed Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.886950 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7frz7" podUID="d6587adc-a984-4ce3-af8d-6739325c8604" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.894579 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5278b1f-f696-4223-a657-24a3d309b5f3-catalog-content\") pod \"community-operators-x8kfg\" (UID: \"c5278b1f-f696-4223-a657-24a3d309b5f3\") " pod="openshift-marketplace/community-operators-x8kfg" Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.894622 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5278b1f-f696-4223-a657-24a3d309b5f3-utilities\") pod \"community-operators-x8kfg\" (UID: \"c5278b1f-f696-4223-a657-24a3d309b5f3\") " pod="openshift-marketplace/community-operators-x8kfg" Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.894645 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4f92b\" (UniqueName: \"kubernetes.io/projected/c5278b1f-f696-4223-a657-24a3d309b5f3-kube-api-access-4f92b\") pod \"community-operators-x8kfg\" (UID: \"c5278b1f-f696-4223-a657-24a3d309b5f3\") " pod="openshift-marketplace/community-operators-x8kfg" Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.894702 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlxgb\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" Sep 30 06:21:45 crc kubenswrapper[4691]: E0930 06:21:45.894994 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 06:21:46.39498137 +0000 UTC m=+149.870002410 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlxgb" (UID: "c134a0fe-e3a2-4683-95d1-045ba2056b14") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.895367 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5278b1f-f696-4223-a657-24a3d309b5f3-utilities\") pod \"community-operators-x8kfg\" (UID: \"c5278b1f-f696-4223-a657-24a3d309b5f3\") " pod="openshift-marketplace/community-operators-x8kfg" Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.895503 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5278b1f-f696-4223-a657-24a3d309b5f3-catalog-content\") pod \"community-operators-x8kfg\" (UID: \"c5278b1f-f696-4223-a657-24a3d309b5f3\") " pod="openshift-marketplace/community-operators-x8kfg" Sep 30 06:21:45 crc kubenswrapper[4691]: I0930 06:21:45.932660 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4f92b\" (UniqueName: \"kubernetes.io/projected/c5278b1f-f696-4223-a657-24a3d309b5f3-kube-api-access-4f92b\") pod \"community-operators-x8kfg\" (UID: \"c5278b1f-f696-4223-a657-24a3d309b5f3\") " pod="openshift-marketplace/community-operators-x8kfg" Sep 30 06:21:46 crc kubenswrapper[4691]: I0930 06:21:45.996075 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 06:21:46 crc kubenswrapper[4691]: E0930 06:21:45.996415 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 06:21:46.496399501 +0000 UTC m=+149.971420541 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:46 crc kubenswrapper[4691]: I0930 06:21:46.010976 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-h6qlb" podStartSLOduration=10.010960033 podStartE2EDuration="10.010960033s" podCreationTimestamp="2025-09-30 06:21:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:21:45.760758827 +0000 UTC m=+149.235779877" watchObservedRunningTime="2025-09-30 06:21:46.010960033 +0000 UTC m=+149.485981073" Sep 30 06:21:46 crc kubenswrapper[4691]: I0930 06:21:46.088763 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Sep 30 06:21:46 crc kubenswrapper[4691]: I0930 06:21:46.097687 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlxgb\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" Sep 30 06:21:46 crc kubenswrapper[4691]: E0930 06:21:46.097973 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 06:21:46.597959876 +0000 UTC m=+150.072980916 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlxgb" (UID: "c134a0fe-e3a2-4683-95d1-045ba2056b14") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:46 crc kubenswrapper[4691]: I0930 06:21:46.120342 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x8kfg" Sep 30 06:21:46 crc kubenswrapper[4691]: I0930 06:21:46.160161 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vzgfj"] Sep 30 06:21:46 crc kubenswrapper[4691]: I0930 06:21:46.198405 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 06:21:46 crc kubenswrapper[4691]: E0930 06:21:46.198551 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 06:21:46.698503069 +0000 UTC m=+150.173524109 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:46 crc kubenswrapper[4691]: I0930 06:21:46.198925 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlxgb\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" Sep 30 06:21:46 crc kubenswrapper[4691]: E0930 06:21:46.199206 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 06:21:46.699194021 +0000 UTC m=+150.174215061 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlxgb" (UID: "c134a0fe-e3a2-4683-95d1-045ba2056b14") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:46 crc kubenswrapper[4691]: I0930 06:21:46.301315 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 06:21:46 crc kubenswrapper[4691]: E0930 06:21:46.301575 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 06:21:46.801546022 +0000 UTC m=+150.276567062 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:46 crc kubenswrapper[4691]: I0930 06:21:46.339349 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-qln5x"] Sep 30 06:21:46 crc kubenswrapper[4691]: I0930 06:21:46.397209 4691 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-09-30T06:21:45.685682262Z","Handler":null,"Name":""} Sep 30 06:21:46 crc kubenswrapper[4691]: I0930 06:21:46.402527 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlxgb\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" Sep 30 06:21:46 crc kubenswrapper[4691]: E0930 06:21:46.402808 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-09-30 06:21:46.902796187 +0000 UTC m=+150.377817227 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xlxgb" (UID: "c134a0fe-e3a2-4683-95d1-045ba2056b14") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:46 crc kubenswrapper[4691]: I0930 06:21:46.476537 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qxlzp"] Sep 30 06:21:46 crc kubenswrapper[4691]: I0930 06:21:46.505286 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 06:21:46 crc kubenswrapper[4691]: E0930 06:21:46.505659 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-09-30 06:21:47.005643155 +0000 UTC m=+150.480664195 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Sep 30 06:21:46 crc kubenswrapper[4691]: I0930 06:21:46.533536 4691 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Sep 30 06:21:46 crc kubenswrapper[4691]: I0930 06:21:46.533569 4691 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Sep 30 06:21:46 crc kubenswrapper[4691]: I0930 06:21:46.606639 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlxgb\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" Sep 30 06:21:46 crc kubenswrapper[4691]: I0930 06:21:46.608710 4691 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Sep 30 06:21:46 crc kubenswrapper[4691]: I0930 06:21:46.608742 4691 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlxgb\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" Sep 30 06:21:46 crc kubenswrapper[4691]: I0930 06:21:46.626726 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-x8kfg"] Sep 30 06:21:46 crc kubenswrapper[4691]: I0930 06:21:46.673560 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xlxgb\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" Sep 30 06:21:46 crc kubenswrapper[4691]: I0930 06:21:46.707871 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Sep 30 06:21:46 crc kubenswrapper[4691]: I0930 06:21:46.711642 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Sep 30 06:21:46 crc kubenswrapper[4691]: I0930 06:21:46.731078 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"02e5e6eeacd5a8717f98e10124ade817d7738c032e229d6b5b2d1e76744dabb5"} Sep 30 06:21:46 crc kubenswrapper[4691]: I0930 06:21:46.731128 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"9d5dc54fde38110f3cf241379320fbe0f389a33fa166cc82a135aeae06d0937b"} Sep 30 06:21:46 crc kubenswrapper[4691]: I0930 06:21:46.740916 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"bc9f14f0-3298-44a3-bf88-833165f8a662","Type":"ContainerStarted","Data":"b03b49417b62fc63c40bbfdaf1b26387bcddc926de69ea6c0be53498928431c2"} Sep 30 06:21:46 crc kubenswrapper[4691]: I0930 06:21:46.740993 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"bc9f14f0-3298-44a3-bf88-833165f8a662","Type":"ContainerStarted","Data":"08e40c4a256c2d92fdbaaa7fc5714acfdb1b97628b713b9626463986e6c2547d"} Sep 30 06:21:46 crc kubenswrapper[4691]: I0930 06:21:46.742819 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"dacab52bed32cee8a457e3f004be3530fc059cebdc1adb9673c4737943335a94"} Sep 30 06:21:46 crc kubenswrapper[4691]: I0930 06:21:46.742852 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"8eb3a7f195e97806826c488d86ca97a39cebd59204e0614e093bad7ff5f709e0"} Sep 30 06:21:46 crc kubenswrapper[4691]: I0930 06:21:46.743174 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 06:21:46 crc kubenswrapper[4691]: I0930 06:21:46.745937 4691 generic.go:334] "Generic (PLEG): container finished" podID="e2df0f77-43e6-417a-a2ef-dfee3e80cbd8" containerID="3e3eaea348966226bce3d5ed1f615a20e73b6acce16227c61e972962c9df8826" exitCode=0 Sep 30 06:21:46 crc kubenswrapper[4691]: I0930 06:21:46.746170 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320215-9kldt" event={"ID":"e2df0f77-43e6-417a-a2ef-dfee3e80cbd8","Type":"ContainerDied","Data":"3e3eaea348966226bce3d5ed1f615a20e73b6acce16227c61e972962c9df8826"} Sep 30 06:21:46 crc kubenswrapper[4691]: I0930 06:21:46.754063 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qln5x" event={"ID":"ca900afa-86c0-4fe0-ba3d-d5d927db24b7","Type":"ContainerStarted","Data":"a567bb7108c3a33eb4e9d29a4a97e035ef323ab632d29711d02c18b2c482e30a"} Sep 30 06:21:46 crc kubenswrapper[4691]: I0930 06:21:46.754103 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qln5x" event={"ID":"ca900afa-86c0-4fe0-ba3d-d5d927db24b7","Type":"ContainerStarted","Data":"321515a17ac9366d6b36f7f23032258fc844948e9fef19851c1e2f6aafd683b0"} Sep 30 06:21:46 crc kubenswrapper[4691]: I0930 06:21:46.757057 4691 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 06:21:46 crc kubenswrapper[4691]: I0930 06:21:46.762818 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"f4ab40ea127c549c445e53d88cd8a2c4066367ce7a4a0ac32f208b2f0ebaf086"} Sep 30 06:21:46 crc kubenswrapper[4691]: I0930 06:21:46.762859 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"e8951936ce82c41fa56f94aa1f1e054e81e275c5ef443007bed1b0a48fbdd126"} Sep 30 06:21:46 crc kubenswrapper[4691]: I0930 06:21:46.765356 4691 generic.go:334] "Generic (PLEG): container finished" podID="07c0a985-95a5-4fd3-a271-9e46d5f51af4" containerID="1aeb90c4b067b706afc6c615c7342a76d8a1016bdaf50cb5a378664149348160" exitCode=0 Sep 30 06:21:46 crc kubenswrapper[4691]: I0930 06:21:46.765419 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qxlzp" event={"ID":"07c0a985-95a5-4fd3-a271-9e46d5f51af4","Type":"ContainerDied","Data":"1aeb90c4b067b706afc6c615c7342a76d8a1016bdaf50cb5a378664149348160"} Sep 30 06:21:46 crc kubenswrapper[4691]: I0930 06:21:46.765444 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qxlzp" event={"ID":"07c0a985-95a5-4fd3-a271-9e46d5f51af4","Type":"ContainerStarted","Data":"d124bab712a6d32378971c33248edfa1ff3bb48979fc5c6a6399963b7c20c04b"} Sep 30 06:21:46 crc kubenswrapper[4691]: I0930 06:21:46.768603 4691 generic.go:334] "Generic (PLEG): container finished" podID="b2bb2e76-e094-4320-8bab-f54bf623dcb1" containerID="05aaca92bf30378ec0da5789869bedf98ca75942d1da2d9ed4580df83fbd07a6" exitCode=0 Sep 30 06:21:46 crc kubenswrapper[4691]: I0930 06:21:46.768778 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vzgfj" event={"ID":"b2bb2e76-e094-4320-8bab-f54bf623dcb1","Type":"ContainerDied","Data":"05aaca92bf30378ec0da5789869bedf98ca75942d1da2d9ed4580df83fbd07a6"} Sep 30 06:21:46 crc kubenswrapper[4691]: I0930 06:21:46.768836 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vzgfj" event={"ID":"b2bb2e76-e094-4320-8bab-f54bf623dcb1","Type":"ContainerStarted","Data":"21a482f187a572bed61df4bb31b0f5238996fb7790b0f5875530d45fefea6315"} Sep 30 06:21:46 crc kubenswrapper[4691]: I0930 06:21:46.841953 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.841936175 podStartE2EDuration="2.841936175s" podCreationTimestamp="2025-09-30 06:21:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:21:46.817579691 +0000 UTC m=+150.292600731" watchObservedRunningTime="2025-09-30 06:21:46.841936175 +0000 UTC m=+150.316957215" Sep 30 06:21:46 crc kubenswrapper[4691]: I0930 06:21:46.890211 4691 patch_prober.go:28] interesting pod/router-default-5444994796-7frz7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 06:21:46 crc kubenswrapper[4691]: [-]has-synced failed: reason withheld Sep 30 06:21:46 crc kubenswrapper[4691]: [+]process-running ok Sep 30 06:21:46 crc kubenswrapper[4691]: healthz check failed Sep 30 06:21:46 crc kubenswrapper[4691]: I0930 06:21:46.890272 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7frz7" podUID="d6587adc-a984-4ce3-af8d-6739325c8604" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 06:21:46 crc kubenswrapper[4691]: I0930 06:21:46.925777 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" Sep 30 06:21:47 crc kubenswrapper[4691]: I0930 06:21:47.044687 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9mzhd"] Sep 30 06:21:47 crc kubenswrapper[4691]: I0930 06:21:47.045989 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9mzhd" Sep 30 06:21:47 crc kubenswrapper[4691]: I0930 06:21:47.047566 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Sep 30 06:21:47 crc kubenswrapper[4691]: I0930 06:21:47.055500 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9mzhd"] Sep 30 06:21:47 crc kubenswrapper[4691]: I0930 06:21:47.105363 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-xlxgb"] Sep 30 06:21:47 crc kubenswrapper[4691]: W0930 06:21:47.111282 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc134a0fe_e3a2_4683_95d1_045ba2056b14.slice/crio-4cb80b00ced64fb3f7ec90b38a3d3186ed66f695137b7bad6a0f749f9f6427f2 WatchSource:0}: Error finding container 4cb80b00ced64fb3f7ec90b38a3d3186ed66f695137b7bad6a0f749f9f6427f2: Status 404 returned error can't find the container with id 4cb80b00ced64fb3f7ec90b38a3d3186ed66f695137b7bad6a0f749f9f6427f2 Sep 30 06:21:47 crc kubenswrapper[4691]: I0930 06:21:47.112880 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2nrv\" (UniqueName: \"kubernetes.io/projected/fecc9d18-f13e-4620-a9da-b620e9660ec7-kube-api-access-w2nrv\") pod \"redhat-marketplace-9mzhd\" (UID: \"fecc9d18-f13e-4620-a9da-b620e9660ec7\") " pod="openshift-marketplace/redhat-marketplace-9mzhd" Sep 30 06:21:47 crc kubenswrapper[4691]: I0930 06:21:47.112967 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fecc9d18-f13e-4620-a9da-b620e9660ec7-catalog-content\") pod \"redhat-marketplace-9mzhd\" (UID: \"fecc9d18-f13e-4620-a9da-b620e9660ec7\") " pod="openshift-marketplace/redhat-marketplace-9mzhd" Sep 30 06:21:47 crc kubenswrapper[4691]: I0930 06:21:47.113035 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fecc9d18-f13e-4620-a9da-b620e9660ec7-utilities\") pod \"redhat-marketplace-9mzhd\" (UID: \"fecc9d18-f13e-4620-a9da-b620e9660ec7\") " pod="openshift-marketplace/redhat-marketplace-9mzhd" Sep 30 06:21:47 crc kubenswrapper[4691]: I0930 06:21:47.217071 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fecc9d18-f13e-4620-a9da-b620e9660ec7-utilities\") pod \"redhat-marketplace-9mzhd\" (UID: \"fecc9d18-f13e-4620-a9da-b620e9660ec7\") " pod="openshift-marketplace/redhat-marketplace-9mzhd" Sep 30 06:21:47 crc kubenswrapper[4691]: I0930 06:21:47.217145 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w2nrv\" (UniqueName: \"kubernetes.io/projected/fecc9d18-f13e-4620-a9da-b620e9660ec7-kube-api-access-w2nrv\") pod \"redhat-marketplace-9mzhd\" (UID: \"fecc9d18-f13e-4620-a9da-b620e9660ec7\") " pod="openshift-marketplace/redhat-marketplace-9mzhd" Sep 30 06:21:47 crc kubenswrapper[4691]: I0930 06:21:47.217204 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fecc9d18-f13e-4620-a9da-b620e9660ec7-catalog-content\") pod \"redhat-marketplace-9mzhd\" (UID: \"fecc9d18-f13e-4620-a9da-b620e9660ec7\") " pod="openshift-marketplace/redhat-marketplace-9mzhd" Sep 30 06:21:47 crc kubenswrapper[4691]: I0930 06:21:47.217546 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fecc9d18-f13e-4620-a9da-b620e9660ec7-utilities\") pod \"redhat-marketplace-9mzhd\" (UID: \"fecc9d18-f13e-4620-a9da-b620e9660ec7\") " pod="openshift-marketplace/redhat-marketplace-9mzhd" Sep 30 06:21:47 crc kubenswrapper[4691]: I0930 06:21:47.217645 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fecc9d18-f13e-4620-a9da-b620e9660ec7-catalog-content\") pod \"redhat-marketplace-9mzhd\" (UID: \"fecc9d18-f13e-4620-a9da-b620e9660ec7\") " pod="openshift-marketplace/redhat-marketplace-9mzhd" Sep 30 06:21:47 crc kubenswrapper[4691]: I0930 06:21:47.233843 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Sep 30 06:21:47 crc kubenswrapper[4691]: I0930 06:21:47.236211 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2nrv\" (UniqueName: \"kubernetes.io/projected/fecc9d18-f13e-4620-a9da-b620e9660ec7-kube-api-access-w2nrv\") pod \"redhat-marketplace-9mzhd\" (UID: \"fecc9d18-f13e-4620-a9da-b620e9660ec7\") " pod="openshift-marketplace/redhat-marketplace-9mzhd" Sep 30 06:21:47 crc kubenswrapper[4691]: I0930 06:21:47.362516 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9mzhd" Sep 30 06:21:47 crc kubenswrapper[4691]: I0930 06:21:47.447277 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-d6xvc"] Sep 30 06:21:47 crc kubenswrapper[4691]: I0930 06:21:47.454371 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d6xvc" Sep 30 06:21:47 crc kubenswrapper[4691]: I0930 06:21:47.498347 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d6xvc"] Sep 30 06:21:47 crc kubenswrapper[4691]: I0930 06:21:47.521152 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fae00bc-923f-4e3c-979b-bfde482dc0b0-catalog-content\") pod \"redhat-marketplace-d6xvc\" (UID: \"2fae00bc-923f-4e3c-979b-bfde482dc0b0\") " pod="openshift-marketplace/redhat-marketplace-d6xvc" Sep 30 06:21:47 crc kubenswrapper[4691]: I0930 06:21:47.521206 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fae00bc-923f-4e3c-979b-bfde482dc0b0-utilities\") pod \"redhat-marketplace-d6xvc\" (UID: \"2fae00bc-923f-4e3c-979b-bfde482dc0b0\") " pod="openshift-marketplace/redhat-marketplace-d6xvc" Sep 30 06:21:47 crc kubenswrapper[4691]: I0930 06:21:47.521236 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bkvc\" (UniqueName: \"kubernetes.io/projected/2fae00bc-923f-4e3c-979b-bfde482dc0b0-kube-api-access-7bkvc\") pod \"redhat-marketplace-d6xvc\" (UID: \"2fae00bc-923f-4e3c-979b-bfde482dc0b0\") " pod="openshift-marketplace/redhat-marketplace-d6xvc" Sep 30 06:21:47 crc kubenswrapper[4691]: I0930 06:21:47.585950 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9mzhd"] Sep 30 06:21:47 crc kubenswrapper[4691]: W0930 06:21:47.613720 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfecc9d18_f13e_4620_a9da_b620e9660ec7.slice/crio-8ada3ce5c48083d682c4e94216fd9c0972769c07127ec4fb0a99090ce7056bcc WatchSource:0}: Error finding container 8ada3ce5c48083d682c4e94216fd9c0972769c07127ec4fb0a99090ce7056bcc: Status 404 returned error can't find the container with id 8ada3ce5c48083d682c4e94216fd9c0972769c07127ec4fb0a99090ce7056bcc Sep 30 06:21:47 crc kubenswrapper[4691]: I0930 06:21:47.622315 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fae00bc-923f-4e3c-979b-bfde482dc0b0-catalog-content\") pod \"redhat-marketplace-d6xvc\" (UID: \"2fae00bc-923f-4e3c-979b-bfde482dc0b0\") " pod="openshift-marketplace/redhat-marketplace-d6xvc" Sep 30 06:21:47 crc kubenswrapper[4691]: I0930 06:21:47.622380 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fae00bc-923f-4e3c-979b-bfde482dc0b0-utilities\") pod \"redhat-marketplace-d6xvc\" (UID: \"2fae00bc-923f-4e3c-979b-bfde482dc0b0\") " pod="openshift-marketplace/redhat-marketplace-d6xvc" Sep 30 06:21:47 crc kubenswrapper[4691]: I0930 06:21:47.622411 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bkvc\" (UniqueName: \"kubernetes.io/projected/2fae00bc-923f-4e3c-979b-bfde482dc0b0-kube-api-access-7bkvc\") pod \"redhat-marketplace-d6xvc\" (UID: \"2fae00bc-923f-4e3c-979b-bfde482dc0b0\") " pod="openshift-marketplace/redhat-marketplace-d6xvc" Sep 30 06:21:47 crc kubenswrapper[4691]: I0930 06:21:47.623074 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fae00bc-923f-4e3c-979b-bfde482dc0b0-utilities\") pod \"redhat-marketplace-d6xvc\" (UID: \"2fae00bc-923f-4e3c-979b-bfde482dc0b0\") " pod="openshift-marketplace/redhat-marketplace-d6xvc" Sep 30 06:21:47 crc kubenswrapper[4691]: I0930 06:21:47.623141 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fae00bc-923f-4e3c-979b-bfde482dc0b0-catalog-content\") pod \"redhat-marketplace-d6xvc\" (UID: \"2fae00bc-923f-4e3c-979b-bfde482dc0b0\") " pod="openshift-marketplace/redhat-marketplace-d6xvc" Sep 30 06:21:47 crc kubenswrapper[4691]: I0930 06:21:47.653615 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bkvc\" (UniqueName: \"kubernetes.io/projected/2fae00bc-923f-4e3c-979b-bfde482dc0b0-kube-api-access-7bkvc\") pod \"redhat-marketplace-d6xvc\" (UID: \"2fae00bc-923f-4e3c-979b-bfde482dc0b0\") " pod="openshift-marketplace/redhat-marketplace-d6xvc" Sep 30 06:21:47 crc kubenswrapper[4691]: I0930 06:21:47.776577 4691 generic.go:334] "Generic (PLEG): container finished" podID="c5278b1f-f696-4223-a657-24a3d309b5f3" containerID="1f64e57d6e5cba91d57e5cb7a09f1ba7477501f867b80146fdf6617fe87b2c71" exitCode=0 Sep 30 06:21:47 crc kubenswrapper[4691]: I0930 06:21:47.777082 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x8kfg" event={"ID":"c5278b1f-f696-4223-a657-24a3d309b5f3","Type":"ContainerDied","Data":"1f64e57d6e5cba91d57e5cb7a09f1ba7477501f867b80146fdf6617fe87b2c71"} Sep 30 06:21:47 crc kubenswrapper[4691]: I0930 06:21:47.777115 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x8kfg" event={"ID":"c5278b1f-f696-4223-a657-24a3d309b5f3","Type":"ContainerStarted","Data":"e671017a28b6ba7d11c15a2853100c952c2f7c9e1e6b7274fa1a29fd81116f93"} Sep 30 06:21:47 crc kubenswrapper[4691]: I0930 06:21:47.779531 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" event={"ID":"c134a0fe-e3a2-4683-95d1-045ba2056b14","Type":"ContainerStarted","Data":"5a1a61bce140a4733d73582e090cece0039c90f5edb54b09d245529de9f3a1ee"} Sep 30 06:21:47 crc kubenswrapper[4691]: I0930 06:21:47.779582 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" event={"ID":"c134a0fe-e3a2-4683-95d1-045ba2056b14","Type":"ContainerStarted","Data":"4cb80b00ced64fb3f7ec90b38a3d3186ed66f695137b7bad6a0f749f9f6427f2"} Sep 30 06:21:47 crc kubenswrapper[4691]: I0930 06:21:47.779718 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" Sep 30 06:21:47 crc kubenswrapper[4691]: I0930 06:21:47.781455 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9mzhd" event={"ID":"fecc9d18-f13e-4620-a9da-b620e9660ec7","Type":"ContainerStarted","Data":"a03ff0b5f01a0084d1d0d69d53b6baa68da298d7a2a55c1792be8d9338c58c48"} Sep 30 06:21:47 crc kubenswrapper[4691]: I0930 06:21:47.781492 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9mzhd" event={"ID":"fecc9d18-f13e-4620-a9da-b620e9660ec7","Type":"ContainerStarted","Data":"8ada3ce5c48083d682c4e94216fd9c0972769c07127ec4fb0a99090ce7056bcc"} Sep 30 06:21:47 crc kubenswrapper[4691]: I0930 06:21:47.787056 4691 generic.go:334] "Generic (PLEG): container finished" podID="bc9f14f0-3298-44a3-bf88-833165f8a662" containerID="b03b49417b62fc63c40bbfdaf1b26387bcddc926de69ea6c0be53498928431c2" exitCode=0 Sep 30 06:21:47 crc kubenswrapper[4691]: I0930 06:21:47.787118 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"bc9f14f0-3298-44a3-bf88-833165f8a662","Type":"ContainerDied","Data":"b03b49417b62fc63c40bbfdaf1b26387bcddc926de69ea6c0be53498928431c2"} Sep 30 06:21:47 crc kubenswrapper[4691]: I0930 06:21:47.790493 4691 generic.go:334] "Generic (PLEG): container finished" podID="ca900afa-86c0-4fe0-ba3d-d5d927db24b7" containerID="a567bb7108c3a33eb4e9d29a4a97e035ef323ab632d29711d02c18b2c482e30a" exitCode=0 Sep 30 06:21:47 crc kubenswrapper[4691]: I0930 06:21:47.790555 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qln5x" event={"ID":"ca900afa-86c0-4fe0-ba3d-d5d927db24b7","Type":"ContainerDied","Data":"a567bb7108c3a33eb4e9d29a4a97e035ef323ab632d29711d02c18b2c482e30a"} Sep 30 06:21:47 crc kubenswrapper[4691]: I0930 06:21:47.800937 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d6xvc" Sep 30 06:21:47 crc kubenswrapper[4691]: I0930 06:21:47.843390 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" podStartSLOduration=129.84335222 podStartE2EDuration="2m9.84335222s" podCreationTimestamp="2025-09-30 06:19:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:21:47.841632595 +0000 UTC m=+151.316653655" watchObservedRunningTime="2025-09-30 06:21:47.84335222 +0000 UTC m=+151.318373260" Sep 30 06:21:47 crc kubenswrapper[4691]: I0930 06:21:47.869505 4691 patch_prober.go:28] interesting pod/router-default-5444994796-7frz7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 06:21:47 crc kubenswrapper[4691]: [-]has-synced failed: reason withheld Sep 30 06:21:47 crc kubenswrapper[4691]: [+]process-running ok Sep 30 06:21:47 crc kubenswrapper[4691]: healthz check failed Sep 30 06:21:47 crc kubenswrapper[4691]: I0930 06:21:47.869559 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7frz7" podUID="d6587adc-a984-4ce3-af8d-6739325c8604" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 06:21:48 crc kubenswrapper[4691]: I0930 06:21:48.078987 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d6xvc"] Sep 30 06:21:48 crc kubenswrapper[4691]: I0930 06:21:48.087384 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320215-9kldt" Sep 30 06:21:48 crc kubenswrapper[4691]: W0930 06:21:48.088497 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2fae00bc_923f_4e3c_979b_bfde482dc0b0.slice/crio-147f7424158c390457b54223a55788eaca8cb94ac0c28fc9b0ca68df2cddf637 WatchSource:0}: Error finding container 147f7424158c390457b54223a55788eaca8cb94ac0c28fc9b0ca68df2cddf637: Status 404 returned error can't find the container with id 147f7424158c390457b54223a55788eaca8cb94ac0c28fc9b0ca68df2cddf637 Sep 30 06:21:48 crc kubenswrapper[4691]: I0930 06:21:48.151110 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cwh9r\" (UniqueName: \"kubernetes.io/projected/e2df0f77-43e6-417a-a2ef-dfee3e80cbd8-kube-api-access-cwh9r\") pod \"e2df0f77-43e6-417a-a2ef-dfee3e80cbd8\" (UID: \"e2df0f77-43e6-417a-a2ef-dfee3e80cbd8\") " Sep 30 06:21:48 crc kubenswrapper[4691]: I0930 06:21:48.151525 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e2df0f77-43e6-417a-a2ef-dfee3e80cbd8-secret-volume\") pod \"e2df0f77-43e6-417a-a2ef-dfee3e80cbd8\" (UID: \"e2df0f77-43e6-417a-a2ef-dfee3e80cbd8\") " Sep 30 06:21:48 crc kubenswrapper[4691]: I0930 06:21:48.151581 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e2df0f77-43e6-417a-a2ef-dfee3e80cbd8-config-volume\") pod \"e2df0f77-43e6-417a-a2ef-dfee3e80cbd8\" (UID: \"e2df0f77-43e6-417a-a2ef-dfee3e80cbd8\") " Sep 30 06:21:48 crc kubenswrapper[4691]: I0930 06:21:48.152229 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e2df0f77-43e6-417a-a2ef-dfee3e80cbd8-config-volume" (OuterVolumeSpecName: "config-volume") pod "e2df0f77-43e6-417a-a2ef-dfee3e80cbd8" (UID: "e2df0f77-43e6-417a-a2ef-dfee3e80cbd8"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:21:48 crc kubenswrapper[4691]: I0930 06:21:48.156154 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2df0f77-43e6-417a-a2ef-dfee3e80cbd8-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e2df0f77-43e6-417a-a2ef-dfee3e80cbd8" (UID: "e2df0f77-43e6-417a-a2ef-dfee3e80cbd8"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:21:48 crc kubenswrapper[4691]: I0930 06:21:48.157075 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2df0f77-43e6-417a-a2ef-dfee3e80cbd8-kube-api-access-cwh9r" (OuterVolumeSpecName: "kube-api-access-cwh9r") pod "e2df0f77-43e6-417a-a2ef-dfee3e80cbd8" (UID: "e2df0f77-43e6-417a-a2ef-dfee3e80cbd8"). InnerVolumeSpecName "kube-api-access-cwh9r". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:21:48 crc kubenswrapper[4691]: I0930 06:21:48.252588 4691 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e2df0f77-43e6-417a-a2ef-dfee3e80cbd8-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 06:21:48 crc kubenswrapper[4691]: I0930 06:21:48.252611 4691 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e2df0f77-43e6-417a-a2ef-dfee3e80cbd8-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 06:21:48 crc kubenswrapper[4691]: I0930 06:21:48.252620 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cwh9r\" (UniqueName: \"kubernetes.io/projected/e2df0f77-43e6-417a-a2ef-dfee3e80cbd8-kube-api-access-cwh9r\") on node \"crc\" DevicePath \"\"" Sep 30 06:21:48 crc kubenswrapper[4691]: E0930 06:21:48.311755 4691 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2fae00bc_923f_4e3c_979b_bfde482dc0b0.slice/crio-conmon-60d4dbfedc85fe8b8462f07cad3091485062f4f25bea06139ceb9af28003b016.scope\": RecentStats: unable to find data in memory cache]" Sep 30 06:21:48 crc kubenswrapper[4691]: I0930 06:21:48.447777 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ghx4r"] Sep 30 06:21:48 crc kubenswrapper[4691]: E0930 06:21:48.448032 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2df0f77-43e6-417a-a2ef-dfee3e80cbd8" containerName="collect-profiles" Sep 30 06:21:48 crc kubenswrapper[4691]: I0930 06:21:48.448047 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2df0f77-43e6-417a-a2ef-dfee3e80cbd8" containerName="collect-profiles" Sep 30 06:21:48 crc kubenswrapper[4691]: I0930 06:21:48.448152 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2df0f77-43e6-417a-a2ef-dfee3e80cbd8" containerName="collect-profiles" Sep 30 06:21:48 crc kubenswrapper[4691]: I0930 06:21:48.448853 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ghx4r" Sep 30 06:21:48 crc kubenswrapper[4691]: I0930 06:21:48.451742 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Sep 30 06:21:48 crc kubenswrapper[4691]: I0930 06:21:48.455940 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ghx4r"] Sep 30 06:21:48 crc kubenswrapper[4691]: I0930 06:21:48.556352 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4fda877-ac4f-419b-9cf9-933c5bca0aba-utilities\") pod \"redhat-operators-ghx4r\" (UID: \"c4fda877-ac4f-419b-9cf9-933c5bca0aba\") " pod="openshift-marketplace/redhat-operators-ghx4r" Sep 30 06:21:48 crc kubenswrapper[4691]: I0930 06:21:48.556400 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4fda877-ac4f-419b-9cf9-933c5bca0aba-catalog-content\") pod \"redhat-operators-ghx4r\" (UID: \"c4fda877-ac4f-419b-9cf9-933c5bca0aba\") " pod="openshift-marketplace/redhat-operators-ghx4r" Sep 30 06:21:48 crc kubenswrapper[4691]: I0930 06:21:48.556446 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-znsl5\" (UniqueName: \"kubernetes.io/projected/c4fda877-ac4f-419b-9cf9-933c5bca0aba-kube-api-access-znsl5\") pod \"redhat-operators-ghx4r\" (UID: \"c4fda877-ac4f-419b-9cf9-933c5bca0aba\") " pod="openshift-marketplace/redhat-operators-ghx4r" Sep 30 06:21:48 crc kubenswrapper[4691]: I0930 06:21:48.657956 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4fda877-ac4f-419b-9cf9-933c5bca0aba-utilities\") pod \"redhat-operators-ghx4r\" (UID: \"c4fda877-ac4f-419b-9cf9-933c5bca0aba\") " pod="openshift-marketplace/redhat-operators-ghx4r" Sep 30 06:21:48 crc kubenswrapper[4691]: I0930 06:21:48.658009 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4fda877-ac4f-419b-9cf9-933c5bca0aba-catalog-content\") pod \"redhat-operators-ghx4r\" (UID: \"c4fda877-ac4f-419b-9cf9-933c5bca0aba\") " pod="openshift-marketplace/redhat-operators-ghx4r" Sep 30 06:21:48 crc kubenswrapper[4691]: I0930 06:21:48.658050 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-znsl5\" (UniqueName: \"kubernetes.io/projected/c4fda877-ac4f-419b-9cf9-933c5bca0aba-kube-api-access-znsl5\") pod \"redhat-operators-ghx4r\" (UID: \"c4fda877-ac4f-419b-9cf9-933c5bca0aba\") " pod="openshift-marketplace/redhat-operators-ghx4r" Sep 30 06:21:48 crc kubenswrapper[4691]: I0930 06:21:48.658486 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4fda877-ac4f-419b-9cf9-933c5bca0aba-utilities\") pod \"redhat-operators-ghx4r\" (UID: \"c4fda877-ac4f-419b-9cf9-933c5bca0aba\") " pod="openshift-marketplace/redhat-operators-ghx4r" Sep 30 06:21:48 crc kubenswrapper[4691]: I0930 06:21:48.658612 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4fda877-ac4f-419b-9cf9-933c5bca0aba-catalog-content\") pod \"redhat-operators-ghx4r\" (UID: \"c4fda877-ac4f-419b-9cf9-933c5bca0aba\") " pod="openshift-marketplace/redhat-operators-ghx4r" Sep 30 06:21:48 crc kubenswrapper[4691]: I0930 06:21:48.674580 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-znsl5\" (UniqueName: \"kubernetes.io/projected/c4fda877-ac4f-419b-9cf9-933c5bca0aba-kube-api-access-znsl5\") pod \"redhat-operators-ghx4r\" (UID: \"c4fda877-ac4f-419b-9cf9-933c5bca0aba\") " pod="openshift-marketplace/redhat-operators-ghx4r" Sep 30 06:21:48 crc kubenswrapper[4691]: I0930 06:21:48.768654 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ghx4r" Sep 30 06:21:48 crc kubenswrapper[4691]: I0930 06:21:48.804479 4691 generic.go:334] "Generic (PLEG): container finished" podID="fecc9d18-f13e-4620-a9da-b620e9660ec7" containerID="a03ff0b5f01a0084d1d0d69d53b6baa68da298d7a2a55c1792be8d9338c58c48" exitCode=0 Sep 30 06:21:48 crc kubenswrapper[4691]: I0930 06:21:48.804586 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9mzhd" event={"ID":"fecc9d18-f13e-4620-a9da-b620e9660ec7","Type":"ContainerDied","Data":"a03ff0b5f01a0084d1d0d69d53b6baa68da298d7a2a55c1792be8d9338c58c48"} Sep 30 06:21:48 crc kubenswrapper[4691]: I0930 06:21:48.821922 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320215-9kldt" event={"ID":"e2df0f77-43e6-417a-a2ef-dfee3e80cbd8","Type":"ContainerDied","Data":"182bdd9e95dffa9e8af6d34ae75a1122af8cdb776d23f0e45038ffab88d8eda6"} Sep 30 06:21:48 crc kubenswrapper[4691]: I0930 06:21:48.821898 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320215-9kldt" Sep 30 06:21:48 crc kubenswrapper[4691]: I0930 06:21:48.821987 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="182bdd9e95dffa9e8af6d34ae75a1122af8cdb776d23f0e45038ffab88d8eda6" Sep 30 06:21:48 crc kubenswrapper[4691]: I0930 06:21:48.825002 4691 generic.go:334] "Generic (PLEG): container finished" podID="2fae00bc-923f-4e3c-979b-bfde482dc0b0" containerID="60d4dbfedc85fe8b8462f07cad3091485062f4f25bea06139ceb9af28003b016" exitCode=0 Sep 30 06:21:48 crc kubenswrapper[4691]: I0930 06:21:48.825149 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d6xvc" event={"ID":"2fae00bc-923f-4e3c-979b-bfde482dc0b0","Type":"ContainerDied","Data":"60d4dbfedc85fe8b8462f07cad3091485062f4f25bea06139ceb9af28003b016"} Sep 30 06:21:48 crc kubenswrapper[4691]: I0930 06:21:48.825195 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d6xvc" event={"ID":"2fae00bc-923f-4e3c-979b-bfde482dc0b0","Type":"ContainerStarted","Data":"147f7424158c390457b54223a55788eaca8cb94ac0c28fc9b0ca68df2cddf637"} Sep 30 06:21:48 crc kubenswrapper[4691]: I0930 06:21:48.859979 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-b7txr"] Sep 30 06:21:48 crc kubenswrapper[4691]: I0930 06:21:48.861780 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b7txr" Sep 30 06:21:48 crc kubenswrapper[4691]: I0930 06:21:48.865960 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-7frz7" Sep 30 06:21:48 crc kubenswrapper[4691]: I0930 06:21:48.870621 4691 patch_prober.go:28] interesting pod/router-default-5444994796-7frz7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 06:21:48 crc kubenswrapper[4691]: [-]has-synced failed: reason withheld Sep 30 06:21:48 crc kubenswrapper[4691]: [+]process-running ok Sep 30 06:21:48 crc kubenswrapper[4691]: healthz check failed Sep 30 06:21:48 crc kubenswrapper[4691]: I0930 06:21:48.870703 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7frz7" podUID="d6587adc-a984-4ce3-af8d-6739325c8604" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 06:21:48 crc kubenswrapper[4691]: I0930 06:21:48.879657 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b7txr"] Sep 30 06:21:48 crc kubenswrapper[4691]: I0930 06:21:48.966672 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-5hlv9" Sep 30 06:21:48 crc kubenswrapper[4691]: I0930 06:21:48.971079 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-5hlv9" Sep 30 06:21:48 crc kubenswrapper[4691]: I0930 06:21:48.986915 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-4bdqt" Sep 30 06:21:49 crc kubenswrapper[4691]: I0930 06:21:49.001411 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ghx4r"] Sep 30 06:21:49 crc kubenswrapper[4691]: I0930 06:21:49.045921 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fthnz" Sep 30 06:21:49 crc kubenswrapper[4691]: I0930 06:21:49.065018 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slmhx\" (UniqueName: \"kubernetes.io/projected/95812fe8-f69f-4c5a-9a91-e3e054317b63-kube-api-access-slmhx\") pod \"redhat-operators-b7txr\" (UID: \"95812fe8-f69f-4c5a-9a91-e3e054317b63\") " pod="openshift-marketplace/redhat-operators-b7txr" Sep 30 06:21:49 crc kubenswrapper[4691]: I0930 06:21:49.065120 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95812fe8-f69f-4c5a-9a91-e3e054317b63-catalog-content\") pod \"redhat-operators-b7txr\" (UID: \"95812fe8-f69f-4c5a-9a91-e3e054317b63\") " pod="openshift-marketplace/redhat-operators-b7txr" Sep 30 06:21:49 crc kubenswrapper[4691]: I0930 06:21:49.065147 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95812fe8-f69f-4c5a-9a91-e3e054317b63-utilities\") pod \"redhat-operators-b7txr\" (UID: \"95812fe8-f69f-4c5a-9a91-e3e054317b63\") " pod="openshift-marketplace/redhat-operators-b7txr" Sep 30 06:21:49 crc kubenswrapper[4691]: I0930 06:21:49.068025 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fthnz" Sep 30 06:21:49 crc kubenswrapper[4691]: I0930 06:21:49.166055 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-slmhx\" (UniqueName: \"kubernetes.io/projected/95812fe8-f69f-4c5a-9a91-e3e054317b63-kube-api-access-slmhx\") pod \"redhat-operators-b7txr\" (UID: \"95812fe8-f69f-4c5a-9a91-e3e054317b63\") " pod="openshift-marketplace/redhat-operators-b7txr" Sep 30 06:21:49 crc kubenswrapper[4691]: I0930 06:21:49.166314 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95812fe8-f69f-4c5a-9a91-e3e054317b63-catalog-content\") pod \"redhat-operators-b7txr\" (UID: \"95812fe8-f69f-4c5a-9a91-e3e054317b63\") " pod="openshift-marketplace/redhat-operators-b7txr" Sep 30 06:21:49 crc kubenswrapper[4691]: I0930 06:21:49.166405 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95812fe8-f69f-4c5a-9a91-e3e054317b63-utilities\") pod \"redhat-operators-b7txr\" (UID: \"95812fe8-f69f-4c5a-9a91-e3e054317b63\") " pod="openshift-marketplace/redhat-operators-b7txr" Sep 30 06:21:49 crc kubenswrapper[4691]: I0930 06:21:49.167273 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95812fe8-f69f-4c5a-9a91-e3e054317b63-utilities\") pod \"redhat-operators-b7txr\" (UID: \"95812fe8-f69f-4c5a-9a91-e3e054317b63\") " pod="openshift-marketplace/redhat-operators-b7txr" Sep 30 06:21:49 crc kubenswrapper[4691]: I0930 06:21:49.167627 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95812fe8-f69f-4c5a-9a91-e3e054317b63-catalog-content\") pod \"redhat-operators-b7txr\" (UID: \"95812fe8-f69f-4c5a-9a91-e3e054317b63\") " pod="openshift-marketplace/redhat-operators-b7txr" Sep 30 06:21:49 crc kubenswrapper[4691]: I0930 06:21:49.200046 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-slmhx\" (UniqueName: \"kubernetes.io/projected/95812fe8-f69f-4c5a-9a91-e3e054317b63-kube-api-access-slmhx\") pod \"redhat-operators-b7txr\" (UID: \"95812fe8-f69f-4c5a-9a91-e3e054317b63\") " pod="openshift-marketplace/redhat-operators-b7txr" Sep 30 06:21:49 crc kubenswrapper[4691]: I0930 06:21:49.200292 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b7txr" Sep 30 06:21:49 crc kubenswrapper[4691]: I0930 06:21:49.410805 4691 patch_prober.go:28] interesting pod/downloads-7954f5f757-nklgj container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Sep 30 06:21:49 crc kubenswrapper[4691]: I0930 06:21:49.411109 4691 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-nklgj" podUID="a779ae38-da0c-4953-8d61-6047076785d2" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Sep 30 06:21:49 crc kubenswrapper[4691]: I0930 06:21:49.410922 4691 patch_prober.go:28] interesting pod/downloads-7954f5f757-nklgj container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Sep 30 06:21:49 crc kubenswrapper[4691]: I0930 06:21:49.411484 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-nklgj" podUID="a779ae38-da0c-4953-8d61-6047076785d2" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.20:8080/\": dial tcp 10.217.0.20:8080: connect: connection refused" Sep 30 06:21:49 crc kubenswrapper[4691]: I0930 06:21:49.412699 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 06:21:49 crc kubenswrapper[4691]: I0930 06:21:49.451670 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-thj2p" Sep 30 06:21:49 crc kubenswrapper[4691]: I0930 06:21:49.451701 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-thj2p" Sep 30 06:21:49 crc kubenswrapper[4691]: I0930 06:21:49.455553 4691 patch_prober.go:28] interesting pod/console-f9d7485db-thj2p container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.16:8443/health\": dial tcp 10.217.0.16:8443: connect: connection refused" start-of-body= Sep 30 06:21:49 crc kubenswrapper[4691]: I0930 06:21:49.455592 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-thj2p" podUID="4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3" containerName="console" probeResult="failure" output="Get \"https://10.217.0.16:8443/health\": dial tcp 10.217.0.16:8443: connect: connection refused" Sep 30 06:21:49 crc kubenswrapper[4691]: I0930 06:21:49.571559 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bc9f14f0-3298-44a3-bf88-833165f8a662-kubelet-dir\") pod \"bc9f14f0-3298-44a3-bf88-833165f8a662\" (UID: \"bc9f14f0-3298-44a3-bf88-833165f8a662\") " Sep 30 06:21:49 crc kubenswrapper[4691]: I0930 06:21:49.571691 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bc9f14f0-3298-44a3-bf88-833165f8a662-kube-api-access\") pod \"bc9f14f0-3298-44a3-bf88-833165f8a662\" (UID: \"bc9f14f0-3298-44a3-bf88-833165f8a662\") " Sep 30 06:21:49 crc kubenswrapper[4691]: I0930 06:21:49.571703 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bc9f14f0-3298-44a3-bf88-833165f8a662-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "bc9f14f0-3298-44a3-bf88-833165f8a662" (UID: "bc9f14f0-3298-44a3-bf88-833165f8a662"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 06:21:49 crc kubenswrapper[4691]: I0930 06:21:49.571988 4691 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bc9f14f0-3298-44a3-bf88-833165f8a662-kubelet-dir\") on node \"crc\" DevicePath \"\"" Sep 30 06:21:49 crc kubenswrapper[4691]: I0930 06:21:49.578120 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc9f14f0-3298-44a3-bf88-833165f8a662-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "bc9f14f0-3298-44a3-bf88-833165f8a662" (UID: "bc9f14f0-3298-44a3-bf88-833165f8a662"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:21:49 crc kubenswrapper[4691]: I0930 06:21:49.674004 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bc9f14f0-3298-44a3-bf88-833165f8a662-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 30 06:21:49 crc kubenswrapper[4691]: I0930 06:21:49.675907 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-dchbw" Sep 30 06:21:49 crc kubenswrapper[4691]: I0930 06:21:49.773059 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-b7txr"] Sep 30 06:21:49 crc kubenswrapper[4691]: W0930 06:21:49.791186 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95812fe8_f69f_4c5a_9a91_e3e054317b63.slice/crio-dcc3355cfc48edb397cc2a21a05252699f21f58176cef2ea9ad0312e7bf1ec83 WatchSource:0}: Error finding container dcc3355cfc48edb397cc2a21a05252699f21f58176cef2ea9ad0312e7bf1ec83: Status 404 returned error can't find the container with id dcc3355cfc48edb397cc2a21a05252699f21f58176cef2ea9ad0312e7bf1ec83 Sep 30 06:21:49 crc kubenswrapper[4691]: I0930 06:21:49.835002 4691 generic.go:334] "Generic (PLEG): container finished" podID="c4fda877-ac4f-419b-9cf9-933c5bca0aba" containerID="07b3ae1fda85d658f298e48b3517de80abf2de14675389df2e1ec1d7e416a391" exitCode=0 Sep 30 06:21:49 crc kubenswrapper[4691]: I0930 06:21:49.835072 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ghx4r" event={"ID":"c4fda877-ac4f-419b-9cf9-933c5bca0aba","Type":"ContainerDied","Data":"07b3ae1fda85d658f298e48b3517de80abf2de14675389df2e1ec1d7e416a391"} Sep 30 06:21:49 crc kubenswrapper[4691]: I0930 06:21:49.835105 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ghx4r" event={"ID":"c4fda877-ac4f-419b-9cf9-933c5bca0aba","Type":"ContainerStarted","Data":"8e02838421031fce338c84df67c95e4c0b24c902e0f14c3bd6156639abb4f705"} Sep 30 06:21:49 crc kubenswrapper[4691]: I0930 06:21:49.840060 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"bc9f14f0-3298-44a3-bf88-833165f8a662","Type":"ContainerDied","Data":"08e40c4a256c2d92fdbaaa7fc5714acfdb1b97628b713b9626463986e6c2547d"} Sep 30 06:21:49 crc kubenswrapper[4691]: I0930 06:21:49.840307 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="08e40c4a256c2d92fdbaaa7fc5714acfdb1b97628b713b9626463986e6c2547d" Sep 30 06:21:49 crc kubenswrapper[4691]: I0930 06:21:49.840444 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Sep 30 06:21:49 crc kubenswrapper[4691]: I0930 06:21:49.847104 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b7txr" event={"ID":"95812fe8-f69f-4c5a-9a91-e3e054317b63","Type":"ContainerStarted","Data":"dcc3355cfc48edb397cc2a21a05252699f21f58176cef2ea9ad0312e7bf1ec83"} Sep 30 06:21:49 crc kubenswrapper[4691]: I0930 06:21:49.869549 4691 patch_prober.go:28] interesting pod/router-default-5444994796-7frz7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 06:21:49 crc kubenswrapper[4691]: [-]has-synced failed: reason withheld Sep 30 06:21:49 crc kubenswrapper[4691]: [+]process-running ok Sep 30 06:21:49 crc kubenswrapper[4691]: healthz check failed Sep 30 06:21:49 crc kubenswrapper[4691]: I0930 06:21:49.869588 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7frz7" podUID="d6587adc-a984-4ce3-af8d-6739325c8604" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 06:21:49 crc kubenswrapper[4691]: I0930 06:21:49.887112 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-dzf49" Sep 30 06:21:50 crc kubenswrapper[4691]: I0930 06:21:50.707786 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Sep 30 06:21:50 crc kubenswrapper[4691]: E0930 06:21:50.708013 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc9f14f0-3298-44a3-bf88-833165f8a662" containerName="pruner" Sep 30 06:21:50 crc kubenswrapper[4691]: I0930 06:21:50.708025 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc9f14f0-3298-44a3-bf88-833165f8a662" containerName="pruner" Sep 30 06:21:50 crc kubenswrapper[4691]: I0930 06:21:50.708113 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc9f14f0-3298-44a3-bf88-833165f8a662" containerName="pruner" Sep 30 06:21:50 crc kubenswrapper[4691]: I0930 06:21:50.708440 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 06:21:50 crc kubenswrapper[4691]: I0930 06:21:50.711141 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Sep 30 06:21:50 crc kubenswrapper[4691]: I0930 06:21:50.712359 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Sep 30 06:21:50 crc kubenswrapper[4691]: I0930 06:21:50.716091 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Sep 30 06:21:50 crc kubenswrapper[4691]: I0930 06:21:50.801176 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dd902b5e-df8a-47be-9781-8a384f3849fb-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"dd902b5e-df8a-47be-9781-8a384f3849fb\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 06:21:50 crc kubenswrapper[4691]: I0930 06:21:50.801248 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dd902b5e-df8a-47be-9781-8a384f3849fb-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"dd902b5e-df8a-47be-9781-8a384f3849fb\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 06:21:50 crc kubenswrapper[4691]: I0930 06:21:50.868138 4691 generic.go:334] "Generic (PLEG): container finished" podID="95812fe8-f69f-4c5a-9a91-e3e054317b63" containerID="6ee4088e8ad236be90928e426494b7ce66716b19e1fd104637fc3c874eea625a" exitCode=0 Sep 30 06:21:50 crc kubenswrapper[4691]: I0930 06:21:50.868176 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b7txr" event={"ID":"95812fe8-f69f-4c5a-9a91-e3e054317b63","Type":"ContainerDied","Data":"6ee4088e8ad236be90928e426494b7ce66716b19e1fd104637fc3c874eea625a"} Sep 30 06:21:50 crc kubenswrapper[4691]: I0930 06:21:50.868999 4691 patch_prober.go:28] interesting pod/router-default-5444994796-7frz7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 06:21:50 crc kubenswrapper[4691]: [-]has-synced failed: reason withheld Sep 30 06:21:50 crc kubenswrapper[4691]: [+]process-running ok Sep 30 06:21:50 crc kubenswrapper[4691]: healthz check failed Sep 30 06:21:50 crc kubenswrapper[4691]: I0930 06:21:50.869020 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7frz7" podUID="d6587adc-a984-4ce3-af8d-6739325c8604" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 06:21:50 crc kubenswrapper[4691]: I0930 06:21:50.902498 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dd902b5e-df8a-47be-9781-8a384f3849fb-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"dd902b5e-df8a-47be-9781-8a384f3849fb\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 06:21:50 crc kubenswrapper[4691]: I0930 06:21:50.902604 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dd902b5e-df8a-47be-9781-8a384f3849fb-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"dd902b5e-df8a-47be-9781-8a384f3849fb\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 06:21:50 crc kubenswrapper[4691]: I0930 06:21:50.902673 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dd902b5e-df8a-47be-9781-8a384f3849fb-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"dd902b5e-df8a-47be-9781-8a384f3849fb\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 06:21:50 crc kubenswrapper[4691]: I0930 06:21:50.924996 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dd902b5e-df8a-47be-9781-8a384f3849fb-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"dd902b5e-df8a-47be-9781-8a384f3849fb\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 06:21:51 crc kubenswrapper[4691]: I0930 06:21:51.044307 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 06:21:51 crc kubenswrapper[4691]: I0930 06:21:51.688753 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-stxbl" Sep 30 06:21:51 crc kubenswrapper[4691]: I0930 06:21:51.736557 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Sep 30 06:21:51 crc kubenswrapper[4691]: W0930 06:21:51.797818 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poddd902b5e_df8a_47be_9781_8a384f3849fb.slice/crio-4f6edd1717a969236b5aee56255cdc38f5dc928b19d34207e78f89597c1d5c13 WatchSource:0}: Error finding container 4f6edd1717a969236b5aee56255cdc38f5dc928b19d34207e78f89597c1d5c13: Status 404 returned error can't find the container with id 4f6edd1717a969236b5aee56255cdc38f5dc928b19d34207e78f89597c1d5c13 Sep 30 06:21:51 crc kubenswrapper[4691]: I0930 06:21:51.873540 4691 patch_prober.go:28] interesting pod/router-default-5444994796-7frz7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 06:21:51 crc kubenswrapper[4691]: [-]has-synced failed: reason withheld Sep 30 06:21:51 crc kubenswrapper[4691]: [+]process-running ok Sep 30 06:21:51 crc kubenswrapper[4691]: healthz check failed Sep 30 06:21:51 crc kubenswrapper[4691]: I0930 06:21:51.873602 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7frz7" podUID="d6587adc-a984-4ce3-af8d-6739325c8604" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 06:21:51 crc kubenswrapper[4691]: I0930 06:21:51.884609 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"dd902b5e-df8a-47be-9781-8a384f3849fb","Type":"ContainerStarted","Data":"4f6edd1717a969236b5aee56255cdc38f5dc928b19d34207e78f89597c1d5c13"} Sep 30 06:21:52 crc kubenswrapper[4691]: I0930 06:21:52.850471 4691 patch_prober.go:28] interesting pod/machine-config-daemon-4w4k6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 06:21:52 crc kubenswrapper[4691]: I0930 06:21:52.850751 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 06:21:52 crc kubenswrapper[4691]: I0930 06:21:52.869257 4691 patch_prober.go:28] interesting pod/router-default-5444994796-7frz7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 06:21:52 crc kubenswrapper[4691]: [-]has-synced failed: reason withheld Sep 30 06:21:52 crc kubenswrapper[4691]: [+]process-running ok Sep 30 06:21:52 crc kubenswrapper[4691]: healthz check failed Sep 30 06:21:52 crc kubenswrapper[4691]: I0930 06:21:52.869326 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7frz7" podUID="d6587adc-a984-4ce3-af8d-6739325c8604" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 06:21:52 crc kubenswrapper[4691]: I0930 06:21:52.915285 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"dd902b5e-df8a-47be-9781-8a384f3849fb","Type":"ContainerStarted","Data":"fd374ea7798591b523f429687f6470969ca78117dbf311e762aa4afb6d833398"} Sep 30 06:21:52 crc kubenswrapper[4691]: I0930 06:21:52.933608 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.933580448 podStartE2EDuration="2.933580448s" podCreationTimestamp="2025-09-30 06:21:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:21:52.928559978 +0000 UTC m=+156.403581018" watchObservedRunningTime="2025-09-30 06:21:52.933580448 +0000 UTC m=+156.408601508" Sep 30 06:21:53 crc kubenswrapper[4691]: I0930 06:21:53.869796 4691 patch_prober.go:28] interesting pod/router-default-5444994796-7frz7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 06:21:53 crc kubenswrapper[4691]: [-]has-synced failed: reason withheld Sep 30 06:21:53 crc kubenswrapper[4691]: [+]process-running ok Sep 30 06:21:53 crc kubenswrapper[4691]: healthz check failed Sep 30 06:21:53 crc kubenswrapper[4691]: I0930 06:21:53.870103 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7frz7" podUID="d6587adc-a984-4ce3-af8d-6739325c8604" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 06:21:53 crc kubenswrapper[4691]: I0930 06:21:53.931262 4691 generic.go:334] "Generic (PLEG): container finished" podID="dd902b5e-df8a-47be-9781-8a384f3849fb" containerID="fd374ea7798591b523f429687f6470969ca78117dbf311e762aa4afb6d833398" exitCode=0 Sep 30 06:21:53 crc kubenswrapper[4691]: I0930 06:21:53.931309 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"dd902b5e-df8a-47be-9781-8a384f3849fb","Type":"ContainerDied","Data":"fd374ea7798591b523f429687f6470969ca78117dbf311e762aa4afb6d833398"} Sep 30 06:21:54 crc kubenswrapper[4691]: I0930 06:21:54.868522 4691 patch_prober.go:28] interesting pod/router-default-5444994796-7frz7 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Sep 30 06:21:54 crc kubenswrapper[4691]: [-]has-synced failed: reason withheld Sep 30 06:21:54 crc kubenswrapper[4691]: [+]process-running ok Sep 30 06:21:54 crc kubenswrapper[4691]: healthz check failed Sep 30 06:21:54 crc kubenswrapper[4691]: I0930 06:21:54.868575 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7frz7" podUID="d6587adc-a984-4ce3-af8d-6739325c8604" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 06:21:55 crc kubenswrapper[4691]: I0930 06:21:55.889229 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-7frz7" Sep 30 06:21:55 crc kubenswrapper[4691]: I0930 06:21:55.893262 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-7frz7" Sep 30 06:21:59 crc kubenswrapper[4691]: I0930 06:21:59.148398 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 06:21:59 crc kubenswrapper[4691]: I0930 06:21:59.221306 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dd902b5e-df8a-47be-9781-8a384f3849fb-kube-api-access\") pod \"dd902b5e-df8a-47be-9781-8a384f3849fb\" (UID: \"dd902b5e-df8a-47be-9781-8a384f3849fb\") " Sep 30 06:21:59 crc kubenswrapper[4691]: I0930 06:21:59.221366 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dd902b5e-df8a-47be-9781-8a384f3849fb-kubelet-dir\") pod \"dd902b5e-df8a-47be-9781-8a384f3849fb\" (UID: \"dd902b5e-df8a-47be-9781-8a384f3849fb\") " Sep 30 06:21:59 crc kubenswrapper[4691]: I0930 06:21:59.221720 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dd902b5e-df8a-47be-9781-8a384f3849fb-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "dd902b5e-df8a-47be-9781-8a384f3849fb" (UID: "dd902b5e-df8a-47be-9781-8a384f3849fb"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 06:21:59 crc kubenswrapper[4691]: I0930 06:21:59.230981 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd902b5e-df8a-47be-9781-8a384f3849fb-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "dd902b5e-df8a-47be-9781-8a384f3849fb" (UID: "dd902b5e-df8a-47be-9781-8a384f3849fb"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:21:59 crc kubenswrapper[4691]: I0930 06:21:59.322671 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/dd902b5e-df8a-47be-9781-8a384f3849fb-kube-api-access\") on node \"crc\" DevicePath \"\"" Sep 30 06:21:59 crc kubenswrapper[4691]: I0930 06:21:59.322718 4691 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/dd902b5e-df8a-47be-9781-8a384f3849fb-kubelet-dir\") on node \"crc\" DevicePath \"\"" Sep 30 06:21:59 crc kubenswrapper[4691]: I0930 06:21:59.417412 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-nklgj" Sep 30 06:21:59 crc kubenswrapper[4691]: I0930 06:21:59.463417 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-thj2p" Sep 30 06:21:59 crc kubenswrapper[4691]: I0930 06:21:59.469990 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-thj2p" Sep 30 06:21:59 crc kubenswrapper[4691]: I0930 06:21:59.993517 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"dd902b5e-df8a-47be-9781-8a384f3849fb","Type":"ContainerDied","Data":"4f6edd1717a969236b5aee56255cdc38f5dc928b19d34207e78f89597c1d5c13"} Sep 30 06:21:59 crc kubenswrapper[4691]: I0930 06:21:59.993563 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Sep 30 06:21:59 crc kubenswrapper[4691]: I0930 06:21:59.993587 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f6edd1717a969236b5aee56255cdc38f5dc928b19d34207e78f89597c1d5c13" Sep 30 06:22:01 crc kubenswrapper[4691]: I0930 06:22:01.255118 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a8ed6f92-0b98-4b1b-a46e-4d0604d686a1-metrics-certs\") pod \"network-metrics-daemon-svjxq\" (UID: \"a8ed6f92-0b98-4b1b-a46e-4d0604d686a1\") " pod="openshift-multus/network-metrics-daemon-svjxq" Sep 30 06:22:01 crc kubenswrapper[4691]: I0930 06:22:01.271033 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a8ed6f92-0b98-4b1b-a46e-4d0604d686a1-metrics-certs\") pod \"network-metrics-daemon-svjxq\" (UID: \"a8ed6f92-0b98-4b1b-a46e-4d0604d686a1\") " pod="openshift-multus/network-metrics-daemon-svjxq" Sep 30 06:22:01 crc kubenswrapper[4691]: I0930 06:22:01.451517 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-svjxq" Sep 30 06:22:06 crc kubenswrapper[4691]: I0930 06:22:06.931607 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" Sep 30 06:22:12 crc kubenswrapper[4691]: I0930 06:22:12.351580 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-svjxq"] Sep 30 06:22:13 crc kubenswrapper[4691]: I0930 06:22:13.068713 4691 generic.go:334] "Generic (PLEG): container finished" podID="07c0a985-95a5-4fd3-a271-9e46d5f51af4" containerID="b62091d059e502a69402978485949ed26afdd47b3721485945a887f247e81825" exitCode=0 Sep 30 06:22:13 crc kubenswrapper[4691]: I0930 06:22:13.069138 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qxlzp" event={"ID":"07c0a985-95a5-4fd3-a271-9e46d5f51af4","Type":"ContainerDied","Data":"b62091d059e502a69402978485949ed26afdd47b3721485945a887f247e81825"} Sep 30 06:22:13 crc kubenswrapper[4691]: I0930 06:22:13.072673 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9mzhd" event={"ID":"fecc9d18-f13e-4620-a9da-b620e9660ec7","Type":"ContainerDied","Data":"d31262f193b2386c4a7f7d68a053a74f942daa91b8f439e7cc9983492cf517d5"} Sep 30 06:22:13 crc kubenswrapper[4691]: I0930 06:22:13.072496 4691 generic.go:334] "Generic (PLEG): container finished" podID="fecc9d18-f13e-4620-a9da-b620e9660ec7" containerID="d31262f193b2386c4a7f7d68a053a74f942daa91b8f439e7cc9983492cf517d5" exitCode=0 Sep 30 06:22:13 crc kubenswrapper[4691]: I0930 06:22:13.081675 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b7txr" event={"ID":"95812fe8-f69f-4c5a-9a91-e3e054317b63","Type":"ContainerStarted","Data":"e68ca679ff39c44ff33d0557bdcecfc31138c271b25c9c2f783af24831b01623"} Sep 30 06:22:13 crc kubenswrapper[4691]: I0930 06:22:13.096242 4691 generic.go:334] "Generic (PLEG): container finished" podID="c4fda877-ac4f-419b-9cf9-933c5bca0aba" containerID="325e589186d33ce364d6baa98d882e0d1befa51bbd84c16cbea1c0a74fda1a10" exitCode=0 Sep 30 06:22:13 crc kubenswrapper[4691]: I0930 06:22:13.096525 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ghx4r" event={"ID":"c4fda877-ac4f-419b-9cf9-933c5bca0aba","Type":"ContainerDied","Data":"325e589186d33ce364d6baa98d882e0d1befa51bbd84c16cbea1c0a74fda1a10"} Sep 30 06:22:13 crc kubenswrapper[4691]: I0930 06:22:13.106531 4691 generic.go:334] "Generic (PLEG): container finished" podID="c5278b1f-f696-4223-a657-24a3d309b5f3" containerID="846841d356bb16c338536c062e0bc94f540fde4d40677855486831c29925e33f" exitCode=0 Sep 30 06:22:13 crc kubenswrapper[4691]: I0930 06:22:13.106585 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x8kfg" event={"ID":"c5278b1f-f696-4223-a657-24a3d309b5f3","Type":"ContainerDied","Data":"846841d356bb16c338536c062e0bc94f540fde4d40677855486831c29925e33f"} Sep 30 06:22:13 crc kubenswrapper[4691]: I0930 06:22:13.109680 4691 generic.go:334] "Generic (PLEG): container finished" podID="b2bb2e76-e094-4320-8bab-f54bf623dcb1" containerID="ae55550324a97f4074df6df0f895bc9770003d48668e79fd555e95318262b96c" exitCode=0 Sep 30 06:22:13 crc kubenswrapper[4691]: I0930 06:22:13.109808 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vzgfj" event={"ID":"b2bb2e76-e094-4320-8bab-f54bf623dcb1","Type":"ContainerDied","Data":"ae55550324a97f4074df6df0f895bc9770003d48668e79fd555e95318262b96c"} Sep 30 06:22:13 crc kubenswrapper[4691]: I0930 06:22:13.114152 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-svjxq" event={"ID":"a8ed6f92-0b98-4b1b-a46e-4d0604d686a1","Type":"ContainerStarted","Data":"fc5f95d6803f6bd78067787bec80611ff5fd263236b2a9ae55f7570bde73076f"} Sep 30 06:22:13 crc kubenswrapper[4691]: I0930 06:22:13.114187 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-svjxq" event={"ID":"a8ed6f92-0b98-4b1b-a46e-4d0604d686a1","Type":"ContainerStarted","Data":"10a6d5c92c832af02efceac112ad4f5f7451bddb86839ed58a28b4811a1891ab"} Sep 30 06:22:13 crc kubenswrapper[4691]: I0930 06:22:13.119053 4691 generic.go:334] "Generic (PLEG): container finished" podID="2fae00bc-923f-4e3c-979b-bfde482dc0b0" containerID="2f2bd095f86567c7528de9f72f3d236e4a63f3be42d89201d96f80ea0cd07e82" exitCode=0 Sep 30 06:22:13 crc kubenswrapper[4691]: I0930 06:22:13.119106 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d6xvc" event={"ID":"2fae00bc-923f-4e3c-979b-bfde482dc0b0","Type":"ContainerDied","Data":"2f2bd095f86567c7528de9f72f3d236e4a63f3be42d89201d96f80ea0cd07e82"} Sep 30 06:22:13 crc kubenswrapper[4691]: I0930 06:22:13.122122 4691 generic.go:334] "Generic (PLEG): container finished" podID="ca900afa-86c0-4fe0-ba3d-d5d927db24b7" containerID="f36f6ea592a163cc14363c8c7410a9dabebab2bb187c2d37f01d94aff5163cda" exitCode=0 Sep 30 06:22:13 crc kubenswrapper[4691]: I0930 06:22:13.122197 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qln5x" event={"ID":"ca900afa-86c0-4fe0-ba3d-d5d927db24b7","Type":"ContainerDied","Data":"f36f6ea592a163cc14363c8c7410a9dabebab2bb187c2d37f01d94aff5163cda"} Sep 30 06:22:14 crc kubenswrapper[4691]: I0930 06:22:14.133123 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-svjxq" event={"ID":"a8ed6f92-0b98-4b1b-a46e-4d0604d686a1","Type":"ContainerStarted","Data":"ca4a03d8108f88ef29c498c8cc6d1e0ed27308925fa682a3ffc557f8b99fc233"} Sep 30 06:22:14 crc kubenswrapper[4691]: I0930 06:22:14.138098 4691 generic.go:334] "Generic (PLEG): container finished" podID="95812fe8-f69f-4c5a-9a91-e3e054317b63" containerID="e68ca679ff39c44ff33d0557bdcecfc31138c271b25c9c2f783af24831b01623" exitCode=0 Sep 30 06:22:14 crc kubenswrapper[4691]: I0930 06:22:14.138338 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b7txr" event={"ID":"95812fe8-f69f-4c5a-9a91-e3e054317b63","Type":"ContainerDied","Data":"e68ca679ff39c44ff33d0557bdcecfc31138c271b25c9c2f783af24831b01623"} Sep 30 06:22:14 crc kubenswrapper[4691]: I0930 06:22:14.164747 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-svjxq" podStartSLOduration=156.164713033 podStartE2EDuration="2m36.164713033s" podCreationTimestamp="2025-09-30 06:19:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:22:14.155172559 +0000 UTC m=+177.630193659" watchObservedRunningTime="2025-09-30 06:22:14.164713033 +0000 UTC m=+177.639734153" Sep 30 06:22:15 crc kubenswrapper[4691]: I0930 06:22:15.146865 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9mzhd" event={"ID":"fecc9d18-f13e-4620-a9da-b620e9660ec7","Type":"ContainerStarted","Data":"e8c6fa72db0c9e7d81b84402cc9c28a73de65b554ae0c1a357a6a5d81dadae3f"} Sep 30 06:22:15 crc kubenswrapper[4691]: I0930 06:22:15.150293 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qln5x" event={"ID":"ca900afa-86c0-4fe0-ba3d-d5d927db24b7","Type":"ContainerStarted","Data":"bc8f5b68997e3a0b344d66030cdc6f2de9602f301ebb44c0dad2b4a164c943c6"} Sep 30 06:22:15 crc kubenswrapper[4691]: I0930 06:22:15.152645 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ghx4r" event={"ID":"c4fda877-ac4f-419b-9cf9-933c5bca0aba","Type":"ContainerStarted","Data":"be5cabb9253f82cb62fe2f55d5e5864c8a9d3b4fef3f5652b07516dbc74ebac0"} Sep 30 06:22:15 crc kubenswrapper[4691]: I0930 06:22:15.156125 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vzgfj" event={"ID":"b2bb2e76-e094-4320-8bab-f54bf623dcb1","Type":"ContainerStarted","Data":"425e7dd7c338e2cf530893d676a01438c6b86ddd8fbb35ebc43df3f02c5e88da"} Sep 30 06:22:15 crc kubenswrapper[4691]: I0930 06:22:15.165055 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9mzhd" podStartSLOduration=2.215009866 podStartE2EDuration="28.165034604s" podCreationTimestamp="2025-09-30 06:21:47 +0000 UTC" firstStartedPulling="2025-09-30 06:21:48.823404357 +0000 UTC m=+152.298425397" lastFinishedPulling="2025-09-30 06:22:14.773429055 +0000 UTC m=+178.248450135" observedRunningTime="2025-09-30 06:22:15.164521277 +0000 UTC m=+178.639542347" watchObservedRunningTime="2025-09-30 06:22:15.165034604 +0000 UTC m=+178.640055664" Sep 30 06:22:15 crc kubenswrapper[4691]: I0930 06:22:15.201341 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ghx4r" podStartSLOduration=2.133576018 podStartE2EDuration="27.201318786s" podCreationTimestamp="2025-09-30 06:21:48 +0000 UTC" firstStartedPulling="2025-09-30 06:21:49.837251597 +0000 UTC m=+153.312272637" lastFinishedPulling="2025-09-30 06:22:14.904994365 +0000 UTC m=+178.380015405" observedRunningTime="2025-09-30 06:22:15.190952006 +0000 UTC m=+178.665973066" watchObservedRunningTime="2025-09-30 06:22:15.201318786 +0000 UTC m=+178.676339826" Sep 30 06:22:15 crc kubenswrapper[4691]: I0930 06:22:15.217145 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-qln5x" podStartSLOduration=2.159609775 podStartE2EDuration="30.217122068s" podCreationTimestamp="2025-09-30 06:21:45 +0000 UTC" firstStartedPulling="2025-09-30 06:21:46.75675627 +0000 UTC m=+150.231777310" lastFinishedPulling="2025-09-30 06:22:14.814268523 +0000 UTC m=+178.289289603" observedRunningTime="2025-09-30 06:22:15.210622771 +0000 UTC m=+178.685643861" watchObservedRunningTime="2025-09-30 06:22:15.217122068 +0000 UTC m=+178.692143118" Sep 30 06:22:15 crc kubenswrapper[4691]: I0930 06:22:15.236566 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vzgfj" podStartSLOduration=2.142558233 podStartE2EDuration="30.236546414s" podCreationTimestamp="2025-09-30 06:21:45 +0000 UTC" firstStartedPulling="2025-09-30 06:21:46.769796103 +0000 UTC m=+150.244817144" lastFinishedPulling="2025-09-30 06:22:14.863784285 +0000 UTC m=+178.338805325" observedRunningTime="2025-09-30 06:22:15.231911838 +0000 UTC m=+178.706932928" watchObservedRunningTime="2025-09-30 06:22:15.236546414 +0000 UTC m=+178.711567464" Sep 30 06:22:15 crc kubenswrapper[4691]: I0930 06:22:15.414625 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vzgfj" Sep 30 06:22:15 crc kubenswrapper[4691]: I0930 06:22:15.414678 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vzgfj" Sep 30 06:22:15 crc kubenswrapper[4691]: I0930 06:22:15.635629 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-qln5x" Sep 30 06:22:15 crc kubenswrapper[4691]: I0930 06:22:15.635674 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-qln5x" Sep 30 06:22:16 crc kubenswrapper[4691]: I0930 06:22:16.162773 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x8kfg" event={"ID":"c5278b1f-f696-4223-a657-24a3d309b5f3","Type":"ContainerStarted","Data":"53ef4a47c9fdac56b0624ca0a487e8d588dcef13b2ef66541ad012450d8a2e02"} Sep 30 06:22:16 crc kubenswrapper[4691]: I0930 06:22:16.165165 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qxlzp" event={"ID":"07c0a985-95a5-4fd3-a271-9e46d5f51af4","Type":"ContainerStarted","Data":"00af7c51e1eba2c0685818e4b801f0cb159667916eae4ab88c892fba68316522"} Sep 30 06:22:16 crc kubenswrapper[4691]: I0930 06:22:16.167674 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b7txr" event={"ID":"95812fe8-f69f-4c5a-9a91-e3e054317b63","Type":"ContainerStarted","Data":"d3c17fc449b1570adc3df7bcf235c498f46bf3bff8da847c598fa7523fac163c"} Sep 30 06:22:16 crc kubenswrapper[4691]: I0930 06:22:16.170149 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d6xvc" event={"ID":"2fae00bc-923f-4e3c-979b-bfde482dc0b0","Type":"ContainerStarted","Data":"abb9224e85b515cadbd0167960598a6b0d482908b0040f94f335e6e5495c6243"} Sep 30 06:22:16 crc kubenswrapper[4691]: I0930 06:22:16.200202 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-x8kfg" podStartSLOduration=3.854166275 podStartE2EDuration="31.20017946s" podCreationTimestamp="2025-09-30 06:21:45 +0000 UTC" firstStartedPulling="2025-09-30 06:21:47.788275622 +0000 UTC m=+151.263296662" lastFinishedPulling="2025-09-30 06:22:15.134288797 +0000 UTC m=+178.609309847" observedRunningTime="2025-09-30 06:22:16.18601349 +0000 UTC m=+179.661034540" watchObservedRunningTime="2025-09-30 06:22:16.20017946 +0000 UTC m=+179.675200500" Sep 30 06:22:16 crc kubenswrapper[4691]: I0930 06:22:16.202240 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qxlzp" podStartSLOduration=2.952890099 podStartE2EDuration="31.202231095s" podCreationTimestamp="2025-09-30 06:21:45 +0000 UTC" firstStartedPulling="2025-09-30 06:21:46.766823469 +0000 UTC m=+150.241844509" lastFinishedPulling="2025-09-30 06:22:15.016164455 +0000 UTC m=+178.491185505" observedRunningTime="2025-09-30 06:22:16.199156737 +0000 UTC m=+179.674177777" watchObservedRunningTime="2025-09-30 06:22:16.202231095 +0000 UTC m=+179.677252145" Sep 30 06:22:16 crc kubenswrapper[4691]: I0930 06:22:16.217235 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-b7txr" podStartSLOduration=3.887824823 podStartE2EDuration="28.217221441s" podCreationTimestamp="2025-09-30 06:21:48 +0000 UTC" firstStartedPulling="2025-09-30 06:21:50.871200775 +0000 UTC m=+154.346221815" lastFinishedPulling="2025-09-30 06:22:15.200597393 +0000 UTC m=+178.675618433" observedRunningTime="2025-09-30 06:22:16.21403318 +0000 UTC m=+179.689054220" watchObservedRunningTime="2025-09-30 06:22:16.217221441 +0000 UTC m=+179.692242481" Sep 30 06:22:16 crc kubenswrapper[4691]: I0930 06:22:16.231466 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-d6xvc" podStartSLOduration=3.031147376 podStartE2EDuration="29.231453983s" podCreationTimestamp="2025-09-30 06:21:47 +0000 UTC" firstStartedPulling="2025-09-30 06:21:48.826595809 +0000 UTC m=+152.301616849" lastFinishedPulling="2025-09-30 06:22:15.026902406 +0000 UTC m=+178.501923456" observedRunningTime="2025-09-30 06:22:16.228658425 +0000 UTC m=+179.703679465" watchObservedRunningTime="2025-09-30 06:22:16.231453983 +0000 UTC m=+179.706475023" Sep 30 06:22:16 crc kubenswrapper[4691]: I0930 06:22:16.603946 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-vzgfj" podUID="b2bb2e76-e094-4320-8bab-f54bf623dcb1" containerName="registry-server" probeResult="failure" output=< Sep 30 06:22:16 crc kubenswrapper[4691]: timeout: failed to connect service ":50051" within 1s Sep 30 06:22:16 crc kubenswrapper[4691]: > Sep 30 06:22:16 crc kubenswrapper[4691]: I0930 06:22:16.675788 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-qln5x" podUID="ca900afa-86c0-4fe0-ba3d-d5d927db24b7" containerName="registry-server" probeResult="failure" output=< Sep 30 06:22:16 crc kubenswrapper[4691]: timeout: failed to connect service ":50051" within 1s Sep 30 06:22:16 crc kubenswrapper[4691]: > Sep 30 06:22:17 crc kubenswrapper[4691]: I0930 06:22:17.362660 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9mzhd" Sep 30 06:22:17 crc kubenswrapper[4691]: I0930 06:22:17.362983 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9mzhd" Sep 30 06:22:17 crc kubenswrapper[4691]: I0930 06:22:17.428691 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9mzhd" Sep 30 06:22:17 crc kubenswrapper[4691]: I0930 06:22:17.801313 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-d6xvc" Sep 30 06:22:17 crc kubenswrapper[4691]: I0930 06:22:17.801386 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-d6xvc" Sep 30 06:22:17 crc kubenswrapper[4691]: I0930 06:22:17.846650 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-d6xvc" Sep 30 06:22:18 crc kubenswrapper[4691]: I0930 06:22:18.768947 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ghx4r" Sep 30 06:22:18 crc kubenswrapper[4691]: I0930 06:22:18.768996 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ghx4r" Sep 30 06:22:19 crc kubenswrapper[4691]: I0930 06:22:19.201928 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-b7txr" Sep 30 06:22:19 crc kubenswrapper[4691]: I0930 06:22:19.202334 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-b7txr" Sep 30 06:22:19 crc kubenswrapper[4691]: I0930 06:22:19.685376 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-s2xqr" Sep 30 06:22:19 crc kubenswrapper[4691]: I0930 06:22:19.839995 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ghx4r" podUID="c4fda877-ac4f-419b-9cf9-933c5bca0aba" containerName="registry-server" probeResult="failure" output=< Sep 30 06:22:19 crc kubenswrapper[4691]: timeout: failed to connect service ":50051" within 1s Sep 30 06:22:19 crc kubenswrapper[4691]: > Sep 30 06:22:20 crc kubenswrapper[4691]: I0930 06:22:20.261544 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-b7txr" podUID="95812fe8-f69f-4c5a-9a91-e3e054317b63" containerName="registry-server" probeResult="failure" output=< Sep 30 06:22:20 crc kubenswrapper[4691]: timeout: failed to connect service ":50051" within 1s Sep 30 06:22:20 crc kubenswrapper[4691]: > Sep 30 06:22:22 crc kubenswrapper[4691]: I0930 06:22:22.850054 4691 patch_prober.go:28] interesting pod/machine-config-daemon-4w4k6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 06:22:22 crc kubenswrapper[4691]: I0930 06:22:22.850417 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 06:22:25 crc kubenswrapper[4691]: I0930 06:22:25.283425 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Sep 30 06:22:25 crc kubenswrapper[4691]: I0930 06:22:25.475775 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vzgfj" Sep 30 06:22:25 crc kubenswrapper[4691]: I0930 06:22:25.534463 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vzgfj" Sep 30 06:22:25 crc kubenswrapper[4691]: I0930 06:22:25.694862 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-qln5x" Sep 30 06:22:25 crc kubenswrapper[4691]: I0930 06:22:25.736707 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-qln5x" Sep 30 06:22:25 crc kubenswrapper[4691]: I0930 06:22:25.768140 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qxlzp" Sep 30 06:22:25 crc kubenswrapper[4691]: I0930 06:22:25.768197 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qxlzp" Sep 30 06:22:25 crc kubenswrapper[4691]: I0930 06:22:25.807521 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qxlzp" Sep 30 06:22:26 crc kubenswrapper[4691]: I0930 06:22:26.121148 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-x8kfg" Sep 30 06:22:26 crc kubenswrapper[4691]: I0930 06:22:26.121197 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-x8kfg" Sep 30 06:22:26 crc kubenswrapper[4691]: I0930 06:22:26.176721 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-x8kfg" Sep 30 06:22:26 crc kubenswrapper[4691]: I0930 06:22:26.292548 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qxlzp" Sep 30 06:22:26 crc kubenswrapper[4691]: I0930 06:22:26.303728 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-x8kfg" Sep 30 06:22:27 crc kubenswrapper[4691]: I0930 06:22:27.435082 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9mzhd" Sep 30 06:22:27 crc kubenswrapper[4691]: I0930 06:22:27.510616 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-x8kfg"] Sep 30 06:22:27 crc kubenswrapper[4691]: I0930 06:22:27.845249 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-d6xvc" Sep 30 06:22:28 crc kubenswrapper[4691]: I0930 06:22:28.111966 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qxlzp"] Sep 30 06:22:28 crc kubenswrapper[4691]: I0930 06:22:28.264833 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-x8kfg" podUID="c5278b1f-f696-4223-a657-24a3d309b5f3" containerName="registry-server" containerID="cri-o://53ef4a47c9fdac56b0624ca0a487e8d588dcef13b2ef66541ad012450d8a2e02" gracePeriod=2 Sep 30 06:22:28 crc kubenswrapper[4691]: I0930 06:22:28.264952 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qxlzp" podUID="07c0a985-95a5-4fd3-a271-9e46d5f51af4" containerName="registry-server" containerID="cri-o://00af7c51e1eba2c0685818e4b801f0cb159667916eae4ab88c892fba68316522" gracePeriod=2 Sep 30 06:22:28 crc kubenswrapper[4691]: I0930 06:22:28.800493 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x8kfg" Sep 30 06:22:28 crc kubenswrapper[4691]: I0930 06:22:28.802564 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qxlzp" Sep 30 06:22:28 crc kubenswrapper[4691]: I0930 06:22:28.816385 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ghx4r" Sep 30 06:22:28 crc kubenswrapper[4691]: I0930 06:22:28.843949 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g958m\" (UniqueName: \"kubernetes.io/projected/07c0a985-95a5-4fd3-a271-9e46d5f51af4-kube-api-access-g958m\") pod \"07c0a985-95a5-4fd3-a271-9e46d5f51af4\" (UID: \"07c0a985-95a5-4fd3-a271-9e46d5f51af4\") " Sep 30 06:22:28 crc kubenswrapper[4691]: I0930 06:22:28.844013 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5278b1f-f696-4223-a657-24a3d309b5f3-utilities\") pod \"c5278b1f-f696-4223-a657-24a3d309b5f3\" (UID: \"c5278b1f-f696-4223-a657-24a3d309b5f3\") " Sep 30 06:22:28 crc kubenswrapper[4691]: I0930 06:22:28.844059 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5278b1f-f696-4223-a657-24a3d309b5f3-catalog-content\") pod \"c5278b1f-f696-4223-a657-24a3d309b5f3\" (UID: \"c5278b1f-f696-4223-a657-24a3d309b5f3\") " Sep 30 06:22:28 crc kubenswrapper[4691]: I0930 06:22:28.844099 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07c0a985-95a5-4fd3-a271-9e46d5f51af4-utilities\") pod \"07c0a985-95a5-4fd3-a271-9e46d5f51af4\" (UID: \"07c0a985-95a5-4fd3-a271-9e46d5f51af4\") " Sep 30 06:22:28 crc kubenswrapper[4691]: I0930 06:22:28.844171 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07c0a985-95a5-4fd3-a271-9e46d5f51af4-catalog-content\") pod \"07c0a985-95a5-4fd3-a271-9e46d5f51af4\" (UID: \"07c0a985-95a5-4fd3-a271-9e46d5f51af4\") " Sep 30 06:22:28 crc kubenswrapper[4691]: I0930 06:22:28.844212 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4f92b\" (UniqueName: \"kubernetes.io/projected/c5278b1f-f696-4223-a657-24a3d309b5f3-kube-api-access-4f92b\") pod \"c5278b1f-f696-4223-a657-24a3d309b5f3\" (UID: \"c5278b1f-f696-4223-a657-24a3d309b5f3\") " Sep 30 06:22:28 crc kubenswrapper[4691]: I0930 06:22:28.845132 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07c0a985-95a5-4fd3-a271-9e46d5f51af4-utilities" (OuterVolumeSpecName: "utilities") pod "07c0a985-95a5-4fd3-a271-9e46d5f51af4" (UID: "07c0a985-95a5-4fd3-a271-9e46d5f51af4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:22:28 crc kubenswrapper[4691]: I0930 06:22:28.845295 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5278b1f-f696-4223-a657-24a3d309b5f3-utilities" (OuterVolumeSpecName: "utilities") pod "c5278b1f-f696-4223-a657-24a3d309b5f3" (UID: "c5278b1f-f696-4223-a657-24a3d309b5f3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:22:28 crc kubenswrapper[4691]: I0930 06:22:28.858397 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07c0a985-95a5-4fd3-a271-9e46d5f51af4-kube-api-access-g958m" (OuterVolumeSpecName: "kube-api-access-g958m") pod "07c0a985-95a5-4fd3-a271-9e46d5f51af4" (UID: "07c0a985-95a5-4fd3-a271-9e46d5f51af4"). InnerVolumeSpecName "kube-api-access-g958m". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:22:28 crc kubenswrapper[4691]: I0930 06:22:28.859097 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5278b1f-f696-4223-a657-24a3d309b5f3-kube-api-access-4f92b" (OuterVolumeSpecName: "kube-api-access-4f92b") pod "c5278b1f-f696-4223-a657-24a3d309b5f3" (UID: "c5278b1f-f696-4223-a657-24a3d309b5f3"). InnerVolumeSpecName "kube-api-access-4f92b". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:22:28 crc kubenswrapper[4691]: I0930 06:22:28.859138 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ghx4r" Sep 30 06:22:28 crc kubenswrapper[4691]: I0930 06:22:28.896868 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07c0a985-95a5-4fd3-a271-9e46d5f51af4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "07c0a985-95a5-4fd3-a271-9e46d5f51af4" (UID: "07c0a985-95a5-4fd3-a271-9e46d5f51af4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:22:28 crc kubenswrapper[4691]: I0930 06:22:28.900011 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c5278b1f-f696-4223-a657-24a3d309b5f3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c5278b1f-f696-4223-a657-24a3d309b5f3" (UID: "c5278b1f-f696-4223-a657-24a3d309b5f3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:22:28 crc kubenswrapper[4691]: I0930 06:22:28.946099 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07c0a985-95a5-4fd3-a271-9e46d5f51af4-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 06:22:28 crc kubenswrapper[4691]: I0930 06:22:28.946166 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4f92b\" (UniqueName: \"kubernetes.io/projected/c5278b1f-f696-4223-a657-24a3d309b5f3-kube-api-access-4f92b\") on node \"crc\" DevicePath \"\"" Sep 30 06:22:28 crc kubenswrapper[4691]: I0930 06:22:28.946197 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g958m\" (UniqueName: \"kubernetes.io/projected/07c0a985-95a5-4fd3-a271-9e46d5f51af4-kube-api-access-g958m\") on node \"crc\" DevicePath \"\"" Sep 30 06:22:28 crc kubenswrapper[4691]: I0930 06:22:28.946208 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c5278b1f-f696-4223-a657-24a3d309b5f3-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 06:22:28 crc kubenswrapper[4691]: I0930 06:22:28.946217 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c5278b1f-f696-4223-a657-24a3d309b5f3-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 06:22:28 crc kubenswrapper[4691]: I0930 06:22:28.946225 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07c0a985-95a5-4fd3-a271-9e46d5f51af4-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 06:22:29 crc kubenswrapper[4691]: I0930 06:22:29.274566 4691 generic.go:334] "Generic (PLEG): container finished" podID="07c0a985-95a5-4fd3-a271-9e46d5f51af4" containerID="00af7c51e1eba2c0685818e4b801f0cb159667916eae4ab88c892fba68316522" exitCode=0 Sep 30 06:22:29 crc kubenswrapper[4691]: I0930 06:22:29.274602 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qxlzp" Sep 30 06:22:29 crc kubenswrapper[4691]: I0930 06:22:29.274620 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qxlzp" event={"ID":"07c0a985-95a5-4fd3-a271-9e46d5f51af4","Type":"ContainerDied","Data":"00af7c51e1eba2c0685818e4b801f0cb159667916eae4ab88c892fba68316522"} Sep 30 06:22:29 crc kubenswrapper[4691]: I0930 06:22:29.275022 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qxlzp" event={"ID":"07c0a985-95a5-4fd3-a271-9e46d5f51af4","Type":"ContainerDied","Data":"d124bab712a6d32378971c33248edfa1ff3bb48979fc5c6a6399963b7c20c04b"} Sep 30 06:22:29 crc kubenswrapper[4691]: I0930 06:22:29.275045 4691 scope.go:117] "RemoveContainer" containerID="00af7c51e1eba2c0685818e4b801f0cb159667916eae4ab88c892fba68316522" Sep 30 06:22:29 crc kubenswrapper[4691]: I0930 06:22:29.283430 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-b7txr" Sep 30 06:22:29 crc kubenswrapper[4691]: I0930 06:22:29.299394 4691 generic.go:334] "Generic (PLEG): container finished" podID="c5278b1f-f696-4223-a657-24a3d309b5f3" containerID="53ef4a47c9fdac56b0624ca0a487e8d588dcef13b2ef66541ad012450d8a2e02" exitCode=0 Sep 30 06:22:29 crc kubenswrapper[4691]: I0930 06:22:29.300122 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-x8kfg" Sep 30 06:22:29 crc kubenswrapper[4691]: I0930 06:22:29.300466 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x8kfg" event={"ID":"c5278b1f-f696-4223-a657-24a3d309b5f3","Type":"ContainerDied","Data":"53ef4a47c9fdac56b0624ca0a487e8d588dcef13b2ef66541ad012450d8a2e02"} Sep 30 06:22:29 crc kubenswrapper[4691]: I0930 06:22:29.300488 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-x8kfg" event={"ID":"c5278b1f-f696-4223-a657-24a3d309b5f3","Type":"ContainerDied","Data":"e671017a28b6ba7d11c15a2853100c952c2f7c9e1e6b7274fa1a29fd81116f93"} Sep 30 06:22:29 crc kubenswrapper[4691]: I0930 06:22:29.311964 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qxlzp"] Sep 30 06:22:29 crc kubenswrapper[4691]: I0930 06:22:29.317189 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qxlzp"] Sep 30 06:22:29 crc kubenswrapper[4691]: I0930 06:22:29.318318 4691 scope.go:117] "RemoveContainer" containerID="b62091d059e502a69402978485949ed26afdd47b3721485945a887f247e81825" Sep 30 06:22:29 crc kubenswrapper[4691]: I0930 06:22:29.334020 4691 scope.go:117] "RemoveContainer" containerID="1aeb90c4b067b706afc6c615c7342a76d8a1016bdaf50cb5a378664149348160" Sep 30 06:22:29 crc kubenswrapper[4691]: I0930 06:22:29.341861 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-b7txr" Sep 30 06:22:29 crc kubenswrapper[4691]: I0930 06:22:29.373094 4691 scope.go:117] "RemoveContainer" containerID="00af7c51e1eba2c0685818e4b801f0cb159667916eae4ab88c892fba68316522" Sep 30 06:22:29 crc kubenswrapper[4691]: I0930 06:22:29.377817 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-x8kfg"] Sep 30 06:22:29 crc kubenswrapper[4691]: E0930 06:22:29.379322 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00af7c51e1eba2c0685818e4b801f0cb159667916eae4ab88c892fba68316522\": container with ID starting with 00af7c51e1eba2c0685818e4b801f0cb159667916eae4ab88c892fba68316522 not found: ID does not exist" containerID="00af7c51e1eba2c0685818e4b801f0cb159667916eae4ab88c892fba68316522" Sep 30 06:22:29 crc kubenswrapper[4691]: I0930 06:22:29.379364 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00af7c51e1eba2c0685818e4b801f0cb159667916eae4ab88c892fba68316522"} err="failed to get container status \"00af7c51e1eba2c0685818e4b801f0cb159667916eae4ab88c892fba68316522\": rpc error: code = NotFound desc = could not find container \"00af7c51e1eba2c0685818e4b801f0cb159667916eae4ab88c892fba68316522\": container with ID starting with 00af7c51e1eba2c0685818e4b801f0cb159667916eae4ab88c892fba68316522 not found: ID does not exist" Sep 30 06:22:29 crc kubenswrapper[4691]: I0930 06:22:29.379407 4691 scope.go:117] "RemoveContainer" containerID="b62091d059e502a69402978485949ed26afdd47b3721485945a887f247e81825" Sep 30 06:22:29 crc kubenswrapper[4691]: E0930 06:22:29.381817 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b62091d059e502a69402978485949ed26afdd47b3721485945a887f247e81825\": container with ID starting with b62091d059e502a69402978485949ed26afdd47b3721485945a887f247e81825 not found: ID does not exist" containerID="b62091d059e502a69402978485949ed26afdd47b3721485945a887f247e81825" Sep 30 06:22:29 crc kubenswrapper[4691]: I0930 06:22:29.381843 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b62091d059e502a69402978485949ed26afdd47b3721485945a887f247e81825"} err="failed to get container status \"b62091d059e502a69402978485949ed26afdd47b3721485945a887f247e81825\": rpc error: code = NotFound desc = could not find container \"b62091d059e502a69402978485949ed26afdd47b3721485945a887f247e81825\": container with ID starting with b62091d059e502a69402978485949ed26afdd47b3721485945a887f247e81825 not found: ID does not exist" Sep 30 06:22:29 crc kubenswrapper[4691]: I0930 06:22:29.381859 4691 scope.go:117] "RemoveContainer" containerID="1aeb90c4b067b706afc6c615c7342a76d8a1016bdaf50cb5a378664149348160" Sep 30 06:22:29 crc kubenswrapper[4691]: E0930 06:22:29.382939 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1aeb90c4b067b706afc6c615c7342a76d8a1016bdaf50cb5a378664149348160\": container with ID starting with 1aeb90c4b067b706afc6c615c7342a76d8a1016bdaf50cb5a378664149348160 not found: ID does not exist" containerID="1aeb90c4b067b706afc6c615c7342a76d8a1016bdaf50cb5a378664149348160" Sep 30 06:22:29 crc kubenswrapper[4691]: I0930 06:22:29.382969 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1aeb90c4b067b706afc6c615c7342a76d8a1016bdaf50cb5a378664149348160"} err="failed to get container status \"1aeb90c4b067b706afc6c615c7342a76d8a1016bdaf50cb5a378664149348160\": rpc error: code = NotFound desc = could not find container \"1aeb90c4b067b706afc6c615c7342a76d8a1016bdaf50cb5a378664149348160\": container with ID starting with 1aeb90c4b067b706afc6c615c7342a76d8a1016bdaf50cb5a378664149348160 not found: ID does not exist" Sep 30 06:22:29 crc kubenswrapper[4691]: I0930 06:22:29.382993 4691 scope.go:117] "RemoveContainer" containerID="53ef4a47c9fdac56b0624ca0a487e8d588dcef13b2ef66541ad012450d8a2e02" Sep 30 06:22:29 crc kubenswrapper[4691]: I0930 06:22:29.385872 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-x8kfg"] Sep 30 06:22:29 crc kubenswrapper[4691]: I0930 06:22:29.412144 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-r8bfj"] Sep 30 06:22:29 crc kubenswrapper[4691]: I0930 06:22:29.429034 4691 scope.go:117] "RemoveContainer" containerID="846841d356bb16c338536c062e0bc94f540fde4d40677855486831c29925e33f" Sep 30 06:22:29 crc kubenswrapper[4691]: I0930 06:22:29.476920 4691 scope.go:117] "RemoveContainer" containerID="1f64e57d6e5cba91d57e5cb7a09f1ba7477501f867b80146fdf6617fe87b2c71" Sep 30 06:22:29 crc kubenswrapper[4691]: I0930 06:22:29.496413 4691 scope.go:117] "RemoveContainer" containerID="53ef4a47c9fdac56b0624ca0a487e8d588dcef13b2ef66541ad012450d8a2e02" Sep 30 06:22:29 crc kubenswrapper[4691]: E0930 06:22:29.498390 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53ef4a47c9fdac56b0624ca0a487e8d588dcef13b2ef66541ad012450d8a2e02\": container with ID starting with 53ef4a47c9fdac56b0624ca0a487e8d588dcef13b2ef66541ad012450d8a2e02 not found: ID does not exist" containerID="53ef4a47c9fdac56b0624ca0a487e8d588dcef13b2ef66541ad012450d8a2e02" Sep 30 06:22:29 crc kubenswrapper[4691]: I0930 06:22:29.498427 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53ef4a47c9fdac56b0624ca0a487e8d588dcef13b2ef66541ad012450d8a2e02"} err="failed to get container status \"53ef4a47c9fdac56b0624ca0a487e8d588dcef13b2ef66541ad012450d8a2e02\": rpc error: code = NotFound desc = could not find container \"53ef4a47c9fdac56b0624ca0a487e8d588dcef13b2ef66541ad012450d8a2e02\": container with ID starting with 53ef4a47c9fdac56b0624ca0a487e8d588dcef13b2ef66541ad012450d8a2e02 not found: ID does not exist" Sep 30 06:22:29 crc kubenswrapper[4691]: I0930 06:22:29.498447 4691 scope.go:117] "RemoveContainer" containerID="846841d356bb16c338536c062e0bc94f540fde4d40677855486831c29925e33f" Sep 30 06:22:29 crc kubenswrapper[4691]: E0930 06:22:29.498657 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"846841d356bb16c338536c062e0bc94f540fde4d40677855486831c29925e33f\": container with ID starting with 846841d356bb16c338536c062e0bc94f540fde4d40677855486831c29925e33f not found: ID does not exist" containerID="846841d356bb16c338536c062e0bc94f540fde4d40677855486831c29925e33f" Sep 30 06:22:29 crc kubenswrapper[4691]: I0930 06:22:29.498679 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"846841d356bb16c338536c062e0bc94f540fde4d40677855486831c29925e33f"} err="failed to get container status \"846841d356bb16c338536c062e0bc94f540fde4d40677855486831c29925e33f\": rpc error: code = NotFound desc = could not find container \"846841d356bb16c338536c062e0bc94f540fde4d40677855486831c29925e33f\": container with ID starting with 846841d356bb16c338536c062e0bc94f540fde4d40677855486831c29925e33f not found: ID does not exist" Sep 30 06:22:29 crc kubenswrapper[4691]: I0930 06:22:29.498710 4691 scope.go:117] "RemoveContainer" containerID="1f64e57d6e5cba91d57e5cb7a09f1ba7477501f867b80146fdf6617fe87b2c71" Sep 30 06:22:29 crc kubenswrapper[4691]: E0930 06:22:29.498984 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f64e57d6e5cba91d57e5cb7a09f1ba7477501f867b80146fdf6617fe87b2c71\": container with ID starting with 1f64e57d6e5cba91d57e5cb7a09f1ba7477501f867b80146fdf6617fe87b2c71 not found: ID does not exist" containerID="1f64e57d6e5cba91d57e5cb7a09f1ba7477501f867b80146fdf6617fe87b2c71" Sep 30 06:22:29 crc kubenswrapper[4691]: I0930 06:22:29.499003 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f64e57d6e5cba91d57e5cb7a09f1ba7477501f867b80146fdf6617fe87b2c71"} err="failed to get container status \"1f64e57d6e5cba91d57e5cb7a09f1ba7477501f867b80146fdf6617fe87b2c71\": rpc error: code = NotFound desc = could not find container \"1f64e57d6e5cba91d57e5cb7a09f1ba7477501f867b80146fdf6617fe87b2c71\": container with ID starting with 1f64e57d6e5cba91d57e5cb7a09f1ba7477501f867b80146fdf6617fe87b2c71 not found: ID does not exist" Sep 30 06:22:29 crc kubenswrapper[4691]: I0930 06:22:29.909436 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d6xvc"] Sep 30 06:22:29 crc kubenswrapper[4691]: I0930 06:22:29.910739 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-d6xvc" podUID="2fae00bc-923f-4e3c-979b-bfde482dc0b0" containerName="registry-server" containerID="cri-o://abb9224e85b515cadbd0167960598a6b0d482908b0040f94f335e6e5495c6243" gracePeriod=2 Sep 30 06:22:30 crc kubenswrapper[4691]: I0930 06:22:30.251079 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d6xvc" Sep 30 06:22:30 crc kubenswrapper[4691]: I0930 06:22:30.311803 4691 generic.go:334] "Generic (PLEG): container finished" podID="2fae00bc-923f-4e3c-979b-bfde482dc0b0" containerID="abb9224e85b515cadbd0167960598a6b0d482908b0040f94f335e6e5495c6243" exitCode=0 Sep 30 06:22:30 crc kubenswrapper[4691]: I0930 06:22:30.311879 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d6xvc" event={"ID":"2fae00bc-923f-4e3c-979b-bfde482dc0b0","Type":"ContainerDied","Data":"abb9224e85b515cadbd0167960598a6b0d482908b0040f94f335e6e5495c6243"} Sep 30 06:22:30 crc kubenswrapper[4691]: I0930 06:22:30.311930 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d6xvc" event={"ID":"2fae00bc-923f-4e3c-979b-bfde482dc0b0","Type":"ContainerDied","Data":"147f7424158c390457b54223a55788eaca8cb94ac0c28fc9b0ca68df2cddf637"} Sep 30 06:22:30 crc kubenswrapper[4691]: I0930 06:22:30.311953 4691 scope.go:117] "RemoveContainer" containerID="abb9224e85b515cadbd0167960598a6b0d482908b0040f94f335e6e5495c6243" Sep 30 06:22:30 crc kubenswrapper[4691]: I0930 06:22:30.312061 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d6xvc" Sep 30 06:22:30 crc kubenswrapper[4691]: I0930 06:22:30.335342 4691 scope.go:117] "RemoveContainer" containerID="2f2bd095f86567c7528de9f72f3d236e4a63f3be42d89201d96f80ea0cd07e82" Sep 30 06:22:30 crc kubenswrapper[4691]: I0930 06:22:30.356106 4691 scope.go:117] "RemoveContainer" containerID="60d4dbfedc85fe8b8462f07cad3091485062f4f25bea06139ceb9af28003b016" Sep 30 06:22:30 crc kubenswrapper[4691]: I0930 06:22:30.362941 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7bkvc\" (UniqueName: \"kubernetes.io/projected/2fae00bc-923f-4e3c-979b-bfde482dc0b0-kube-api-access-7bkvc\") pod \"2fae00bc-923f-4e3c-979b-bfde482dc0b0\" (UID: \"2fae00bc-923f-4e3c-979b-bfde482dc0b0\") " Sep 30 06:22:30 crc kubenswrapper[4691]: I0930 06:22:30.363021 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fae00bc-923f-4e3c-979b-bfde482dc0b0-catalog-content\") pod \"2fae00bc-923f-4e3c-979b-bfde482dc0b0\" (UID: \"2fae00bc-923f-4e3c-979b-bfde482dc0b0\") " Sep 30 06:22:30 crc kubenswrapper[4691]: I0930 06:22:30.363089 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fae00bc-923f-4e3c-979b-bfde482dc0b0-utilities\") pod \"2fae00bc-923f-4e3c-979b-bfde482dc0b0\" (UID: \"2fae00bc-923f-4e3c-979b-bfde482dc0b0\") " Sep 30 06:22:30 crc kubenswrapper[4691]: I0930 06:22:30.365210 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2fae00bc-923f-4e3c-979b-bfde482dc0b0-utilities" (OuterVolumeSpecName: "utilities") pod "2fae00bc-923f-4e3c-979b-bfde482dc0b0" (UID: "2fae00bc-923f-4e3c-979b-bfde482dc0b0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:22:30 crc kubenswrapper[4691]: I0930 06:22:30.370773 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2fae00bc-923f-4e3c-979b-bfde482dc0b0-kube-api-access-7bkvc" (OuterVolumeSpecName: "kube-api-access-7bkvc") pod "2fae00bc-923f-4e3c-979b-bfde482dc0b0" (UID: "2fae00bc-923f-4e3c-979b-bfde482dc0b0"). InnerVolumeSpecName "kube-api-access-7bkvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:22:30 crc kubenswrapper[4691]: I0930 06:22:30.376321 4691 scope.go:117] "RemoveContainer" containerID="abb9224e85b515cadbd0167960598a6b0d482908b0040f94f335e6e5495c6243" Sep 30 06:22:30 crc kubenswrapper[4691]: E0930 06:22:30.376657 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abb9224e85b515cadbd0167960598a6b0d482908b0040f94f335e6e5495c6243\": container with ID starting with abb9224e85b515cadbd0167960598a6b0d482908b0040f94f335e6e5495c6243 not found: ID does not exist" containerID="abb9224e85b515cadbd0167960598a6b0d482908b0040f94f335e6e5495c6243" Sep 30 06:22:30 crc kubenswrapper[4691]: I0930 06:22:30.376700 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abb9224e85b515cadbd0167960598a6b0d482908b0040f94f335e6e5495c6243"} err="failed to get container status \"abb9224e85b515cadbd0167960598a6b0d482908b0040f94f335e6e5495c6243\": rpc error: code = NotFound desc = could not find container \"abb9224e85b515cadbd0167960598a6b0d482908b0040f94f335e6e5495c6243\": container with ID starting with abb9224e85b515cadbd0167960598a6b0d482908b0040f94f335e6e5495c6243 not found: ID does not exist" Sep 30 06:22:30 crc kubenswrapper[4691]: I0930 06:22:30.376728 4691 scope.go:117] "RemoveContainer" containerID="2f2bd095f86567c7528de9f72f3d236e4a63f3be42d89201d96f80ea0cd07e82" Sep 30 06:22:30 crc kubenswrapper[4691]: I0930 06:22:30.377044 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2fae00bc-923f-4e3c-979b-bfde482dc0b0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2fae00bc-923f-4e3c-979b-bfde482dc0b0" (UID: "2fae00bc-923f-4e3c-979b-bfde482dc0b0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:22:30 crc kubenswrapper[4691]: E0930 06:22:30.377286 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f2bd095f86567c7528de9f72f3d236e4a63f3be42d89201d96f80ea0cd07e82\": container with ID starting with 2f2bd095f86567c7528de9f72f3d236e4a63f3be42d89201d96f80ea0cd07e82 not found: ID does not exist" containerID="2f2bd095f86567c7528de9f72f3d236e4a63f3be42d89201d96f80ea0cd07e82" Sep 30 06:22:30 crc kubenswrapper[4691]: I0930 06:22:30.377336 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f2bd095f86567c7528de9f72f3d236e4a63f3be42d89201d96f80ea0cd07e82"} err="failed to get container status \"2f2bd095f86567c7528de9f72f3d236e4a63f3be42d89201d96f80ea0cd07e82\": rpc error: code = NotFound desc = could not find container \"2f2bd095f86567c7528de9f72f3d236e4a63f3be42d89201d96f80ea0cd07e82\": container with ID starting with 2f2bd095f86567c7528de9f72f3d236e4a63f3be42d89201d96f80ea0cd07e82 not found: ID does not exist" Sep 30 06:22:30 crc kubenswrapper[4691]: I0930 06:22:30.377374 4691 scope.go:117] "RemoveContainer" containerID="60d4dbfedc85fe8b8462f07cad3091485062f4f25bea06139ceb9af28003b016" Sep 30 06:22:30 crc kubenswrapper[4691]: E0930 06:22:30.378115 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60d4dbfedc85fe8b8462f07cad3091485062f4f25bea06139ceb9af28003b016\": container with ID starting with 60d4dbfedc85fe8b8462f07cad3091485062f4f25bea06139ceb9af28003b016 not found: ID does not exist" containerID="60d4dbfedc85fe8b8462f07cad3091485062f4f25bea06139ceb9af28003b016" Sep 30 06:22:30 crc kubenswrapper[4691]: I0930 06:22:30.378152 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60d4dbfedc85fe8b8462f07cad3091485062f4f25bea06139ceb9af28003b016"} err="failed to get container status \"60d4dbfedc85fe8b8462f07cad3091485062f4f25bea06139ceb9af28003b016\": rpc error: code = NotFound desc = could not find container \"60d4dbfedc85fe8b8462f07cad3091485062f4f25bea06139ceb9af28003b016\": container with ID starting with 60d4dbfedc85fe8b8462f07cad3091485062f4f25bea06139ceb9af28003b016 not found: ID does not exist" Sep 30 06:22:30 crc kubenswrapper[4691]: I0930 06:22:30.464243 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2fae00bc-923f-4e3c-979b-bfde482dc0b0-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 06:22:30 crc kubenswrapper[4691]: I0930 06:22:30.464275 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7bkvc\" (UniqueName: \"kubernetes.io/projected/2fae00bc-923f-4e3c-979b-bfde482dc0b0-kube-api-access-7bkvc\") on node \"crc\" DevicePath \"\"" Sep 30 06:22:30 crc kubenswrapper[4691]: I0930 06:22:30.464288 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2fae00bc-923f-4e3c-979b-bfde482dc0b0-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 06:22:30 crc kubenswrapper[4691]: I0930 06:22:30.637918 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d6xvc"] Sep 30 06:22:30 crc kubenswrapper[4691]: I0930 06:22:30.640195 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-d6xvc"] Sep 30 06:22:31 crc kubenswrapper[4691]: I0930 06:22:31.232671 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07c0a985-95a5-4fd3-a271-9e46d5f51af4" path="/var/lib/kubelet/pods/07c0a985-95a5-4fd3-a271-9e46d5f51af4/volumes" Sep 30 06:22:31 crc kubenswrapper[4691]: I0930 06:22:31.233414 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2fae00bc-923f-4e3c-979b-bfde482dc0b0" path="/var/lib/kubelet/pods/2fae00bc-923f-4e3c-979b-bfde482dc0b0/volumes" Sep 30 06:22:31 crc kubenswrapper[4691]: I0930 06:22:31.234088 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5278b1f-f696-4223-a657-24a3d309b5f3" path="/var/lib/kubelet/pods/c5278b1f-f696-4223-a657-24a3d309b5f3/volumes" Sep 30 06:22:32 crc kubenswrapper[4691]: I0930 06:22:32.509521 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-b7txr"] Sep 30 06:22:32 crc kubenswrapper[4691]: I0930 06:22:32.509753 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-b7txr" podUID="95812fe8-f69f-4c5a-9a91-e3e054317b63" containerName="registry-server" containerID="cri-o://d3c17fc449b1570adc3df7bcf235c498f46bf3bff8da847c598fa7523fac163c" gracePeriod=2 Sep 30 06:22:32 crc kubenswrapper[4691]: I0930 06:22:32.880624 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b7txr" Sep 30 06:22:32 crc kubenswrapper[4691]: I0930 06:22:32.993360 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95812fe8-f69f-4c5a-9a91-e3e054317b63-utilities\") pod \"95812fe8-f69f-4c5a-9a91-e3e054317b63\" (UID: \"95812fe8-f69f-4c5a-9a91-e3e054317b63\") " Sep 30 06:22:32 crc kubenswrapper[4691]: I0930 06:22:32.993415 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-slmhx\" (UniqueName: \"kubernetes.io/projected/95812fe8-f69f-4c5a-9a91-e3e054317b63-kube-api-access-slmhx\") pod \"95812fe8-f69f-4c5a-9a91-e3e054317b63\" (UID: \"95812fe8-f69f-4c5a-9a91-e3e054317b63\") " Sep 30 06:22:32 crc kubenswrapper[4691]: I0930 06:22:32.993497 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95812fe8-f69f-4c5a-9a91-e3e054317b63-catalog-content\") pod \"95812fe8-f69f-4c5a-9a91-e3e054317b63\" (UID: \"95812fe8-f69f-4c5a-9a91-e3e054317b63\") " Sep 30 06:22:32 crc kubenswrapper[4691]: I0930 06:22:32.994571 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95812fe8-f69f-4c5a-9a91-e3e054317b63-utilities" (OuterVolumeSpecName: "utilities") pod "95812fe8-f69f-4c5a-9a91-e3e054317b63" (UID: "95812fe8-f69f-4c5a-9a91-e3e054317b63"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:22:32 crc kubenswrapper[4691]: I0930 06:22:32.999180 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95812fe8-f69f-4c5a-9a91-e3e054317b63-kube-api-access-slmhx" (OuterVolumeSpecName: "kube-api-access-slmhx") pod "95812fe8-f69f-4c5a-9a91-e3e054317b63" (UID: "95812fe8-f69f-4c5a-9a91-e3e054317b63"). InnerVolumeSpecName "kube-api-access-slmhx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:22:33 crc kubenswrapper[4691]: I0930 06:22:33.092238 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95812fe8-f69f-4c5a-9a91-e3e054317b63-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "95812fe8-f69f-4c5a-9a91-e3e054317b63" (UID: "95812fe8-f69f-4c5a-9a91-e3e054317b63"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:22:33 crc kubenswrapper[4691]: I0930 06:22:33.094566 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/95812fe8-f69f-4c5a-9a91-e3e054317b63-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 06:22:33 crc kubenswrapper[4691]: I0930 06:22:33.094608 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-slmhx\" (UniqueName: \"kubernetes.io/projected/95812fe8-f69f-4c5a-9a91-e3e054317b63-kube-api-access-slmhx\") on node \"crc\" DevicePath \"\"" Sep 30 06:22:33 crc kubenswrapper[4691]: I0930 06:22:33.094623 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/95812fe8-f69f-4c5a-9a91-e3e054317b63-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 06:22:33 crc kubenswrapper[4691]: I0930 06:22:33.333359 4691 generic.go:334] "Generic (PLEG): container finished" podID="95812fe8-f69f-4c5a-9a91-e3e054317b63" containerID="d3c17fc449b1570adc3df7bcf235c498f46bf3bff8da847c598fa7523fac163c" exitCode=0 Sep 30 06:22:33 crc kubenswrapper[4691]: I0930 06:22:33.333431 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b7txr" event={"ID":"95812fe8-f69f-4c5a-9a91-e3e054317b63","Type":"ContainerDied","Data":"d3c17fc449b1570adc3df7bcf235c498f46bf3bff8da847c598fa7523fac163c"} Sep 30 06:22:33 crc kubenswrapper[4691]: I0930 06:22:33.333471 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-b7txr" event={"ID":"95812fe8-f69f-4c5a-9a91-e3e054317b63","Type":"ContainerDied","Data":"dcc3355cfc48edb397cc2a21a05252699f21f58176cef2ea9ad0312e7bf1ec83"} Sep 30 06:22:33 crc kubenswrapper[4691]: I0930 06:22:33.333487 4691 scope.go:117] "RemoveContainer" containerID="d3c17fc449b1570adc3df7bcf235c498f46bf3bff8da847c598fa7523fac163c" Sep 30 06:22:33 crc kubenswrapper[4691]: I0930 06:22:33.333627 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-b7txr" Sep 30 06:22:33 crc kubenswrapper[4691]: I0930 06:22:33.345023 4691 scope.go:117] "RemoveContainer" containerID="e68ca679ff39c44ff33d0557bdcecfc31138c271b25c9c2f783af24831b01623" Sep 30 06:22:33 crc kubenswrapper[4691]: I0930 06:22:33.350144 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-b7txr"] Sep 30 06:22:33 crc kubenswrapper[4691]: I0930 06:22:33.353766 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-b7txr"] Sep 30 06:22:33 crc kubenswrapper[4691]: I0930 06:22:33.361931 4691 scope.go:117] "RemoveContainer" containerID="6ee4088e8ad236be90928e426494b7ce66716b19e1fd104637fc3c874eea625a" Sep 30 06:22:33 crc kubenswrapper[4691]: I0930 06:22:33.379935 4691 scope.go:117] "RemoveContainer" containerID="d3c17fc449b1570adc3df7bcf235c498f46bf3bff8da847c598fa7523fac163c" Sep 30 06:22:33 crc kubenswrapper[4691]: E0930 06:22:33.380514 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3c17fc449b1570adc3df7bcf235c498f46bf3bff8da847c598fa7523fac163c\": container with ID starting with d3c17fc449b1570adc3df7bcf235c498f46bf3bff8da847c598fa7523fac163c not found: ID does not exist" containerID="d3c17fc449b1570adc3df7bcf235c498f46bf3bff8da847c598fa7523fac163c" Sep 30 06:22:33 crc kubenswrapper[4691]: I0930 06:22:33.380544 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3c17fc449b1570adc3df7bcf235c498f46bf3bff8da847c598fa7523fac163c"} err="failed to get container status \"d3c17fc449b1570adc3df7bcf235c498f46bf3bff8da847c598fa7523fac163c\": rpc error: code = NotFound desc = could not find container \"d3c17fc449b1570adc3df7bcf235c498f46bf3bff8da847c598fa7523fac163c\": container with ID starting with d3c17fc449b1570adc3df7bcf235c498f46bf3bff8da847c598fa7523fac163c not found: ID does not exist" Sep 30 06:22:33 crc kubenswrapper[4691]: I0930 06:22:33.380568 4691 scope.go:117] "RemoveContainer" containerID="e68ca679ff39c44ff33d0557bdcecfc31138c271b25c9c2f783af24831b01623" Sep 30 06:22:33 crc kubenswrapper[4691]: E0930 06:22:33.384260 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e68ca679ff39c44ff33d0557bdcecfc31138c271b25c9c2f783af24831b01623\": container with ID starting with e68ca679ff39c44ff33d0557bdcecfc31138c271b25c9c2f783af24831b01623 not found: ID does not exist" containerID="e68ca679ff39c44ff33d0557bdcecfc31138c271b25c9c2f783af24831b01623" Sep 30 06:22:33 crc kubenswrapper[4691]: I0930 06:22:33.384308 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e68ca679ff39c44ff33d0557bdcecfc31138c271b25c9c2f783af24831b01623"} err="failed to get container status \"e68ca679ff39c44ff33d0557bdcecfc31138c271b25c9c2f783af24831b01623\": rpc error: code = NotFound desc = could not find container \"e68ca679ff39c44ff33d0557bdcecfc31138c271b25c9c2f783af24831b01623\": container with ID starting with e68ca679ff39c44ff33d0557bdcecfc31138c271b25c9c2f783af24831b01623 not found: ID does not exist" Sep 30 06:22:33 crc kubenswrapper[4691]: I0930 06:22:33.384337 4691 scope.go:117] "RemoveContainer" containerID="6ee4088e8ad236be90928e426494b7ce66716b19e1fd104637fc3c874eea625a" Sep 30 06:22:33 crc kubenswrapper[4691]: E0930 06:22:33.384687 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ee4088e8ad236be90928e426494b7ce66716b19e1fd104637fc3c874eea625a\": container with ID starting with 6ee4088e8ad236be90928e426494b7ce66716b19e1fd104637fc3c874eea625a not found: ID does not exist" containerID="6ee4088e8ad236be90928e426494b7ce66716b19e1fd104637fc3c874eea625a" Sep 30 06:22:33 crc kubenswrapper[4691]: I0930 06:22:33.384738 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ee4088e8ad236be90928e426494b7ce66716b19e1fd104637fc3c874eea625a"} err="failed to get container status \"6ee4088e8ad236be90928e426494b7ce66716b19e1fd104637fc3c874eea625a\": rpc error: code = NotFound desc = could not find container \"6ee4088e8ad236be90928e426494b7ce66716b19e1fd104637fc3c874eea625a\": container with ID starting with 6ee4088e8ad236be90928e426494b7ce66716b19e1fd104637fc3c874eea625a not found: ID does not exist" Sep 30 06:22:35 crc kubenswrapper[4691]: I0930 06:22:35.231361 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95812fe8-f69f-4c5a-9a91-e3e054317b63" path="/var/lib/kubelet/pods/95812fe8-f69f-4c5a-9a91-e3e054317b63/volumes" Sep 30 06:22:41 crc kubenswrapper[4691]: I0930 06:22:41.375218 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-4bdqt"] Sep 30 06:22:41 crc kubenswrapper[4691]: I0930 06:22:41.375924 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-4bdqt" podUID="7a677441-8b2d-41ae-8dd8-e3334c16c700" containerName="controller-manager" containerID="cri-o://3da83dec82e9fb5197879c972a1ef86ac9b11da0ed3884630618078742fb6558" gracePeriod=30 Sep 30 06:22:41 crc kubenswrapper[4691]: I0930 06:22:41.474046 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-fqxlz"] Sep 30 06:22:41 crc kubenswrapper[4691]: I0930 06:22:41.474517 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fqxlz" podUID="535f395c-e127-4a48-8766-707bf9d4d5a3" containerName="route-controller-manager" containerID="cri-o://141288b9567948ac2e145dbc13589900de4d46c60614c8ff9fc8e33de478b579" gracePeriod=30 Sep 30 06:22:41 crc kubenswrapper[4691]: I0930 06:22:41.737069 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-4bdqt" Sep 30 06:22:41 crc kubenswrapper[4691]: I0930 06:22:41.803225 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fqxlz" Sep 30 06:22:41 crc kubenswrapper[4691]: I0930 06:22:41.803241 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a677441-8b2d-41ae-8dd8-e3334c16c700-serving-cert\") pod \"7a677441-8b2d-41ae-8dd8-e3334c16c700\" (UID: \"7a677441-8b2d-41ae-8dd8-e3334c16c700\") " Sep 30 06:22:41 crc kubenswrapper[4691]: I0930 06:22:41.803285 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a677441-8b2d-41ae-8dd8-e3334c16c700-config\") pod \"7a677441-8b2d-41ae-8dd8-e3334c16c700\" (UID: \"7a677441-8b2d-41ae-8dd8-e3334c16c700\") " Sep 30 06:22:41 crc kubenswrapper[4691]: I0930 06:22:41.803358 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7a677441-8b2d-41ae-8dd8-e3334c16c700-client-ca\") pod \"7a677441-8b2d-41ae-8dd8-e3334c16c700\" (UID: \"7a677441-8b2d-41ae-8dd8-e3334c16c700\") " Sep 30 06:22:41 crc kubenswrapper[4691]: I0930 06:22:41.803979 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a677441-8b2d-41ae-8dd8-e3334c16c700-client-ca" (OuterVolumeSpecName: "client-ca") pod "7a677441-8b2d-41ae-8dd8-e3334c16c700" (UID: "7a677441-8b2d-41ae-8dd8-e3334c16c700"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:22:41 crc kubenswrapper[4691]: I0930 06:22:41.804091 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a677441-8b2d-41ae-8dd8-e3334c16c700-config" (OuterVolumeSpecName: "config") pod "7a677441-8b2d-41ae-8dd8-e3334c16c700" (UID: "7a677441-8b2d-41ae-8dd8-e3334c16c700"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:22:41 crc kubenswrapper[4691]: I0930 06:22:41.804182 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6jc8\" (UniqueName: \"kubernetes.io/projected/7a677441-8b2d-41ae-8dd8-e3334c16c700-kube-api-access-f6jc8\") pod \"7a677441-8b2d-41ae-8dd8-e3334c16c700\" (UID: \"7a677441-8b2d-41ae-8dd8-e3334c16c700\") " Sep 30 06:22:41 crc kubenswrapper[4691]: I0930 06:22:41.804210 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7a677441-8b2d-41ae-8dd8-e3334c16c700-proxy-ca-bundles\") pod \"7a677441-8b2d-41ae-8dd8-e3334c16c700\" (UID: \"7a677441-8b2d-41ae-8dd8-e3334c16c700\") " Sep 30 06:22:41 crc kubenswrapper[4691]: I0930 06:22:41.804528 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a677441-8b2d-41ae-8dd8-e3334c16c700-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7a677441-8b2d-41ae-8dd8-e3334c16c700" (UID: "7a677441-8b2d-41ae-8dd8-e3334c16c700"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:22:41 crc kubenswrapper[4691]: I0930 06:22:41.805095 4691 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a677441-8b2d-41ae-8dd8-e3334c16c700-config\") on node \"crc\" DevicePath \"\"" Sep 30 06:22:41 crc kubenswrapper[4691]: I0930 06:22:41.805115 4691 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7a677441-8b2d-41ae-8dd8-e3334c16c700-client-ca\") on node \"crc\" DevicePath \"\"" Sep 30 06:22:41 crc kubenswrapper[4691]: I0930 06:22:41.805128 4691 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7a677441-8b2d-41ae-8dd8-e3334c16c700-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Sep 30 06:22:41 crc kubenswrapper[4691]: I0930 06:22:41.809426 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a677441-8b2d-41ae-8dd8-e3334c16c700-kube-api-access-f6jc8" (OuterVolumeSpecName: "kube-api-access-f6jc8") pod "7a677441-8b2d-41ae-8dd8-e3334c16c700" (UID: "7a677441-8b2d-41ae-8dd8-e3334c16c700"). InnerVolumeSpecName "kube-api-access-f6jc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:22:41 crc kubenswrapper[4691]: I0930 06:22:41.809453 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a677441-8b2d-41ae-8dd8-e3334c16c700-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7a677441-8b2d-41ae-8dd8-e3334c16c700" (UID: "7a677441-8b2d-41ae-8dd8-e3334c16c700"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:22:41 crc kubenswrapper[4691]: I0930 06:22:41.906311 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/535f395c-e127-4a48-8766-707bf9d4d5a3-config\") pod \"535f395c-e127-4a48-8766-707bf9d4d5a3\" (UID: \"535f395c-e127-4a48-8766-707bf9d4d5a3\") " Sep 30 06:22:41 crc kubenswrapper[4691]: I0930 06:22:41.906438 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6vcq\" (UniqueName: \"kubernetes.io/projected/535f395c-e127-4a48-8766-707bf9d4d5a3-kube-api-access-p6vcq\") pod \"535f395c-e127-4a48-8766-707bf9d4d5a3\" (UID: \"535f395c-e127-4a48-8766-707bf9d4d5a3\") " Sep 30 06:22:41 crc kubenswrapper[4691]: I0930 06:22:41.906463 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/535f395c-e127-4a48-8766-707bf9d4d5a3-serving-cert\") pod \"535f395c-e127-4a48-8766-707bf9d4d5a3\" (UID: \"535f395c-e127-4a48-8766-707bf9d4d5a3\") " Sep 30 06:22:41 crc kubenswrapper[4691]: I0930 06:22:41.906494 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/535f395c-e127-4a48-8766-707bf9d4d5a3-client-ca\") pod \"535f395c-e127-4a48-8766-707bf9d4d5a3\" (UID: \"535f395c-e127-4a48-8766-707bf9d4d5a3\") " Sep 30 06:22:41 crc kubenswrapper[4691]: I0930 06:22:41.906664 4691 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a677441-8b2d-41ae-8dd8-e3334c16c700-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 06:22:41 crc kubenswrapper[4691]: I0930 06:22:41.906679 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f6jc8\" (UniqueName: \"kubernetes.io/projected/7a677441-8b2d-41ae-8dd8-e3334c16c700-kube-api-access-f6jc8\") on node \"crc\" DevicePath \"\"" Sep 30 06:22:41 crc kubenswrapper[4691]: I0930 06:22:41.907220 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/535f395c-e127-4a48-8766-707bf9d4d5a3-client-ca" (OuterVolumeSpecName: "client-ca") pod "535f395c-e127-4a48-8766-707bf9d4d5a3" (UID: "535f395c-e127-4a48-8766-707bf9d4d5a3"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:22:41 crc kubenswrapper[4691]: I0930 06:22:41.907515 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/535f395c-e127-4a48-8766-707bf9d4d5a3-config" (OuterVolumeSpecName: "config") pod "535f395c-e127-4a48-8766-707bf9d4d5a3" (UID: "535f395c-e127-4a48-8766-707bf9d4d5a3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:22:41 crc kubenswrapper[4691]: I0930 06:22:41.910949 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/535f395c-e127-4a48-8766-707bf9d4d5a3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "535f395c-e127-4a48-8766-707bf9d4d5a3" (UID: "535f395c-e127-4a48-8766-707bf9d4d5a3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:22:41 crc kubenswrapper[4691]: I0930 06:22:41.921711 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/535f395c-e127-4a48-8766-707bf9d4d5a3-kube-api-access-p6vcq" (OuterVolumeSpecName: "kube-api-access-p6vcq") pod "535f395c-e127-4a48-8766-707bf9d4d5a3" (UID: "535f395c-e127-4a48-8766-707bf9d4d5a3"). InnerVolumeSpecName "kube-api-access-p6vcq". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.007554 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6vcq\" (UniqueName: \"kubernetes.io/projected/535f395c-e127-4a48-8766-707bf9d4d5a3-kube-api-access-p6vcq\") on node \"crc\" DevicePath \"\"" Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.007589 4691 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/535f395c-e127-4a48-8766-707bf9d4d5a3-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.007598 4691 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/535f395c-e127-4a48-8766-707bf9d4d5a3-client-ca\") on node \"crc\" DevicePath \"\"" Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.007613 4691 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/535f395c-e127-4a48-8766-707bf9d4d5a3-config\") on node \"crc\" DevicePath \"\"" Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.405218 4691 generic.go:334] "Generic (PLEG): container finished" podID="7a677441-8b2d-41ae-8dd8-e3334c16c700" containerID="3da83dec82e9fb5197879c972a1ef86ac9b11da0ed3884630618078742fb6558" exitCode=0 Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.406659 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-4bdqt" Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.406647 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-4bdqt" event={"ID":"7a677441-8b2d-41ae-8dd8-e3334c16c700","Type":"ContainerDied","Data":"3da83dec82e9fb5197879c972a1ef86ac9b11da0ed3884630618078742fb6558"} Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.407177 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-4bdqt" event={"ID":"7a677441-8b2d-41ae-8dd8-e3334c16c700","Type":"ContainerDied","Data":"51f276215a44bcc6bac3fd8de14efcc278517f7c02d577bc36d4f2d90d996cf8"} Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.407202 4691 scope.go:117] "RemoveContainer" containerID="3da83dec82e9fb5197879c972a1ef86ac9b11da0ed3884630618078742fb6558" Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.408035 4691 generic.go:334] "Generic (PLEG): container finished" podID="535f395c-e127-4a48-8766-707bf9d4d5a3" containerID="141288b9567948ac2e145dbc13589900de4d46c60614c8ff9fc8e33de478b579" exitCode=0 Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.408066 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fqxlz" event={"ID":"535f395c-e127-4a48-8766-707bf9d4d5a3","Type":"ContainerDied","Data":"141288b9567948ac2e145dbc13589900de4d46c60614c8ff9fc8e33de478b579"} Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.408103 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fqxlz" event={"ID":"535f395c-e127-4a48-8766-707bf9d4d5a3","Type":"ContainerDied","Data":"88b2975d929ad637c7ee9e3385f84baf2c9722d48f88262b68afa90fcc09e9dd"} Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.408356 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-fqxlz" Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.455156 4691 scope.go:117] "RemoveContainer" containerID="3da83dec82e9fb5197879c972a1ef86ac9b11da0ed3884630618078742fb6558" Sep 30 06:22:42 crc kubenswrapper[4691]: E0930 06:22:42.456618 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3da83dec82e9fb5197879c972a1ef86ac9b11da0ed3884630618078742fb6558\": container with ID starting with 3da83dec82e9fb5197879c972a1ef86ac9b11da0ed3884630618078742fb6558 not found: ID does not exist" containerID="3da83dec82e9fb5197879c972a1ef86ac9b11da0ed3884630618078742fb6558" Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.456668 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3da83dec82e9fb5197879c972a1ef86ac9b11da0ed3884630618078742fb6558"} err="failed to get container status \"3da83dec82e9fb5197879c972a1ef86ac9b11da0ed3884630618078742fb6558\": rpc error: code = NotFound desc = could not find container \"3da83dec82e9fb5197879c972a1ef86ac9b11da0ed3884630618078742fb6558\": container with ID starting with 3da83dec82e9fb5197879c972a1ef86ac9b11da0ed3884630618078742fb6558 not found: ID does not exist" Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.456700 4691 scope.go:117] "RemoveContainer" containerID="141288b9567948ac2e145dbc13589900de4d46c60614c8ff9fc8e33de478b579" Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.460818 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-fqxlz"] Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.463366 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-fqxlz"] Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.471841 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-4bdqt"] Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.474863 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-4bdqt"] Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.477519 4691 scope.go:117] "RemoveContainer" containerID="141288b9567948ac2e145dbc13589900de4d46c60614c8ff9fc8e33de478b579" Sep 30 06:22:42 crc kubenswrapper[4691]: E0930 06:22:42.477954 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"141288b9567948ac2e145dbc13589900de4d46c60614c8ff9fc8e33de478b579\": container with ID starting with 141288b9567948ac2e145dbc13589900de4d46c60614c8ff9fc8e33de478b579 not found: ID does not exist" containerID="141288b9567948ac2e145dbc13589900de4d46c60614c8ff9fc8e33de478b579" Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.478058 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"141288b9567948ac2e145dbc13589900de4d46c60614c8ff9fc8e33de478b579"} err="failed to get container status \"141288b9567948ac2e145dbc13589900de4d46c60614c8ff9fc8e33de478b579\": rpc error: code = NotFound desc = could not find container \"141288b9567948ac2e145dbc13589900de4d46c60614c8ff9fc8e33de478b579\": container with ID starting with 141288b9567948ac2e145dbc13589900de4d46c60614c8ff9fc8e33de478b579 not found: ID does not exist" Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.772008 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-865444f99f-6qhsv"] Sep 30 06:22:42 crc kubenswrapper[4691]: E0930 06:22:42.772318 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fae00bc-923f-4e3c-979b-bfde482dc0b0" containerName="extract-utilities" Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.772336 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fae00bc-923f-4e3c-979b-bfde482dc0b0" containerName="extract-utilities" Sep 30 06:22:42 crc kubenswrapper[4691]: E0930 06:22:42.772347 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5278b1f-f696-4223-a657-24a3d309b5f3" containerName="registry-server" Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.772355 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5278b1f-f696-4223-a657-24a3d309b5f3" containerName="registry-server" Sep 30 06:22:42 crc kubenswrapper[4691]: E0930 06:22:42.772379 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95812fe8-f69f-4c5a-9a91-e3e054317b63" containerName="registry-server" Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.772385 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="95812fe8-f69f-4c5a-9a91-e3e054317b63" containerName="registry-server" Sep 30 06:22:42 crc kubenswrapper[4691]: E0930 06:22:42.772393 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="535f395c-e127-4a48-8766-707bf9d4d5a3" containerName="route-controller-manager" Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.772399 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="535f395c-e127-4a48-8766-707bf9d4d5a3" containerName="route-controller-manager" Sep 30 06:22:42 crc kubenswrapper[4691]: E0930 06:22:42.772407 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fae00bc-923f-4e3c-979b-bfde482dc0b0" containerName="extract-content" Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.772413 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fae00bc-923f-4e3c-979b-bfde482dc0b0" containerName="extract-content" Sep 30 06:22:42 crc kubenswrapper[4691]: E0930 06:22:42.772424 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07c0a985-95a5-4fd3-a271-9e46d5f51af4" containerName="extract-utilities" Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.772430 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="07c0a985-95a5-4fd3-a271-9e46d5f51af4" containerName="extract-utilities" Sep 30 06:22:42 crc kubenswrapper[4691]: E0930 06:22:42.772454 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5278b1f-f696-4223-a657-24a3d309b5f3" containerName="extract-content" Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.772461 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5278b1f-f696-4223-a657-24a3d309b5f3" containerName="extract-content" Sep 30 06:22:42 crc kubenswrapper[4691]: E0930 06:22:42.772467 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07c0a985-95a5-4fd3-a271-9e46d5f51af4" containerName="registry-server" Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.772473 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="07c0a985-95a5-4fd3-a271-9e46d5f51af4" containerName="registry-server" Sep 30 06:22:42 crc kubenswrapper[4691]: E0930 06:22:42.772482 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95812fe8-f69f-4c5a-9a91-e3e054317b63" containerName="extract-content" Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.772488 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="95812fe8-f69f-4c5a-9a91-e3e054317b63" containerName="extract-content" Sep 30 06:22:42 crc kubenswrapper[4691]: E0930 06:22:42.772495 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07c0a985-95a5-4fd3-a271-9e46d5f51af4" containerName="extract-content" Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.772500 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="07c0a985-95a5-4fd3-a271-9e46d5f51af4" containerName="extract-content" Sep 30 06:22:42 crc kubenswrapper[4691]: E0930 06:22:42.772510 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a677441-8b2d-41ae-8dd8-e3334c16c700" containerName="controller-manager" Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.772516 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a677441-8b2d-41ae-8dd8-e3334c16c700" containerName="controller-manager" Sep 30 06:22:42 crc kubenswrapper[4691]: E0930 06:22:42.772538 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5278b1f-f696-4223-a657-24a3d309b5f3" containerName="extract-utilities" Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.772544 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5278b1f-f696-4223-a657-24a3d309b5f3" containerName="extract-utilities" Sep 30 06:22:42 crc kubenswrapper[4691]: E0930 06:22:42.772554 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95812fe8-f69f-4c5a-9a91-e3e054317b63" containerName="extract-utilities" Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.772560 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="95812fe8-f69f-4c5a-9a91-e3e054317b63" containerName="extract-utilities" Sep 30 06:22:42 crc kubenswrapper[4691]: E0930 06:22:42.772568 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2fae00bc-923f-4e3c-979b-bfde482dc0b0" containerName="registry-server" Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.772573 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="2fae00bc-923f-4e3c-979b-bfde482dc0b0" containerName="registry-server" Sep 30 06:22:42 crc kubenswrapper[4691]: E0930 06:22:42.772585 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd902b5e-df8a-47be-9781-8a384f3849fb" containerName="pruner" Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.772591 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd902b5e-df8a-47be-9781-8a384f3849fb" containerName="pruner" Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.772702 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="95812fe8-f69f-4c5a-9a91-e3e054317b63" containerName="registry-server" Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.772714 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="2fae00bc-923f-4e3c-979b-bfde482dc0b0" containerName="registry-server" Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.772722 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a677441-8b2d-41ae-8dd8-e3334c16c700" containerName="controller-manager" Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.772730 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="535f395c-e127-4a48-8766-707bf9d4d5a3" containerName="route-controller-manager" Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.772739 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5278b1f-f696-4223-a657-24a3d309b5f3" containerName="registry-server" Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.772748 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="07c0a985-95a5-4fd3-a271-9e46d5f51af4" containerName="registry-server" Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.772773 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd902b5e-df8a-47be-9781-8a384f3849fb" containerName="pruner" Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.773319 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-865444f99f-6qhsv" Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.774158 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7897b44485-4fj4f"] Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.774747 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7897b44485-4fj4f" Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.776834 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.777070 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.777186 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.777375 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.777427 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.777674 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.777772 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.778013 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.778598 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.778806 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.778806 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.779054 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.796252 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.798511 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-865444f99f-6qhsv"] Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.817473 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlccc\" (UniqueName: \"kubernetes.io/projected/861e92c9-a61e-43df-9331-c2aee5175fc3-kube-api-access-tlccc\") pod \"controller-manager-7897b44485-4fj4f\" (UID: \"861e92c9-a61e-43df-9331-c2aee5175fc3\") " pod="openshift-controller-manager/controller-manager-7897b44485-4fj4f" Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.817554 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/861e92c9-a61e-43df-9331-c2aee5175fc3-serving-cert\") pod \"controller-manager-7897b44485-4fj4f\" (UID: \"861e92c9-a61e-43df-9331-c2aee5175fc3\") " pod="openshift-controller-manager/controller-manager-7897b44485-4fj4f" Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.817578 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71196a78-693d-4f4e-9397-a0c60da648c2-config\") pod \"route-controller-manager-865444f99f-6qhsv\" (UID: \"71196a78-693d-4f4e-9397-a0c60da648c2\") " pod="openshift-route-controller-manager/route-controller-manager-865444f99f-6qhsv" Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.817602 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/861e92c9-a61e-43df-9331-c2aee5175fc3-config\") pod \"controller-manager-7897b44485-4fj4f\" (UID: \"861e92c9-a61e-43df-9331-c2aee5175fc3\") " pod="openshift-controller-manager/controller-manager-7897b44485-4fj4f" Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.817655 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4l7g6\" (UniqueName: \"kubernetes.io/projected/71196a78-693d-4f4e-9397-a0c60da648c2-kube-api-access-4l7g6\") pod \"route-controller-manager-865444f99f-6qhsv\" (UID: \"71196a78-693d-4f4e-9397-a0c60da648c2\") " pod="openshift-route-controller-manager/route-controller-manager-865444f99f-6qhsv" Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.817680 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/71196a78-693d-4f4e-9397-a0c60da648c2-client-ca\") pod \"route-controller-manager-865444f99f-6qhsv\" (UID: \"71196a78-693d-4f4e-9397-a0c60da648c2\") " pod="openshift-route-controller-manager/route-controller-manager-865444f99f-6qhsv" Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.817747 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/861e92c9-a61e-43df-9331-c2aee5175fc3-proxy-ca-bundles\") pod \"controller-manager-7897b44485-4fj4f\" (UID: \"861e92c9-a61e-43df-9331-c2aee5175fc3\") " pod="openshift-controller-manager/controller-manager-7897b44485-4fj4f" Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.817780 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/71196a78-693d-4f4e-9397-a0c60da648c2-serving-cert\") pod \"route-controller-manager-865444f99f-6qhsv\" (UID: \"71196a78-693d-4f4e-9397-a0c60da648c2\") " pod="openshift-route-controller-manager/route-controller-manager-865444f99f-6qhsv" Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.817803 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/861e92c9-a61e-43df-9331-c2aee5175fc3-client-ca\") pod \"controller-manager-7897b44485-4fj4f\" (UID: \"861e92c9-a61e-43df-9331-c2aee5175fc3\") " pod="openshift-controller-manager/controller-manager-7897b44485-4fj4f" Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.819181 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7897b44485-4fj4f"] Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.919275 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/861e92c9-a61e-43df-9331-c2aee5175fc3-proxy-ca-bundles\") pod \"controller-manager-7897b44485-4fj4f\" (UID: \"861e92c9-a61e-43df-9331-c2aee5175fc3\") " pod="openshift-controller-manager/controller-manager-7897b44485-4fj4f" Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.919316 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/71196a78-693d-4f4e-9397-a0c60da648c2-serving-cert\") pod \"route-controller-manager-865444f99f-6qhsv\" (UID: \"71196a78-693d-4f4e-9397-a0c60da648c2\") " pod="openshift-route-controller-manager/route-controller-manager-865444f99f-6qhsv" Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.919344 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/861e92c9-a61e-43df-9331-c2aee5175fc3-client-ca\") pod \"controller-manager-7897b44485-4fj4f\" (UID: \"861e92c9-a61e-43df-9331-c2aee5175fc3\") " pod="openshift-controller-manager/controller-manager-7897b44485-4fj4f" Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.919364 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlccc\" (UniqueName: \"kubernetes.io/projected/861e92c9-a61e-43df-9331-c2aee5175fc3-kube-api-access-tlccc\") pod \"controller-manager-7897b44485-4fj4f\" (UID: \"861e92c9-a61e-43df-9331-c2aee5175fc3\") " pod="openshift-controller-manager/controller-manager-7897b44485-4fj4f" Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.919386 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/861e92c9-a61e-43df-9331-c2aee5175fc3-serving-cert\") pod \"controller-manager-7897b44485-4fj4f\" (UID: \"861e92c9-a61e-43df-9331-c2aee5175fc3\") " pod="openshift-controller-manager/controller-manager-7897b44485-4fj4f" Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.919406 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71196a78-693d-4f4e-9397-a0c60da648c2-config\") pod \"route-controller-manager-865444f99f-6qhsv\" (UID: \"71196a78-693d-4f4e-9397-a0c60da648c2\") " pod="openshift-route-controller-manager/route-controller-manager-865444f99f-6qhsv" Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.919426 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/861e92c9-a61e-43df-9331-c2aee5175fc3-config\") pod \"controller-manager-7897b44485-4fj4f\" (UID: \"861e92c9-a61e-43df-9331-c2aee5175fc3\") " pod="openshift-controller-manager/controller-manager-7897b44485-4fj4f" Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.919444 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4l7g6\" (UniqueName: \"kubernetes.io/projected/71196a78-693d-4f4e-9397-a0c60da648c2-kube-api-access-4l7g6\") pod \"route-controller-manager-865444f99f-6qhsv\" (UID: \"71196a78-693d-4f4e-9397-a0c60da648c2\") " pod="openshift-route-controller-manager/route-controller-manager-865444f99f-6qhsv" Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.919463 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/71196a78-693d-4f4e-9397-a0c60da648c2-client-ca\") pod \"route-controller-manager-865444f99f-6qhsv\" (UID: \"71196a78-693d-4f4e-9397-a0c60da648c2\") " pod="openshift-route-controller-manager/route-controller-manager-865444f99f-6qhsv" Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.920719 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/861e92c9-a61e-43df-9331-c2aee5175fc3-client-ca\") pod \"controller-manager-7897b44485-4fj4f\" (UID: \"861e92c9-a61e-43df-9331-c2aee5175fc3\") " pod="openshift-controller-manager/controller-manager-7897b44485-4fj4f" Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.920879 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71196a78-693d-4f4e-9397-a0c60da648c2-config\") pod \"route-controller-manager-865444f99f-6qhsv\" (UID: \"71196a78-693d-4f4e-9397-a0c60da648c2\") " pod="openshift-route-controller-manager/route-controller-manager-865444f99f-6qhsv" Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.920908 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/861e92c9-a61e-43df-9331-c2aee5175fc3-proxy-ca-bundles\") pod \"controller-manager-7897b44485-4fj4f\" (UID: \"861e92c9-a61e-43df-9331-c2aee5175fc3\") " pod="openshift-controller-manager/controller-manager-7897b44485-4fj4f" Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.921097 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/861e92c9-a61e-43df-9331-c2aee5175fc3-config\") pod \"controller-manager-7897b44485-4fj4f\" (UID: \"861e92c9-a61e-43df-9331-c2aee5175fc3\") " pod="openshift-controller-manager/controller-manager-7897b44485-4fj4f" Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.921356 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/71196a78-693d-4f4e-9397-a0c60da648c2-client-ca\") pod \"route-controller-manager-865444f99f-6qhsv\" (UID: \"71196a78-693d-4f4e-9397-a0c60da648c2\") " pod="openshift-route-controller-manager/route-controller-manager-865444f99f-6qhsv" Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.923815 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/861e92c9-a61e-43df-9331-c2aee5175fc3-serving-cert\") pod \"controller-manager-7897b44485-4fj4f\" (UID: \"861e92c9-a61e-43df-9331-c2aee5175fc3\") " pod="openshift-controller-manager/controller-manager-7897b44485-4fj4f" Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.924194 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/71196a78-693d-4f4e-9397-a0c60da648c2-serving-cert\") pod \"route-controller-manager-865444f99f-6qhsv\" (UID: \"71196a78-693d-4f4e-9397-a0c60da648c2\") " pod="openshift-route-controller-manager/route-controller-manager-865444f99f-6qhsv" Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.940238 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4l7g6\" (UniqueName: \"kubernetes.io/projected/71196a78-693d-4f4e-9397-a0c60da648c2-kube-api-access-4l7g6\") pod \"route-controller-manager-865444f99f-6qhsv\" (UID: \"71196a78-693d-4f4e-9397-a0c60da648c2\") " pod="openshift-route-controller-manager/route-controller-manager-865444f99f-6qhsv" Sep 30 06:22:42 crc kubenswrapper[4691]: I0930 06:22:42.946576 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlccc\" (UniqueName: \"kubernetes.io/projected/861e92c9-a61e-43df-9331-c2aee5175fc3-kube-api-access-tlccc\") pod \"controller-manager-7897b44485-4fj4f\" (UID: \"861e92c9-a61e-43df-9331-c2aee5175fc3\") " pod="openshift-controller-manager/controller-manager-7897b44485-4fj4f" Sep 30 06:22:43 crc kubenswrapper[4691]: I0930 06:22:43.090968 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-865444f99f-6qhsv" Sep 30 06:22:43 crc kubenswrapper[4691]: I0930 06:22:43.104112 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7897b44485-4fj4f" Sep 30 06:22:43 crc kubenswrapper[4691]: I0930 06:22:43.231626 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="535f395c-e127-4a48-8766-707bf9d4d5a3" path="/var/lib/kubelet/pods/535f395c-e127-4a48-8766-707bf9d4d5a3/volumes" Sep 30 06:22:43 crc kubenswrapper[4691]: I0930 06:22:43.232277 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a677441-8b2d-41ae-8dd8-e3334c16c700" path="/var/lib/kubelet/pods/7a677441-8b2d-41ae-8dd8-e3334c16c700/volumes" Sep 30 06:22:43 crc kubenswrapper[4691]: I0930 06:22:43.297372 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-865444f99f-6qhsv"] Sep 30 06:22:43 crc kubenswrapper[4691]: I0930 06:22:43.355746 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7897b44485-4fj4f"] Sep 30 06:22:43 crc kubenswrapper[4691]: W0930 06:22:43.368384 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod861e92c9_a61e_43df_9331_c2aee5175fc3.slice/crio-5a947043bbf3ff0e8ab25e98067921f75a47b224cf8411772ec7ab1ae951f7e5 WatchSource:0}: Error finding container 5a947043bbf3ff0e8ab25e98067921f75a47b224cf8411772ec7ab1ae951f7e5: Status 404 returned error can't find the container with id 5a947043bbf3ff0e8ab25e98067921f75a47b224cf8411772ec7ab1ae951f7e5 Sep 30 06:22:43 crc kubenswrapper[4691]: I0930 06:22:43.416227 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7897b44485-4fj4f" event={"ID":"861e92c9-a61e-43df-9331-c2aee5175fc3","Type":"ContainerStarted","Data":"5a947043bbf3ff0e8ab25e98067921f75a47b224cf8411772ec7ab1ae951f7e5"} Sep 30 06:22:43 crc kubenswrapper[4691]: I0930 06:22:43.418792 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-865444f99f-6qhsv" event={"ID":"71196a78-693d-4f4e-9397-a0c60da648c2","Type":"ContainerStarted","Data":"75e432cb5b18346f0e75da1883f7d28fa520cb6606e63d4414ff63d4deb4ebcf"} Sep 30 06:22:44 crc kubenswrapper[4691]: I0930 06:22:44.426564 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7897b44485-4fj4f" event={"ID":"861e92c9-a61e-43df-9331-c2aee5175fc3","Type":"ContainerStarted","Data":"4a54b6ec3b4548717be6f5961483781523e0099c361ce3f49e1b6f295e1e345d"} Sep 30 06:22:44 crc kubenswrapper[4691]: I0930 06:22:44.427254 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7897b44485-4fj4f" Sep 30 06:22:44 crc kubenswrapper[4691]: I0930 06:22:44.428205 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-865444f99f-6qhsv" event={"ID":"71196a78-693d-4f4e-9397-a0c60da648c2","Type":"ContainerStarted","Data":"ea460a4cf5a57c8a9454974ef379b7ec4152b0d799fa98c30221c721b82dac68"} Sep 30 06:22:44 crc kubenswrapper[4691]: I0930 06:22:44.428873 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-865444f99f-6qhsv" Sep 30 06:22:44 crc kubenswrapper[4691]: I0930 06:22:44.432841 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7897b44485-4fj4f" Sep 30 06:22:44 crc kubenswrapper[4691]: I0930 06:22:44.435831 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-865444f99f-6qhsv" Sep 30 06:22:44 crc kubenswrapper[4691]: I0930 06:22:44.448548 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7897b44485-4fj4f" podStartSLOduration=3.448526304 podStartE2EDuration="3.448526304s" podCreationTimestamp="2025-09-30 06:22:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:22:44.445917441 +0000 UTC m=+207.920938541" watchObservedRunningTime="2025-09-30 06:22:44.448526304 +0000 UTC m=+207.923547374" Sep 30 06:22:44 crc kubenswrapper[4691]: I0930 06:22:44.469950 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-865444f99f-6qhsv" podStartSLOduration=3.469931373 podStartE2EDuration="3.469931373s" podCreationTimestamp="2025-09-30 06:22:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:22:44.463832259 +0000 UTC m=+207.938853339" watchObservedRunningTime="2025-09-30 06:22:44.469931373 +0000 UTC m=+207.944952413" Sep 30 06:22:52 crc kubenswrapper[4691]: I0930 06:22:52.850425 4691 patch_prober.go:28] interesting pod/machine-config-daemon-4w4k6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 06:22:52 crc kubenswrapper[4691]: I0930 06:22:52.851127 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 06:22:52 crc kubenswrapper[4691]: I0930 06:22:52.851190 4691 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" Sep 30 06:22:52 crc kubenswrapper[4691]: I0930 06:22:52.851967 4691 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1440cd9bacd9b28d5e192b3d9143dda931c741606fdb8f246fba23b8a68c534c"} pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 06:22:52 crc kubenswrapper[4691]: I0930 06:22:52.852070 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" containerName="machine-config-daemon" containerID="cri-o://1440cd9bacd9b28d5e192b3d9143dda931c741606fdb8f246fba23b8a68c534c" gracePeriod=600 Sep 30 06:22:53 crc kubenswrapper[4691]: I0930 06:22:53.499176 4691 generic.go:334] "Generic (PLEG): container finished" podID="69b46ade-8260-448f-84b7-506632d23ff9" containerID="1440cd9bacd9b28d5e192b3d9143dda931c741606fdb8f246fba23b8a68c534c" exitCode=0 Sep 30 06:22:53 crc kubenswrapper[4691]: I0930 06:22:53.499298 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" event={"ID":"69b46ade-8260-448f-84b7-506632d23ff9","Type":"ContainerDied","Data":"1440cd9bacd9b28d5e192b3d9143dda931c741606fdb8f246fba23b8a68c534c"} Sep 30 06:22:53 crc kubenswrapper[4691]: I0930 06:22:53.499586 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" event={"ID":"69b46ade-8260-448f-84b7-506632d23ff9","Type":"ContainerStarted","Data":"5124cec3e8ade06d39c26cde1baaa625eb5e8cb0cb2eb147c1c6f02b93ecaae0"} Sep 30 06:22:54 crc kubenswrapper[4691]: I0930 06:22:54.447996 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-r8bfj" podUID="46e3679a-b63e-4f7c-b118-02287f570a24" containerName="oauth-openshift" containerID="cri-o://883c597d43032755b085bf364c08ef47a535703632e5cfaa15e8634989643a09" gracePeriod=15 Sep 30 06:22:55 crc kubenswrapper[4691]: I0930 06:22:55.040370 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-r8bfj" Sep 30 06:22:55 crc kubenswrapper[4691]: I0930 06:22:55.101198 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/46e3679a-b63e-4f7c-b118-02287f570a24-v4-0-config-user-template-error\") pod \"46e3679a-b63e-4f7c-b118-02287f570a24\" (UID: \"46e3679a-b63e-4f7c-b118-02287f570a24\") " Sep 30 06:22:55 crc kubenswrapper[4691]: I0930 06:22:55.101626 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/46e3679a-b63e-4f7c-b118-02287f570a24-v4-0-config-user-idp-0-file-data\") pod \"46e3679a-b63e-4f7c-b118-02287f570a24\" (UID: \"46e3679a-b63e-4f7c-b118-02287f570a24\") " Sep 30 06:22:55 crc kubenswrapper[4691]: I0930 06:22:55.101669 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/46e3679a-b63e-4f7c-b118-02287f570a24-v4-0-config-system-cliconfig\") pod \"46e3679a-b63e-4f7c-b118-02287f570a24\" (UID: \"46e3679a-b63e-4f7c-b118-02287f570a24\") " Sep 30 06:22:55 crc kubenswrapper[4691]: I0930 06:22:55.101709 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ssd8b\" (UniqueName: \"kubernetes.io/projected/46e3679a-b63e-4f7c-b118-02287f570a24-kube-api-access-ssd8b\") pod \"46e3679a-b63e-4f7c-b118-02287f570a24\" (UID: \"46e3679a-b63e-4f7c-b118-02287f570a24\") " Sep 30 06:22:55 crc kubenswrapper[4691]: I0930 06:22:55.101752 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/46e3679a-b63e-4f7c-b118-02287f570a24-v4-0-config-system-session\") pod \"46e3679a-b63e-4f7c-b118-02287f570a24\" (UID: \"46e3679a-b63e-4f7c-b118-02287f570a24\") " Sep 30 06:22:55 crc kubenswrapper[4691]: I0930 06:22:55.101813 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/46e3679a-b63e-4f7c-b118-02287f570a24-v4-0-config-user-template-login\") pod \"46e3679a-b63e-4f7c-b118-02287f570a24\" (UID: \"46e3679a-b63e-4f7c-b118-02287f570a24\") " Sep 30 06:22:55 crc kubenswrapper[4691]: I0930 06:22:55.101848 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/46e3679a-b63e-4f7c-b118-02287f570a24-v4-0-config-system-ocp-branding-template\") pod \"46e3679a-b63e-4f7c-b118-02287f570a24\" (UID: \"46e3679a-b63e-4f7c-b118-02287f570a24\") " Sep 30 06:22:55 crc kubenswrapper[4691]: I0930 06:22:55.101938 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/46e3679a-b63e-4f7c-b118-02287f570a24-v4-0-config-system-service-ca\") pod \"46e3679a-b63e-4f7c-b118-02287f570a24\" (UID: \"46e3679a-b63e-4f7c-b118-02287f570a24\") " Sep 30 06:22:55 crc kubenswrapper[4691]: I0930 06:22:55.101984 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/46e3679a-b63e-4f7c-b118-02287f570a24-audit-policies\") pod \"46e3679a-b63e-4f7c-b118-02287f570a24\" (UID: \"46e3679a-b63e-4f7c-b118-02287f570a24\") " Sep 30 06:22:55 crc kubenswrapper[4691]: I0930 06:22:55.102019 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/46e3679a-b63e-4f7c-b118-02287f570a24-v4-0-config-system-router-certs\") pod \"46e3679a-b63e-4f7c-b118-02287f570a24\" (UID: \"46e3679a-b63e-4f7c-b118-02287f570a24\") " Sep 30 06:22:55 crc kubenswrapper[4691]: I0930 06:22:55.102040 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46e3679a-b63e-4f7c-b118-02287f570a24-v4-0-config-system-trusted-ca-bundle\") pod \"46e3679a-b63e-4f7c-b118-02287f570a24\" (UID: \"46e3679a-b63e-4f7c-b118-02287f570a24\") " Sep 30 06:22:55 crc kubenswrapper[4691]: I0930 06:22:55.102061 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/46e3679a-b63e-4f7c-b118-02287f570a24-v4-0-config-user-template-provider-selection\") pod \"46e3679a-b63e-4f7c-b118-02287f570a24\" (UID: \"46e3679a-b63e-4f7c-b118-02287f570a24\") " Sep 30 06:22:55 crc kubenswrapper[4691]: I0930 06:22:55.102092 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/46e3679a-b63e-4f7c-b118-02287f570a24-v4-0-config-system-serving-cert\") pod \"46e3679a-b63e-4f7c-b118-02287f570a24\" (UID: \"46e3679a-b63e-4f7c-b118-02287f570a24\") " Sep 30 06:22:55 crc kubenswrapper[4691]: I0930 06:22:55.102121 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/46e3679a-b63e-4f7c-b118-02287f570a24-audit-dir\") pod \"46e3679a-b63e-4f7c-b118-02287f570a24\" (UID: \"46e3679a-b63e-4f7c-b118-02287f570a24\") " Sep 30 06:22:55 crc kubenswrapper[4691]: I0930 06:22:55.102366 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/46e3679a-b63e-4f7c-b118-02287f570a24-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "46e3679a-b63e-4f7c-b118-02287f570a24" (UID: "46e3679a-b63e-4f7c-b118-02287f570a24"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 06:22:55 crc kubenswrapper[4691]: I0930 06:22:55.103276 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46e3679a-b63e-4f7c-b118-02287f570a24-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "46e3679a-b63e-4f7c-b118-02287f570a24" (UID: "46e3679a-b63e-4f7c-b118-02287f570a24"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:22:55 crc kubenswrapper[4691]: I0930 06:22:55.103454 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46e3679a-b63e-4f7c-b118-02287f570a24-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "46e3679a-b63e-4f7c-b118-02287f570a24" (UID: "46e3679a-b63e-4f7c-b118-02287f570a24"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:22:55 crc kubenswrapper[4691]: I0930 06:22:55.103764 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46e3679a-b63e-4f7c-b118-02287f570a24-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "46e3679a-b63e-4f7c-b118-02287f570a24" (UID: "46e3679a-b63e-4f7c-b118-02287f570a24"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:22:55 crc kubenswrapper[4691]: I0930 06:22:55.104680 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46e3679a-b63e-4f7c-b118-02287f570a24-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "46e3679a-b63e-4f7c-b118-02287f570a24" (UID: "46e3679a-b63e-4f7c-b118-02287f570a24"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:22:55 crc kubenswrapper[4691]: I0930 06:22:55.110688 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46e3679a-b63e-4f7c-b118-02287f570a24-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "46e3679a-b63e-4f7c-b118-02287f570a24" (UID: "46e3679a-b63e-4f7c-b118-02287f570a24"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:22:55 crc kubenswrapper[4691]: I0930 06:22:55.111219 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46e3679a-b63e-4f7c-b118-02287f570a24-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "46e3679a-b63e-4f7c-b118-02287f570a24" (UID: "46e3679a-b63e-4f7c-b118-02287f570a24"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:22:55 crc kubenswrapper[4691]: I0930 06:22:55.112028 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46e3679a-b63e-4f7c-b118-02287f570a24-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "46e3679a-b63e-4f7c-b118-02287f570a24" (UID: "46e3679a-b63e-4f7c-b118-02287f570a24"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:22:55 crc kubenswrapper[4691]: I0930 06:22:55.112488 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46e3679a-b63e-4f7c-b118-02287f570a24-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "46e3679a-b63e-4f7c-b118-02287f570a24" (UID: "46e3679a-b63e-4f7c-b118-02287f570a24"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:22:55 crc kubenswrapper[4691]: I0930 06:22:55.113426 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46e3679a-b63e-4f7c-b118-02287f570a24-kube-api-access-ssd8b" (OuterVolumeSpecName: "kube-api-access-ssd8b") pod "46e3679a-b63e-4f7c-b118-02287f570a24" (UID: "46e3679a-b63e-4f7c-b118-02287f570a24"). InnerVolumeSpecName "kube-api-access-ssd8b". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:22:55 crc kubenswrapper[4691]: I0930 06:22:55.113469 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46e3679a-b63e-4f7c-b118-02287f570a24-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "46e3679a-b63e-4f7c-b118-02287f570a24" (UID: "46e3679a-b63e-4f7c-b118-02287f570a24"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:22:55 crc kubenswrapper[4691]: I0930 06:22:55.113855 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46e3679a-b63e-4f7c-b118-02287f570a24-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "46e3679a-b63e-4f7c-b118-02287f570a24" (UID: "46e3679a-b63e-4f7c-b118-02287f570a24"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:22:55 crc kubenswrapper[4691]: I0930 06:22:55.114120 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46e3679a-b63e-4f7c-b118-02287f570a24-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "46e3679a-b63e-4f7c-b118-02287f570a24" (UID: "46e3679a-b63e-4f7c-b118-02287f570a24"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:22:55 crc kubenswrapper[4691]: I0930 06:22:55.115022 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46e3679a-b63e-4f7c-b118-02287f570a24-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "46e3679a-b63e-4f7c-b118-02287f570a24" (UID: "46e3679a-b63e-4f7c-b118-02287f570a24"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:22:55 crc kubenswrapper[4691]: I0930 06:22:55.203822 4691 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/46e3679a-b63e-4f7c-b118-02287f570a24-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Sep 30 06:22:55 crc kubenswrapper[4691]: I0930 06:22:55.203948 4691 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/46e3679a-b63e-4f7c-b118-02287f570a24-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 06:22:55 crc kubenswrapper[4691]: I0930 06:22:55.203972 4691 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/46e3679a-b63e-4f7c-b118-02287f570a24-audit-dir\") on node \"crc\" DevicePath \"\"" Sep 30 06:22:55 crc kubenswrapper[4691]: I0930 06:22:55.203991 4691 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/46e3679a-b63e-4f7c-b118-02287f570a24-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Sep 30 06:22:55 crc kubenswrapper[4691]: I0930 06:22:55.204011 4691 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/46e3679a-b63e-4f7c-b118-02287f570a24-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Sep 30 06:22:55 crc kubenswrapper[4691]: I0930 06:22:55.204052 4691 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/46e3679a-b63e-4f7c-b118-02287f570a24-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Sep 30 06:22:55 crc kubenswrapper[4691]: I0930 06:22:55.204073 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ssd8b\" (UniqueName: \"kubernetes.io/projected/46e3679a-b63e-4f7c-b118-02287f570a24-kube-api-access-ssd8b\") on node \"crc\" DevicePath \"\"" Sep 30 06:22:55 crc kubenswrapper[4691]: I0930 06:22:55.204096 4691 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/46e3679a-b63e-4f7c-b118-02287f570a24-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Sep 30 06:22:55 crc kubenswrapper[4691]: I0930 06:22:55.204265 4691 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/46e3679a-b63e-4f7c-b118-02287f570a24-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Sep 30 06:22:55 crc kubenswrapper[4691]: I0930 06:22:55.204285 4691 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/46e3679a-b63e-4f7c-b118-02287f570a24-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Sep 30 06:22:55 crc kubenswrapper[4691]: I0930 06:22:55.204307 4691 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/46e3679a-b63e-4f7c-b118-02287f570a24-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Sep 30 06:22:55 crc kubenswrapper[4691]: I0930 06:22:55.204327 4691 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/46e3679a-b63e-4f7c-b118-02287f570a24-audit-policies\") on node \"crc\" DevicePath \"\"" Sep 30 06:22:55 crc kubenswrapper[4691]: I0930 06:22:55.204346 4691 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/46e3679a-b63e-4f7c-b118-02287f570a24-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Sep 30 06:22:55 crc kubenswrapper[4691]: I0930 06:22:55.204368 4691 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/46e3679a-b63e-4f7c-b118-02287f570a24-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:22:55 crc kubenswrapper[4691]: I0930 06:22:55.517350 4691 generic.go:334] "Generic (PLEG): container finished" podID="46e3679a-b63e-4f7c-b118-02287f570a24" containerID="883c597d43032755b085bf364c08ef47a535703632e5cfaa15e8634989643a09" exitCode=0 Sep 30 06:22:55 crc kubenswrapper[4691]: I0930 06:22:55.517430 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-r8bfj" event={"ID":"46e3679a-b63e-4f7c-b118-02287f570a24","Type":"ContainerDied","Data":"883c597d43032755b085bf364c08ef47a535703632e5cfaa15e8634989643a09"} Sep 30 06:22:55 crc kubenswrapper[4691]: I0930 06:22:55.517458 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-r8bfj" Sep 30 06:22:55 crc kubenswrapper[4691]: I0930 06:22:55.517493 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-r8bfj" event={"ID":"46e3679a-b63e-4f7c-b118-02287f570a24","Type":"ContainerDied","Data":"570f75deff6c2681edaecad239d0e9e4e6133109760d5d09ea77a65fcf1d59b9"} Sep 30 06:22:55 crc kubenswrapper[4691]: I0930 06:22:55.517530 4691 scope.go:117] "RemoveContainer" containerID="883c597d43032755b085bf364c08ef47a535703632e5cfaa15e8634989643a09" Sep 30 06:22:55 crc kubenswrapper[4691]: I0930 06:22:55.548045 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-r8bfj"] Sep 30 06:22:55 crc kubenswrapper[4691]: I0930 06:22:55.553378 4691 scope.go:117] "RemoveContainer" containerID="883c597d43032755b085bf364c08ef47a535703632e5cfaa15e8634989643a09" Sep 30 06:22:55 crc kubenswrapper[4691]: E0930 06:22:55.554652 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"883c597d43032755b085bf364c08ef47a535703632e5cfaa15e8634989643a09\": container with ID starting with 883c597d43032755b085bf364c08ef47a535703632e5cfaa15e8634989643a09 not found: ID does not exist" containerID="883c597d43032755b085bf364c08ef47a535703632e5cfaa15e8634989643a09" Sep 30 06:22:55 crc kubenswrapper[4691]: I0930 06:22:55.554745 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"883c597d43032755b085bf364c08ef47a535703632e5cfaa15e8634989643a09"} err="failed to get container status \"883c597d43032755b085bf364c08ef47a535703632e5cfaa15e8634989643a09\": rpc error: code = NotFound desc = could not find container \"883c597d43032755b085bf364c08ef47a535703632e5cfaa15e8634989643a09\": container with ID starting with 883c597d43032755b085bf364c08ef47a535703632e5cfaa15e8634989643a09 not found: ID does not exist" Sep 30 06:22:55 crc kubenswrapper[4691]: I0930 06:22:55.564140 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-r8bfj"] Sep 30 06:22:57 crc kubenswrapper[4691]: I0930 06:22:57.237630 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46e3679a-b63e-4f7c-b118-02287f570a24" path="/var/lib/kubelet/pods/46e3679a-b63e-4f7c-b118-02287f570a24/volumes" Sep 30 06:22:59 crc kubenswrapper[4691]: I0930 06:22:59.791537 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-6869cbc5df-p5kwd"] Sep 30 06:22:59 crc kubenswrapper[4691]: E0930 06:22:59.792187 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46e3679a-b63e-4f7c-b118-02287f570a24" containerName="oauth-openshift" Sep 30 06:22:59 crc kubenswrapper[4691]: I0930 06:22:59.792208 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="46e3679a-b63e-4f7c-b118-02287f570a24" containerName="oauth-openshift" Sep 30 06:22:59 crc kubenswrapper[4691]: I0930 06:22:59.792369 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="46e3679a-b63e-4f7c-b118-02287f570a24" containerName="oauth-openshift" Sep 30 06:22:59 crc kubenswrapper[4691]: I0930 06:22:59.792970 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6869cbc5df-p5kwd" Sep 30 06:22:59 crc kubenswrapper[4691]: I0930 06:22:59.799079 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Sep 30 06:22:59 crc kubenswrapper[4691]: I0930 06:22:59.799757 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Sep 30 06:22:59 crc kubenswrapper[4691]: I0930 06:22:59.801313 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Sep 30 06:22:59 crc kubenswrapper[4691]: I0930 06:22:59.801332 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Sep 30 06:22:59 crc kubenswrapper[4691]: I0930 06:22:59.801328 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Sep 30 06:22:59 crc kubenswrapper[4691]: I0930 06:22:59.801544 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Sep 30 06:22:59 crc kubenswrapper[4691]: I0930 06:22:59.802068 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Sep 30 06:22:59 crc kubenswrapper[4691]: I0930 06:22:59.802511 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Sep 30 06:22:59 crc kubenswrapper[4691]: I0930 06:22:59.802767 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Sep 30 06:22:59 crc kubenswrapper[4691]: I0930 06:22:59.803321 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Sep 30 06:22:59 crc kubenswrapper[4691]: I0930 06:22:59.803376 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Sep 30 06:22:59 crc kubenswrapper[4691]: I0930 06:22:59.803470 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Sep 30 06:22:59 crc kubenswrapper[4691]: I0930 06:22:59.814073 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Sep 30 06:22:59 crc kubenswrapper[4691]: I0930 06:22:59.822063 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6869cbc5df-p5kwd"] Sep 30 06:22:59 crc kubenswrapper[4691]: I0930 06:22:59.823038 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Sep 30 06:22:59 crc kubenswrapper[4691]: I0930 06:22:59.866082 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Sep 30 06:22:59 crc kubenswrapper[4691]: I0930 06:22:59.874565 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bae742eb-4220-4fa8-9af6-1934e3808345-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6869cbc5df-p5kwd\" (UID: \"bae742eb-4220-4fa8-9af6-1934e3808345\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-p5kwd" Sep 30 06:22:59 crc kubenswrapper[4691]: I0930 06:22:59.874666 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/bae742eb-4220-4fa8-9af6-1934e3808345-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6869cbc5df-p5kwd\" (UID: \"bae742eb-4220-4fa8-9af6-1934e3808345\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-p5kwd" Sep 30 06:22:59 crc kubenswrapper[4691]: I0930 06:22:59.874724 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/bae742eb-4220-4fa8-9af6-1934e3808345-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6869cbc5df-p5kwd\" (UID: \"bae742eb-4220-4fa8-9af6-1934e3808345\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-p5kwd" Sep 30 06:22:59 crc kubenswrapper[4691]: I0930 06:22:59.874768 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/bae742eb-4220-4fa8-9af6-1934e3808345-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6869cbc5df-p5kwd\" (UID: \"bae742eb-4220-4fa8-9af6-1934e3808345\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-p5kwd" Sep 30 06:22:59 crc kubenswrapper[4691]: I0930 06:22:59.874834 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rmtt\" (UniqueName: \"kubernetes.io/projected/bae742eb-4220-4fa8-9af6-1934e3808345-kube-api-access-4rmtt\") pod \"oauth-openshift-6869cbc5df-p5kwd\" (UID: \"bae742eb-4220-4fa8-9af6-1934e3808345\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-p5kwd" Sep 30 06:22:59 crc kubenswrapper[4691]: I0930 06:22:59.874872 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/bae742eb-4220-4fa8-9af6-1934e3808345-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6869cbc5df-p5kwd\" (UID: \"bae742eb-4220-4fa8-9af6-1934e3808345\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-p5kwd" Sep 30 06:22:59 crc kubenswrapper[4691]: I0930 06:22:59.874938 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/bae742eb-4220-4fa8-9af6-1934e3808345-v4-0-config-system-service-ca\") pod \"oauth-openshift-6869cbc5df-p5kwd\" (UID: \"bae742eb-4220-4fa8-9af6-1934e3808345\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-p5kwd" Sep 30 06:22:59 crc kubenswrapper[4691]: I0930 06:22:59.874985 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/bae742eb-4220-4fa8-9af6-1934e3808345-v4-0-config-user-template-error\") pod \"oauth-openshift-6869cbc5df-p5kwd\" (UID: \"bae742eb-4220-4fa8-9af6-1934e3808345\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-p5kwd" Sep 30 06:22:59 crc kubenswrapper[4691]: I0930 06:22:59.875036 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/bae742eb-4220-4fa8-9af6-1934e3808345-v4-0-config-system-router-certs\") pod \"oauth-openshift-6869cbc5df-p5kwd\" (UID: \"bae742eb-4220-4fa8-9af6-1934e3808345\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-p5kwd" Sep 30 06:22:59 crc kubenswrapper[4691]: I0930 06:22:59.875092 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bae742eb-4220-4fa8-9af6-1934e3808345-audit-dir\") pod \"oauth-openshift-6869cbc5df-p5kwd\" (UID: \"bae742eb-4220-4fa8-9af6-1934e3808345\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-p5kwd" Sep 30 06:22:59 crc kubenswrapper[4691]: I0930 06:22:59.875129 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/bae742eb-4220-4fa8-9af6-1934e3808345-v4-0-config-user-template-login\") pod \"oauth-openshift-6869cbc5df-p5kwd\" (UID: \"bae742eb-4220-4fa8-9af6-1934e3808345\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-p5kwd" Sep 30 06:22:59 crc kubenswrapper[4691]: I0930 06:22:59.875169 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/bae742eb-4220-4fa8-9af6-1934e3808345-audit-policies\") pod \"oauth-openshift-6869cbc5df-p5kwd\" (UID: \"bae742eb-4220-4fa8-9af6-1934e3808345\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-p5kwd" Sep 30 06:22:59 crc kubenswrapper[4691]: I0930 06:22:59.875203 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/bae742eb-4220-4fa8-9af6-1934e3808345-v4-0-config-system-session\") pod \"oauth-openshift-6869cbc5df-p5kwd\" (UID: \"bae742eb-4220-4fa8-9af6-1934e3808345\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-p5kwd" Sep 30 06:22:59 crc kubenswrapper[4691]: I0930 06:22:59.875237 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/bae742eb-4220-4fa8-9af6-1934e3808345-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6869cbc5df-p5kwd\" (UID: \"bae742eb-4220-4fa8-9af6-1934e3808345\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-p5kwd" Sep 30 06:22:59 crc kubenswrapper[4691]: I0930 06:22:59.976865 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rmtt\" (UniqueName: \"kubernetes.io/projected/bae742eb-4220-4fa8-9af6-1934e3808345-kube-api-access-4rmtt\") pod \"oauth-openshift-6869cbc5df-p5kwd\" (UID: \"bae742eb-4220-4fa8-9af6-1934e3808345\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-p5kwd" Sep 30 06:22:59 crc kubenswrapper[4691]: I0930 06:22:59.977370 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/bae742eb-4220-4fa8-9af6-1934e3808345-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6869cbc5df-p5kwd\" (UID: \"bae742eb-4220-4fa8-9af6-1934e3808345\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-p5kwd" Sep 30 06:22:59 crc kubenswrapper[4691]: I0930 06:22:59.977642 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/bae742eb-4220-4fa8-9af6-1934e3808345-v4-0-config-system-service-ca\") pod \"oauth-openshift-6869cbc5df-p5kwd\" (UID: \"bae742eb-4220-4fa8-9af6-1934e3808345\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-p5kwd" Sep 30 06:22:59 crc kubenswrapper[4691]: I0930 06:22:59.977915 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/bae742eb-4220-4fa8-9af6-1934e3808345-v4-0-config-user-template-error\") pod \"oauth-openshift-6869cbc5df-p5kwd\" (UID: \"bae742eb-4220-4fa8-9af6-1934e3808345\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-p5kwd" Sep 30 06:22:59 crc kubenswrapper[4691]: I0930 06:22:59.978168 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/bae742eb-4220-4fa8-9af6-1934e3808345-v4-0-config-system-router-certs\") pod \"oauth-openshift-6869cbc5df-p5kwd\" (UID: \"bae742eb-4220-4fa8-9af6-1934e3808345\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-p5kwd" Sep 30 06:22:59 crc kubenswrapper[4691]: I0930 06:22:59.978362 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bae742eb-4220-4fa8-9af6-1934e3808345-audit-dir\") pod \"oauth-openshift-6869cbc5df-p5kwd\" (UID: \"bae742eb-4220-4fa8-9af6-1934e3808345\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-p5kwd" Sep 30 06:22:59 crc kubenswrapper[4691]: I0930 06:22:59.978540 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/bae742eb-4220-4fa8-9af6-1934e3808345-v4-0-config-user-template-login\") pod \"oauth-openshift-6869cbc5df-p5kwd\" (UID: \"bae742eb-4220-4fa8-9af6-1934e3808345\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-p5kwd" Sep 30 06:22:59 crc kubenswrapper[4691]: I0930 06:22:59.978750 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/bae742eb-4220-4fa8-9af6-1934e3808345-audit-policies\") pod \"oauth-openshift-6869cbc5df-p5kwd\" (UID: \"bae742eb-4220-4fa8-9af6-1934e3808345\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-p5kwd" Sep 30 06:22:59 crc kubenswrapper[4691]: I0930 06:22:59.979002 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/bae742eb-4220-4fa8-9af6-1934e3808345-v4-0-config-system-session\") pod \"oauth-openshift-6869cbc5df-p5kwd\" (UID: \"bae742eb-4220-4fa8-9af6-1934e3808345\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-p5kwd" Sep 30 06:22:59 crc kubenswrapper[4691]: I0930 06:22:59.979219 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/bae742eb-4220-4fa8-9af6-1934e3808345-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6869cbc5df-p5kwd\" (UID: \"bae742eb-4220-4fa8-9af6-1934e3808345\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-p5kwd" Sep 30 06:22:59 crc kubenswrapper[4691]: I0930 06:22:59.979436 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bae742eb-4220-4fa8-9af6-1934e3808345-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6869cbc5df-p5kwd\" (UID: \"bae742eb-4220-4fa8-9af6-1934e3808345\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-p5kwd" Sep 30 06:22:59 crc kubenswrapper[4691]: I0930 06:22:59.979658 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/bae742eb-4220-4fa8-9af6-1934e3808345-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6869cbc5df-p5kwd\" (UID: \"bae742eb-4220-4fa8-9af6-1934e3808345\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-p5kwd" Sep 30 06:22:59 crc kubenswrapper[4691]: I0930 06:22:59.979875 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/bae742eb-4220-4fa8-9af6-1934e3808345-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6869cbc5df-p5kwd\" (UID: \"bae742eb-4220-4fa8-9af6-1934e3808345\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-p5kwd" Sep 30 06:22:59 crc kubenswrapper[4691]: I0930 06:22:59.980689 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/bae742eb-4220-4fa8-9af6-1934e3808345-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6869cbc5df-p5kwd\" (UID: \"bae742eb-4220-4fa8-9af6-1934e3808345\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-p5kwd" Sep 30 06:22:59 crc kubenswrapper[4691]: I0930 06:22:59.979117 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/bae742eb-4220-4fa8-9af6-1934e3808345-v4-0-config-system-service-ca\") pod \"oauth-openshift-6869cbc5df-p5kwd\" (UID: \"bae742eb-4220-4fa8-9af6-1934e3808345\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-p5kwd" Sep 30 06:22:59 crc kubenswrapper[4691]: I0930 06:22:59.980289 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/bae742eb-4220-4fa8-9af6-1934e3808345-audit-policies\") pod \"oauth-openshift-6869cbc5df-p5kwd\" (UID: \"bae742eb-4220-4fa8-9af6-1934e3808345\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-p5kwd" Sep 30 06:22:59 crc kubenswrapper[4691]: I0930 06:22:59.978454 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bae742eb-4220-4fa8-9af6-1934e3808345-audit-dir\") pod \"oauth-openshift-6869cbc5df-p5kwd\" (UID: \"bae742eb-4220-4fa8-9af6-1934e3808345\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-p5kwd" Sep 30 06:22:59 crc kubenswrapper[4691]: I0930 06:22:59.982476 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bae742eb-4220-4fa8-9af6-1934e3808345-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6869cbc5df-p5kwd\" (UID: \"bae742eb-4220-4fa8-9af6-1934e3808345\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-p5kwd" Sep 30 06:22:59 crc kubenswrapper[4691]: I0930 06:22:59.982823 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/bae742eb-4220-4fa8-9af6-1934e3808345-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6869cbc5df-p5kwd\" (UID: \"bae742eb-4220-4fa8-9af6-1934e3808345\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-p5kwd" Sep 30 06:22:59 crc kubenswrapper[4691]: I0930 06:22:59.985284 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/bae742eb-4220-4fa8-9af6-1934e3808345-v4-0-config-user-template-error\") pod \"oauth-openshift-6869cbc5df-p5kwd\" (UID: \"bae742eb-4220-4fa8-9af6-1934e3808345\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-p5kwd" Sep 30 06:22:59 crc kubenswrapper[4691]: I0930 06:22:59.985691 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/bae742eb-4220-4fa8-9af6-1934e3808345-v4-0-config-system-router-certs\") pod \"oauth-openshift-6869cbc5df-p5kwd\" (UID: \"bae742eb-4220-4fa8-9af6-1934e3808345\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-p5kwd" Sep 30 06:22:59 crc kubenswrapper[4691]: I0930 06:22:59.986514 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/bae742eb-4220-4fa8-9af6-1934e3808345-v4-0-config-system-session\") pod \"oauth-openshift-6869cbc5df-p5kwd\" (UID: \"bae742eb-4220-4fa8-9af6-1934e3808345\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-p5kwd" Sep 30 06:22:59 crc kubenswrapper[4691]: I0930 06:22:59.987349 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/bae742eb-4220-4fa8-9af6-1934e3808345-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6869cbc5df-p5kwd\" (UID: \"bae742eb-4220-4fa8-9af6-1934e3808345\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-p5kwd" Sep 30 06:22:59 crc kubenswrapper[4691]: I0930 06:22:59.988483 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/bae742eb-4220-4fa8-9af6-1934e3808345-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6869cbc5df-p5kwd\" (UID: \"bae742eb-4220-4fa8-9af6-1934e3808345\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-p5kwd" Sep 30 06:22:59 crc kubenswrapper[4691]: I0930 06:22:59.989149 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/bae742eb-4220-4fa8-9af6-1934e3808345-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6869cbc5df-p5kwd\" (UID: \"bae742eb-4220-4fa8-9af6-1934e3808345\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-p5kwd" Sep 30 06:22:59 crc kubenswrapper[4691]: I0930 06:22:59.990207 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/bae742eb-4220-4fa8-9af6-1934e3808345-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6869cbc5df-p5kwd\" (UID: \"bae742eb-4220-4fa8-9af6-1934e3808345\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-p5kwd" Sep 30 06:22:59 crc kubenswrapper[4691]: I0930 06:22:59.990451 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/bae742eb-4220-4fa8-9af6-1934e3808345-v4-0-config-user-template-login\") pod \"oauth-openshift-6869cbc5df-p5kwd\" (UID: \"bae742eb-4220-4fa8-9af6-1934e3808345\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-p5kwd" Sep 30 06:23:00 crc kubenswrapper[4691]: I0930 06:23:00.004413 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rmtt\" (UniqueName: \"kubernetes.io/projected/bae742eb-4220-4fa8-9af6-1934e3808345-kube-api-access-4rmtt\") pod \"oauth-openshift-6869cbc5df-p5kwd\" (UID: \"bae742eb-4220-4fa8-9af6-1934e3808345\") " pod="openshift-authentication/oauth-openshift-6869cbc5df-p5kwd" Sep 30 06:23:00 crc kubenswrapper[4691]: I0930 06:23:00.155470 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6869cbc5df-p5kwd" Sep 30 06:23:00 crc kubenswrapper[4691]: I0930 06:23:00.646556 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6869cbc5df-p5kwd"] Sep 30 06:23:01 crc kubenswrapper[4691]: I0930 06:23:01.413472 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7897b44485-4fj4f"] Sep 30 06:23:01 crc kubenswrapper[4691]: I0930 06:23:01.414107 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7897b44485-4fj4f" podUID="861e92c9-a61e-43df-9331-c2aee5175fc3" containerName="controller-manager" containerID="cri-o://4a54b6ec3b4548717be6f5961483781523e0099c361ce3f49e1b6f295e1e345d" gracePeriod=30 Sep 30 06:23:01 crc kubenswrapper[4691]: I0930 06:23:01.438993 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-865444f99f-6qhsv"] Sep 30 06:23:01 crc kubenswrapper[4691]: I0930 06:23:01.439629 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-865444f99f-6qhsv" podUID="71196a78-693d-4f4e-9397-a0c60da648c2" containerName="route-controller-manager" containerID="cri-o://ea460a4cf5a57c8a9454974ef379b7ec4152b0d799fa98c30221c721b82dac68" gracePeriod=30 Sep 30 06:23:01 crc kubenswrapper[4691]: I0930 06:23:01.564619 4691 generic.go:334] "Generic (PLEG): container finished" podID="71196a78-693d-4f4e-9397-a0c60da648c2" containerID="ea460a4cf5a57c8a9454974ef379b7ec4152b0d799fa98c30221c721b82dac68" exitCode=0 Sep 30 06:23:01 crc kubenswrapper[4691]: I0930 06:23:01.564713 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-865444f99f-6qhsv" event={"ID":"71196a78-693d-4f4e-9397-a0c60da648c2","Type":"ContainerDied","Data":"ea460a4cf5a57c8a9454974ef379b7ec4152b0d799fa98c30221c721b82dac68"} Sep 30 06:23:01 crc kubenswrapper[4691]: I0930 06:23:01.566860 4691 generic.go:334] "Generic (PLEG): container finished" podID="861e92c9-a61e-43df-9331-c2aee5175fc3" containerID="4a54b6ec3b4548717be6f5961483781523e0099c361ce3f49e1b6f295e1e345d" exitCode=0 Sep 30 06:23:01 crc kubenswrapper[4691]: I0930 06:23:01.566948 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7897b44485-4fj4f" event={"ID":"861e92c9-a61e-43df-9331-c2aee5175fc3","Type":"ContainerDied","Data":"4a54b6ec3b4548717be6f5961483781523e0099c361ce3f49e1b6f295e1e345d"} Sep 30 06:23:01 crc kubenswrapper[4691]: I0930 06:23:01.568920 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6869cbc5df-p5kwd" event={"ID":"bae742eb-4220-4fa8-9af6-1934e3808345","Type":"ContainerStarted","Data":"b74a025070342040731aa7723ebe39977ba3365298bf8260985f596cf7970c2a"} Sep 30 06:23:01 crc kubenswrapper[4691]: I0930 06:23:01.568986 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6869cbc5df-p5kwd" event={"ID":"bae742eb-4220-4fa8-9af6-1934e3808345","Type":"ContainerStarted","Data":"eb289cd4220014a4ff8a74c4f9175f8506c97a1fb1036385568d990fa3010f87"} Sep 30 06:23:01 crc kubenswrapper[4691]: I0930 06:23:01.569240 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-6869cbc5df-p5kwd" Sep 30 06:23:01 crc kubenswrapper[4691]: I0930 06:23:01.592316 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-6869cbc5df-p5kwd" podStartSLOduration=32.592294583 podStartE2EDuration="32.592294583s" podCreationTimestamp="2025-09-30 06:22:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:23:01.589177284 +0000 UTC m=+225.064198364" watchObservedRunningTime="2025-09-30 06:23:01.592294583 +0000 UTC m=+225.067315643" Sep 30 06:23:01 crc kubenswrapper[4691]: I0930 06:23:01.693165 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-6869cbc5df-p5kwd" Sep 30 06:23:01 crc kubenswrapper[4691]: I0930 06:23:01.957467 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-865444f99f-6qhsv" Sep 30 06:23:02 crc kubenswrapper[4691]: I0930 06:23:02.029717 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71196a78-693d-4f4e-9397-a0c60da648c2-config\") pod \"71196a78-693d-4f4e-9397-a0c60da648c2\" (UID: \"71196a78-693d-4f4e-9397-a0c60da648c2\") " Sep 30 06:23:02 crc kubenswrapper[4691]: I0930 06:23:02.029754 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/71196a78-693d-4f4e-9397-a0c60da648c2-client-ca\") pod \"71196a78-693d-4f4e-9397-a0c60da648c2\" (UID: \"71196a78-693d-4f4e-9397-a0c60da648c2\") " Sep 30 06:23:02 crc kubenswrapper[4691]: I0930 06:23:02.029782 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/71196a78-693d-4f4e-9397-a0c60da648c2-serving-cert\") pod \"71196a78-693d-4f4e-9397-a0c60da648c2\" (UID: \"71196a78-693d-4f4e-9397-a0c60da648c2\") " Sep 30 06:23:02 crc kubenswrapper[4691]: I0930 06:23:02.029857 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4l7g6\" (UniqueName: \"kubernetes.io/projected/71196a78-693d-4f4e-9397-a0c60da648c2-kube-api-access-4l7g6\") pod \"71196a78-693d-4f4e-9397-a0c60da648c2\" (UID: \"71196a78-693d-4f4e-9397-a0c60da648c2\") " Sep 30 06:23:02 crc kubenswrapper[4691]: I0930 06:23:02.030627 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71196a78-693d-4f4e-9397-a0c60da648c2-client-ca" (OuterVolumeSpecName: "client-ca") pod "71196a78-693d-4f4e-9397-a0c60da648c2" (UID: "71196a78-693d-4f4e-9397-a0c60da648c2"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:23:02 crc kubenswrapper[4691]: I0930 06:23:02.030636 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71196a78-693d-4f4e-9397-a0c60da648c2-config" (OuterVolumeSpecName: "config") pod "71196a78-693d-4f4e-9397-a0c60da648c2" (UID: "71196a78-693d-4f4e-9397-a0c60da648c2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:23:02 crc kubenswrapper[4691]: I0930 06:23:02.035098 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71196a78-693d-4f4e-9397-a0c60da648c2-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "71196a78-693d-4f4e-9397-a0c60da648c2" (UID: "71196a78-693d-4f4e-9397-a0c60da648c2"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:23:02 crc kubenswrapper[4691]: I0930 06:23:02.038993 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71196a78-693d-4f4e-9397-a0c60da648c2-kube-api-access-4l7g6" (OuterVolumeSpecName: "kube-api-access-4l7g6") pod "71196a78-693d-4f4e-9397-a0c60da648c2" (UID: "71196a78-693d-4f4e-9397-a0c60da648c2"). InnerVolumeSpecName "kube-api-access-4l7g6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:23:02 crc kubenswrapper[4691]: I0930 06:23:02.060138 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7897b44485-4fj4f" Sep 30 06:23:02 crc kubenswrapper[4691]: I0930 06:23:02.130750 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/861e92c9-a61e-43df-9331-c2aee5175fc3-proxy-ca-bundles\") pod \"861e92c9-a61e-43df-9331-c2aee5175fc3\" (UID: \"861e92c9-a61e-43df-9331-c2aee5175fc3\") " Sep 30 06:23:02 crc kubenswrapper[4691]: I0930 06:23:02.130861 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/861e92c9-a61e-43df-9331-c2aee5175fc3-client-ca\") pod \"861e92c9-a61e-43df-9331-c2aee5175fc3\" (UID: \"861e92c9-a61e-43df-9331-c2aee5175fc3\") " Sep 30 06:23:02 crc kubenswrapper[4691]: I0930 06:23:02.130940 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/861e92c9-a61e-43df-9331-c2aee5175fc3-config\") pod \"861e92c9-a61e-43df-9331-c2aee5175fc3\" (UID: \"861e92c9-a61e-43df-9331-c2aee5175fc3\") " Sep 30 06:23:02 crc kubenswrapper[4691]: I0930 06:23:02.130968 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/861e92c9-a61e-43df-9331-c2aee5175fc3-serving-cert\") pod \"861e92c9-a61e-43df-9331-c2aee5175fc3\" (UID: \"861e92c9-a61e-43df-9331-c2aee5175fc3\") " Sep 30 06:23:02 crc kubenswrapper[4691]: I0930 06:23:02.130995 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tlccc\" (UniqueName: \"kubernetes.io/projected/861e92c9-a61e-43df-9331-c2aee5175fc3-kube-api-access-tlccc\") pod \"861e92c9-a61e-43df-9331-c2aee5175fc3\" (UID: \"861e92c9-a61e-43df-9331-c2aee5175fc3\") " Sep 30 06:23:02 crc kubenswrapper[4691]: I0930 06:23:02.131296 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4l7g6\" (UniqueName: \"kubernetes.io/projected/71196a78-693d-4f4e-9397-a0c60da648c2-kube-api-access-4l7g6\") on node \"crc\" DevicePath \"\"" Sep 30 06:23:02 crc kubenswrapper[4691]: I0930 06:23:02.131324 4691 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71196a78-693d-4f4e-9397-a0c60da648c2-config\") on node \"crc\" DevicePath \"\"" Sep 30 06:23:02 crc kubenswrapper[4691]: I0930 06:23:02.131338 4691 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/71196a78-693d-4f4e-9397-a0c60da648c2-client-ca\") on node \"crc\" DevicePath \"\"" Sep 30 06:23:02 crc kubenswrapper[4691]: I0930 06:23:02.131350 4691 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/71196a78-693d-4f4e-9397-a0c60da648c2-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 06:23:02 crc kubenswrapper[4691]: I0930 06:23:02.131534 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/861e92c9-a61e-43df-9331-c2aee5175fc3-client-ca" (OuterVolumeSpecName: "client-ca") pod "861e92c9-a61e-43df-9331-c2aee5175fc3" (UID: "861e92c9-a61e-43df-9331-c2aee5175fc3"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:23:02 crc kubenswrapper[4691]: I0930 06:23:02.131661 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/861e92c9-a61e-43df-9331-c2aee5175fc3-config" (OuterVolumeSpecName: "config") pod "861e92c9-a61e-43df-9331-c2aee5175fc3" (UID: "861e92c9-a61e-43df-9331-c2aee5175fc3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:23:02 crc kubenswrapper[4691]: I0930 06:23:02.131680 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/861e92c9-a61e-43df-9331-c2aee5175fc3-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "861e92c9-a61e-43df-9331-c2aee5175fc3" (UID: "861e92c9-a61e-43df-9331-c2aee5175fc3"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:23:02 crc kubenswrapper[4691]: I0930 06:23:02.133972 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/861e92c9-a61e-43df-9331-c2aee5175fc3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "861e92c9-a61e-43df-9331-c2aee5175fc3" (UID: "861e92c9-a61e-43df-9331-c2aee5175fc3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:23:02 crc kubenswrapper[4691]: I0930 06:23:02.134818 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/861e92c9-a61e-43df-9331-c2aee5175fc3-kube-api-access-tlccc" (OuterVolumeSpecName: "kube-api-access-tlccc") pod "861e92c9-a61e-43df-9331-c2aee5175fc3" (UID: "861e92c9-a61e-43df-9331-c2aee5175fc3"). InnerVolumeSpecName "kube-api-access-tlccc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:23:02 crc kubenswrapper[4691]: I0930 06:23:02.232591 4691 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/861e92c9-a61e-43df-9331-c2aee5175fc3-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Sep 30 06:23:02 crc kubenswrapper[4691]: I0930 06:23:02.232645 4691 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/861e92c9-a61e-43df-9331-c2aee5175fc3-client-ca\") on node \"crc\" DevicePath \"\"" Sep 30 06:23:02 crc kubenswrapper[4691]: I0930 06:23:02.232667 4691 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/861e92c9-a61e-43df-9331-c2aee5175fc3-config\") on node \"crc\" DevicePath \"\"" Sep 30 06:23:02 crc kubenswrapper[4691]: I0930 06:23:02.232685 4691 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/861e92c9-a61e-43df-9331-c2aee5175fc3-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 06:23:02 crc kubenswrapper[4691]: I0930 06:23:02.232702 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tlccc\" (UniqueName: \"kubernetes.io/projected/861e92c9-a61e-43df-9331-c2aee5175fc3-kube-api-access-tlccc\") on node \"crc\" DevicePath \"\"" Sep 30 06:23:02 crc kubenswrapper[4691]: I0930 06:23:02.577674 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-865444f99f-6qhsv" event={"ID":"71196a78-693d-4f4e-9397-a0c60da648c2","Type":"ContainerDied","Data":"75e432cb5b18346f0e75da1883f7d28fa520cb6606e63d4414ff63d4deb4ebcf"} Sep 30 06:23:02 crc kubenswrapper[4691]: I0930 06:23:02.577695 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-865444f99f-6qhsv" Sep 30 06:23:02 crc kubenswrapper[4691]: I0930 06:23:02.577768 4691 scope.go:117] "RemoveContainer" containerID="ea460a4cf5a57c8a9454974ef379b7ec4152b0d799fa98c30221c721b82dac68" Sep 30 06:23:02 crc kubenswrapper[4691]: I0930 06:23:02.581916 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7897b44485-4fj4f" Sep 30 06:23:02 crc kubenswrapper[4691]: I0930 06:23:02.581925 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7897b44485-4fj4f" event={"ID":"861e92c9-a61e-43df-9331-c2aee5175fc3","Type":"ContainerDied","Data":"5a947043bbf3ff0e8ab25e98067921f75a47b224cf8411772ec7ab1ae951f7e5"} Sep 30 06:23:02 crc kubenswrapper[4691]: I0930 06:23:02.609961 4691 scope.go:117] "RemoveContainer" containerID="4a54b6ec3b4548717be6f5961483781523e0099c361ce3f49e1b6f295e1e345d" Sep 30 06:23:02 crc kubenswrapper[4691]: I0930 06:23:02.629474 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-865444f99f-6qhsv"] Sep 30 06:23:02 crc kubenswrapper[4691]: I0930 06:23:02.634236 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-865444f99f-6qhsv"] Sep 30 06:23:02 crc kubenswrapper[4691]: I0930 06:23:02.643538 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7897b44485-4fj4f"] Sep 30 06:23:02 crc kubenswrapper[4691]: I0930 06:23:02.649605 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7897b44485-4fj4f"] Sep 30 06:23:02 crc kubenswrapper[4691]: I0930 06:23:02.795593 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-fbf676896-d94w6"] Sep 30 06:23:02 crc kubenswrapper[4691]: E0930 06:23:02.795863 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71196a78-693d-4f4e-9397-a0c60da648c2" containerName="route-controller-manager" Sep 30 06:23:02 crc kubenswrapper[4691]: I0930 06:23:02.795900 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="71196a78-693d-4f4e-9397-a0c60da648c2" containerName="route-controller-manager" Sep 30 06:23:02 crc kubenswrapper[4691]: E0930 06:23:02.795923 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="861e92c9-a61e-43df-9331-c2aee5175fc3" containerName="controller-manager" Sep 30 06:23:02 crc kubenswrapper[4691]: I0930 06:23:02.795934 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="861e92c9-a61e-43df-9331-c2aee5175fc3" containerName="controller-manager" Sep 30 06:23:02 crc kubenswrapper[4691]: I0930 06:23:02.796050 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="71196a78-693d-4f4e-9397-a0c60da648c2" containerName="route-controller-manager" Sep 30 06:23:02 crc kubenswrapper[4691]: I0930 06:23:02.796065 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="861e92c9-a61e-43df-9331-c2aee5175fc3" containerName="controller-manager" Sep 30 06:23:02 crc kubenswrapper[4691]: I0930 06:23:02.796536 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-fbf676896-d94w6" Sep 30 06:23:02 crc kubenswrapper[4691]: I0930 06:23:02.798616 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Sep 30 06:23:02 crc kubenswrapper[4691]: I0930 06:23:02.799690 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Sep 30 06:23:02 crc kubenswrapper[4691]: I0930 06:23:02.801172 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Sep 30 06:23:02 crc kubenswrapper[4691]: I0930 06:23:02.801417 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Sep 30 06:23:02 crc kubenswrapper[4691]: I0930 06:23:02.801423 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Sep 30 06:23:02 crc kubenswrapper[4691]: I0930 06:23:02.801765 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Sep 30 06:23:02 crc kubenswrapper[4691]: I0930 06:23:02.803778 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6444bd78fb-f5ccr"] Sep 30 06:23:02 crc kubenswrapper[4691]: I0930 06:23:02.804637 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6444bd78fb-f5ccr" Sep 30 06:23:02 crc kubenswrapper[4691]: I0930 06:23:02.808389 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Sep 30 06:23:02 crc kubenswrapper[4691]: I0930 06:23:02.808621 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Sep 30 06:23:02 crc kubenswrapper[4691]: I0930 06:23:02.810418 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Sep 30 06:23:02 crc kubenswrapper[4691]: I0930 06:23:02.810427 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Sep 30 06:23:02 crc kubenswrapper[4691]: I0930 06:23:02.810451 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Sep 30 06:23:02 crc kubenswrapper[4691]: I0930 06:23:02.810828 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Sep 30 06:23:02 crc kubenswrapper[4691]: I0930 06:23:02.810971 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Sep 30 06:23:02 crc kubenswrapper[4691]: I0930 06:23:02.812771 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-fbf676896-d94w6"] Sep 30 06:23:02 crc kubenswrapper[4691]: I0930 06:23:02.821841 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6444bd78fb-f5ccr"] Sep 30 06:23:02 crc kubenswrapper[4691]: I0930 06:23:02.944917 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4401eb03-3572-421e-93ff-cb75efb1eed1-config\") pod \"controller-manager-fbf676896-d94w6\" (UID: \"4401eb03-3572-421e-93ff-cb75efb1eed1\") " pod="openshift-controller-manager/controller-manager-fbf676896-d94w6" Sep 30 06:23:02 crc kubenswrapper[4691]: I0930 06:23:02.944979 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4401eb03-3572-421e-93ff-cb75efb1eed1-serving-cert\") pod \"controller-manager-fbf676896-d94w6\" (UID: \"4401eb03-3572-421e-93ff-cb75efb1eed1\") " pod="openshift-controller-manager/controller-manager-fbf676896-d94w6" Sep 30 06:23:02 crc kubenswrapper[4691]: I0930 06:23:02.945141 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a6c7567a-d200-4ff2-8a5c-b4b5746f7101-serving-cert\") pod \"route-controller-manager-6444bd78fb-f5ccr\" (UID: \"a6c7567a-d200-4ff2-8a5c-b4b5746f7101\") " pod="openshift-route-controller-manager/route-controller-manager-6444bd78fb-f5ccr" Sep 30 06:23:02 crc kubenswrapper[4691]: I0930 06:23:02.945246 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4401eb03-3572-421e-93ff-cb75efb1eed1-client-ca\") pod \"controller-manager-fbf676896-d94w6\" (UID: \"4401eb03-3572-421e-93ff-cb75efb1eed1\") " pod="openshift-controller-manager/controller-manager-fbf676896-d94w6" Sep 30 06:23:02 crc kubenswrapper[4691]: I0930 06:23:02.945284 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a6c7567a-d200-4ff2-8a5c-b4b5746f7101-client-ca\") pod \"route-controller-manager-6444bd78fb-f5ccr\" (UID: \"a6c7567a-d200-4ff2-8a5c-b4b5746f7101\") " pod="openshift-route-controller-manager/route-controller-manager-6444bd78fb-f5ccr" Sep 30 06:23:02 crc kubenswrapper[4691]: I0930 06:23:02.945322 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2fhs\" (UniqueName: \"kubernetes.io/projected/a6c7567a-d200-4ff2-8a5c-b4b5746f7101-kube-api-access-l2fhs\") pod \"route-controller-manager-6444bd78fb-f5ccr\" (UID: \"a6c7567a-d200-4ff2-8a5c-b4b5746f7101\") " pod="openshift-route-controller-manager/route-controller-manager-6444bd78fb-f5ccr" Sep 30 06:23:02 crc kubenswrapper[4691]: I0930 06:23:02.945344 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrnsl\" (UniqueName: \"kubernetes.io/projected/4401eb03-3572-421e-93ff-cb75efb1eed1-kube-api-access-vrnsl\") pod \"controller-manager-fbf676896-d94w6\" (UID: \"4401eb03-3572-421e-93ff-cb75efb1eed1\") " pod="openshift-controller-manager/controller-manager-fbf676896-d94w6" Sep 30 06:23:02 crc kubenswrapper[4691]: I0930 06:23:02.945364 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4401eb03-3572-421e-93ff-cb75efb1eed1-proxy-ca-bundles\") pod \"controller-manager-fbf676896-d94w6\" (UID: \"4401eb03-3572-421e-93ff-cb75efb1eed1\") " pod="openshift-controller-manager/controller-manager-fbf676896-d94w6" Sep 30 06:23:02 crc kubenswrapper[4691]: I0930 06:23:02.945518 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6c7567a-d200-4ff2-8a5c-b4b5746f7101-config\") pod \"route-controller-manager-6444bd78fb-f5ccr\" (UID: \"a6c7567a-d200-4ff2-8a5c-b4b5746f7101\") " pod="openshift-route-controller-manager/route-controller-manager-6444bd78fb-f5ccr" Sep 30 06:23:03 crc kubenswrapper[4691]: I0930 06:23:03.046764 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6c7567a-d200-4ff2-8a5c-b4b5746f7101-config\") pod \"route-controller-manager-6444bd78fb-f5ccr\" (UID: \"a6c7567a-d200-4ff2-8a5c-b4b5746f7101\") " pod="openshift-route-controller-manager/route-controller-manager-6444bd78fb-f5ccr" Sep 30 06:23:03 crc kubenswrapper[4691]: I0930 06:23:03.046865 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4401eb03-3572-421e-93ff-cb75efb1eed1-config\") pod \"controller-manager-fbf676896-d94w6\" (UID: \"4401eb03-3572-421e-93ff-cb75efb1eed1\") " pod="openshift-controller-manager/controller-manager-fbf676896-d94w6" Sep 30 06:23:03 crc kubenswrapper[4691]: I0930 06:23:03.046962 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4401eb03-3572-421e-93ff-cb75efb1eed1-serving-cert\") pod \"controller-manager-fbf676896-d94w6\" (UID: \"4401eb03-3572-421e-93ff-cb75efb1eed1\") " pod="openshift-controller-manager/controller-manager-fbf676896-d94w6" Sep 30 06:23:03 crc kubenswrapper[4691]: I0930 06:23:03.047002 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a6c7567a-d200-4ff2-8a5c-b4b5746f7101-serving-cert\") pod \"route-controller-manager-6444bd78fb-f5ccr\" (UID: \"a6c7567a-d200-4ff2-8a5c-b4b5746f7101\") " pod="openshift-route-controller-manager/route-controller-manager-6444bd78fb-f5ccr" Sep 30 06:23:03 crc kubenswrapper[4691]: I0930 06:23:03.047084 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4401eb03-3572-421e-93ff-cb75efb1eed1-client-ca\") pod \"controller-manager-fbf676896-d94w6\" (UID: \"4401eb03-3572-421e-93ff-cb75efb1eed1\") " pod="openshift-controller-manager/controller-manager-fbf676896-d94w6" Sep 30 06:23:03 crc kubenswrapper[4691]: I0930 06:23:03.047148 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a6c7567a-d200-4ff2-8a5c-b4b5746f7101-client-ca\") pod \"route-controller-manager-6444bd78fb-f5ccr\" (UID: \"a6c7567a-d200-4ff2-8a5c-b4b5746f7101\") " pod="openshift-route-controller-manager/route-controller-manager-6444bd78fb-f5ccr" Sep 30 06:23:03 crc kubenswrapper[4691]: I0930 06:23:03.047195 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2fhs\" (UniqueName: \"kubernetes.io/projected/a6c7567a-d200-4ff2-8a5c-b4b5746f7101-kube-api-access-l2fhs\") pod \"route-controller-manager-6444bd78fb-f5ccr\" (UID: \"a6c7567a-d200-4ff2-8a5c-b4b5746f7101\") " pod="openshift-route-controller-manager/route-controller-manager-6444bd78fb-f5ccr" Sep 30 06:23:03 crc kubenswrapper[4691]: I0930 06:23:03.047231 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrnsl\" (UniqueName: \"kubernetes.io/projected/4401eb03-3572-421e-93ff-cb75efb1eed1-kube-api-access-vrnsl\") pod \"controller-manager-fbf676896-d94w6\" (UID: \"4401eb03-3572-421e-93ff-cb75efb1eed1\") " pod="openshift-controller-manager/controller-manager-fbf676896-d94w6" Sep 30 06:23:03 crc kubenswrapper[4691]: I0930 06:23:03.047264 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4401eb03-3572-421e-93ff-cb75efb1eed1-proxy-ca-bundles\") pod \"controller-manager-fbf676896-d94w6\" (UID: \"4401eb03-3572-421e-93ff-cb75efb1eed1\") " pod="openshift-controller-manager/controller-manager-fbf676896-d94w6" Sep 30 06:23:03 crc kubenswrapper[4691]: I0930 06:23:03.048386 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4401eb03-3572-421e-93ff-cb75efb1eed1-config\") pod \"controller-manager-fbf676896-d94w6\" (UID: \"4401eb03-3572-421e-93ff-cb75efb1eed1\") " pod="openshift-controller-manager/controller-manager-fbf676896-d94w6" Sep 30 06:23:03 crc kubenswrapper[4691]: I0930 06:23:03.049480 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4401eb03-3572-421e-93ff-cb75efb1eed1-client-ca\") pod \"controller-manager-fbf676896-d94w6\" (UID: \"4401eb03-3572-421e-93ff-cb75efb1eed1\") " pod="openshift-controller-manager/controller-manager-fbf676896-d94w6" Sep 30 06:23:03 crc kubenswrapper[4691]: I0930 06:23:03.049596 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a6c7567a-d200-4ff2-8a5c-b4b5746f7101-config\") pod \"route-controller-manager-6444bd78fb-f5ccr\" (UID: \"a6c7567a-d200-4ff2-8a5c-b4b5746f7101\") " pod="openshift-route-controller-manager/route-controller-manager-6444bd78fb-f5ccr" Sep 30 06:23:03 crc kubenswrapper[4691]: I0930 06:23:03.049688 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4401eb03-3572-421e-93ff-cb75efb1eed1-proxy-ca-bundles\") pod \"controller-manager-fbf676896-d94w6\" (UID: \"4401eb03-3572-421e-93ff-cb75efb1eed1\") " pod="openshift-controller-manager/controller-manager-fbf676896-d94w6" Sep 30 06:23:03 crc kubenswrapper[4691]: I0930 06:23:03.050221 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a6c7567a-d200-4ff2-8a5c-b4b5746f7101-client-ca\") pod \"route-controller-manager-6444bd78fb-f5ccr\" (UID: \"a6c7567a-d200-4ff2-8a5c-b4b5746f7101\") " pod="openshift-route-controller-manager/route-controller-manager-6444bd78fb-f5ccr" Sep 30 06:23:03 crc kubenswrapper[4691]: I0930 06:23:03.054358 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4401eb03-3572-421e-93ff-cb75efb1eed1-serving-cert\") pod \"controller-manager-fbf676896-d94w6\" (UID: \"4401eb03-3572-421e-93ff-cb75efb1eed1\") " pod="openshift-controller-manager/controller-manager-fbf676896-d94w6" Sep 30 06:23:03 crc kubenswrapper[4691]: I0930 06:23:03.054370 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a6c7567a-d200-4ff2-8a5c-b4b5746f7101-serving-cert\") pod \"route-controller-manager-6444bd78fb-f5ccr\" (UID: \"a6c7567a-d200-4ff2-8a5c-b4b5746f7101\") " pod="openshift-route-controller-manager/route-controller-manager-6444bd78fb-f5ccr" Sep 30 06:23:03 crc kubenswrapper[4691]: I0930 06:23:03.078207 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2fhs\" (UniqueName: \"kubernetes.io/projected/a6c7567a-d200-4ff2-8a5c-b4b5746f7101-kube-api-access-l2fhs\") pod \"route-controller-manager-6444bd78fb-f5ccr\" (UID: \"a6c7567a-d200-4ff2-8a5c-b4b5746f7101\") " pod="openshift-route-controller-manager/route-controller-manager-6444bd78fb-f5ccr" Sep 30 06:23:03 crc kubenswrapper[4691]: I0930 06:23:03.080872 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrnsl\" (UniqueName: \"kubernetes.io/projected/4401eb03-3572-421e-93ff-cb75efb1eed1-kube-api-access-vrnsl\") pod \"controller-manager-fbf676896-d94w6\" (UID: \"4401eb03-3572-421e-93ff-cb75efb1eed1\") " pod="openshift-controller-manager/controller-manager-fbf676896-d94w6" Sep 30 06:23:03 crc kubenswrapper[4691]: I0930 06:23:03.144791 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-fbf676896-d94w6" Sep 30 06:23:03 crc kubenswrapper[4691]: I0930 06:23:03.154311 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6444bd78fb-f5ccr" Sep 30 06:23:03 crc kubenswrapper[4691]: I0930 06:23:03.231431 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71196a78-693d-4f4e-9397-a0c60da648c2" path="/var/lib/kubelet/pods/71196a78-693d-4f4e-9397-a0c60da648c2/volumes" Sep 30 06:23:03 crc kubenswrapper[4691]: I0930 06:23:03.232181 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="861e92c9-a61e-43df-9331-c2aee5175fc3" path="/var/lib/kubelet/pods/861e92c9-a61e-43df-9331-c2aee5175fc3/volumes" Sep 30 06:23:03 crc kubenswrapper[4691]: I0930 06:23:03.635408 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-fbf676896-d94w6"] Sep 30 06:23:03 crc kubenswrapper[4691]: I0930 06:23:03.640372 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6444bd78fb-f5ccr"] Sep 30 06:23:03 crc kubenswrapper[4691]: W0930 06:23:03.640722 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4401eb03_3572_421e_93ff_cb75efb1eed1.slice/crio-f992254318cd4a17bb7f194eca6a5a6ff5b03a3dc06adfb98bd2546e4f4258be WatchSource:0}: Error finding container f992254318cd4a17bb7f194eca6a5a6ff5b03a3dc06adfb98bd2546e4f4258be: Status 404 returned error can't find the container with id f992254318cd4a17bb7f194eca6a5a6ff5b03a3dc06adfb98bd2546e4f4258be Sep 30 06:23:03 crc kubenswrapper[4691]: W0930 06:23:03.642944 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6c7567a_d200_4ff2_8a5c_b4b5746f7101.slice/crio-53e6d7a748c5a960b3805035a4eb33a4caa1cf2babd6a7b05f0998bb1b943581 WatchSource:0}: Error finding container 53e6d7a748c5a960b3805035a4eb33a4caa1cf2babd6a7b05f0998bb1b943581: Status 404 returned error can't find the container with id 53e6d7a748c5a960b3805035a4eb33a4caa1cf2babd6a7b05f0998bb1b943581 Sep 30 06:23:04 crc kubenswrapper[4691]: I0930 06:23:04.597133 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-fbf676896-d94w6" event={"ID":"4401eb03-3572-421e-93ff-cb75efb1eed1","Type":"ContainerStarted","Data":"67b06f05910cf641d34c79faa04e381a50a89d2db61a1a5dd420a3c643e52b04"} Sep 30 06:23:04 crc kubenswrapper[4691]: I0930 06:23:04.597363 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-fbf676896-d94w6" event={"ID":"4401eb03-3572-421e-93ff-cb75efb1eed1","Type":"ContainerStarted","Data":"f992254318cd4a17bb7f194eca6a5a6ff5b03a3dc06adfb98bd2546e4f4258be"} Sep 30 06:23:04 crc kubenswrapper[4691]: I0930 06:23:04.599295 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-fbf676896-d94w6" Sep 30 06:23:04 crc kubenswrapper[4691]: I0930 06:23:04.601647 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6444bd78fb-f5ccr" event={"ID":"a6c7567a-d200-4ff2-8a5c-b4b5746f7101","Type":"ContainerStarted","Data":"e0847386f8404cfce04d40969eddd877c8db14421fbacb041df3d32a1f51d1a0"} Sep 30 06:23:04 crc kubenswrapper[4691]: I0930 06:23:04.601689 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6444bd78fb-f5ccr" event={"ID":"a6c7567a-d200-4ff2-8a5c-b4b5746f7101","Type":"ContainerStarted","Data":"53e6d7a748c5a960b3805035a4eb33a4caa1cf2babd6a7b05f0998bb1b943581"} Sep 30 06:23:04 crc kubenswrapper[4691]: I0930 06:23:04.601923 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6444bd78fb-f5ccr" Sep 30 06:23:04 crc kubenswrapper[4691]: I0930 06:23:04.602125 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-fbf676896-d94w6" Sep 30 06:23:04 crc kubenswrapper[4691]: I0930 06:23:04.607766 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6444bd78fb-f5ccr" Sep 30 06:23:04 crc kubenswrapper[4691]: I0930 06:23:04.620276 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-fbf676896-d94w6" podStartSLOduration=3.6202575919999997 podStartE2EDuration="3.620257592s" podCreationTimestamp="2025-09-30 06:23:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:23:04.618155456 +0000 UTC m=+228.093176506" watchObservedRunningTime="2025-09-30 06:23:04.620257592 +0000 UTC m=+228.095278642" Sep 30 06:23:04 crc kubenswrapper[4691]: I0930 06:23:04.654551 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6444bd78fb-f5ccr" podStartSLOduration=3.654531781 podStartE2EDuration="3.654531781s" podCreationTimestamp="2025-09-30 06:23:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:23:04.651704361 +0000 UTC m=+228.126725421" watchObservedRunningTime="2025-09-30 06:23:04.654531781 +0000 UTC m=+228.129552831" Sep 30 06:23:18 crc kubenswrapper[4691]: I0930 06:23:18.260714 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vzgfj"] Sep 30 06:23:18 crc kubenswrapper[4691]: I0930 06:23:18.261493 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-vzgfj" podUID="b2bb2e76-e094-4320-8bab-f54bf623dcb1" containerName="registry-server" containerID="cri-o://425e7dd7c338e2cf530893d676a01438c6b86ddd8fbb35ebc43df3f02c5e88da" gracePeriod=30 Sep 30 06:23:18 crc kubenswrapper[4691]: I0930 06:23:18.279148 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qln5x"] Sep 30 06:23:18 crc kubenswrapper[4691]: I0930 06:23:18.279486 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-qln5x" podUID="ca900afa-86c0-4fe0-ba3d-d5d927db24b7" containerName="registry-server" containerID="cri-o://bc8f5b68997e3a0b344d66030cdc6f2de9602f301ebb44c0dad2b4a164c943c6" gracePeriod=30 Sep 30 06:23:18 crc kubenswrapper[4691]: I0930 06:23:18.294088 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dzf49"] Sep 30 06:23:18 crc kubenswrapper[4691]: I0930 06:23:18.294643 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-dzf49" podUID="f5ef6b93-5bb5-467f-8268-5feb300e2d5c" containerName="marketplace-operator" containerID="cri-o://9dfc77521acadc64b053263fa9d2d2f760270714a851575b1333fd531f276b65" gracePeriod=30 Sep 30 06:23:18 crc kubenswrapper[4691]: I0930 06:23:18.306307 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9mzhd"] Sep 30 06:23:18 crc kubenswrapper[4691]: I0930 06:23:18.306685 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9mzhd" podUID="fecc9d18-f13e-4620-a9da-b620e9660ec7" containerName="registry-server" containerID="cri-o://e8c6fa72db0c9e7d81b84402cc9c28a73de65b554ae0c1a357a6a5d81dadae3f" gracePeriod=30 Sep 30 06:23:18 crc kubenswrapper[4691]: I0930 06:23:18.316749 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4km9n"] Sep 30 06:23:18 crc kubenswrapper[4691]: I0930 06:23:18.317568 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-4km9n" Sep 30 06:23:18 crc kubenswrapper[4691]: I0930 06:23:18.321002 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ghx4r"] Sep 30 06:23:18 crc kubenswrapper[4691]: I0930 06:23:18.321306 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ghx4r" podUID="c4fda877-ac4f-419b-9cf9-933c5bca0aba" containerName="registry-server" containerID="cri-o://be5cabb9253f82cb62fe2f55d5e5864c8a9d3b4fef3f5652b07516dbc74ebac0" gracePeriod=30 Sep 30 06:23:18 crc kubenswrapper[4691]: I0930 06:23:18.324661 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4km9n"] Sep 30 06:23:18 crc kubenswrapper[4691]: I0930 06:23:18.460624 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kwss\" (UniqueName: \"kubernetes.io/projected/f46c875b-2f18-4fac-98af-64b0756b7e26-kube-api-access-8kwss\") pod \"marketplace-operator-79b997595-4km9n\" (UID: \"f46c875b-2f18-4fac-98af-64b0756b7e26\") " pod="openshift-marketplace/marketplace-operator-79b997595-4km9n" Sep 30 06:23:18 crc kubenswrapper[4691]: I0930 06:23:18.461082 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f46c875b-2f18-4fac-98af-64b0756b7e26-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-4km9n\" (UID: \"f46c875b-2f18-4fac-98af-64b0756b7e26\") " pod="openshift-marketplace/marketplace-operator-79b997595-4km9n" Sep 30 06:23:18 crc kubenswrapper[4691]: I0930 06:23:18.461122 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f46c875b-2f18-4fac-98af-64b0756b7e26-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-4km9n\" (UID: \"f46c875b-2f18-4fac-98af-64b0756b7e26\") " pod="openshift-marketplace/marketplace-operator-79b997595-4km9n" Sep 30 06:23:18 crc kubenswrapper[4691]: I0930 06:23:18.562421 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f46c875b-2f18-4fac-98af-64b0756b7e26-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-4km9n\" (UID: \"f46c875b-2f18-4fac-98af-64b0756b7e26\") " pod="openshift-marketplace/marketplace-operator-79b997595-4km9n" Sep 30 06:23:18 crc kubenswrapper[4691]: I0930 06:23:18.562499 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kwss\" (UniqueName: \"kubernetes.io/projected/f46c875b-2f18-4fac-98af-64b0756b7e26-kube-api-access-8kwss\") pod \"marketplace-operator-79b997595-4km9n\" (UID: \"f46c875b-2f18-4fac-98af-64b0756b7e26\") " pod="openshift-marketplace/marketplace-operator-79b997595-4km9n" Sep 30 06:23:18 crc kubenswrapper[4691]: I0930 06:23:18.562545 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f46c875b-2f18-4fac-98af-64b0756b7e26-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-4km9n\" (UID: \"f46c875b-2f18-4fac-98af-64b0756b7e26\") " pod="openshift-marketplace/marketplace-operator-79b997595-4km9n" Sep 30 06:23:18 crc kubenswrapper[4691]: I0930 06:23:18.564335 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f46c875b-2f18-4fac-98af-64b0756b7e26-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-4km9n\" (UID: \"f46c875b-2f18-4fac-98af-64b0756b7e26\") " pod="openshift-marketplace/marketplace-operator-79b997595-4km9n" Sep 30 06:23:18 crc kubenswrapper[4691]: I0930 06:23:18.571666 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f46c875b-2f18-4fac-98af-64b0756b7e26-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-4km9n\" (UID: \"f46c875b-2f18-4fac-98af-64b0756b7e26\") " pod="openshift-marketplace/marketplace-operator-79b997595-4km9n" Sep 30 06:23:18 crc kubenswrapper[4691]: I0930 06:23:18.583387 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kwss\" (UniqueName: \"kubernetes.io/projected/f46c875b-2f18-4fac-98af-64b0756b7e26-kube-api-access-8kwss\") pod \"marketplace-operator-79b997595-4km9n\" (UID: \"f46c875b-2f18-4fac-98af-64b0756b7e26\") " pod="openshift-marketplace/marketplace-operator-79b997595-4km9n" Sep 30 06:23:18 crc kubenswrapper[4691]: I0930 06:23:18.694408 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-4km9n" Sep 30 06:23:18 crc kubenswrapper[4691]: I0930 06:23:18.711823 4691 generic.go:334] "Generic (PLEG): container finished" podID="f5ef6b93-5bb5-467f-8268-5feb300e2d5c" containerID="9dfc77521acadc64b053263fa9d2d2f760270714a851575b1333fd531f276b65" exitCode=0 Sep 30 06:23:18 crc kubenswrapper[4691]: I0930 06:23:18.711966 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-dzf49" event={"ID":"f5ef6b93-5bb5-467f-8268-5feb300e2d5c","Type":"ContainerDied","Data":"9dfc77521acadc64b053263fa9d2d2f760270714a851575b1333fd531f276b65"} Sep 30 06:23:18 crc kubenswrapper[4691]: I0930 06:23:18.714396 4691 generic.go:334] "Generic (PLEG): container finished" podID="c4fda877-ac4f-419b-9cf9-933c5bca0aba" containerID="be5cabb9253f82cb62fe2f55d5e5864c8a9d3b4fef3f5652b07516dbc74ebac0" exitCode=0 Sep 30 06:23:18 crc kubenswrapper[4691]: I0930 06:23:18.714469 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ghx4r" event={"ID":"c4fda877-ac4f-419b-9cf9-933c5bca0aba","Type":"ContainerDied","Data":"be5cabb9253f82cb62fe2f55d5e5864c8a9d3b4fef3f5652b07516dbc74ebac0"} Sep 30 06:23:18 crc kubenswrapper[4691]: I0930 06:23:18.721462 4691 generic.go:334] "Generic (PLEG): container finished" podID="b2bb2e76-e094-4320-8bab-f54bf623dcb1" containerID="425e7dd7c338e2cf530893d676a01438c6b86ddd8fbb35ebc43df3f02c5e88da" exitCode=0 Sep 30 06:23:18 crc kubenswrapper[4691]: I0930 06:23:18.721530 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vzgfj" event={"ID":"b2bb2e76-e094-4320-8bab-f54bf623dcb1","Type":"ContainerDied","Data":"425e7dd7c338e2cf530893d676a01438c6b86ddd8fbb35ebc43df3f02c5e88da"} Sep 30 06:23:18 crc kubenswrapper[4691]: I0930 06:23:18.724213 4691 generic.go:334] "Generic (PLEG): container finished" podID="fecc9d18-f13e-4620-a9da-b620e9660ec7" containerID="e8c6fa72db0c9e7d81b84402cc9c28a73de65b554ae0c1a357a6a5d81dadae3f" exitCode=0 Sep 30 06:23:18 crc kubenswrapper[4691]: I0930 06:23:18.724253 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9mzhd" event={"ID":"fecc9d18-f13e-4620-a9da-b620e9660ec7","Type":"ContainerDied","Data":"e8c6fa72db0c9e7d81b84402cc9c28a73de65b554ae0c1a357a6a5d81dadae3f"} Sep 30 06:23:18 crc kubenswrapper[4691]: I0930 06:23:18.726155 4691 generic.go:334] "Generic (PLEG): container finished" podID="ca900afa-86c0-4fe0-ba3d-d5d927db24b7" containerID="bc8f5b68997e3a0b344d66030cdc6f2de9602f301ebb44c0dad2b4a164c943c6" exitCode=0 Sep 30 06:23:18 crc kubenswrapper[4691]: I0930 06:23:18.726176 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qln5x" event={"ID":"ca900afa-86c0-4fe0-ba3d-d5d927db24b7","Type":"ContainerDied","Data":"bc8f5b68997e3a0b344d66030cdc6f2de9602f301ebb44c0dad2b4a164c943c6"} Sep 30 06:23:18 crc kubenswrapper[4691]: E0930 06:23:18.776590 4691 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of be5cabb9253f82cb62fe2f55d5e5864c8a9d3b4fef3f5652b07516dbc74ebac0 is running failed: container process not found" containerID="be5cabb9253f82cb62fe2f55d5e5864c8a9d3b4fef3f5652b07516dbc74ebac0" cmd=["grpc_health_probe","-addr=:50051"] Sep 30 06:23:18 crc kubenswrapper[4691]: E0930 06:23:18.777091 4691 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of be5cabb9253f82cb62fe2f55d5e5864c8a9d3b4fef3f5652b07516dbc74ebac0 is running failed: container process not found" containerID="be5cabb9253f82cb62fe2f55d5e5864c8a9d3b4fef3f5652b07516dbc74ebac0" cmd=["grpc_health_probe","-addr=:50051"] Sep 30 06:23:18 crc kubenswrapper[4691]: E0930 06:23:18.777693 4691 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of be5cabb9253f82cb62fe2f55d5e5864c8a9d3b4fef3f5652b07516dbc74ebac0 is running failed: container process not found" containerID="be5cabb9253f82cb62fe2f55d5e5864c8a9d3b4fef3f5652b07516dbc74ebac0" cmd=["grpc_health_probe","-addr=:50051"] Sep 30 06:23:18 crc kubenswrapper[4691]: E0930 06:23:18.777731 4691 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of be5cabb9253f82cb62fe2f55d5e5864c8a9d3b4fef3f5652b07516dbc74ebac0 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-operators-ghx4r" podUID="c4fda877-ac4f-419b-9cf9-933c5bca0aba" containerName="registry-server" Sep 30 06:23:18 crc kubenswrapper[4691]: I0930 06:23:18.869847 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vzgfj" Sep 30 06:23:18 crc kubenswrapper[4691]: I0930 06:23:18.958649 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qln5x" Sep 30 06:23:18 crc kubenswrapper[4691]: I0930 06:23:18.966715 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2bb2e76-e094-4320-8bab-f54bf623dcb1-utilities\") pod \"b2bb2e76-e094-4320-8bab-f54bf623dcb1\" (UID: \"b2bb2e76-e094-4320-8bab-f54bf623dcb1\") " Sep 30 06:23:18 crc kubenswrapper[4691]: I0930 06:23:18.966804 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7j542\" (UniqueName: \"kubernetes.io/projected/b2bb2e76-e094-4320-8bab-f54bf623dcb1-kube-api-access-7j542\") pod \"b2bb2e76-e094-4320-8bab-f54bf623dcb1\" (UID: \"b2bb2e76-e094-4320-8bab-f54bf623dcb1\") " Sep 30 06:23:18 crc kubenswrapper[4691]: I0930 06:23:18.967116 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2bb2e76-e094-4320-8bab-f54bf623dcb1-catalog-content\") pod \"b2bb2e76-e094-4320-8bab-f54bf623dcb1\" (UID: \"b2bb2e76-e094-4320-8bab-f54bf623dcb1\") " Sep 30 06:23:18 crc kubenswrapper[4691]: I0930 06:23:18.968378 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2bb2e76-e094-4320-8bab-f54bf623dcb1-utilities" (OuterVolumeSpecName: "utilities") pod "b2bb2e76-e094-4320-8bab-f54bf623dcb1" (UID: "b2bb2e76-e094-4320-8bab-f54bf623dcb1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:23:18 crc kubenswrapper[4691]: I0930 06:23:18.976481 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2bb2e76-e094-4320-8bab-f54bf623dcb1-kube-api-access-7j542" (OuterVolumeSpecName: "kube-api-access-7j542") pod "b2bb2e76-e094-4320-8bab-f54bf623dcb1" (UID: "b2bb2e76-e094-4320-8bab-f54bf623dcb1"). InnerVolumeSpecName "kube-api-access-7j542". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:23:18 crc kubenswrapper[4691]: I0930 06:23:18.977088 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ghx4r" Sep 30 06:23:18 crc kubenswrapper[4691]: I0930 06:23:18.987093 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-dzf49" Sep 30 06:23:18 crc kubenswrapper[4691]: I0930 06:23:18.996720 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9mzhd" Sep 30 06:23:19 crc kubenswrapper[4691]: I0930 06:23:19.029561 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2bb2e76-e094-4320-8bab-f54bf623dcb1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b2bb2e76-e094-4320-8bab-f54bf623dcb1" (UID: "b2bb2e76-e094-4320-8bab-f54bf623dcb1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:23:19 crc kubenswrapper[4691]: I0930 06:23:19.068484 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fecc9d18-f13e-4620-a9da-b620e9660ec7-catalog-content\") pod \"fecc9d18-f13e-4620-a9da-b620e9660ec7\" (UID: \"fecc9d18-f13e-4620-a9da-b620e9660ec7\") " Sep 30 06:23:19 crc kubenswrapper[4691]: I0930 06:23:19.068537 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7wxt\" (UniqueName: \"kubernetes.io/projected/ca900afa-86c0-4fe0-ba3d-d5d927db24b7-kube-api-access-x7wxt\") pod \"ca900afa-86c0-4fe0-ba3d-d5d927db24b7\" (UID: \"ca900afa-86c0-4fe0-ba3d-d5d927db24b7\") " Sep 30 06:23:19 crc kubenswrapper[4691]: I0930 06:23:19.068574 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f5ef6b93-5bb5-467f-8268-5feb300e2d5c-marketplace-trusted-ca\") pod \"f5ef6b93-5bb5-467f-8268-5feb300e2d5c\" (UID: \"f5ef6b93-5bb5-467f-8268-5feb300e2d5c\") " Sep 30 06:23:19 crc kubenswrapper[4691]: I0930 06:23:19.068604 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca900afa-86c0-4fe0-ba3d-d5d927db24b7-utilities\") pod \"ca900afa-86c0-4fe0-ba3d-d5d927db24b7\" (UID: \"ca900afa-86c0-4fe0-ba3d-d5d927db24b7\") " Sep 30 06:23:19 crc kubenswrapper[4691]: I0930 06:23:19.068620 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w2nrv\" (UniqueName: \"kubernetes.io/projected/fecc9d18-f13e-4620-a9da-b620e9660ec7-kube-api-access-w2nrv\") pod \"fecc9d18-f13e-4620-a9da-b620e9660ec7\" (UID: \"fecc9d18-f13e-4620-a9da-b620e9660ec7\") " Sep 30 06:23:19 crc kubenswrapper[4691]: I0930 06:23:19.068643 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-znsl5\" (UniqueName: \"kubernetes.io/projected/c4fda877-ac4f-419b-9cf9-933c5bca0aba-kube-api-access-znsl5\") pod \"c4fda877-ac4f-419b-9cf9-933c5bca0aba\" (UID: \"c4fda877-ac4f-419b-9cf9-933c5bca0aba\") " Sep 30 06:23:19 crc kubenswrapper[4691]: I0930 06:23:19.068666 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4fda877-ac4f-419b-9cf9-933c5bca0aba-catalog-content\") pod \"c4fda877-ac4f-419b-9cf9-933c5bca0aba\" (UID: \"c4fda877-ac4f-419b-9cf9-933c5bca0aba\") " Sep 30 06:23:19 crc kubenswrapper[4691]: I0930 06:23:19.068687 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f5ef6b93-5bb5-467f-8268-5feb300e2d5c-marketplace-operator-metrics\") pod \"f5ef6b93-5bb5-467f-8268-5feb300e2d5c\" (UID: \"f5ef6b93-5bb5-467f-8268-5feb300e2d5c\") " Sep 30 06:23:19 crc kubenswrapper[4691]: I0930 06:23:19.068723 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-676xh\" (UniqueName: \"kubernetes.io/projected/f5ef6b93-5bb5-467f-8268-5feb300e2d5c-kube-api-access-676xh\") pod \"f5ef6b93-5bb5-467f-8268-5feb300e2d5c\" (UID: \"f5ef6b93-5bb5-467f-8268-5feb300e2d5c\") " Sep 30 06:23:19 crc kubenswrapper[4691]: I0930 06:23:19.068751 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca900afa-86c0-4fe0-ba3d-d5d927db24b7-catalog-content\") pod \"ca900afa-86c0-4fe0-ba3d-d5d927db24b7\" (UID: \"ca900afa-86c0-4fe0-ba3d-d5d927db24b7\") " Sep 30 06:23:19 crc kubenswrapper[4691]: I0930 06:23:19.068778 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fecc9d18-f13e-4620-a9da-b620e9660ec7-utilities\") pod \"fecc9d18-f13e-4620-a9da-b620e9660ec7\" (UID: \"fecc9d18-f13e-4620-a9da-b620e9660ec7\") " Sep 30 06:23:19 crc kubenswrapper[4691]: I0930 06:23:19.068808 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4fda877-ac4f-419b-9cf9-933c5bca0aba-utilities\") pod \"c4fda877-ac4f-419b-9cf9-933c5bca0aba\" (UID: \"c4fda877-ac4f-419b-9cf9-933c5bca0aba\") " Sep 30 06:23:19 crc kubenswrapper[4691]: I0930 06:23:19.069011 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2bb2e76-e094-4320-8bab-f54bf623dcb1-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 06:23:19 crc kubenswrapper[4691]: I0930 06:23:19.069028 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7j542\" (UniqueName: \"kubernetes.io/projected/b2bb2e76-e094-4320-8bab-f54bf623dcb1-kube-api-access-7j542\") on node \"crc\" DevicePath \"\"" Sep 30 06:23:19 crc kubenswrapper[4691]: I0930 06:23:19.069045 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2bb2e76-e094-4320-8bab-f54bf623dcb1-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 06:23:19 crc kubenswrapper[4691]: I0930 06:23:19.070076 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fecc9d18-f13e-4620-a9da-b620e9660ec7-utilities" (OuterVolumeSpecName: "utilities") pod "fecc9d18-f13e-4620-a9da-b620e9660ec7" (UID: "fecc9d18-f13e-4620-a9da-b620e9660ec7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:23:19 crc kubenswrapper[4691]: I0930 06:23:19.070932 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5ef6b93-5bb5-467f-8268-5feb300e2d5c-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "f5ef6b93-5bb5-467f-8268-5feb300e2d5c" (UID: "f5ef6b93-5bb5-467f-8268-5feb300e2d5c"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:23:19 crc kubenswrapper[4691]: I0930 06:23:19.072584 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4fda877-ac4f-419b-9cf9-933c5bca0aba-kube-api-access-znsl5" (OuterVolumeSpecName: "kube-api-access-znsl5") pod "c4fda877-ac4f-419b-9cf9-933c5bca0aba" (UID: "c4fda877-ac4f-419b-9cf9-933c5bca0aba"). InnerVolumeSpecName "kube-api-access-znsl5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:23:19 crc kubenswrapper[4691]: I0930 06:23:19.072594 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca900afa-86c0-4fe0-ba3d-d5d927db24b7-utilities" (OuterVolumeSpecName: "utilities") pod "ca900afa-86c0-4fe0-ba3d-d5d927db24b7" (UID: "ca900afa-86c0-4fe0-ba3d-d5d927db24b7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:23:19 crc kubenswrapper[4691]: I0930 06:23:19.072652 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4fda877-ac4f-419b-9cf9-933c5bca0aba-utilities" (OuterVolumeSpecName: "utilities") pod "c4fda877-ac4f-419b-9cf9-933c5bca0aba" (UID: "c4fda877-ac4f-419b-9cf9-933c5bca0aba"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:23:19 crc kubenswrapper[4691]: I0930 06:23:19.073076 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5ef6b93-5bb5-467f-8268-5feb300e2d5c-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "f5ef6b93-5bb5-467f-8268-5feb300e2d5c" (UID: "f5ef6b93-5bb5-467f-8268-5feb300e2d5c"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:23:19 crc kubenswrapper[4691]: I0930 06:23:19.074447 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca900afa-86c0-4fe0-ba3d-d5d927db24b7-kube-api-access-x7wxt" (OuterVolumeSpecName: "kube-api-access-x7wxt") pod "ca900afa-86c0-4fe0-ba3d-d5d927db24b7" (UID: "ca900afa-86c0-4fe0-ba3d-d5d927db24b7"). InnerVolumeSpecName "kube-api-access-x7wxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:23:19 crc kubenswrapper[4691]: I0930 06:23:19.075230 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fecc9d18-f13e-4620-a9da-b620e9660ec7-kube-api-access-w2nrv" (OuterVolumeSpecName: "kube-api-access-w2nrv") pod "fecc9d18-f13e-4620-a9da-b620e9660ec7" (UID: "fecc9d18-f13e-4620-a9da-b620e9660ec7"). InnerVolumeSpecName "kube-api-access-w2nrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:23:19 crc kubenswrapper[4691]: I0930 06:23:19.082365 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5ef6b93-5bb5-467f-8268-5feb300e2d5c-kube-api-access-676xh" (OuterVolumeSpecName: "kube-api-access-676xh") pod "f5ef6b93-5bb5-467f-8268-5feb300e2d5c" (UID: "f5ef6b93-5bb5-467f-8268-5feb300e2d5c"). InnerVolumeSpecName "kube-api-access-676xh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:23:19 crc kubenswrapper[4691]: I0930 06:23:19.089075 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fecc9d18-f13e-4620-a9da-b620e9660ec7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fecc9d18-f13e-4620-a9da-b620e9660ec7" (UID: "fecc9d18-f13e-4620-a9da-b620e9660ec7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:23:19 crc kubenswrapper[4691]: I0930 06:23:19.160447 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca900afa-86c0-4fe0-ba3d-d5d927db24b7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ca900afa-86c0-4fe0-ba3d-d5d927db24b7" (UID: "ca900afa-86c0-4fe0-ba3d-d5d927db24b7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:23:19 crc kubenswrapper[4691]: I0930 06:23:19.169138 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c4fda877-ac4f-419b-9cf9-933c5bca0aba-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c4fda877-ac4f-419b-9cf9-933c5bca0aba" (UID: "c4fda877-ac4f-419b-9cf9-933c5bca0aba"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:23:19 crc kubenswrapper[4691]: I0930 06:23:19.169827 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca900afa-86c0-4fe0-ba3d-d5d927db24b7-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 06:23:19 crc kubenswrapper[4691]: I0930 06:23:19.169848 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w2nrv\" (UniqueName: \"kubernetes.io/projected/fecc9d18-f13e-4620-a9da-b620e9660ec7-kube-api-access-w2nrv\") on node \"crc\" DevicePath \"\"" Sep 30 06:23:19 crc kubenswrapper[4691]: I0930 06:23:19.169859 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-znsl5\" (UniqueName: \"kubernetes.io/projected/c4fda877-ac4f-419b-9cf9-933c5bca0aba-kube-api-access-znsl5\") on node \"crc\" DevicePath \"\"" Sep 30 06:23:19 crc kubenswrapper[4691]: I0930 06:23:19.169868 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c4fda877-ac4f-419b-9cf9-933c5bca0aba-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 06:23:19 crc kubenswrapper[4691]: I0930 06:23:19.169897 4691 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f5ef6b93-5bb5-467f-8268-5feb300e2d5c-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Sep 30 06:23:19 crc kubenswrapper[4691]: I0930 06:23:19.169907 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-676xh\" (UniqueName: \"kubernetes.io/projected/f5ef6b93-5bb5-467f-8268-5feb300e2d5c-kube-api-access-676xh\") on node \"crc\" DevicePath \"\"" Sep 30 06:23:19 crc kubenswrapper[4691]: I0930 06:23:19.169914 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca900afa-86c0-4fe0-ba3d-d5d927db24b7-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 06:23:19 crc kubenswrapper[4691]: I0930 06:23:19.169922 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fecc9d18-f13e-4620-a9da-b620e9660ec7-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 06:23:19 crc kubenswrapper[4691]: I0930 06:23:19.169929 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c4fda877-ac4f-419b-9cf9-933c5bca0aba-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 06:23:19 crc kubenswrapper[4691]: I0930 06:23:19.169937 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fecc9d18-f13e-4620-a9da-b620e9660ec7-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 06:23:19 crc kubenswrapper[4691]: I0930 06:23:19.169945 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7wxt\" (UniqueName: \"kubernetes.io/projected/ca900afa-86c0-4fe0-ba3d-d5d927db24b7-kube-api-access-x7wxt\") on node \"crc\" DevicePath \"\"" Sep 30 06:23:19 crc kubenswrapper[4691]: I0930 06:23:19.169952 4691 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f5ef6b93-5bb5-467f-8268-5feb300e2d5c-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 06:23:19 crc kubenswrapper[4691]: I0930 06:23:19.232917 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-4km9n"] Sep 30 06:23:19 crc kubenswrapper[4691]: W0930 06:23:19.236679 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf46c875b_2f18_4fac_98af_64b0756b7e26.slice/crio-a97a78a1555f4ef470659dfabac8cff7277a914aba9c8f6b02b4f0ce100e52bb WatchSource:0}: Error finding container a97a78a1555f4ef470659dfabac8cff7277a914aba9c8f6b02b4f0ce100e52bb: Status 404 returned error can't find the container with id a97a78a1555f4ef470659dfabac8cff7277a914aba9c8f6b02b4f0ce100e52bb Sep 30 06:23:19 crc kubenswrapper[4691]: I0930 06:23:19.736957 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4km9n" event={"ID":"f46c875b-2f18-4fac-98af-64b0756b7e26","Type":"ContainerStarted","Data":"4b3bdd1590589ca779e71862930a647f53ed5b22d03c288c97897f623f6ac262"} Sep 30 06:23:19 crc kubenswrapper[4691]: I0930 06:23:19.737454 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-4km9n" event={"ID":"f46c875b-2f18-4fac-98af-64b0756b7e26","Type":"ContainerStarted","Data":"a97a78a1555f4ef470659dfabac8cff7277a914aba9c8f6b02b4f0ce100e52bb"} Sep 30 06:23:19 crc kubenswrapper[4691]: I0930 06:23:19.737638 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-4km9n" Sep 30 06:23:19 crc kubenswrapper[4691]: I0930 06:23:19.740178 4691 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-4km9n container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.60:8080/healthz\": dial tcp 10.217.0.60:8080: connect: connection refused" start-of-body= Sep 30 06:23:19 crc kubenswrapper[4691]: I0930 06:23:19.740234 4691 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-4km9n" podUID="f46c875b-2f18-4fac-98af-64b0756b7e26" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.60:8080/healthz\": dial tcp 10.217.0.60:8080: connect: connection refused" Sep 30 06:23:19 crc kubenswrapper[4691]: I0930 06:23:19.741491 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9mzhd" Sep 30 06:23:19 crc kubenswrapper[4691]: I0930 06:23:19.741496 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9mzhd" event={"ID":"fecc9d18-f13e-4620-a9da-b620e9660ec7","Type":"ContainerDied","Data":"8ada3ce5c48083d682c4e94216fd9c0972769c07127ec4fb0a99090ce7056bcc"} Sep 30 06:23:19 crc kubenswrapper[4691]: I0930 06:23:19.741694 4691 scope.go:117] "RemoveContainer" containerID="e8c6fa72db0c9e7d81b84402cc9c28a73de65b554ae0c1a357a6a5d81dadae3f" Sep 30 06:23:19 crc kubenswrapper[4691]: I0930 06:23:19.748714 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-qln5x" event={"ID":"ca900afa-86c0-4fe0-ba3d-d5d927db24b7","Type":"ContainerDied","Data":"321515a17ac9366d6b36f7f23032258fc844948e9fef19851c1e2f6aafd683b0"} Sep 30 06:23:19 crc kubenswrapper[4691]: I0930 06:23:19.748731 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-qln5x" Sep 30 06:23:19 crc kubenswrapper[4691]: I0930 06:23:19.752068 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-dzf49" Sep 30 06:23:19 crc kubenswrapper[4691]: I0930 06:23:19.753199 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-dzf49" event={"ID":"f5ef6b93-5bb5-467f-8268-5feb300e2d5c","Type":"ContainerDied","Data":"0ecf98d90806bfece70489b8b5d7a016804ae84800a14f33622c14164ca1e62f"} Sep 30 06:23:19 crc kubenswrapper[4691]: I0930 06:23:19.759608 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ghx4r" event={"ID":"c4fda877-ac4f-419b-9cf9-933c5bca0aba","Type":"ContainerDied","Data":"8e02838421031fce338c84df67c95e4c0b24c902e0f14c3bd6156639abb4f705"} Sep 30 06:23:19 crc kubenswrapper[4691]: I0930 06:23:19.759755 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ghx4r" Sep 30 06:23:19 crc kubenswrapper[4691]: I0930 06:23:19.767241 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vzgfj" event={"ID":"b2bb2e76-e094-4320-8bab-f54bf623dcb1","Type":"ContainerDied","Data":"21a482f187a572bed61df4bb31b0f5238996fb7790b0f5875530d45fefea6315"} Sep 30 06:23:19 crc kubenswrapper[4691]: I0930 06:23:19.767375 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vzgfj" Sep 30 06:23:19 crc kubenswrapper[4691]: I0930 06:23:19.772100 4691 scope.go:117] "RemoveContainer" containerID="d31262f193b2386c4a7f7d68a053a74f942daa91b8f439e7cc9983492cf517d5" Sep 30 06:23:19 crc kubenswrapper[4691]: I0930 06:23:19.778927 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-4km9n" podStartSLOduration=1.778905957 podStartE2EDuration="1.778905957s" podCreationTimestamp="2025-09-30 06:23:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:23:19.757811187 +0000 UTC m=+243.232832257" watchObservedRunningTime="2025-09-30 06:23:19.778905957 +0000 UTC m=+243.253927027" Sep 30 06:23:19 crc kubenswrapper[4691]: I0930 06:23:19.785114 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9mzhd"] Sep 30 06:23:19 crc kubenswrapper[4691]: I0930 06:23:19.790098 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9mzhd"] Sep 30 06:23:19 crc kubenswrapper[4691]: I0930 06:23:19.798916 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ghx4r"] Sep 30 06:23:19 crc kubenswrapper[4691]: I0930 06:23:19.802090 4691 scope.go:117] "RemoveContainer" containerID="a03ff0b5f01a0084d1d0d69d53b6baa68da298d7a2a55c1792be8d9338c58c48" Sep 30 06:23:19 crc kubenswrapper[4691]: I0930 06:23:19.804187 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ghx4r"] Sep 30 06:23:19 crc kubenswrapper[4691]: I0930 06:23:19.810476 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-qln5x"] Sep 30 06:23:19 crc kubenswrapper[4691]: I0930 06:23:19.822162 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-qln5x"] Sep 30 06:23:19 crc kubenswrapper[4691]: I0930 06:23:19.835325 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-vzgfj"] Sep 30 06:23:19 crc kubenswrapper[4691]: I0930 06:23:19.837122 4691 scope.go:117] "RemoveContainer" containerID="bc8f5b68997e3a0b344d66030cdc6f2de9602f301ebb44c0dad2b4a164c943c6" Sep 30 06:23:19 crc kubenswrapper[4691]: I0930 06:23:19.838204 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-vzgfj"] Sep 30 06:23:19 crc kubenswrapper[4691]: I0930 06:23:19.840314 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dzf49"] Sep 30 06:23:19 crc kubenswrapper[4691]: I0930 06:23:19.843019 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-dzf49"] Sep 30 06:23:19 crc kubenswrapper[4691]: I0930 06:23:19.852292 4691 scope.go:117] "RemoveContainer" containerID="f36f6ea592a163cc14363c8c7410a9dabebab2bb187c2d37f01d94aff5163cda" Sep 30 06:23:19 crc kubenswrapper[4691]: I0930 06:23:19.867408 4691 scope.go:117] "RemoveContainer" containerID="a567bb7108c3a33eb4e9d29a4a97e035ef323ab632d29711d02c18b2c482e30a" Sep 30 06:23:19 crc kubenswrapper[4691]: I0930 06:23:19.882820 4691 scope.go:117] "RemoveContainer" containerID="9dfc77521acadc64b053263fa9d2d2f760270714a851575b1333fd531f276b65" Sep 30 06:23:19 crc kubenswrapper[4691]: I0930 06:23:19.896083 4691 scope.go:117] "RemoveContainer" containerID="be5cabb9253f82cb62fe2f55d5e5864c8a9d3b4fef3f5652b07516dbc74ebac0" Sep 30 06:23:19 crc kubenswrapper[4691]: I0930 06:23:19.906968 4691 scope.go:117] "RemoveContainer" containerID="325e589186d33ce364d6baa98d882e0d1befa51bbd84c16cbea1c0a74fda1a10" Sep 30 06:23:19 crc kubenswrapper[4691]: I0930 06:23:19.920615 4691 scope.go:117] "RemoveContainer" containerID="07b3ae1fda85d658f298e48b3517de80abf2de14675389df2e1ec1d7e416a391" Sep 30 06:23:19 crc kubenswrapper[4691]: I0930 06:23:19.933517 4691 scope.go:117] "RemoveContainer" containerID="425e7dd7c338e2cf530893d676a01438c6b86ddd8fbb35ebc43df3f02c5e88da" Sep 30 06:23:19 crc kubenswrapper[4691]: I0930 06:23:19.946859 4691 scope.go:117] "RemoveContainer" containerID="ae55550324a97f4074df6df0f895bc9770003d48668e79fd555e95318262b96c" Sep 30 06:23:19 crc kubenswrapper[4691]: I0930 06:23:19.961206 4691 scope.go:117] "RemoveContainer" containerID="05aaca92bf30378ec0da5789869bedf98ca75942d1da2d9ed4580df83fbd07a6" Sep 30 06:23:20 crc kubenswrapper[4691]: I0930 06:23:20.490118 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-tzs7x"] Sep 30 06:23:20 crc kubenswrapper[4691]: E0930 06:23:20.491494 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca900afa-86c0-4fe0-ba3d-d5d927db24b7" containerName="extract-content" Sep 30 06:23:20 crc kubenswrapper[4691]: I0930 06:23:20.491542 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca900afa-86c0-4fe0-ba3d-d5d927db24b7" containerName="extract-content" Sep 30 06:23:20 crc kubenswrapper[4691]: E0930 06:23:20.491566 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fecc9d18-f13e-4620-a9da-b620e9660ec7" containerName="registry-server" Sep 30 06:23:20 crc kubenswrapper[4691]: I0930 06:23:20.491585 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="fecc9d18-f13e-4620-a9da-b620e9660ec7" containerName="registry-server" Sep 30 06:23:20 crc kubenswrapper[4691]: E0930 06:23:20.491623 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4fda877-ac4f-419b-9cf9-933c5bca0aba" containerName="extract-utilities" Sep 30 06:23:20 crc kubenswrapper[4691]: I0930 06:23:20.491641 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4fda877-ac4f-419b-9cf9-933c5bca0aba" containerName="extract-utilities" Sep 30 06:23:20 crc kubenswrapper[4691]: E0930 06:23:20.491683 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5ef6b93-5bb5-467f-8268-5feb300e2d5c" containerName="marketplace-operator" Sep 30 06:23:20 crc kubenswrapper[4691]: I0930 06:23:20.491700 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5ef6b93-5bb5-467f-8268-5feb300e2d5c" containerName="marketplace-operator" Sep 30 06:23:20 crc kubenswrapper[4691]: E0930 06:23:20.491737 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fecc9d18-f13e-4620-a9da-b620e9660ec7" containerName="extract-utilities" Sep 30 06:23:20 crc kubenswrapper[4691]: I0930 06:23:20.491755 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="fecc9d18-f13e-4620-a9da-b620e9660ec7" containerName="extract-utilities" Sep 30 06:23:20 crc kubenswrapper[4691]: E0930 06:23:20.491799 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4fda877-ac4f-419b-9cf9-933c5bca0aba" containerName="extract-content" Sep 30 06:23:20 crc kubenswrapper[4691]: I0930 06:23:20.491819 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4fda877-ac4f-419b-9cf9-933c5bca0aba" containerName="extract-content" Sep 30 06:23:20 crc kubenswrapper[4691]: E0930 06:23:20.491854 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fecc9d18-f13e-4620-a9da-b620e9660ec7" containerName="extract-content" Sep 30 06:23:20 crc kubenswrapper[4691]: I0930 06:23:20.492142 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="fecc9d18-f13e-4620-a9da-b620e9660ec7" containerName="extract-content" Sep 30 06:23:20 crc kubenswrapper[4691]: E0930 06:23:20.492195 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca900afa-86c0-4fe0-ba3d-d5d927db24b7" containerName="registry-server" Sep 30 06:23:20 crc kubenswrapper[4691]: I0930 06:23:20.492212 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca900afa-86c0-4fe0-ba3d-d5d927db24b7" containerName="registry-server" Sep 30 06:23:20 crc kubenswrapper[4691]: E0930 06:23:20.492245 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2bb2e76-e094-4320-8bab-f54bf623dcb1" containerName="extract-utilities" Sep 30 06:23:20 crc kubenswrapper[4691]: I0930 06:23:20.492260 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2bb2e76-e094-4320-8bab-f54bf623dcb1" containerName="extract-utilities" Sep 30 06:23:20 crc kubenswrapper[4691]: E0930 06:23:20.492278 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca900afa-86c0-4fe0-ba3d-d5d927db24b7" containerName="extract-utilities" Sep 30 06:23:20 crc kubenswrapper[4691]: I0930 06:23:20.492291 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca900afa-86c0-4fe0-ba3d-d5d927db24b7" containerName="extract-utilities" Sep 30 06:23:20 crc kubenswrapper[4691]: E0930 06:23:20.492320 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2bb2e76-e094-4320-8bab-f54bf623dcb1" containerName="extract-content" Sep 30 06:23:20 crc kubenswrapper[4691]: I0930 06:23:20.492335 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2bb2e76-e094-4320-8bab-f54bf623dcb1" containerName="extract-content" Sep 30 06:23:20 crc kubenswrapper[4691]: E0930 06:23:20.492375 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4fda877-ac4f-419b-9cf9-933c5bca0aba" containerName="registry-server" Sep 30 06:23:20 crc kubenswrapper[4691]: I0930 06:23:20.492390 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4fda877-ac4f-419b-9cf9-933c5bca0aba" containerName="registry-server" Sep 30 06:23:20 crc kubenswrapper[4691]: E0930 06:23:20.492419 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2bb2e76-e094-4320-8bab-f54bf623dcb1" containerName="registry-server" Sep 30 06:23:20 crc kubenswrapper[4691]: I0930 06:23:20.492432 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2bb2e76-e094-4320-8bab-f54bf623dcb1" containerName="registry-server" Sep 30 06:23:20 crc kubenswrapper[4691]: I0930 06:23:20.492993 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4fda877-ac4f-419b-9cf9-933c5bca0aba" containerName="registry-server" Sep 30 06:23:20 crc kubenswrapper[4691]: I0930 06:23:20.493063 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca900afa-86c0-4fe0-ba3d-d5d927db24b7" containerName="registry-server" Sep 30 06:23:20 crc kubenswrapper[4691]: I0930 06:23:20.493100 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="fecc9d18-f13e-4620-a9da-b620e9660ec7" containerName="registry-server" Sep 30 06:23:20 crc kubenswrapper[4691]: I0930 06:23:20.493122 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2bb2e76-e094-4320-8bab-f54bf623dcb1" containerName="registry-server" Sep 30 06:23:20 crc kubenswrapper[4691]: I0930 06:23:20.493147 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5ef6b93-5bb5-467f-8268-5feb300e2d5c" containerName="marketplace-operator" Sep 30 06:23:20 crc kubenswrapper[4691]: I0930 06:23:20.496605 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tzs7x" Sep 30 06:23:20 crc kubenswrapper[4691]: I0930 06:23:20.501435 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Sep 30 06:23:20 crc kubenswrapper[4691]: I0930 06:23:20.506949 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tzs7x"] Sep 30 06:23:20 crc kubenswrapper[4691]: I0930 06:23:20.587608 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02f1c945-9a65-40b7-871c-aebadb76aa48-catalog-content\") pod \"redhat-marketplace-tzs7x\" (UID: \"02f1c945-9a65-40b7-871c-aebadb76aa48\") " pod="openshift-marketplace/redhat-marketplace-tzs7x" Sep 30 06:23:20 crc kubenswrapper[4691]: I0930 06:23:20.587727 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02f1c945-9a65-40b7-871c-aebadb76aa48-utilities\") pod \"redhat-marketplace-tzs7x\" (UID: \"02f1c945-9a65-40b7-871c-aebadb76aa48\") " pod="openshift-marketplace/redhat-marketplace-tzs7x" Sep 30 06:23:20 crc kubenswrapper[4691]: I0930 06:23:20.587770 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blh22\" (UniqueName: \"kubernetes.io/projected/02f1c945-9a65-40b7-871c-aebadb76aa48-kube-api-access-blh22\") pod \"redhat-marketplace-tzs7x\" (UID: \"02f1c945-9a65-40b7-871c-aebadb76aa48\") " pod="openshift-marketplace/redhat-marketplace-tzs7x" Sep 30 06:23:20 crc kubenswrapper[4691]: I0930 06:23:20.686191 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hwbg9"] Sep 30 06:23:20 crc kubenswrapper[4691]: I0930 06:23:20.688842 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02f1c945-9a65-40b7-871c-aebadb76aa48-utilities\") pod \"redhat-marketplace-tzs7x\" (UID: \"02f1c945-9a65-40b7-871c-aebadb76aa48\") " pod="openshift-marketplace/redhat-marketplace-tzs7x" Sep 30 06:23:20 crc kubenswrapper[4691]: I0930 06:23:20.688981 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blh22\" (UniqueName: \"kubernetes.io/projected/02f1c945-9a65-40b7-871c-aebadb76aa48-kube-api-access-blh22\") pod \"redhat-marketplace-tzs7x\" (UID: \"02f1c945-9a65-40b7-871c-aebadb76aa48\") " pod="openshift-marketplace/redhat-marketplace-tzs7x" Sep 30 06:23:20 crc kubenswrapper[4691]: I0930 06:23:20.689047 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02f1c945-9a65-40b7-871c-aebadb76aa48-catalog-content\") pod \"redhat-marketplace-tzs7x\" (UID: \"02f1c945-9a65-40b7-871c-aebadb76aa48\") " pod="openshift-marketplace/redhat-marketplace-tzs7x" Sep 30 06:23:20 crc kubenswrapper[4691]: I0930 06:23:20.689282 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/02f1c945-9a65-40b7-871c-aebadb76aa48-utilities\") pod \"redhat-marketplace-tzs7x\" (UID: \"02f1c945-9a65-40b7-871c-aebadb76aa48\") " pod="openshift-marketplace/redhat-marketplace-tzs7x" Sep 30 06:23:20 crc kubenswrapper[4691]: I0930 06:23:20.689673 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/02f1c945-9a65-40b7-871c-aebadb76aa48-catalog-content\") pod \"redhat-marketplace-tzs7x\" (UID: \"02f1c945-9a65-40b7-871c-aebadb76aa48\") " pod="openshift-marketplace/redhat-marketplace-tzs7x" Sep 30 06:23:20 crc kubenswrapper[4691]: I0930 06:23:20.690715 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hwbg9" Sep 30 06:23:20 crc kubenswrapper[4691]: I0930 06:23:20.695652 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Sep 30 06:23:20 crc kubenswrapper[4691]: I0930 06:23:20.696176 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hwbg9"] Sep 30 06:23:20 crc kubenswrapper[4691]: I0930 06:23:20.713439 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blh22\" (UniqueName: \"kubernetes.io/projected/02f1c945-9a65-40b7-871c-aebadb76aa48-kube-api-access-blh22\") pod \"redhat-marketplace-tzs7x\" (UID: \"02f1c945-9a65-40b7-871c-aebadb76aa48\") " pod="openshift-marketplace/redhat-marketplace-tzs7x" Sep 30 06:23:20 crc kubenswrapper[4691]: I0930 06:23:20.783512 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-4km9n" Sep 30 06:23:20 crc kubenswrapper[4691]: I0930 06:23:20.790882 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38edfb7b-d43d-4492-bc1b-8281e28991c0-catalog-content\") pod \"certified-operators-hwbg9\" (UID: \"38edfb7b-d43d-4492-bc1b-8281e28991c0\") " pod="openshift-marketplace/certified-operators-hwbg9" Sep 30 06:23:20 crc kubenswrapper[4691]: I0930 06:23:20.791012 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqd7z\" (UniqueName: \"kubernetes.io/projected/38edfb7b-d43d-4492-bc1b-8281e28991c0-kube-api-access-pqd7z\") pod \"certified-operators-hwbg9\" (UID: \"38edfb7b-d43d-4492-bc1b-8281e28991c0\") " pod="openshift-marketplace/certified-operators-hwbg9" Sep 30 06:23:20 crc kubenswrapper[4691]: I0930 06:23:20.791120 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38edfb7b-d43d-4492-bc1b-8281e28991c0-utilities\") pod \"certified-operators-hwbg9\" (UID: \"38edfb7b-d43d-4492-bc1b-8281e28991c0\") " pod="openshift-marketplace/certified-operators-hwbg9" Sep 30 06:23:20 crc kubenswrapper[4691]: I0930 06:23:20.823960 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-tzs7x" Sep 30 06:23:20 crc kubenswrapper[4691]: I0930 06:23:20.892664 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38edfb7b-d43d-4492-bc1b-8281e28991c0-catalog-content\") pod \"certified-operators-hwbg9\" (UID: \"38edfb7b-d43d-4492-bc1b-8281e28991c0\") " pod="openshift-marketplace/certified-operators-hwbg9" Sep 30 06:23:20 crc kubenswrapper[4691]: I0930 06:23:20.892747 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqd7z\" (UniqueName: \"kubernetes.io/projected/38edfb7b-d43d-4492-bc1b-8281e28991c0-kube-api-access-pqd7z\") pod \"certified-operators-hwbg9\" (UID: \"38edfb7b-d43d-4492-bc1b-8281e28991c0\") " pod="openshift-marketplace/certified-operators-hwbg9" Sep 30 06:23:20 crc kubenswrapper[4691]: I0930 06:23:20.892827 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38edfb7b-d43d-4492-bc1b-8281e28991c0-utilities\") pod \"certified-operators-hwbg9\" (UID: \"38edfb7b-d43d-4492-bc1b-8281e28991c0\") " pod="openshift-marketplace/certified-operators-hwbg9" Sep 30 06:23:20 crc kubenswrapper[4691]: I0930 06:23:20.893468 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38edfb7b-d43d-4492-bc1b-8281e28991c0-catalog-content\") pod \"certified-operators-hwbg9\" (UID: \"38edfb7b-d43d-4492-bc1b-8281e28991c0\") " pod="openshift-marketplace/certified-operators-hwbg9" Sep 30 06:23:20 crc kubenswrapper[4691]: I0930 06:23:20.893309 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38edfb7b-d43d-4492-bc1b-8281e28991c0-utilities\") pod \"certified-operators-hwbg9\" (UID: \"38edfb7b-d43d-4492-bc1b-8281e28991c0\") " pod="openshift-marketplace/certified-operators-hwbg9" Sep 30 06:23:20 crc kubenswrapper[4691]: I0930 06:23:20.909805 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqd7z\" (UniqueName: \"kubernetes.io/projected/38edfb7b-d43d-4492-bc1b-8281e28991c0-kube-api-access-pqd7z\") pod \"certified-operators-hwbg9\" (UID: \"38edfb7b-d43d-4492-bc1b-8281e28991c0\") " pod="openshift-marketplace/certified-operators-hwbg9" Sep 30 06:23:21 crc kubenswrapper[4691]: I0930 06:23:21.050077 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hwbg9" Sep 30 06:23:21 crc kubenswrapper[4691]: W0930 06:23:21.247443 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02f1c945_9a65_40b7_871c_aebadb76aa48.slice/crio-1b24396e5c1a12cf27135f75c35f33427c48c4e1acd4b789e0c4953ab46e2a87 WatchSource:0}: Error finding container 1b24396e5c1a12cf27135f75c35f33427c48c4e1acd4b789e0c4953ab46e2a87: Status 404 returned error can't find the container with id 1b24396e5c1a12cf27135f75c35f33427c48c4e1acd4b789e0c4953ab46e2a87 Sep 30 06:23:21 crc kubenswrapper[4691]: I0930 06:23:21.248622 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2bb2e76-e094-4320-8bab-f54bf623dcb1" path="/var/lib/kubelet/pods/b2bb2e76-e094-4320-8bab-f54bf623dcb1/volumes" Sep 30 06:23:21 crc kubenswrapper[4691]: I0930 06:23:21.249205 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4fda877-ac4f-419b-9cf9-933c5bca0aba" path="/var/lib/kubelet/pods/c4fda877-ac4f-419b-9cf9-933c5bca0aba/volumes" Sep 30 06:23:21 crc kubenswrapper[4691]: I0930 06:23:21.249874 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca900afa-86c0-4fe0-ba3d-d5d927db24b7" path="/var/lib/kubelet/pods/ca900afa-86c0-4fe0-ba3d-d5d927db24b7/volumes" Sep 30 06:23:21 crc kubenswrapper[4691]: I0930 06:23:21.251031 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5ef6b93-5bb5-467f-8268-5feb300e2d5c" path="/var/lib/kubelet/pods/f5ef6b93-5bb5-467f-8268-5feb300e2d5c/volumes" Sep 30 06:23:21 crc kubenswrapper[4691]: I0930 06:23:21.251522 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fecc9d18-f13e-4620-a9da-b620e9660ec7" path="/var/lib/kubelet/pods/fecc9d18-f13e-4620-a9da-b620e9660ec7/volumes" Sep 30 06:23:21 crc kubenswrapper[4691]: I0930 06:23:21.257607 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-tzs7x"] Sep 30 06:23:21 crc kubenswrapper[4691]: I0930 06:23:21.451980 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hwbg9"] Sep 30 06:23:21 crc kubenswrapper[4691]: W0930 06:23:21.505592 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38edfb7b_d43d_4492_bc1b_8281e28991c0.slice/crio-1f4ddc3eedb023b90e8889ae3da320a9004b185e834a0504c4f0a37b9bd7f498 WatchSource:0}: Error finding container 1f4ddc3eedb023b90e8889ae3da320a9004b185e834a0504c4f0a37b9bd7f498: Status 404 returned error can't find the container with id 1f4ddc3eedb023b90e8889ae3da320a9004b185e834a0504c4f0a37b9bd7f498 Sep 30 06:23:21 crc kubenswrapper[4691]: I0930 06:23:21.787818 4691 generic.go:334] "Generic (PLEG): container finished" podID="02f1c945-9a65-40b7-871c-aebadb76aa48" containerID="f66ec8576a0c08202418c4da3792cdfbb8560c7f8a372bed3e1215a8fef12904" exitCode=0 Sep 30 06:23:21 crc kubenswrapper[4691]: I0930 06:23:21.787937 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tzs7x" event={"ID":"02f1c945-9a65-40b7-871c-aebadb76aa48","Type":"ContainerDied","Data":"f66ec8576a0c08202418c4da3792cdfbb8560c7f8a372bed3e1215a8fef12904"} Sep 30 06:23:21 crc kubenswrapper[4691]: I0930 06:23:21.787974 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tzs7x" event={"ID":"02f1c945-9a65-40b7-871c-aebadb76aa48","Type":"ContainerStarted","Data":"1b24396e5c1a12cf27135f75c35f33427c48c4e1acd4b789e0c4953ab46e2a87"} Sep 30 06:23:21 crc kubenswrapper[4691]: I0930 06:23:21.792934 4691 generic.go:334] "Generic (PLEG): container finished" podID="38edfb7b-d43d-4492-bc1b-8281e28991c0" containerID="c6607ed3ded0e334eb7e6e51c9e6e839fd186b492c11e28d54f078c0d1a05d98" exitCode=0 Sep 30 06:23:21 crc kubenswrapper[4691]: I0930 06:23:21.793085 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hwbg9" event={"ID":"38edfb7b-d43d-4492-bc1b-8281e28991c0","Type":"ContainerDied","Data":"c6607ed3ded0e334eb7e6e51c9e6e839fd186b492c11e28d54f078c0d1a05d98"} Sep 30 06:23:21 crc kubenswrapper[4691]: I0930 06:23:21.793127 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hwbg9" event={"ID":"38edfb7b-d43d-4492-bc1b-8281e28991c0","Type":"ContainerStarted","Data":"1f4ddc3eedb023b90e8889ae3da320a9004b185e834a0504c4f0a37b9bd7f498"} Sep 30 06:23:22 crc kubenswrapper[4691]: I0930 06:23:22.800501 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hwbg9" event={"ID":"38edfb7b-d43d-4492-bc1b-8281e28991c0","Type":"ContainerStarted","Data":"1c34e7fce6c4fb93e0cc366f9138b08d04cc9ddcf5b424e69303b88f7e9a392b"} Sep 30 06:23:22 crc kubenswrapper[4691]: I0930 06:23:22.873826 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hcqm7"] Sep 30 06:23:22 crc kubenswrapper[4691]: I0930 06:23:22.874774 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hcqm7" Sep 30 06:23:22 crc kubenswrapper[4691]: I0930 06:23:22.876239 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Sep 30 06:23:22 crc kubenswrapper[4691]: I0930 06:23:22.887832 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hcqm7"] Sep 30 06:23:23 crc kubenswrapper[4691]: I0930 06:23:23.021030 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89aa6b93-31ae-4f4f-ad2c-bed85c8fd8fa-catalog-content\") pod \"community-operators-hcqm7\" (UID: \"89aa6b93-31ae-4f4f-ad2c-bed85c8fd8fa\") " pod="openshift-marketplace/community-operators-hcqm7" Sep 30 06:23:23 crc kubenswrapper[4691]: I0930 06:23:23.021073 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89aa6b93-31ae-4f4f-ad2c-bed85c8fd8fa-utilities\") pod \"community-operators-hcqm7\" (UID: \"89aa6b93-31ae-4f4f-ad2c-bed85c8fd8fa\") " pod="openshift-marketplace/community-operators-hcqm7" Sep 30 06:23:23 crc kubenswrapper[4691]: I0930 06:23:23.021285 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kn9rv\" (UniqueName: \"kubernetes.io/projected/89aa6b93-31ae-4f4f-ad2c-bed85c8fd8fa-kube-api-access-kn9rv\") pod \"community-operators-hcqm7\" (UID: \"89aa6b93-31ae-4f4f-ad2c-bed85c8fd8fa\") " pod="openshift-marketplace/community-operators-hcqm7" Sep 30 06:23:23 crc kubenswrapper[4691]: I0930 06:23:23.079851 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xbbxj"] Sep 30 06:23:23 crc kubenswrapper[4691]: I0930 06:23:23.081528 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xbbxj" Sep 30 06:23:23 crc kubenswrapper[4691]: I0930 06:23:23.083840 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Sep 30 06:23:23 crc kubenswrapper[4691]: I0930 06:23:23.120247 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xbbxj"] Sep 30 06:23:23 crc kubenswrapper[4691]: I0930 06:23:23.122988 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kn9rv\" (UniqueName: \"kubernetes.io/projected/89aa6b93-31ae-4f4f-ad2c-bed85c8fd8fa-kube-api-access-kn9rv\") pod \"community-operators-hcqm7\" (UID: \"89aa6b93-31ae-4f4f-ad2c-bed85c8fd8fa\") " pod="openshift-marketplace/community-operators-hcqm7" Sep 30 06:23:23 crc kubenswrapper[4691]: I0930 06:23:23.123078 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89aa6b93-31ae-4f4f-ad2c-bed85c8fd8fa-catalog-content\") pod \"community-operators-hcqm7\" (UID: \"89aa6b93-31ae-4f4f-ad2c-bed85c8fd8fa\") " pod="openshift-marketplace/community-operators-hcqm7" Sep 30 06:23:23 crc kubenswrapper[4691]: I0930 06:23:23.123105 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89aa6b93-31ae-4f4f-ad2c-bed85c8fd8fa-utilities\") pod \"community-operators-hcqm7\" (UID: \"89aa6b93-31ae-4f4f-ad2c-bed85c8fd8fa\") " pod="openshift-marketplace/community-operators-hcqm7" Sep 30 06:23:23 crc kubenswrapper[4691]: I0930 06:23:23.123528 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89aa6b93-31ae-4f4f-ad2c-bed85c8fd8fa-utilities\") pod \"community-operators-hcqm7\" (UID: \"89aa6b93-31ae-4f4f-ad2c-bed85c8fd8fa\") " pod="openshift-marketplace/community-operators-hcqm7" Sep 30 06:23:23 crc kubenswrapper[4691]: I0930 06:23:23.124064 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89aa6b93-31ae-4f4f-ad2c-bed85c8fd8fa-catalog-content\") pod \"community-operators-hcqm7\" (UID: \"89aa6b93-31ae-4f4f-ad2c-bed85c8fd8fa\") " pod="openshift-marketplace/community-operators-hcqm7" Sep 30 06:23:23 crc kubenswrapper[4691]: I0930 06:23:23.143825 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kn9rv\" (UniqueName: \"kubernetes.io/projected/89aa6b93-31ae-4f4f-ad2c-bed85c8fd8fa-kube-api-access-kn9rv\") pod \"community-operators-hcqm7\" (UID: \"89aa6b93-31ae-4f4f-ad2c-bed85c8fd8fa\") " pod="openshift-marketplace/community-operators-hcqm7" Sep 30 06:23:23 crc kubenswrapper[4691]: I0930 06:23:23.223998 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6366c530-63f4-4f81-b0eb-b4db91578068-catalog-content\") pod \"redhat-operators-xbbxj\" (UID: \"6366c530-63f4-4f81-b0eb-b4db91578068\") " pod="openshift-marketplace/redhat-operators-xbbxj" Sep 30 06:23:23 crc kubenswrapper[4691]: I0930 06:23:23.224074 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vj6zp\" (UniqueName: \"kubernetes.io/projected/6366c530-63f4-4f81-b0eb-b4db91578068-kube-api-access-vj6zp\") pod \"redhat-operators-xbbxj\" (UID: \"6366c530-63f4-4f81-b0eb-b4db91578068\") " pod="openshift-marketplace/redhat-operators-xbbxj" Sep 30 06:23:23 crc kubenswrapper[4691]: I0930 06:23:23.224109 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6366c530-63f4-4f81-b0eb-b4db91578068-utilities\") pod \"redhat-operators-xbbxj\" (UID: \"6366c530-63f4-4f81-b0eb-b4db91578068\") " pod="openshift-marketplace/redhat-operators-xbbxj" Sep 30 06:23:23 crc kubenswrapper[4691]: I0930 06:23:23.268845 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hcqm7" Sep 30 06:23:23 crc kubenswrapper[4691]: I0930 06:23:23.324872 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6366c530-63f4-4f81-b0eb-b4db91578068-catalog-content\") pod \"redhat-operators-xbbxj\" (UID: \"6366c530-63f4-4f81-b0eb-b4db91578068\") " pod="openshift-marketplace/redhat-operators-xbbxj" Sep 30 06:23:23 crc kubenswrapper[4691]: I0930 06:23:23.325209 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vj6zp\" (UniqueName: \"kubernetes.io/projected/6366c530-63f4-4f81-b0eb-b4db91578068-kube-api-access-vj6zp\") pod \"redhat-operators-xbbxj\" (UID: \"6366c530-63f4-4f81-b0eb-b4db91578068\") " pod="openshift-marketplace/redhat-operators-xbbxj" Sep 30 06:23:23 crc kubenswrapper[4691]: I0930 06:23:23.325247 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6366c530-63f4-4f81-b0eb-b4db91578068-utilities\") pod \"redhat-operators-xbbxj\" (UID: \"6366c530-63f4-4f81-b0eb-b4db91578068\") " pod="openshift-marketplace/redhat-operators-xbbxj" Sep 30 06:23:23 crc kubenswrapper[4691]: I0930 06:23:23.325713 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6366c530-63f4-4f81-b0eb-b4db91578068-catalog-content\") pod \"redhat-operators-xbbxj\" (UID: \"6366c530-63f4-4f81-b0eb-b4db91578068\") " pod="openshift-marketplace/redhat-operators-xbbxj" Sep 30 06:23:23 crc kubenswrapper[4691]: I0930 06:23:23.325825 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6366c530-63f4-4f81-b0eb-b4db91578068-utilities\") pod \"redhat-operators-xbbxj\" (UID: \"6366c530-63f4-4f81-b0eb-b4db91578068\") " pod="openshift-marketplace/redhat-operators-xbbxj" Sep 30 06:23:23 crc kubenswrapper[4691]: I0930 06:23:23.349290 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vj6zp\" (UniqueName: \"kubernetes.io/projected/6366c530-63f4-4f81-b0eb-b4db91578068-kube-api-access-vj6zp\") pod \"redhat-operators-xbbxj\" (UID: \"6366c530-63f4-4f81-b0eb-b4db91578068\") " pod="openshift-marketplace/redhat-operators-xbbxj" Sep 30 06:23:23 crc kubenswrapper[4691]: I0930 06:23:23.426544 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xbbxj" Sep 30 06:23:23 crc kubenswrapper[4691]: I0930 06:23:23.720809 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hcqm7"] Sep 30 06:23:23 crc kubenswrapper[4691]: W0930 06:23:23.733410 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89aa6b93_31ae_4f4f_ad2c_bed85c8fd8fa.slice/crio-fe7f6fb70a6726e5f4edf33a7fa03267f9230cd7a4f40b8f14812248d9d59e36 WatchSource:0}: Error finding container fe7f6fb70a6726e5f4edf33a7fa03267f9230cd7a4f40b8f14812248d9d59e36: Status 404 returned error can't find the container with id fe7f6fb70a6726e5f4edf33a7fa03267f9230cd7a4f40b8f14812248d9d59e36 Sep 30 06:23:23 crc kubenswrapper[4691]: I0930 06:23:23.809594 4691 generic.go:334] "Generic (PLEG): container finished" podID="02f1c945-9a65-40b7-871c-aebadb76aa48" containerID="32195c2eff3b2f5fdd9265f4977c16d8207fa67b045a316b86470cd1fa7d4d31" exitCode=0 Sep 30 06:23:23 crc kubenswrapper[4691]: I0930 06:23:23.809664 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tzs7x" event={"ID":"02f1c945-9a65-40b7-871c-aebadb76aa48","Type":"ContainerDied","Data":"32195c2eff3b2f5fdd9265f4977c16d8207fa67b045a316b86470cd1fa7d4d31"} Sep 30 06:23:23 crc kubenswrapper[4691]: I0930 06:23:23.817316 4691 generic.go:334] "Generic (PLEG): container finished" podID="38edfb7b-d43d-4492-bc1b-8281e28991c0" containerID="1c34e7fce6c4fb93e0cc366f9138b08d04cc9ddcf5b424e69303b88f7e9a392b" exitCode=0 Sep 30 06:23:23 crc kubenswrapper[4691]: I0930 06:23:23.817431 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hwbg9" event={"ID":"38edfb7b-d43d-4492-bc1b-8281e28991c0","Type":"ContainerDied","Data":"1c34e7fce6c4fb93e0cc366f9138b08d04cc9ddcf5b424e69303b88f7e9a392b"} Sep 30 06:23:23 crc kubenswrapper[4691]: I0930 06:23:23.819227 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hcqm7" event={"ID":"89aa6b93-31ae-4f4f-ad2c-bed85c8fd8fa","Type":"ContainerStarted","Data":"fe7f6fb70a6726e5f4edf33a7fa03267f9230cd7a4f40b8f14812248d9d59e36"} Sep 30 06:23:23 crc kubenswrapper[4691]: I0930 06:23:23.846452 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xbbxj"] Sep 30 06:23:24 crc kubenswrapper[4691]: I0930 06:23:24.827648 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hwbg9" event={"ID":"38edfb7b-d43d-4492-bc1b-8281e28991c0","Type":"ContainerStarted","Data":"b3b4083d4931a476e33aa7c89387b7aa63b24a88433bdcbc005007bdd364cabc"} Sep 30 06:23:24 crc kubenswrapper[4691]: I0930 06:23:24.829405 4691 generic.go:334] "Generic (PLEG): container finished" podID="6366c530-63f4-4f81-b0eb-b4db91578068" containerID="a8430e7e4e6f2dd4173c74a3ec78264a2aa6b79d397b0461968df6b9aa8a4018" exitCode=0 Sep 30 06:23:24 crc kubenswrapper[4691]: I0930 06:23:24.829461 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xbbxj" event={"ID":"6366c530-63f4-4f81-b0eb-b4db91578068","Type":"ContainerDied","Data":"a8430e7e4e6f2dd4173c74a3ec78264a2aa6b79d397b0461968df6b9aa8a4018"} Sep 30 06:23:24 crc kubenswrapper[4691]: I0930 06:23:24.829476 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xbbxj" event={"ID":"6366c530-63f4-4f81-b0eb-b4db91578068","Type":"ContainerStarted","Data":"c3f110b5b89a9a1b6bc7b0c4dda4da315ca4be51d6a0fdf75770c6a346d39665"} Sep 30 06:23:24 crc kubenswrapper[4691]: I0930 06:23:24.832626 4691 generic.go:334] "Generic (PLEG): container finished" podID="89aa6b93-31ae-4f4f-ad2c-bed85c8fd8fa" containerID="38b2f3c48572a4a424a75026b8534f9e1506028ff2db7dd11cdd54bea45b6526" exitCode=0 Sep 30 06:23:24 crc kubenswrapper[4691]: I0930 06:23:24.833011 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hcqm7" event={"ID":"89aa6b93-31ae-4f4f-ad2c-bed85c8fd8fa","Type":"ContainerDied","Data":"38b2f3c48572a4a424a75026b8534f9e1506028ff2db7dd11cdd54bea45b6526"} Sep 30 06:23:24 crc kubenswrapper[4691]: I0930 06:23:24.836671 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-tzs7x" event={"ID":"02f1c945-9a65-40b7-871c-aebadb76aa48","Type":"ContainerStarted","Data":"25ef0fde0130c1f02f532f68e68a7d90590b01362197b238913c72ecfc969f39"} Sep 30 06:23:24 crc kubenswrapper[4691]: I0930 06:23:24.848584 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hwbg9" podStartSLOduration=2.3096101190000002 podStartE2EDuration="4.848566254s" podCreationTimestamp="2025-09-30 06:23:20 +0000 UTC" firstStartedPulling="2025-09-30 06:23:21.794746144 +0000 UTC m=+245.269767194" lastFinishedPulling="2025-09-30 06:23:24.333702289 +0000 UTC m=+247.808723329" observedRunningTime="2025-09-30 06:23:24.848148901 +0000 UTC m=+248.323169951" watchObservedRunningTime="2025-09-30 06:23:24.848566254 +0000 UTC m=+248.323587294" Sep 30 06:23:25 crc kubenswrapper[4691]: I0930 06:23:25.842854 4691 generic.go:334] "Generic (PLEG): container finished" podID="89aa6b93-31ae-4f4f-ad2c-bed85c8fd8fa" containerID="baea1787301fb182d471179939eac8d932676beb4368fdace89d7a46c280e07d" exitCode=0 Sep 30 06:23:25 crc kubenswrapper[4691]: I0930 06:23:25.843019 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hcqm7" event={"ID":"89aa6b93-31ae-4f4f-ad2c-bed85c8fd8fa","Type":"ContainerDied","Data":"baea1787301fb182d471179939eac8d932676beb4368fdace89d7a46c280e07d"} Sep 30 06:23:25 crc kubenswrapper[4691]: I0930 06:23:25.873515 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-tzs7x" podStartSLOduration=3.453824135 podStartE2EDuration="5.873492513s" podCreationTimestamp="2025-09-30 06:23:20 +0000 UTC" firstStartedPulling="2025-09-30 06:23:21.790871361 +0000 UTC m=+245.265892441" lastFinishedPulling="2025-09-30 06:23:24.210539749 +0000 UTC m=+247.685560819" observedRunningTime="2025-09-30 06:23:24.892744906 +0000 UTC m=+248.367765956" watchObservedRunningTime="2025-09-30 06:23:25.873492513 +0000 UTC m=+249.348513593" Sep 30 06:23:26 crc kubenswrapper[4691]: I0930 06:23:26.850942 4691 generic.go:334] "Generic (PLEG): container finished" podID="6366c530-63f4-4f81-b0eb-b4db91578068" containerID="7f9b1227e638e5bfb7b05323fc4627f51135b1af60fa694ad847c30811193111" exitCode=0 Sep 30 06:23:26 crc kubenswrapper[4691]: I0930 06:23:26.851047 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xbbxj" event={"ID":"6366c530-63f4-4f81-b0eb-b4db91578068","Type":"ContainerDied","Data":"7f9b1227e638e5bfb7b05323fc4627f51135b1af60fa694ad847c30811193111"} Sep 30 06:23:27 crc kubenswrapper[4691]: I0930 06:23:27.859728 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xbbxj" event={"ID":"6366c530-63f4-4f81-b0eb-b4db91578068","Type":"ContainerStarted","Data":"10a65d458860ed3b8bb6fb11740202bb4aa58bedccae5d865e82f44abe165d0a"} Sep 30 06:23:27 crc kubenswrapper[4691]: I0930 06:23:27.862395 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hcqm7" event={"ID":"89aa6b93-31ae-4f4f-ad2c-bed85c8fd8fa","Type":"ContainerStarted","Data":"0f562af832ccd3f36b04baee9f5c688477de9861fdfcd7c29cb6a376e9a97401"} Sep 30 06:23:27 crc kubenswrapper[4691]: I0930 06:23:27.880311 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hcqm7" podStartSLOduration=4.400090862 podStartE2EDuration="5.880285403s" podCreationTimestamp="2025-09-30 06:23:22 +0000 UTC" firstStartedPulling="2025-09-30 06:23:24.835124978 +0000 UTC m=+248.310146058" lastFinishedPulling="2025-09-30 06:23:26.315319539 +0000 UTC m=+249.790340599" observedRunningTime="2025-09-30 06:23:27.875367987 +0000 UTC m=+251.350389067" watchObservedRunningTime="2025-09-30 06:23:27.880285403 +0000 UTC m=+251.355306483" Sep 30 06:23:28 crc kubenswrapper[4691]: I0930 06:23:28.884246 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xbbxj" podStartSLOduration=3.431080384 podStartE2EDuration="5.884226665s" podCreationTimestamp="2025-09-30 06:23:23 +0000 UTC" firstStartedPulling="2025-09-30 06:23:24.830659995 +0000 UTC m=+248.305681035" lastFinishedPulling="2025-09-30 06:23:27.283806246 +0000 UTC m=+250.758827316" observedRunningTime="2025-09-30 06:23:28.88122058 +0000 UTC m=+252.356241650" watchObservedRunningTime="2025-09-30 06:23:28.884226665 +0000 UTC m=+252.359247725" Sep 30 06:23:30 crc kubenswrapper[4691]: I0930 06:23:30.824223 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-tzs7x" Sep 30 06:23:30 crc kubenswrapper[4691]: I0930 06:23:30.824558 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-tzs7x" Sep 30 06:23:30 crc kubenswrapper[4691]: I0930 06:23:30.896098 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-tzs7x" Sep 30 06:23:30 crc kubenswrapper[4691]: I0930 06:23:30.938752 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-tzs7x" Sep 30 06:23:31 crc kubenswrapper[4691]: I0930 06:23:31.050827 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hwbg9" Sep 30 06:23:31 crc kubenswrapper[4691]: I0930 06:23:31.051093 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hwbg9" Sep 30 06:23:31 crc kubenswrapper[4691]: I0930 06:23:31.085220 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hwbg9" Sep 30 06:23:31 crc kubenswrapper[4691]: I0930 06:23:31.933465 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hwbg9" Sep 30 06:23:33 crc kubenswrapper[4691]: I0930 06:23:33.269511 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hcqm7" Sep 30 06:23:33 crc kubenswrapper[4691]: I0930 06:23:33.269544 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hcqm7" Sep 30 06:23:33 crc kubenswrapper[4691]: I0930 06:23:33.327240 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hcqm7" Sep 30 06:23:33 crc kubenswrapper[4691]: I0930 06:23:33.427638 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xbbxj" Sep 30 06:23:33 crc kubenswrapper[4691]: I0930 06:23:33.427694 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xbbxj" Sep 30 06:23:33 crc kubenswrapper[4691]: I0930 06:23:33.467825 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xbbxj" Sep 30 06:23:33 crc kubenswrapper[4691]: I0930 06:23:33.945244 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hcqm7" Sep 30 06:23:33 crc kubenswrapper[4691]: I0930 06:23:33.948158 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xbbxj" Sep 30 06:25:22 crc kubenswrapper[4691]: I0930 06:25:22.849927 4691 patch_prober.go:28] interesting pod/machine-config-daemon-4w4k6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 06:25:22 crc kubenswrapper[4691]: I0930 06:25:22.850517 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 06:25:52 crc kubenswrapper[4691]: I0930 06:25:52.849866 4691 patch_prober.go:28] interesting pod/machine-config-daemon-4w4k6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 06:25:52 crc kubenswrapper[4691]: I0930 06:25:52.850777 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 06:26:22 crc kubenswrapper[4691]: I0930 06:26:22.850578 4691 patch_prober.go:28] interesting pod/machine-config-daemon-4w4k6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 06:26:22 crc kubenswrapper[4691]: I0930 06:26:22.851252 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 06:26:22 crc kubenswrapper[4691]: I0930 06:26:22.851319 4691 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" Sep 30 06:26:22 crc kubenswrapper[4691]: I0930 06:26:22.852037 4691 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5124cec3e8ade06d39c26cde1baaa625eb5e8cb0cb2eb147c1c6f02b93ecaae0"} pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 06:26:22 crc kubenswrapper[4691]: I0930 06:26:22.852124 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" containerName="machine-config-daemon" containerID="cri-o://5124cec3e8ade06d39c26cde1baaa625eb5e8cb0cb2eb147c1c6f02b93ecaae0" gracePeriod=600 Sep 30 06:26:23 crc kubenswrapper[4691]: I0930 06:26:23.030099 4691 generic.go:334] "Generic (PLEG): container finished" podID="69b46ade-8260-448f-84b7-506632d23ff9" containerID="5124cec3e8ade06d39c26cde1baaa625eb5e8cb0cb2eb147c1c6f02b93ecaae0" exitCode=0 Sep 30 06:26:23 crc kubenswrapper[4691]: I0930 06:26:23.030137 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" event={"ID":"69b46ade-8260-448f-84b7-506632d23ff9","Type":"ContainerDied","Data":"5124cec3e8ade06d39c26cde1baaa625eb5e8cb0cb2eb147c1c6f02b93ecaae0"} Sep 30 06:26:23 crc kubenswrapper[4691]: I0930 06:26:23.030408 4691 scope.go:117] "RemoveContainer" containerID="1440cd9bacd9b28d5e192b3d9143dda931c741606fdb8f246fba23b8a68c534c" Sep 30 06:26:24 crc kubenswrapper[4691]: I0930 06:26:24.040232 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" event={"ID":"69b46ade-8260-448f-84b7-506632d23ff9","Type":"ContainerStarted","Data":"31e757fc7bb8d72540655d2ce1c4ea6d10d3a5eb3fd6ea0108f524dba7e5bca2"} Sep 30 06:27:12 crc kubenswrapper[4691]: I0930 06:27:12.563641 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-bnnkm"] Sep 30 06:27:12 crc kubenswrapper[4691]: I0930 06:27:12.565093 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-bnnkm" Sep 30 06:27:12 crc kubenswrapper[4691]: I0930 06:27:12.578691 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-bnnkm"] Sep 30 06:27:12 crc kubenswrapper[4691]: I0930 06:27:12.753539 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5c79a7e9-bfcf-4af7-b733-c9565542c0a7-installation-pull-secrets\") pod \"image-registry-66df7c8f76-bnnkm\" (UID: \"5c79a7e9-bfcf-4af7-b733-c9565542c0a7\") " pod="openshift-image-registry/image-registry-66df7c8f76-bnnkm" Sep 30 06:27:12 crc kubenswrapper[4691]: I0930 06:27:12.753608 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5c79a7e9-bfcf-4af7-b733-c9565542c0a7-bound-sa-token\") pod \"image-registry-66df7c8f76-bnnkm\" (UID: \"5c79a7e9-bfcf-4af7-b733-c9565542c0a7\") " pod="openshift-image-registry/image-registry-66df7c8f76-bnnkm" Sep 30 06:27:12 crc kubenswrapper[4691]: I0930 06:27:12.753640 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5c79a7e9-bfcf-4af7-b733-c9565542c0a7-registry-tls\") pod \"image-registry-66df7c8f76-bnnkm\" (UID: \"5c79a7e9-bfcf-4af7-b733-c9565542c0a7\") " pod="openshift-image-registry/image-registry-66df7c8f76-bnnkm" Sep 30 06:27:12 crc kubenswrapper[4691]: I0930 06:27:12.753850 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5c79a7e9-bfcf-4af7-b733-c9565542c0a7-registry-certificates\") pod \"image-registry-66df7c8f76-bnnkm\" (UID: \"5c79a7e9-bfcf-4af7-b733-c9565542c0a7\") " pod="openshift-image-registry/image-registry-66df7c8f76-bnnkm" Sep 30 06:27:12 crc kubenswrapper[4691]: I0930 06:27:12.753958 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5c79a7e9-bfcf-4af7-b733-c9565542c0a7-trusted-ca\") pod \"image-registry-66df7c8f76-bnnkm\" (UID: \"5c79a7e9-bfcf-4af7-b733-c9565542c0a7\") " pod="openshift-image-registry/image-registry-66df7c8f76-bnnkm" Sep 30 06:27:12 crc kubenswrapper[4691]: I0930 06:27:12.754044 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5c79a7e9-bfcf-4af7-b733-c9565542c0a7-ca-trust-extracted\") pod \"image-registry-66df7c8f76-bnnkm\" (UID: \"5c79a7e9-bfcf-4af7-b733-c9565542c0a7\") " pod="openshift-image-registry/image-registry-66df7c8f76-bnnkm" Sep 30 06:27:12 crc kubenswrapper[4691]: I0930 06:27:12.754122 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-bnnkm\" (UID: \"5c79a7e9-bfcf-4af7-b733-c9565542c0a7\") " pod="openshift-image-registry/image-registry-66df7c8f76-bnnkm" Sep 30 06:27:12 crc kubenswrapper[4691]: I0930 06:27:12.754235 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4f4gq\" (UniqueName: \"kubernetes.io/projected/5c79a7e9-bfcf-4af7-b733-c9565542c0a7-kube-api-access-4f4gq\") pod \"image-registry-66df7c8f76-bnnkm\" (UID: \"5c79a7e9-bfcf-4af7-b733-c9565542c0a7\") " pod="openshift-image-registry/image-registry-66df7c8f76-bnnkm" Sep 30 06:27:12 crc kubenswrapper[4691]: I0930 06:27:12.788129 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-bnnkm\" (UID: \"5c79a7e9-bfcf-4af7-b733-c9565542c0a7\") " pod="openshift-image-registry/image-registry-66df7c8f76-bnnkm" Sep 30 06:27:12 crc kubenswrapper[4691]: I0930 06:27:12.855229 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4f4gq\" (UniqueName: \"kubernetes.io/projected/5c79a7e9-bfcf-4af7-b733-c9565542c0a7-kube-api-access-4f4gq\") pod \"image-registry-66df7c8f76-bnnkm\" (UID: \"5c79a7e9-bfcf-4af7-b733-c9565542c0a7\") " pod="openshift-image-registry/image-registry-66df7c8f76-bnnkm" Sep 30 06:27:12 crc kubenswrapper[4691]: I0930 06:27:12.855293 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5c79a7e9-bfcf-4af7-b733-c9565542c0a7-installation-pull-secrets\") pod \"image-registry-66df7c8f76-bnnkm\" (UID: \"5c79a7e9-bfcf-4af7-b733-c9565542c0a7\") " pod="openshift-image-registry/image-registry-66df7c8f76-bnnkm" Sep 30 06:27:12 crc kubenswrapper[4691]: I0930 06:27:12.855333 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5c79a7e9-bfcf-4af7-b733-c9565542c0a7-bound-sa-token\") pod \"image-registry-66df7c8f76-bnnkm\" (UID: \"5c79a7e9-bfcf-4af7-b733-c9565542c0a7\") " pod="openshift-image-registry/image-registry-66df7c8f76-bnnkm" Sep 30 06:27:12 crc kubenswrapper[4691]: I0930 06:27:12.855368 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5c79a7e9-bfcf-4af7-b733-c9565542c0a7-registry-tls\") pod \"image-registry-66df7c8f76-bnnkm\" (UID: \"5c79a7e9-bfcf-4af7-b733-c9565542c0a7\") " pod="openshift-image-registry/image-registry-66df7c8f76-bnnkm" Sep 30 06:27:12 crc kubenswrapper[4691]: I0930 06:27:12.855740 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5c79a7e9-bfcf-4af7-b733-c9565542c0a7-registry-certificates\") pod \"image-registry-66df7c8f76-bnnkm\" (UID: \"5c79a7e9-bfcf-4af7-b733-c9565542c0a7\") " pod="openshift-image-registry/image-registry-66df7c8f76-bnnkm" Sep 30 06:27:12 crc kubenswrapper[4691]: I0930 06:27:12.855792 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5c79a7e9-bfcf-4af7-b733-c9565542c0a7-trusted-ca\") pod \"image-registry-66df7c8f76-bnnkm\" (UID: \"5c79a7e9-bfcf-4af7-b733-c9565542c0a7\") " pod="openshift-image-registry/image-registry-66df7c8f76-bnnkm" Sep 30 06:27:12 crc kubenswrapper[4691]: I0930 06:27:12.855822 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5c79a7e9-bfcf-4af7-b733-c9565542c0a7-ca-trust-extracted\") pod \"image-registry-66df7c8f76-bnnkm\" (UID: \"5c79a7e9-bfcf-4af7-b733-c9565542c0a7\") " pod="openshift-image-registry/image-registry-66df7c8f76-bnnkm" Sep 30 06:27:12 crc kubenswrapper[4691]: I0930 06:27:12.856603 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/5c79a7e9-bfcf-4af7-b733-c9565542c0a7-ca-trust-extracted\") pod \"image-registry-66df7c8f76-bnnkm\" (UID: \"5c79a7e9-bfcf-4af7-b733-c9565542c0a7\") " pod="openshift-image-registry/image-registry-66df7c8f76-bnnkm" Sep 30 06:27:12 crc kubenswrapper[4691]: I0930 06:27:12.858252 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/5c79a7e9-bfcf-4af7-b733-c9565542c0a7-registry-certificates\") pod \"image-registry-66df7c8f76-bnnkm\" (UID: \"5c79a7e9-bfcf-4af7-b733-c9565542c0a7\") " pod="openshift-image-registry/image-registry-66df7c8f76-bnnkm" Sep 30 06:27:12 crc kubenswrapper[4691]: I0930 06:27:12.858678 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5c79a7e9-bfcf-4af7-b733-c9565542c0a7-trusted-ca\") pod \"image-registry-66df7c8f76-bnnkm\" (UID: \"5c79a7e9-bfcf-4af7-b733-c9565542c0a7\") " pod="openshift-image-registry/image-registry-66df7c8f76-bnnkm" Sep 30 06:27:12 crc kubenswrapper[4691]: I0930 06:27:12.865020 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/5c79a7e9-bfcf-4af7-b733-c9565542c0a7-registry-tls\") pod \"image-registry-66df7c8f76-bnnkm\" (UID: \"5c79a7e9-bfcf-4af7-b733-c9565542c0a7\") " pod="openshift-image-registry/image-registry-66df7c8f76-bnnkm" Sep 30 06:27:12 crc kubenswrapper[4691]: I0930 06:27:12.865546 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/5c79a7e9-bfcf-4af7-b733-c9565542c0a7-installation-pull-secrets\") pod \"image-registry-66df7c8f76-bnnkm\" (UID: \"5c79a7e9-bfcf-4af7-b733-c9565542c0a7\") " pod="openshift-image-registry/image-registry-66df7c8f76-bnnkm" Sep 30 06:27:12 crc kubenswrapper[4691]: I0930 06:27:12.886475 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4f4gq\" (UniqueName: \"kubernetes.io/projected/5c79a7e9-bfcf-4af7-b733-c9565542c0a7-kube-api-access-4f4gq\") pod \"image-registry-66df7c8f76-bnnkm\" (UID: \"5c79a7e9-bfcf-4af7-b733-c9565542c0a7\") " pod="openshift-image-registry/image-registry-66df7c8f76-bnnkm" Sep 30 06:27:12 crc kubenswrapper[4691]: I0930 06:27:12.886556 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5c79a7e9-bfcf-4af7-b733-c9565542c0a7-bound-sa-token\") pod \"image-registry-66df7c8f76-bnnkm\" (UID: \"5c79a7e9-bfcf-4af7-b733-c9565542c0a7\") " pod="openshift-image-registry/image-registry-66df7c8f76-bnnkm" Sep 30 06:27:13 crc kubenswrapper[4691]: I0930 06:27:13.182073 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-bnnkm" Sep 30 06:27:13 crc kubenswrapper[4691]: I0930 06:27:13.409069 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-bnnkm"] Sep 30 06:27:14 crc kubenswrapper[4691]: I0930 06:27:14.414636 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-bnnkm" event={"ID":"5c79a7e9-bfcf-4af7-b733-c9565542c0a7","Type":"ContainerStarted","Data":"e905957b660e40b748476dd5701817ff2ab862c08f3a289eccdbbf3d688c609c"} Sep 30 06:27:14 crc kubenswrapper[4691]: I0930 06:27:14.415070 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-bnnkm" event={"ID":"5c79a7e9-bfcf-4af7-b733-c9565542c0a7","Type":"ContainerStarted","Data":"d68c8911c857ca4fd37c14a8dabc0cd30c055ed8b3ab3246f7324599da9690c8"} Sep 30 06:27:14 crc kubenswrapper[4691]: I0930 06:27:14.415108 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-bnnkm" Sep 30 06:27:14 crc kubenswrapper[4691]: I0930 06:27:14.446290 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-bnnkm" podStartSLOduration=2.446266267 podStartE2EDuration="2.446266267s" podCreationTimestamp="2025-09-30 06:27:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:27:14.442854892 +0000 UTC m=+477.917875992" watchObservedRunningTime="2025-09-30 06:27:14.446266267 +0000 UTC m=+477.921287337" Sep 30 06:27:33 crc kubenswrapper[4691]: I0930 06:27:33.188470 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-bnnkm" Sep 30 06:27:33 crc kubenswrapper[4691]: I0930 06:27:33.254951 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-xlxgb"] Sep 30 06:27:58 crc kubenswrapper[4691]: I0930 06:27:58.313622 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" podUID="c134a0fe-e3a2-4683-95d1-045ba2056b14" containerName="registry" containerID="cri-o://5a1a61bce140a4733d73582e090cece0039c90f5edb54b09d245529de9f3a1ee" gracePeriod=30 Sep 30 06:27:58 crc kubenswrapper[4691]: I0930 06:27:58.719870 4691 generic.go:334] "Generic (PLEG): container finished" podID="c134a0fe-e3a2-4683-95d1-045ba2056b14" containerID="5a1a61bce140a4733d73582e090cece0039c90f5edb54b09d245529de9f3a1ee" exitCode=0 Sep 30 06:27:58 crc kubenswrapper[4691]: I0930 06:27:58.719945 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" event={"ID":"c134a0fe-e3a2-4683-95d1-045ba2056b14","Type":"ContainerDied","Data":"5a1a61bce140a4733d73582e090cece0039c90f5edb54b09d245529de9f3a1ee"} Sep 30 06:27:58 crc kubenswrapper[4691]: I0930 06:27:58.720336 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" event={"ID":"c134a0fe-e3a2-4683-95d1-045ba2056b14","Type":"ContainerDied","Data":"4cb80b00ced64fb3f7ec90b38a3d3186ed66f695137b7bad6a0f749f9f6427f2"} Sep 30 06:27:58 crc kubenswrapper[4691]: I0930 06:27:58.720365 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4cb80b00ced64fb3f7ec90b38a3d3186ed66f695137b7bad6a0f749f9f6427f2" Sep 30 06:27:58 crc kubenswrapper[4691]: I0930 06:27:58.755149 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" Sep 30 06:27:58 crc kubenswrapper[4691]: I0930 06:27:58.854076 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c134a0fe-e3a2-4683-95d1-045ba2056b14-installation-pull-secrets\") pod \"c134a0fe-e3a2-4683-95d1-045ba2056b14\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " Sep 30 06:27:58 crc kubenswrapper[4691]: I0930 06:27:58.854147 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c134a0fe-e3a2-4683-95d1-045ba2056b14-registry-certificates\") pod \"c134a0fe-e3a2-4683-95d1-045ba2056b14\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " Sep 30 06:27:58 crc kubenswrapper[4691]: I0930 06:27:58.854242 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c134a0fe-e3a2-4683-95d1-045ba2056b14-bound-sa-token\") pod \"c134a0fe-e3a2-4683-95d1-045ba2056b14\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " Sep 30 06:27:58 crc kubenswrapper[4691]: I0930 06:27:58.854312 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bfp66\" (UniqueName: \"kubernetes.io/projected/c134a0fe-e3a2-4683-95d1-045ba2056b14-kube-api-access-bfp66\") pod \"c134a0fe-e3a2-4683-95d1-045ba2056b14\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " Sep 30 06:27:58 crc kubenswrapper[4691]: I0930 06:27:58.854377 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c134a0fe-e3a2-4683-95d1-045ba2056b14-registry-tls\") pod \"c134a0fe-e3a2-4683-95d1-045ba2056b14\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " Sep 30 06:27:58 crc kubenswrapper[4691]: I0930 06:27:58.854413 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c134a0fe-e3a2-4683-95d1-045ba2056b14-ca-trust-extracted\") pod \"c134a0fe-e3a2-4683-95d1-045ba2056b14\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " Sep 30 06:27:58 crc kubenswrapper[4691]: I0930 06:27:58.854466 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c134a0fe-e3a2-4683-95d1-045ba2056b14-trusted-ca\") pod \"c134a0fe-e3a2-4683-95d1-045ba2056b14\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " Sep 30 06:27:58 crc kubenswrapper[4691]: I0930 06:27:58.854692 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"c134a0fe-e3a2-4683-95d1-045ba2056b14\" (UID: \"c134a0fe-e3a2-4683-95d1-045ba2056b14\") " Sep 30 06:27:58 crc kubenswrapper[4691]: I0930 06:27:58.856007 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c134a0fe-e3a2-4683-95d1-045ba2056b14-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "c134a0fe-e3a2-4683-95d1-045ba2056b14" (UID: "c134a0fe-e3a2-4683-95d1-045ba2056b14"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:27:58 crc kubenswrapper[4691]: I0930 06:27:58.856270 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c134a0fe-e3a2-4683-95d1-045ba2056b14-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "c134a0fe-e3a2-4683-95d1-045ba2056b14" (UID: "c134a0fe-e3a2-4683-95d1-045ba2056b14"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:27:58 crc kubenswrapper[4691]: I0930 06:27:58.866136 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c134a0fe-e3a2-4683-95d1-045ba2056b14-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "c134a0fe-e3a2-4683-95d1-045ba2056b14" (UID: "c134a0fe-e3a2-4683-95d1-045ba2056b14"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:27:58 crc kubenswrapper[4691]: I0930 06:27:58.868220 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c134a0fe-e3a2-4683-95d1-045ba2056b14-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "c134a0fe-e3a2-4683-95d1-045ba2056b14" (UID: "c134a0fe-e3a2-4683-95d1-045ba2056b14"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:27:58 crc kubenswrapper[4691]: I0930 06:27:58.870085 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "c134a0fe-e3a2-4683-95d1-045ba2056b14" (UID: "c134a0fe-e3a2-4683-95d1-045ba2056b14"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Sep 30 06:27:58 crc kubenswrapper[4691]: I0930 06:27:58.871282 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c134a0fe-e3a2-4683-95d1-045ba2056b14-kube-api-access-bfp66" (OuterVolumeSpecName: "kube-api-access-bfp66") pod "c134a0fe-e3a2-4683-95d1-045ba2056b14" (UID: "c134a0fe-e3a2-4683-95d1-045ba2056b14"). InnerVolumeSpecName "kube-api-access-bfp66". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:27:58 crc kubenswrapper[4691]: I0930 06:27:58.871681 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c134a0fe-e3a2-4683-95d1-045ba2056b14-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "c134a0fe-e3a2-4683-95d1-045ba2056b14" (UID: "c134a0fe-e3a2-4683-95d1-045ba2056b14"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:27:58 crc kubenswrapper[4691]: I0930 06:27:58.889332 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c134a0fe-e3a2-4683-95d1-045ba2056b14-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "c134a0fe-e3a2-4683-95d1-045ba2056b14" (UID: "c134a0fe-e3a2-4683-95d1-045ba2056b14"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:27:58 crc kubenswrapper[4691]: I0930 06:27:58.956318 4691 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c134a0fe-e3a2-4683-95d1-045ba2056b14-registry-certificates\") on node \"crc\" DevicePath \"\"" Sep 30 06:27:58 crc kubenswrapper[4691]: I0930 06:27:58.956365 4691 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c134a0fe-e3a2-4683-95d1-045ba2056b14-bound-sa-token\") on node \"crc\" DevicePath \"\"" Sep 30 06:27:58 crc kubenswrapper[4691]: I0930 06:27:58.956383 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bfp66\" (UniqueName: \"kubernetes.io/projected/c134a0fe-e3a2-4683-95d1-045ba2056b14-kube-api-access-bfp66\") on node \"crc\" DevicePath \"\"" Sep 30 06:27:58 crc kubenswrapper[4691]: I0930 06:27:58.956402 4691 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c134a0fe-e3a2-4683-95d1-045ba2056b14-registry-tls\") on node \"crc\" DevicePath \"\"" Sep 30 06:27:58 crc kubenswrapper[4691]: I0930 06:27:58.956418 4691 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c134a0fe-e3a2-4683-95d1-045ba2056b14-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Sep 30 06:27:58 crc kubenswrapper[4691]: I0930 06:27:58.956435 4691 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c134a0fe-e3a2-4683-95d1-045ba2056b14-trusted-ca\") on node \"crc\" DevicePath \"\"" Sep 30 06:27:58 crc kubenswrapper[4691]: I0930 06:27:58.956452 4691 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c134a0fe-e3a2-4683-95d1-045ba2056b14-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Sep 30 06:27:59 crc kubenswrapper[4691]: I0930 06:27:59.727023 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-xlxgb" Sep 30 06:27:59 crc kubenswrapper[4691]: I0930 06:27:59.751838 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-xlxgb"] Sep 30 06:27:59 crc kubenswrapper[4691]: I0930 06:27:59.757586 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-xlxgb"] Sep 30 06:28:01 crc kubenswrapper[4691]: I0930 06:28:01.236743 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c134a0fe-e3a2-4683-95d1-045ba2056b14" path="/var/lib/kubelet/pods/c134a0fe-e3a2-4683-95d1-045ba2056b14/volumes" Sep 30 06:28:17 crc kubenswrapper[4691]: I0930 06:28:17.477633 4691 scope.go:117] "RemoveContainer" containerID="5a1a61bce140a4733d73582e090cece0039c90f5edb54b09d245529de9f3a1ee" Sep 30 06:28:22 crc kubenswrapper[4691]: I0930 06:28:22.587738 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-xwxbt"] Sep 30 06:28:22 crc kubenswrapper[4691]: E0930 06:28:22.588518 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c134a0fe-e3a2-4683-95d1-045ba2056b14" containerName="registry" Sep 30 06:28:22 crc kubenswrapper[4691]: I0930 06:28:22.588532 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="c134a0fe-e3a2-4683-95d1-045ba2056b14" containerName="registry" Sep 30 06:28:22 crc kubenswrapper[4691]: I0930 06:28:22.588633 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="c134a0fe-e3a2-4683-95d1-045ba2056b14" containerName="registry" Sep 30 06:28:22 crc kubenswrapper[4691]: I0930 06:28:22.589043 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-xwxbt" Sep 30 06:28:22 crc kubenswrapper[4691]: I0930 06:28:22.592882 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Sep 30 06:28:22 crc kubenswrapper[4691]: I0930 06:28:22.593285 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfnz9\" (UniqueName: \"kubernetes.io/projected/5746a924-b059-4e93-91c3-31bbe5e2ef86-kube-api-access-qfnz9\") pod \"cert-manager-cainjector-7f985d654d-xwxbt\" (UID: \"5746a924-b059-4e93-91c3-31bbe5e2ef86\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-xwxbt" Sep 30 06:28:22 crc kubenswrapper[4691]: I0930 06:28:22.595562 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-5b446d88c5-xxwh7"] Sep 30 06:28:22 crc kubenswrapper[4691]: I0930 06:28:22.596683 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-xxwh7" Sep 30 06:28:22 crc kubenswrapper[4691]: I0930 06:28:22.600990 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-xwxbt"] Sep 30 06:28:22 crc kubenswrapper[4691]: I0930 06:28:22.601385 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Sep 30 06:28:22 crc kubenswrapper[4691]: I0930 06:28:22.603180 4691 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-lnk4w" Sep 30 06:28:22 crc kubenswrapper[4691]: I0930 06:28:22.603666 4691 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-h2gdg" Sep 30 06:28:22 crc kubenswrapper[4691]: I0930 06:28:22.613696 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-99v8p"] Sep 30 06:28:22 crc kubenswrapper[4691]: I0930 06:28:22.614535 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-99v8p" Sep 30 06:28:22 crc kubenswrapper[4691]: I0930 06:28:22.617558 4691 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-bl7hn" Sep 30 06:28:22 crc kubenswrapper[4691]: I0930 06:28:22.632036 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-xxwh7"] Sep 30 06:28:22 crc kubenswrapper[4691]: I0930 06:28:22.635199 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-99v8p"] Sep 30 06:28:22 crc kubenswrapper[4691]: I0930 06:28:22.695091 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vr7fn\" (UniqueName: \"kubernetes.io/projected/746ab0d5-3b5c-4985-935e-73a35939302d-kube-api-access-vr7fn\") pod \"cert-manager-webhook-5655c58dd6-99v8p\" (UID: \"746ab0d5-3b5c-4985-935e-73a35939302d\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-99v8p" Sep 30 06:28:22 crc kubenswrapper[4691]: I0930 06:28:22.695178 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfnz9\" (UniqueName: \"kubernetes.io/projected/5746a924-b059-4e93-91c3-31bbe5e2ef86-kube-api-access-qfnz9\") pod \"cert-manager-cainjector-7f985d654d-xwxbt\" (UID: \"5746a924-b059-4e93-91c3-31bbe5e2ef86\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-xwxbt" Sep 30 06:28:22 crc kubenswrapper[4691]: I0930 06:28:22.718465 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfnz9\" (UniqueName: \"kubernetes.io/projected/5746a924-b059-4e93-91c3-31bbe5e2ef86-kube-api-access-qfnz9\") pod \"cert-manager-cainjector-7f985d654d-xwxbt\" (UID: \"5746a924-b059-4e93-91c3-31bbe5e2ef86\") " pod="cert-manager/cert-manager-cainjector-7f985d654d-xwxbt" Sep 30 06:28:22 crc kubenswrapper[4691]: I0930 06:28:22.796252 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2skjz\" (UniqueName: \"kubernetes.io/projected/2ee3b0f5-be03-426e-b603-ec6c53237e85-kube-api-access-2skjz\") pod \"cert-manager-5b446d88c5-xxwh7\" (UID: \"2ee3b0f5-be03-426e-b603-ec6c53237e85\") " pod="cert-manager/cert-manager-5b446d88c5-xxwh7" Sep 30 06:28:22 crc kubenswrapper[4691]: I0930 06:28:22.796319 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vr7fn\" (UniqueName: \"kubernetes.io/projected/746ab0d5-3b5c-4985-935e-73a35939302d-kube-api-access-vr7fn\") pod \"cert-manager-webhook-5655c58dd6-99v8p\" (UID: \"746ab0d5-3b5c-4985-935e-73a35939302d\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-99v8p" Sep 30 06:28:22 crc kubenswrapper[4691]: I0930 06:28:22.822131 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vr7fn\" (UniqueName: \"kubernetes.io/projected/746ab0d5-3b5c-4985-935e-73a35939302d-kube-api-access-vr7fn\") pod \"cert-manager-webhook-5655c58dd6-99v8p\" (UID: \"746ab0d5-3b5c-4985-935e-73a35939302d\") " pod="cert-manager/cert-manager-webhook-5655c58dd6-99v8p" Sep 30 06:28:22 crc kubenswrapper[4691]: I0930 06:28:22.898002 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2skjz\" (UniqueName: \"kubernetes.io/projected/2ee3b0f5-be03-426e-b603-ec6c53237e85-kube-api-access-2skjz\") pod \"cert-manager-5b446d88c5-xxwh7\" (UID: \"2ee3b0f5-be03-426e-b603-ec6c53237e85\") " pod="cert-manager/cert-manager-5b446d88c5-xxwh7" Sep 30 06:28:22 crc kubenswrapper[4691]: I0930 06:28:22.910003 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-7f985d654d-xwxbt" Sep 30 06:28:22 crc kubenswrapper[4691]: I0930 06:28:22.923367 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2skjz\" (UniqueName: \"kubernetes.io/projected/2ee3b0f5-be03-426e-b603-ec6c53237e85-kube-api-access-2skjz\") pod \"cert-manager-5b446d88c5-xxwh7\" (UID: \"2ee3b0f5-be03-426e-b603-ec6c53237e85\") " pod="cert-manager/cert-manager-5b446d88c5-xxwh7" Sep 30 06:28:22 crc kubenswrapper[4691]: I0930 06:28:22.931183 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-5655c58dd6-99v8p" Sep 30 06:28:23 crc kubenswrapper[4691]: I0930 06:28:23.137945 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-5655c58dd6-99v8p"] Sep 30 06:28:23 crc kubenswrapper[4691]: I0930 06:28:23.150588 4691 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 06:28:23 crc kubenswrapper[4691]: I0930 06:28:23.172394 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-7f985d654d-xwxbt"] Sep 30 06:28:23 crc kubenswrapper[4691]: W0930 06:28:23.174718 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5746a924_b059_4e93_91c3_31bbe5e2ef86.slice/crio-f7f3df00a1209fd2e0a48d11f5b8ba50ff1eb0c587425f0730bf7c694c27ffdb WatchSource:0}: Error finding container f7f3df00a1209fd2e0a48d11f5b8ba50ff1eb0c587425f0730bf7c694c27ffdb: Status 404 returned error can't find the container with id f7f3df00a1209fd2e0a48d11f5b8ba50ff1eb0c587425f0730bf7c694c27ffdb Sep 30 06:28:23 crc kubenswrapper[4691]: I0930 06:28:23.218560 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-5b446d88c5-xxwh7" Sep 30 06:28:23 crc kubenswrapper[4691]: I0930 06:28:23.423650 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-5b446d88c5-xxwh7"] Sep 30 06:28:23 crc kubenswrapper[4691]: W0930 06:28:23.431173 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ee3b0f5_be03_426e_b603_ec6c53237e85.slice/crio-54ddacf674adbcb185b21decec31c09ad81462b1dc0b3d304d568f1f88520d5d WatchSource:0}: Error finding container 54ddacf674adbcb185b21decec31c09ad81462b1dc0b3d304d568f1f88520d5d: Status 404 returned error can't find the container with id 54ddacf674adbcb185b21decec31c09ad81462b1dc0b3d304d568f1f88520d5d Sep 30 06:28:23 crc kubenswrapper[4691]: I0930 06:28:23.905406 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-99v8p" event={"ID":"746ab0d5-3b5c-4985-935e-73a35939302d","Type":"ContainerStarted","Data":"5ce368e43d0455a77cbd2e75d340edbb119860e0a93108d3f21c3390cca4fa1d"} Sep 30 06:28:23 crc kubenswrapper[4691]: I0930 06:28:23.907092 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-xxwh7" event={"ID":"2ee3b0f5-be03-426e-b603-ec6c53237e85","Type":"ContainerStarted","Data":"54ddacf674adbcb185b21decec31c09ad81462b1dc0b3d304d568f1f88520d5d"} Sep 30 06:28:23 crc kubenswrapper[4691]: I0930 06:28:23.908634 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-xwxbt" event={"ID":"5746a924-b059-4e93-91c3-31bbe5e2ef86","Type":"ContainerStarted","Data":"f7f3df00a1209fd2e0a48d11f5b8ba50ff1eb0c587425f0730bf7c694c27ffdb"} Sep 30 06:28:25 crc kubenswrapper[4691]: I0930 06:28:25.926521 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-7f985d654d-xwxbt" event={"ID":"5746a924-b059-4e93-91c3-31bbe5e2ef86","Type":"ContainerStarted","Data":"51ce23eeb7b00bcd5a45433d4a4f9e2368722839fdc7dae508d79247a8fe9d47"} Sep 30 06:28:26 crc kubenswrapper[4691]: I0930 06:28:26.932150 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-5655c58dd6-99v8p" event={"ID":"746ab0d5-3b5c-4985-935e-73a35939302d","Type":"ContainerStarted","Data":"367f01886b615b26263b134ebecc9c20449fab035977ef2be991fa752f547c2b"} Sep 30 06:28:26 crc kubenswrapper[4691]: I0930 06:28:26.947517 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-7f985d654d-xwxbt" podStartSLOduration=2.363034747 podStartE2EDuration="4.947502123s" podCreationTimestamp="2025-09-30 06:28:22 +0000 UTC" firstStartedPulling="2025-09-30 06:28:23.176247187 +0000 UTC m=+546.651268227" lastFinishedPulling="2025-09-30 06:28:25.760714543 +0000 UTC m=+549.235735603" observedRunningTime="2025-09-30 06:28:25.954210079 +0000 UTC m=+549.429231149" watchObservedRunningTime="2025-09-30 06:28:26.947502123 +0000 UTC m=+550.422523163" Sep 30 06:28:27 crc kubenswrapper[4691]: I0930 06:28:27.264188 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-5655c58dd6-99v8p" podStartSLOduration=2.531056727 podStartE2EDuration="5.264159608s" podCreationTimestamp="2025-09-30 06:28:22 +0000 UTC" firstStartedPulling="2025-09-30 06:28:23.150365108 +0000 UTC m=+546.625386148" lastFinishedPulling="2025-09-30 06:28:25.883467949 +0000 UTC m=+549.358489029" observedRunningTime="2025-09-30 06:28:26.948533575 +0000 UTC m=+550.423554615" watchObservedRunningTime="2025-09-30 06:28:27.264159608 +0000 UTC m=+550.739180688" Sep 30 06:28:27 crc kubenswrapper[4691]: I0930 06:28:27.932088 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-5655c58dd6-99v8p" Sep 30 06:28:27 crc kubenswrapper[4691]: I0930 06:28:27.939297 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-5b446d88c5-xxwh7" event={"ID":"2ee3b0f5-be03-426e-b603-ec6c53237e85","Type":"ContainerStarted","Data":"9e5bba5afd36484ae515b85218f784b09f4555ef44286fcef9440b423ee2f637"} Sep 30 06:28:27 crc kubenswrapper[4691]: I0930 06:28:27.965441 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-5b446d88c5-xxwh7" podStartSLOduration=2.197188157 podStartE2EDuration="5.965418567s" podCreationTimestamp="2025-09-30 06:28:22 +0000 UTC" firstStartedPulling="2025-09-30 06:28:23.434145542 +0000 UTC m=+546.909166622" lastFinishedPulling="2025-09-30 06:28:27.202375992 +0000 UTC m=+550.677397032" observedRunningTime="2025-09-30 06:28:27.960267484 +0000 UTC m=+551.435288594" watchObservedRunningTime="2025-09-30 06:28:27.965418567 +0000 UTC m=+551.440439647" Sep 30 06:28:32 crc kubenswrapper[4691]: I0930 06:28:32.936022 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-5655c58dd6-99v8p" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.405848 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-sjmvw"] Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.406712 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" podUID="6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" containerName="ovn-controller" containerID="cri-o://d002e869e911c24f154df90324ace473c747a4cfa6dd60ddcbef6473f9815f72" gracePeriod=30 Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.406846 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" podUID="6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" containerName="sbdb" containerID="cri-o://c9f49abce591ae5c912fdbbdf0253d06ee20088424718dd6d535683a2a2ede5e" gracePeriod=30 Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.406952 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" podUID="6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" containerName="kube-rbac-proxy-node" containerID="cri-o://5b6df86867d5427797ec0c0a28b653ef5b8ec62af450207d9af5f490df725b05" gracePeriod=30 Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.407000 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" podUID="6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" containerName="ovn-acl-logging" containerID="cri-o://20789e47d7e584044b0763af1eb8b81bbde2f247ec14e56a5a9b08c28894c3be" gracePeriod=30 Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.407154 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" podUID="6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" containerName="nbdb" containerID="cri-o://36d95203bc3297779316f06d001c72859f9b4561a9a9d7c3123c5a75e8d8bc6b" gracePeriod=30 Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.407247 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" podUID="6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" containerName="northd" containerID="cri-o://f46927454f67993366632bb3f6e37e4cb0c5f013266b18c1ac61e9cddc42e023" gracePeriod=30 Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.409017 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" podUID="6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://04bc58a7b740ca100f333ef769a949f962ff6024f8dd87b798c99a28514e0394" gracePeriod=30 Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.467568 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" podUID="6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" containerName="ovnkube-controller" containerID="cri-o://0879865d465a3a53a432bcc490519dc13015da51ae4252508ba71348c72bd4c8" gracePeriod=30 Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.764455 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sjmvw_6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d/ovnkube-controller/3.log" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.767069 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sjmvw_6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d/ovn-acl-logging/0.log" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.767637 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sjmvw_6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d/ovn-controller/0.log" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.768136 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.833341 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-7scx6"] Sep 30 06:28:33 crc kubenswrapper[4691]: E0930 06:28:33.833546 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" containerName="ovnkube-controller" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.833561 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" containerName="ovnkube-controller" Sep 30 06:28:33 crc kubenswrapper[4691]: E0930 06:28:33.833573 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" containerName="ovn-acl-logging" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.833580 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" containerName="ovn-acl-logging" Sep 30 06:28:33 crc kubenswrapper[4691]: E0930 06:28:33.833594 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" containerName="ovn-controller" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.833601 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" containerName="ovn-controller" Sep 30 06:28:33 crc kubenswrapper[4691]: E0930 06:28:33.833612 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" containerName="northd" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.833619 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" containerName="northd" Sep 30 06:28:33 crc kubenswrapper[4691]: E0930 06:28:33.833627 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" containerName="ovnkube-controller" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.833634 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" containerName="ovnkube-controller" Sep 30 06:28:33 crc kubenswrapper[4691]: E0930 06:28:33.833642 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" containerName="ovnkube-controller" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.833648 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" containerName="ovnkube-controller" Sep 30 06:28:33 crc kubenswrapper[4691]: E0930 06:28:33.833657 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" containerName="kube-rbac-proxy-ovn-metrics" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.833664 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" containerName="kube-rbac-proxy-ovn-metrics" Sep 30 06:28:33 crc kubenswrapper[4691]: E0930 06:28:33.833673 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" containerName="sbdb" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.833680 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" containerName="sbdb" Sep 30 06:28:33 crc kubenswrapper[4691]: E0930 06:28:33.833692 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" containerName="kubecfg-setup" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.833700 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" containerName="kubecfg-setup" Sep 30 06:28:33 crc kubenswrapper[4691]: E0930 06:28:33.833710 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" containerName="nbdb" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.833717 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" containerName="nbdb" Sep 30 06:28:33 crc kubenswrapper[4691]: E0930 06:28:33.833729 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" containerName="kube-rbac-proxy-node" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.833739 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" containerName="kube-rbac-proxy-node" Sep 30 06:28:33 crc kubenswrapper[4691]: E0930 06:28:33.834094 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" containerName="ovnkube-controller" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.834103 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" containerName="ovnkube-controller" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.834216 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" containerName="ovnkube-controller" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.834228 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" containerName="ovnkube-controller" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.834237 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" containerName="northd" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.834246 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" containerName="kube-rbac-proxy-ovn-metrics" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.834258 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" containerName="ovn-controller" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.834266 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" containerName="ovnkube-controller" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.834276 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" containerName="ovn-acl-logging" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.834286 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" containerName="kube-rbac-proxy-node" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.834301 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" containerName="sbdb" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.834308 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" containerName="nbdb" Sep 30 06:28:33 crc kubenswrapper[4691]: E0930 06:28:33.834413 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" containerName="ovnkube-controller" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.834422 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" containerName="ovnkube-controller" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.834526 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" containerName="ovnkube-controller" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.834539 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" containerName="ovnkube-controller" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.836555 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7scx6" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.947022 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-host-run-netns\") pod \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\" (UID: \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\") " Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.947075 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-log-socket\") pod \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\" (UID: \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\") " Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.947113 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-host-slash\") pod \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\" (UID: \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\") " Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.947139 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" (UID: "6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.947150 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-ovnkube-script-lib\") pod \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\" (UID: \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\") " Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.947204 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-run-ovn\") pod \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\" (UID: \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\") " Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.947233 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" (UID: "6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.947258 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-log-socket" (OuterVolumeSpecName: "log-socket") pod "6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" (UID: "6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.947280 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-host-slash" (OuterVolumeSpecName: "host-slash") pod "6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" (UID: "6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.947342 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-etc-openvswitch\") pod \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\" (UID: \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\") " Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.947442 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" (UID: "6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.947519 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-node-log\") pod \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\" (UID: \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\") " Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.947552 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-node-log" (OuterVolumeSpecName: "node-log") pod "6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" (UID: "6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.947814 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" (UID: "6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.947846 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5nvgw\" (UniqueName: \"kubernetes.io/projected/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-kube-api-access-5nvgw\") pod \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\" (UID: \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\") " Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.948013 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-run-systemd\") pod \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\" (UID: \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\") " Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.948073 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-host-kubelet\") pod \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\" (UID: \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\") " Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.948132 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-env-overrides\") pod \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\" (UID: \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\") " Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.948206 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" (UID: "6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.948215 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\" (UID: \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\") " Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.948267 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-run-openvswitch\") pod \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\" (UID: \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\") " Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.948318 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-ovnkube-config\") pod \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\" (UID: \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\") " Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.948334 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" (UID: "6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.948362 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" (UID: "6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.948382 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-var-lib-openvswitch\") pod \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\" (UID: \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\") " Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.948425 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-systemd-units\") pod \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\" (UID: \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\") " Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.948467 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-host-run-ovn-kubernetes\") pod \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\" (UID: \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\") " Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.948513 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" (UID: "6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.948523 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-ovn-node-metrics-cert\") pod \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\" (UID: \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\") " Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.948553 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" (UID: "6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.948567 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-host-cni-netd\") pod \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\" (UID: \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\") " Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.948640 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-host-cni-bin\") pod \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\" (UID: \"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d\") " Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.948573 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" (UID: "6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.948605 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" (UID: "6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.948581 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" (UID: "6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.948792 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" (UID: "6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.948801 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" (UID: "6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.948994 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c-host-cni-netd\") pod \"ovnkube-node-7scx6\" (UID: \"91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7scx6" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.949056 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c-ovn-node-metrics-cert\") pod \"ovnkube-node-7scx6\" (UID: \"91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7scx6" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.949125 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c-run-ovn\") pod \"ovnkube-node-7scx6\" (UID: \"91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7scx6" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.949175 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c-host-run-ovn-kubernetes\") pod \"ovnkube-node-7scx6\" (UID: \"91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7scx6" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.949277 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c-ovnkube-config\") pod \"ovnkube-node-7scx6\" (UID: \"91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7scx6" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.949360 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c-run-systemd\") pod \"ovnkube-node-7scx6\" (UID: \"91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7scx6" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.949411 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c-host-slash\") pod \"ovnkube-node-7scx6\" (UID: \"91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7scx6" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.949485 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c-env-overrides\") pod \"ovnkube-node-7scx6\" (UID: \"91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7scx6" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.949574 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c-systemd-units\") pod \"ovnkube-node-7scx6\" (UID: \"91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7scx6" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.949666 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c-node-log\") pod \"ovnkube-node-7scx6\" (UID: \"91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7scx6" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.949720 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c-host-run-netns\") pod \"ovnkube-node-7scx6\" (UID: \"91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7scx6" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.949774 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c-var-lib-openvswitch\") pod \"ovnkube-node-7scx6\" (UID: \"91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7scx6" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.949829 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c-host-cni-bin\") pod \"ovnkube-node-7scx6\" (UID: \"91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7scx6" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.949943 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c-run-openvswitch\") pod \"ovnkube-node-7scx6\" (UID: \"91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7scx6" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.950018 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7scx6\" (UID: \"91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7scx6" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.950112 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnt2l\" (UniqueName: \"kubernetes.io/projected/91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c-kube-api-access-rnt2l\") pod \"ovnkube-node-7scx6\" (UID: \"91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7scx6" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.950146 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c-log-socket\") pod \"ovnkube-node-7scx6\" (UID: \"91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7scx6" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.950165 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c-host-kubelet\") pod \"ovnkube-node-7scx6\" (UID: \"91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7scx6" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.950183 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c-etc-openvswitch\") pod \"ovnkube-node-7scx6\" (UID: \"91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7scx6" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.950204 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c-ovnkube-script-lib\") pod \"ovnkube-node-7scx6\" (UID: \"91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7scx6" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.950297 4691 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.950315 4691 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-node-log\") on node \"crc\" DevicePath \"\"" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.950327 4691 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-host-kubelet\") on node \"crc\" DevicePath \"\"" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.950338 4691 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-env-overrides\") on node \"crc\" DevicePath \"\"" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.950377 4691 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.950388 4691 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-run-openvswitch\") on node \"crc\" DevicePath \"\"" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.950398 4691 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-ovnkube-config\") on node \"crc\" DevicePath \"\"" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.950408 4691 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.950417 4691 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-systemd-units\") on node \"crc\" DevicePath \"\"" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.950426 4691 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.950437 4691 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-host-cni-netd\") on node \"crc\" DevicePath \"\"" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.950446 4691 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-host-cni-bin\") on node \"crc\" DevicePath \"\"" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.950455 4691 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-host-run-netns\") on node \"crc\" DevicePath \"\"" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.950464 4691 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-log-socket\") on node \"crc\" DevicePath \"\"" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.950472 4691 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-host-slash\") on node \"crc\" DevicePath \"\"" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.950482 4691 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.950491 4691 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-run-ovn\") on node \"crc\" DevicePath \"\"" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.956600 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" (UID: "6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.958798 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-kube-api-access-5nvgw" (OuterVolumeSpecName: "kube-api-access-5nvgw") pod "6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" (UID: "6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d"). InnerVolumeSpecName "kube-api-access-5nvgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.974220 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" (UID: "6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 06:28:33 crc kubenswrapper[4691]: I0930 06:28:33.998164 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sjmvw_6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d/ovnkube-controller/3.log" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.001069 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sjmvw_6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d/ovn-acl-logging/0.log" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.001827 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-sjmvw_6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d/ovn-controller/0.log" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.002209 4691 generic.go:334] "Generic (PLEG): container finished" podID="6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" containerID="0879865d465a3a53a432bcc490519dc13015da51ae4252508ba71348c72bd4c8" exitCode=0 Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.002234 4691 generic.go:334] "Generic (PLEG): container finished" podID="6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" containerID="c9f49abce591ae5c912fdbbdf0253d06ee20088424718dd6d535683a2a2ede5e" exitCode=0 Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.002244 4691 generic.go:334] "Generic (PLEG): container finished" podID="6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" containerID="36d95203bc3297779316f06d001c72859f9b4561a9a9d7c3123c5a75e8d8bc6b" exitCode=0 Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.002252 4691 generic.go:334] "Generic (PLEG): container finished" podID="6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" containerID="f46927454f67993366632bb3f6e37e4cb0c5f013266b18c1ac61e9cddc42e023" exitCode=0 Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.002261 4691 generic.go:334] "Generic (PLEG): container finished" podID="6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" containerID="04bc58a7b740ca100f333ef769a949f962ff6024f8dd87b798c99a28514e0394" exitCode=0 Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.002269 4691 generic.go:334] "Generic (PLEG): container finished" podID="6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" containerID="5b6df86867d5427797ec0c0a28b653ef5b8ec62af450207d9af5f490df725b05" exitCode=0 Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.002279 4691 generic.go:334] "Generic (PLEG): container finished" podID="6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" containerID="20789e47d7e584044b0763af1eb8b81bbde2f247ec14e56a5a9b08c28894c3be" exitCode=143 Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.002288 4691 generic.go:334] "Generic (PLEG): container finished" podID="6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" containerID="d002e869e911c24f154df90324ace473c747a4cfa6dd60ddcbef6473f9815f72" exitCode=143 Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.002338 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" event={"ID":"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d","Type":"ContainerDied","Data":"0879865d465a3a53a432bcc490519dc13015da51ae4252508ba71348c72bd4c8"} Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.002369 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" event={"ID":"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d","Type":"ContainerDied","Data":"c9f49abce591ae5c912fdbbdf0253d06ee20088424718dd6d535683a2a2ede5e"} Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.002383 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" event={"ID":"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d","Type":"ContainerDied","Data":"36d95203bc3297779316f06d001c72859f9b4561a9a9d7c3123c5a75e8d8bc6b"} Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.002395 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" event={"ID":"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d","Type":"ContainerDied","Data":"f46927454f67993366632bb3f6e37e4cb0c5f013266b18c1ac61e9cddc42e023"} Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.002407 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" event={"ID":"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d","Type":"ContainerDied","Data":"04bc58a7b740ca100f333ef769a949f962ff6024f8dd87b798c99a28514e0394"} Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.002418 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" event={"ID":"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d","Type":"ContainerDied","Data":"5b6df86867d5427797ec0c0a28b653ef5b8ec62af450207d9af5f490df725b05"} Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.002433 4691 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"76736edc05ab57e4ee3d6dea3b7f4d2033e718cdac579abe7c4f41fe6988f62e"} Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.002444 4691 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c9f49abce591ae5c912fdbbdf0253d06ee20088424718dd6d535683a2a2ede5e"} Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.002451 4691 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"36d95203bc3297779316f06d001c72859f9b4561a9a9d7c3123c5a75e8d8bc6b"} Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.002458 4691 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f46927454f67993366632bb3f6e37e4cb0c5f013266b18c1ac61e9cddc42e023"} Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.002465 4691 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"04bc58a7b740ca100f333ef769a949f962ff6024f8dd87b798c99a28514e0394"} Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.002472 4691 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5b6df86867d5427797ec0c0a28b653ef5b8ec62af450207d9af5f490df725b05"} Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.002479 4691 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"20789e47d7e584044b0763af1eb8b81bbde2f247ec14e56a5a9b08c28894c3be"} Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.002487 4691 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d002e869e911c24f154df90324ace473c747a4cfa6dd60ddcbef6473f9815f72"} Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.002494 4691 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72"} Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.002503 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" event={"ID":"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d","Type":"ContainerDied","Data":"20789e47d7e584044b0763af1eb8b81bbde2f247ec14e56a5a9b08c28894c3be"} Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.002514 4691 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0879865d465a3a53a432bcc490519dc13015da51ae4252508ba71348c72bd4c8"} Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.002524 4691 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"76736edc05ab57e4ee3d6dea3b7f4d2033e718cdac579abe7c4f41fe6988f62e"} Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.002542 4691 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c9f49abce591ae5c912fdbbdf0253d06ee20088424718dd6d535683a2a2ede5e"} Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.002553 4691 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"36d95203bc3297779316f06d001c72859f9b4561a9a9d7c3123c5a75e8d8bc6b"} Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.002572 4691 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f46927454f67993366632bb3f6e37e4cb0c5f013266b18c1ac61e9cddc42e023"} Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.002620 4691 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"04bc58a7b740ca100f333ef769a949f962ff6024f8dd87b798c99a28514e0394"} Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.002631 4691 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5b6df86867d5427797ec0c0a28b653ef5b8ec62af450207d9af5f490df725b05"} Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.002640 4691 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"20789e47d7e584044b0763af1eb8b81bbde2f247ec14e56a5a9b08c28894c3be"} Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.002649 4691 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d002e869e911c24f154df90324ace473c747a4cfa6dd60ddcbef6473f9815f72"} Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.002659 4691 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72"} Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.002676 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" event={"ID":"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d","Type":"ContainerDied","Data":"d002e869e911c24f154df90324ace473c747a4cfa6dd60ddcbef6473f9815f72"} Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.002696 4691 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0879865d465a3a53a432bcc490519dc13015da51ae4252508ba71348c72bd4c8"} Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.002706 4691 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"76736edc05ab57e4ee3d6dea3b7f4d2033e718cdac579abe7c4f41fe6988f62e"} Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.002714 4691 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c9f49abce591ae5c912fdbbdf0253d06ee20088424718dd6d535683a2a2ede5e"} Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.002720 4691 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"36d95203bc3297779316f06d001c72859f9b4561a9a9d7c3123c5a75e8d8bc6b"} Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.002728 4691 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f46927454f67993366632bb3f6e37e4cb0c5f013266b18c1ac61e9cddc42e023"} Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.002737 4691 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"04bc58a7b740ca100f333ef769a949f962ff6024f8dd87b798c99a28514e0394"} Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.002745 4691 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5b6df86867d5427797ec0c0a28b653ef5b8ec62af450207d9af5f490df725b05"} Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.002751 4691 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"20789e47d7e584044b0763af1eb8b81bbde2f247ec14e56a5a9b08c28894c3be"} Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.002758 4691 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d002e869e911c24f154df90324ace473c747a4cfa6dd60ddcbef6473f9815f72"} Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.002764 4691 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72"} Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.002774 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" event={"ID":"6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d","Type":"ContainerDied","Data":"0ab6d01b4928b3b2967394cb35e4c99eb8f282ea2c9f82bfaba96756581f6b1b"} Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.002785 4691 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0879865d465a3a53a432bcc490519dc13015da51ae4252508ba71348c72bd4c8"} Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.002793 4691 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"76736edc05ab57e4ee3d6dea3b7f4d2033e718cdac579abe7c4f41fe6988f62e"} Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.002800 4691 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c9f49abce591ae5c912fdbbdf0253d06ee20088424718dd6d535683a2a2ede5e"} Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.002807 4691 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"36d95203bc3297779316f06d001c72859f9b4561a9a9d7c3123c5a75e8d8bc6b"} Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.002814 4691 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f46927454f67993366632bb3f6e37e4cb0c5f013266b18c1ac61e9cddc42e023"} Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.002821 4691 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"04bc58a7b740ca100f333ef769a949f962ff6024f8dd87b798c99a28514e0394"} Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.002827 4691 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5b6df86867d5427797ec0c0a28b653ef5b8ec62af450207d9af5f490df725b05"} Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.002834 4691 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"20789e47d7e584044b0763af1eb8b81bbde2f247ec14e56a5a9b08c28894c3be"} Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.002840 4691 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"d002e869e911c24f154df90324ace473c747a4cfa6dd60ddcbef6473f9815f72"} Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.002847 4691 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72"} Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.002477 4691 scope.go:117] "RemoveContainer" containerID="0879865d465a3a53a432bcc490519dc13015da51ae4252508ba71348c72bd4c8" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.004059 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-sjmvw" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.005091 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xjjw8_5bfd073c-4582-4a65-8170-7030f4852174/kube-multus/2.log" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.005602 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xjjw8_5bfd073c-4582-4a65-8170-7030f4852174/kube-multus/1.log" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.005647 4691 generic.go:334] "Generic (PLEG): container finished" podID="5bfd073c-4582-4a65-8170-7030f4852174" containerID="6dc9f6de9a72745abb5fcd3a1cc65a6aade6d9c7dc8696106fc1a98b3550d079" exitCode=2 Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.005675 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xjjw8" event={"ID":"5bfd073c-4582-4a65-8170-7030f4852174","Type":"ContainerDied","Data":"6dc9f6de9a72745abb5fcd3a1cc65a6aade6d9c7dc8696106fc1a98b3550d079"} Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.005715 4691 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"139bc9d998c0acce39c92915ac1c15ec414b09e1f5c0ae15c37f7af6e5cf570d"} Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.006259 4691 scope.go:117] "RemoveContainer" containerID="6dc9f6de9a72745abb5fcd3a1cc65a6aade6d9c7dc8696106fc1a98b3550d079" Sep 30 06:28:34 crc kubenswrapper[4691]: E0930 06:28:34.006451 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-xjjw8_openshift-multus(5bfd073c-4582-4a65-8170-7030f4852174)\"" pod="openshift-multus/multus-xjjw8" podUID="5bfd073c-4582-4a65-8170-7030f4852174" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.046907 4691 scope.go:117] "RemoveContainer" containerID="76736edc05ab57e4ee3d6dea3b7f4d2033e718cdac579abe7c4f41fe6988f62e" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.053518 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c-run-ovn\") pod \"ovnkube-node-7scx6\" (UID: \"91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7scx6" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.053776 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c-run-ovn\") pod \"ovnkube-node-7scx6\" (UID: \"91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7scx6" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.053856 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c-host-run-ovn-kubernetes\") pod \"ovnkube-node-7scx6\" (UID: \"91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7scx6" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.053931 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c-ovnkube-config\") pod \"ovnkube-node-7scx6\" (UID: \"91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7scx6" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.053964 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c-run-systemd\") pod \"ovnkube-node-7scx6\" (UID: \"91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7scx6" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.053991 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c-host-slash\") pod \"ovnkube-node-7scx6\" (UID: \"91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7scx6" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.054034 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c-env-overrides\") pod \"ovnkube-node-7scx6\" (UID: \"91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7scx6" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.054077 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c-systemd-units\") pod \"ovnkube-node-7scx6\" (UID: \"91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7scx6" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.054106 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c-node-log\") pod \"ovnkube-node-7scx6\" (UID: \"91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7scx6" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.054754 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c-host-run-netns\") pod \"ovnkube-node-7scx6\" (UID: \"91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7scx6" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.054797 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c-var-lib-openvswitch\") pod \"ovnkube-node-7scx6\" (UID: \"91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7scx6" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.054856 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c-host-cni-bin\") pod \"ovnkube-node-7scx6\" (UID: \"91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7scx6" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.054922 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c-run-openvswitch\") pod \"ovnkube-node-7scx6\" (UID: \"91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7scx6" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.054967 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7scx6\" (UID: \"91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7scx6" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.055003 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnt2l\" (UniqueName: \"kubernetes.io/projected/91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c-kube-api-access-rnt2l\") pod \"ovnkube-node-7scx6\" (UID: \"91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7scx6" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.055040 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c-log-socket\") pod \"ovnkube-node-7scx6\" (UID: \"91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7scx6" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.055069 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c-host-kubelet\") pod \"ovnkube-node-7scx6\" (UID: \"91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7scx6" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.055099 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c-etc-openvswitch\") pod \"ovnkube-node-7scx6\" (UID: \"91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7scx6" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.055130 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c-ovnkube-script-lib\") pod \"ovnkube-node-7scx6\" (UID: \"91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7scx6" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.055197 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c-ovn-node-metrics-cert\") pod \"ovnkube-node-7scx6\" (UID: \"91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7scx6" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.055227 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c-host-cni-netd\") pod \"ovnkube-node-7scx6\" (UID: \"91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7scx6" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.055310 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c-host-cni-netd\") pod \"ovnkube-node-7scx6\" (UID: \"91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7scx6" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.055335 4691 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-run-systemd\") on node \"crc\" DevicePath \"\"" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.055358 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c-host-run-ovn-kubernetes\") pod \"ovnkube-node-7scx6\" (UID: \"91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7scx6" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.055570 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c-host-slash\") pod \"ovnkube-node-7scx6\" (UID: \"91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7scx6" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.055586 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c-log-socket\") pod \"ovnkube-node-7scx6\" (UID: \"91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7scx6" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.055636 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c-run-openvswitch\") pod \"ovnkube-node-7scx6\" (UID: \"91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7scx6" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.055693 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-7scx6\" (UID: \"91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7scx6" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.055970 4691 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.056025 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c-host-cni-bin\") pod \"ovnkube-node-7scx6\" (UID: \"91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7scx6" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.056024 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c-env-overrides\") pod \"ovnkube-node-7scx6\" (UID: \"91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7scx6" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.056053 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c-var-lib-openvswitch\") pod \"ovnkube-node-7scx6\" (UID: \"91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7scx6" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.056080 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c-run-systemd\") pod \"ovnkube-node-7scx6\" (UID: \"91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7scx6" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.056113 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c-node-log\") pod \"ovnkube-node-7scx6\" (UID: \"91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7scx6" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.056142 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c-systemd-units\") pod \"ovnkube-node-7scx6\" (UID: \"91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7scx6" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.056168 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c-host-run-netns\") pod \"ovnkube-node-7scx6\" (UID: \"91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7scx6" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.056205 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c-host-kubelet\") pod \"ovnkube-node-7scx6\" (UID: \"91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7scx6" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.056234 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c-etc-openvswitch\") pod \"ovnkube-node-7scx6\" (UID: \"91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7scx6" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.056260 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5nvgw\" (UniqueName: \"kubernetes.io/projected/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d-kube-api-access-5nvgw\") on node \"crc\" DevicePath \"\"" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.057446 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c-ovnkube-script-lib\") pod \"ovnkube-node-7scx6\" (UID: \"91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7scx6" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.057710 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c-ovnkube-config\") pod \"ovnkube-node-7scx6\" (UID: \"91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7scx6" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.061632 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-sjmvw"] Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.065218 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c-ovn-node-metrics-cert\") pod \"ovnkube-node-7scx6\" (UID: \"91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7scx6" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.067997 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-sjmvw"] Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.078318 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnt2l\" (UniqueName: \"kubernetes.io/projected/91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c-kube-api-access-rnt2l\") pod \"ovnkube-node-7scx6\" (UID: \"91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c\") " pod="openshift-ovn-kubernetes/ovnkube-node-7scx6" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.079008 4691 scope.go:117] "RemoveContainer" containerID="c9f49abce591ae5c912fdbbdf0253d06ee20088424718dd6d535683a2a2ede5e" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.091719 4691 scope.go:117] "RemoveContainer" containerID="36d95203bc3297779316f06d001c72859f9b4561a9a9d7c3123c5a75e8d8bc6b" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.105212 4691 scope.go:117] "RemoveContainer" containerID="f46927454f67993366632bb3f6e37e4cb0c5f013266b18c1ac61e9cddc42e023" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.128817 4691 scope.go:117] "RemoveContainer" containerID="04bc58a7b740ca100f333ef769a949f962ff6024f8dd87b798c99a28514e0394" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.145474 4691 scope.go:117] "RemoveContainer" containerID="5b6df86867d5427797ec0c0a28b653ef5b8ec62af450207d9af5f490df725b05" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.149876 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-7scx6" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.170848 4691 scope.go:117] "RemoveContainer" containerID="20789e47d7e584044b0763af1eb8b81bbde2f247ec14e56a5a9b08c28894c3be" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.190670 4691 scope.go:117] "RemoveContainer" containerID="d002e869e911c24f154df90324ace473c747a4cfa6dd60ddcbef6473f9815f72" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.213168 4691 scope.go:117] "RemoveContainer" containerID="32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.232652 4691 scope.go:117] "RemoveContainer" containerID="0879865d465a3a53a432bcc490519dc13015da51ae4252508ba71348c72bd4c8" Sep 30 06:28:34 crc kubenswrapper[4691]: E0930 06:28:34.233233 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0879865d465a3a53a432bcc490519dc13015da51ae4252508ba71348c72bd4c8\": container with ID starting with 0879865d465a3a53a432bcc490519dc13015da51ae4252508ba71348c72bd4c8 not found: ID does not exist" containerID="0879865d465a3a53a432bcc490519dc13015da51ae4252508ba71348c72bd4c8" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.233270 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0879865d465a3a53a432bcc490519dc13015da51ae4252508ba71348c72bd4c8"} err="failed to get container status \"0879865d465a3a53a432bcc490519dc13015da51ae4252508ba71348c72bd4c8\": rpc error: code = NotFound desc = could not find container \"0879865d465a3a53a432bcc490519dc13015da51ae4252508ba71348c72bd4c8\": container with ID starting with 0879865d465a3a53a432bcc490519dc13015da51ae4252508ba71348c72bd4c8 not found: ID does not exist" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.233296 4691 scope.go:117] "RemoveContainer" containerID="76736edc05ab57e4ee3d6dea3b7f4d2033e718cdac579abe7c4f41fe6988f62e" Sep 30 06:28:34 crc kubenswrapper[4691]: E0930 06:28:34.233662 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76736edc05ab57e4ee3d6dea3b7f4d2033e718cdac579abe7c4f41fe6988f62e\": container with ID starting with 76736edc05ab57e4ee3d6dea3b7f4d2033e718cdac579abe7c4f41fe6988f62e not found: ID does not exist" containerID="76736edc05ab57e4ee3d6dea3b7f4d2033e718cdac579abe7c4f41fe6988f62e" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.233731 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76736edc05ab57e4ee3d6dea3b7f4d2033e718cdac579abe7c4f41fe6988f62e"} err="failed to get container status \"76736edc05ab57e4ee3d6dea3b7f4d2033e718cdac579abe7c4f41fe6988f62e\": rpc error: code = NotFound desc = could not find container \"76736edc05ab57e4ee3d6dea3b7f4d2033e718cdac579abe7c4f41fe6988f62e\": container with ID starting with 76736edc05ab57e4ee3d6dea3b7f4d2033e718cdac579abe7c4f41fe6988f62e not found: ID does not exist" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.233774 4691 scope.go:117] "RemoveContainer" containerID="c9f49abce591ae5c912fdbbdf0253d06ee20088424718dd6d535683a2a2ede5e" Sep 30 06:28:34 crc kubenswrapper[4691]: E0930 06:28:34.234163 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9f49abce591ae5c912fdbbdf0253d06ee20088424718dd6d535683a2a2ede5e\": container with ID starting with c9f49abce591ae5c912fdbbdf0253d06ee20088424718dd6d535683a2a2ede5e not found: ID does not exist" containerID="c9f49abce591ae5c912fdbbdf0253d06ee20088424718dd6d535683a2a2ede5e" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.234209 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9f49abce591ae5c912fdbbdf0253d06ee20088424718dd6d535683a2a2ede5e"} err="failed to get container status \"c9f49abce591ae5c912fdbbdf0253d06ee20088424718dd6d535683a2a2ede5e\": rpc error: code = NotFound desc = could not find container \"c9f49abce591ae5c912fdbbdf0253d06ee20088424718dd6d535683a2a2ede5e\": container with ID starting with c9f49abce591ae5c912fdbbdf0253d06ee20088424718dd6d535683a2a2ede5e not found: ID does not exist" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.234242 4691 scope.go:117] "RemoveContainer" containerID="36d95203bc3297779316f06d001c72859f9b4561a9a9d7c3123c5a75e8d8bc6b" Sep 30 06:28:34 crc kubenswrapper[4691]: E0930 06:28:34.234518 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36d95203bc3297779316f06d001c72859f9b4561a9a9d7c3123c5a75e8d8bc6b\": container with ID starting with 36d95203bc3297779316f06d001c72859f9b4561a9a9d7c3123c5a75e8d8bc6b not found: ID does not exist" containerID="36d95203bc3297779316f06d001c72859f9b4561a9a9d7c3123c5a75e8d8bc6b" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.234592 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36d95203bc3297779316f06d001c72859f9b4561a9a9d7c3123c5a75e8d8bc6b"} err="failed to get container status \"36d95203bc3297779316f06d001c72859f9b4561a9a9d7c3123c5a75e8d8bc6b\": rpc error: code = NotFound desc = could not find container \"36d95203bc3297779316f06d001c72859f9b4561a9a9d7c3123c5a75e8d8bc6b\": container with ID starting with 36d95203bc3297779316f06d001c72859f9b4561a9a9d7c3123c5a75e8d8bc6b not found: ID does not exist" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.234612 4691 scope.go:117] "RemoveContainer" containerID="f46927454f67993366632bb3f6e37e4cb0c5f013266b18c1ac61e9cddc42e023" Sep 30 06:28:34 crc kubenswrapper[4691]: E0930 06:28:34.235335 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f46927454f67993366632bb3f6e37e4cb0c5f013266b18c1ac61e9cddc42e023\": container with ID starting with f46927454f67993366632bb3f6e37e4cb0c5f013266b18c1ac61e9cddc42e023 not found: ID does not exist" containerID="f46927454f67993366632bb3f6e37e4cb0c5f013266b18c1ac61e9cddc42e023" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.235388 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f46927454f67993366632bb3f6e37e4cb0c5f013266b18c1ac61e9cddc42e023"} err="failed to get container status \"f46927454f67993366632bb3f6e37e4cb0c5f013266b18c1ac61e9cddc42e023\": rpc error: code = NotFound desc = could not find container \"f46927454f67993366632bb3f6e37e4cb0c5f013266b18c1ac61e9cddc42e023\": container with ID starting with f46927454f67993366632bb3f6e37e4cb0c5f013266b18c1ac61e9cddc42e023 not found: ID does not exist" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.235416 4691 scope.go:117] "RemoveContainer" containerID="04bc58a7b740ca100f333ef769a949f962ff6024f8dd87b798c99a28514e0394" Sep 30 06:28:34 crc kubenswrapper[4691]: E0930 06:28:34.235811 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04bc58a7b740ca100f333ef769a949f962ff6024f8dd87b798c99a28514e0394\": container with ID starting with 04bc58a7b740ca100f333ef769a949f962ff6024f8dd87b798c99a28514e0394 not found: ID does not exist" containerID="04bc58a7b740ca100f333ef769a949f962ff6024f8dd87b798c99a28514e0394" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.235862 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04bc58a7b740ca100f333ef769a949f962ff6024f8dd87b798c99a28514e0394"} err="failed to get container status \"04bc58a7b740ca100f333ef769a949f962ff6024f8dd87b798c99a28514e0394\": rpc error: code = NotFound desc = could not find container \"04bc58a7b740ca100f333ef769a949f962ff6024f8dd87b798c99a28514e0394\": container with ID starting with 04bc58a7b740ca100f333ef769a949f962ff6024f8dd87b798c99a28514e0394 not found: ID does not exist" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.235924 4691 scope.go:117] "RemoveContainer" containerID="5b6df86867d5427797ec0c0a28b653ef5b8ec62af450207d9af5f490df725b05" Sep 30 06:28:34 crc kubenswrapper[4691]: E0930 06:28:34.236500 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b6df86867d5427797ec0c0a28b653ef5b8ec62af450207d9af5f490df725b05\": container with ID starting with 5b6df86867d5427797ec0c0a28b653ef5b8ec62af450207d9af5f490df725b05 not found: ID does not exist" containerID="5b6df86867d5427797ec0c0a28b653ef5b8ec62af450207d9af5f490df725b05" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.236568 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b6df86867d5427797ec0c0a28b653ef5b8ec62af450207d9af5f490df725b05"} err="failed to get container status \"5b6df86867d5427797ec0c0a28b653ef5b8ec62af450207d9af5f490df725b05\": rpc error: code = NotFound desc = could not find container \"5b6df86867d5427797ec0c0a28b653ef5b8ec62af450207d9af5f490df725b05\": container with ID starting with 5b6df86867d5427797ec0c0a28b653ef5b8ec62af450207d9af5f490df725b05 not found: ID does not exist" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.236609 4691 scope.go:117] "RemoveContainer" containerID="20789e47d7e584044b0763af1eb8b81bbde2f247ec14e56a5a9b08c28894c3be" Sep 30 06:28:34 crc kubenswrapper[4691]: E0930 06:28:34.237101 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20789e47d7e584044b0763af1eb8b81bbde2f247ec14e56a5a9b08c28894c3be\": container with ID starting with 20789e47d7e584044b0763af1eb8b81bbde2f247ec14e56a5a9b08c28894c3be not found: ID does not exist" containerID="20789e47d7e584044b0763af1eb8b81bbde2f247ec14e56a5a9b08c28894c3be" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.237158 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20789e47d7e584044b0763af1eb8b81bbde2f247ec14e56a5a9b08c28894c3be"} err="failed to get container status \"20789e47d7e584044b0763af1eb8b81bbde2f247ec14e56a5a9b08c28894c3be\": rpc error: code = NotFound desc = could not find container \"20789e47d7e584044b0763af1eb8b81bbde2f247ec14e56a5a9b08c28894c3be\": container with ID starting with 20789e47d7e584044b0763af1eb8b81bbde2f247ec14e56a5a9b08c28894c3be not found: ID does not exist" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.237199 4691 scope.go:117] "RemoveContainer" containerID="d002e869e911c24f154df90324ace473c747a4cfa6dd60ddcbef6473f9815f72" Sep 30 06:28:34 crc kubenswrapper[4691]: E0930 06:28:34.237759 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d002e869e911c24f154df90324ace473c747a4cfa6dd60ddcbef6473f9815f72\": container with ID starting with d002e869e911c24f154df90324ace473c747a4cfa6dd60ddcbef6473f9815f72 not found: ID does not exist" containerID="d002e869e911c24f154df90324ace473c747a4cfa6dd60ddcbef6473f9815f72" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.237794 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d002e869e911c24f154df90324ace473c747a4cfa6dd60ddcbef6473f9815f72"} err="failed to get container status \"d002e869e911c24f154df90324ace473c747a4cfa6dd60ddcbef6473f9815f72\": rpc error: code = NotFound desc = could not find container \"d002e869e911c24f154df90324ace473c747a4cfa6dd60ddcbef6473f9815f72\": container with ID starting with d002e869e911c24f154df90324ace473c747a4cfa6dd60ddcbef6473f9815f72 not found: ID does not exist" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.237815 4691 scope.go:117] "RemoveContainer" containerID="32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72" Sep 30 06:28:34 crc kubenswrapper[4691]: E0930 06:28:34.238185 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72\": container with ID starting with 32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72 not found: ID does not exist" containerID="32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.238251 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72"} err="failed to get container status \"32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72\": rpc error: code = NotFound desc = could not find container \"32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72\": container with ID starting with 32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72 not found: ID does not exist" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.238291 4691 scope.go:117] "RemoveContainer" containerID="0879865d465a3a53a432bcc490519dc13015da51ae4252508ba71348c72bd4c8" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.238746 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0879865d465a3a53a432bcc490519dc13015da51ae4252508ba71348c72bd4c8"} err="failed to get container status \"0879865d465a3a53a432bcc490519dc13015da51ae4252508ba71348c72bd4c8\": rpc error: code = NotFound desc = could not find container \"0879865d465a3a53a432bcc490519dc13015da51ae4252508ba71348c72bd4c8\": container with ID starting with 0879865d465a3a53a432bcc490519dc13015da51ae4252508ba71348c72bd4c8 not found: ID does not exist" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.238817 4691 scope.go:117] "RemoveContainer" containerID="76736edc05ab57e4ee3d6dea3b7f4d2033e718cdac579abe7c4f41fe6988f62e" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.239354 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76736edc05ab57e4ee3d6dea3b7f4d2033e718cdac579abe7c4f41fe6988f62e"} err="failed to get container status \"76736edc05ab57e4ee3d6dea3b7f4d2033e718cdac579abe7c4f41fe6988f62e\": rpc error: code = NotFound desc = could not find container \"76736edc05ab57e4ee3d6dea3b7f4d2033e718cdac579abe7c4f41fe6988f62e\": container with ID starting with 76736edc05ab57e4ee3d6dea3b7f4d2033e718cdac579abe7c4f41fe6988f62e not found: ID does not exist" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.239412 4691 scope.go:117] "RemoveContainer" containerID="c9f49abce591ae5c912fdbbdf0253d06ee20088424718dd6d535683a2a2ede5e" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.239783 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9f49abce591ae5c912fdbbdf0253d06ee20088424718dd6d535683a2a2ede5e"} err="failed to get container status \"c9f49abce591ae5c912fdbbdf0253d06ee20088424718dd6d535683a2a2ede5e\": rpc error: code = NotFound desc = could not find container \"c9f49abce591ae5c912fdbbdf0253d06ee20088424718dd6d535683a2a2ede5e\": container with ID starting with c9f49abce591ae5c912fdbbdf0253d06ee20088424718dd6d535683a2a2ede5e not found: ID does not exist" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.239827 4691 scope.go:117] "RemoveContainer" containerID="36d95203bc3297779316f06d001c72859f9b4561a9a9d7c3123c5a75e8d8bc6b" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.240421 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36d95203bc3297779316f06d001c72859f9b4561a9a9d7c3123c5a75e8d8bc6b"} err="failed to get container status \"36d95203bc3297779316f06d001c72859f9b4561a9a9d7c3123c5a75e8d8bc6b\": rpc error: code = NotFound desc = could not find container \"36d95203bc3297779316f06d001c72859f9b4561a9a9d7c3123c5a75e8d8bc6b\": container with ID starting with 36d95203bc3297779316f06d001c72859f9b4561a9a9d7c3123c5a75e8d8bc6b not found: ID does not exist" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.240458 4691 scope.go:117] "RemoveContainer" containerID="f46927454f67993366632bb3f6e37e4cb0c5f013266b18c1ac61e9cddc42e023" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.240844 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f46927454f67993366632bb3f6e37e4cb0c5f013266b18c1ac61e9cddc42e023"} err="failed to get container status \"f46927454f67993366632bb3f6e37e4cb0c5f013266b18c1ac61e9cddc42e023\": rpc error: code = NotFound desc = could not find container \"f46927454f67993366632bb3f6e37e4cb0c5f013266b18c1ac61e9cddc42e023\": container with ID starting with f46927454f67993366632bb3f6e37e4cb0c5f013266b18c1ac61e9cddc42e023 not found: ID does not exist" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.240931 4691 scope.go:117] "RemoveContainer" containerID="04bc58a7b740ca100f333ef769a949f962ff6024f8dd87b798c99a28514e0394" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.241413 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04bc58a7b740ca100f333ef769a949f962ff6024f8dd87b798c99a28514e0394"} err="failed to get container status \"04bc58a7b740ca100f333ef769a949f962ff6024f8dd87b798c99a28514e0394\": rpc error: code = NotFound desc = could not find container \"04bc58a7b740ca100f333ef769a949f962ff6024f8dd87b798c99a28514e0394\": container with ID starting with 04bc58a7b740ca100f333ef769a949f962ff6024f8dd87b798c99a28514e0394 not found: ID does not exist" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.241441 4691 scope.go:117] "RemoveContainer" containerID="5b6df86867d5427797ec0c0a28b653ef5b8ec62af450207d9af5f490df725b05" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.241694 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b6df86867d5427797ec0c0a28b653ef5b8ec62af450207d9af5f490df725b05"} err="failed to get container status \"5b6df86867d5427797ec0c0a28b653ef5b8ec62af450207d9af5f490df725b05\": rpc error: code = NotFound desc = could not find container \"5b6df86867d5427797ec0c0a28b653ef5b8ec62af450207d9af5f490df725b05\": container with ID starting with 5b6df86867d5427797ec0c0a28b653ef5b8ec62af450207d9af5f490df725b05 not found: ID does not exist" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.241719 4691 scope.go:117] "RemoveContainer" containerID="20789e47d7e584044b0763af1eb8b81bbde2f247ec14e56a5a9b08c28894c3be" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.242089 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20789e47d7e584044b0763af1eb8b81bbde2f247ec14e56a5a9b08c28894c3be"} err="failed to get container status \"20789e47d7e584044b0763af1eb8b81bbde2f247ec14e56a5a9b08c28894c3be\": rpc error: code = NotFound desc = could not find container \"20789e47d7e584044b0763af1eb8b81bbde2f247ec14e56a5a9b08c28894c3be\": container with ID starting with 20789e47d7e584044b0763af1eb8b81bbde2f247ec14e56a5a9b08c28894c3be not found: ID does not exist" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.242147 4691 scope.go:117] "RemoveContainer" containerID="d002e869e911c24f154df90324ace473c747a4cfa6dd60ddcbef6473f9815f72" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.242689 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d002e869e911c24f154df90324ace473c747a4cfa6dd60ddcbef6473f9815f72"} err="failed to get container status \"d002e869e911c24f154df90324ace473c747a4cfa6dd60ddcbef6473f9815f72\": rpc error: code = NotFound desc = could not find container \"d002e869e911c24f154df90324ace473c747a4cfa6dd60ddcbef6473f9815f72\": container with ID starting with d002e869e911c24f154df90324ace473c747a4cfa6dd60ddcbef6473f9815f72 not found: ID does not exist" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.242715 4691 scope.go:117] "RemoveContainer" containerID="32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.243076 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72"} err="failed to get container status \"32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72\": rpc error: code = NotFound desc = could not find container \"32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72\": container with ID starting with 32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72 not found: ID does not exist" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.243137 4691 scope.go:117] "RemoveContainer" containerID="0879865d465a3a53a432bcc490519dc13015da51ae4252508ba71348c72bd4c8" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.243650 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0879865d465a3a53a432bcc490519dc13015da51ae4252508ba71348c72bd4c8"} err="failed to get container status \"0879865d465a3a53a432bcc490519dc13015da51ae4252508ba71348c72bd4c8\": rpc error: code = NotFound desc = could not find container \"0879865d465a3a53a432bcc490519dc13015da51ae4252508ba71348c72bd4c8\": container with ID starting with 0879865d465a3a53a432bcc490519dc13015da51ae4252508ba71348c72bd4c8 not found: ID does not exist" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.243715 4691 scope.go:117] "RemoveContainer" containerID="76736edc05ab57e4ee3d6dea3b7f4d2033e718cdac579abe7c4f41fe6988f62e" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.244157 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76736edc05ab57e4ee3d6dea3b7f4d2033e718cdac579abe7c4f41fe6988f62e"} err="failed to get container status \"76736edc05ab57e4ee3d6dea3b7f4d2033e718cdac579abe7c4f41fe6988f62e\": rpc error: code = NotFound desc = could not find container \"76736edc05ab57e4ee3d6dea3b7f4d2033e718cdac579abe7c4f41fe6988f62e\": container with ID starting with 76736edc05ab57e4ee3d6dea3b7f4d2033e718cdac579abe7c4f41fe6988f62e not found: ID does not exist" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.244263 4691 scope.go:117] "RemoveContainer" containerID="c9f49abce591ae5c912fdbbdf0253d06ee20088424718dd6d535683a2a2ede5e" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.244653 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9f49abce591ae5c912fdbbdf0253d06ee20088424718dd6d535683a2a2ede5e"} err="failed to get container status \"c9f49abce591ae5c912fdbbdf0253d06ee20088424718dd6d535683a2a2ede5e\": rpc error: code = NotFound desc = could not find container \"c9f49abce591ae5c912fdbbdf0253d06ee20088424718dd6d535683a2a2ede5e\": container with ID starting with c9f49abce591ae5c912fdbbdf0253d06ee20088424718dd6d535683a2a2ede5e not found: ID does not exist" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.244681 4691 scope.go:117] "RemoveContainer" containerID="36d95203bc3297779316f06d001c72859f9b4561a9a9d7c3123c5a75e8d8bc6b" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.245240 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36d95203bc3297779316f06d001c72859f9b4561a9a9d7c3123c5a75e8d8bc6b"} err="failed to get container status \"36d95203bc3297779316f06d001c72859f9b4561a9a9d7c3123c5a75e8d8bc6b\": rpc error: code = NotFound desc = could not find container \"36d95203bc3297779316f06d001c72859f9b4561a9a9d7c3123c5a75e8d8bc6b\": container with ID starting with 36d95203bc3297779316f06d001c72859f9b4561a9a9d7c3123c5a75e8d8bc6b not found: ID does not exist" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.245278 4691 scope.go:117] "RemoveContainer" containerID="f46927454f67993366632bb3f6e37e4cb0c5f013266b18c1ac61e9cddc42e023" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.246216 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f46927454f67993366632bb3f6e37e4cb0c5f013266b18c1ac61e9cddc42e023"} err="failed to get container status \"f46927454f67993366632bb3f6e37e4cb0c5f013266b18c1ac61e9cddc42e023\": rpc error: code = NotFound desc = could not find container \"f46927454f67993366632bb3f6e37e4cb0c5f013266b18c1ac61e9cddc42e023\": container with ID starting with f46927454f67993366632bb3f6e37e4cb0c5f013266b18c1ac61e9cddc42e023 not found: ID does not exist" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.246250 4691 scope.go:117] "RemoveContainer" containerID="04bc58a7b740ca100f333ef769a949f962ff6024f8dd87b798c99a28514e0394" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.246582 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04bc58a7b740ca100f333ef769a949f962ff6024f8dd87b798c99a28514e0394"} err="failed to get container status \"04bc58a7b740ca100f333ef769a949f962ff6024f8dd87b798c99a28514e0394\": rpc error: code = NotFound desc = could not find container \"04bc58a7b740ca100f333ef769a949f962ff6024f8dd87b798c99a28514e0394\": container with ID starting with 04bc58a7b740ca100f333ef769a949f962ff6024f8dd87b798c99a28514e0394 not found: ID does not exist" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.246644 4691 scope.go:117] "RemoveContainer" containerID="5b6df86867d5427797ec0c0a28b653ef5b8ec62af450207d9af5f490df725b05" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.246941 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b6df86867d5427797ec0c0a28b653ef5b8ec62af450207d9af5f490df725b05"} err="failed to get container status \"5b6df86867d5427797ec0c0a28b653ef5b8ec62af450207d9af5f490df725b05\": rpc error: code = NotFound desc = could not find container \"5b6df86867d5427797ec0c0a28b653ef5b8ec62af450207d9af5f490df725b05\": container with ID starting with 5b6df86867d5427797ec0c0a28b653ef5b8ec62af450207d9af5f490df725b05 not found: ID does not exist" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.246977 4691 scope.go:117] "RemoveContainer" containerID="20789e47d7e584044b0763af1eb8b81bbde2f247ec14e56a5a9b08c28894c3be" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.247500 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20789e47d7e584044b0763af1eb8b81bbde2f247ec14e56a5a9b08c28894c3be"} err="failed to get container status \"20789e47d7e584044b0763af1eb8b81bbde2f247ec14e56a5a9b08c28894c3be\": rpc error: code = NotFound desc = could not find container \"20789e47d7e584044b0763af1eb8b81bbde2f247ec14e56a5a9b08c28894c3be\": container with ID starting with 20789e47d7e584044b0763af1eb8b81bbde2f247ec14e56a5a9b08c28894c3be not found: ID does not exist" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.247552 4691 scope.go:117] "RemoveContainer" containerID="d002e869e911c24f154df90324ace473c747a4cfa6dd60ddcbef6473f9815f72" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.247904 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d002e869e911c24f154df90324ace473c747a4cfa6dd60ddcbef6473f9815f72"} err="failed to get container status \"d002e869e911c24f154df90324ace473c747a4cfa6dd60ddcbef6473f9815f72\": rpc error: code = NotFound desc = could not find container \"d002e869e911c24f154df90324ace473c747a4cfa6dd60ddcbef6473f9815f72\": container with ID starting with d002e869e911c24f154df90324ace473c747a4cfa6dd60ddcbef6473f9815f72 not found: ID does not exist" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.247927 4691 scope.go:117] "RemoveContainer" containerID="32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.248591 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72"} err="failed to get container status \"32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72\": rpc error: code = NotFound desc = could not find container \"32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72\": container with ID starting with 32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72 not found: ID does not exist" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.248618 4691 scope.go:117] "RemoveContainer" containerID="0879865d465a3a53a432bcc490519dc13015da51ae4252508ba71348c72bd4c8" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.248938 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0879865d465a3a53a432bcc490519dc13015da51ae4252508ba71348c72bd4c8"} err="failed to get container status \"0879865d465a3a53a432bcc490519dc13015da51ae4252508ba71348c72bd4c8\": rpc error: code = NotFound desc = could not find container \"0879865d465a3a53a432bcc490519dc13015da51ae4252508ba71348c72bd4c8\": container with ID starting with 0879865d465a3a53a432bcc490519dc13015da51ae4252508ba71348c72bd4c8 not found: ID does not exist" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.248996 4691 scope.go:117] "RemoveContainer" containerID="76736edc05ab57e4ee3d6dea3b7f4d2033e718cdac579abe7c4f41fe6988f62e" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.249442 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76736edc05ab57e4ee3d6dea3b7f4d2033e718cdac579abe7c4f41fe6988f62e"} err="failed to get container status \"76736edc05ab57e4ee3d6dea3b7f4d2033e718cdac579abe7c4f41fe6988f62e\": rpc error: code = NotFound desc = could not find container \"76736edc05ab57e4ee3d6dea3b7f4d2033e718cdac579abe7c4f41fe6988f62e\": container with ID starting with 76736edc05ab57e4ee3d6dea3b7f4d2033e718cdac579abe7c4f41fe6988f62e not found: ID does not exist" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.249468 4691 scope.go:117] "RemoveContainer" containerID="c9f49abce591ae5c912fdbbdf0253d06ee20088424718dd6d535683a2a2ede5e" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.249838 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9f49abce591ae5c912fdbbdf0253d06ee20088424718dd6d535683a2a2ede5e"} err="failed to get container status \"c9f49abce591ae5c912fdbbdf0253d06ee20088424718dd6d535683a2a2ede5e\": rpc error: code = NotFound desc = could not find container \"c9f49abce591ae5c912fdbbdf0253d06ee20088424718dd6d535683a2a2ede5e\": container with ID starting with c9f49abce591ae5c912fdbbdf0253d06ee20088424718dd6d535683a2a2ede5e not found: ID does not exist" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.249863 4691 scope.go:117] "RemoveContainer" containerID="36d95203bc3297779316f06d001c72859f9b4561a9a9d7c3123c5a75e8d8bc6b" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.250290 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36d95203bc3297779316f06d001c72859f9b4561a9a9d7c3123c5a75e8d8bc6b"} err="failed to get container status \"36d95203bc3297779316f06d001c72859f9b4561a9a9d7c3123c5a75e8d8bc6b\": rpc error: code = NotFound desc = could not find container \"36d95203bc3297779316f06d001c72859f9b4561a9a9d7c3123c5a75e8d8bc6b\": container with ID starting with 36d95203bc3297779316f06d001c72859f9b4561a9a9d7c3123c5a75e8d8bc6b not found: ID does not exist" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.250313 4691 scope.go:117] "RemoveContainer" containerID="f46927454f67993366632bb3f6e37e4cb0c5f013266b18c1ac61e9cddc42e023" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.250643 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f46927454f67993366632bb3f6e37e4cb0c5f013266b18c1ac61e9cddc42e023"} err="failed to get container status \"f46927454f67993366632bb3f6e37e4cb0c5f013266b18c1ac61e9cddc42e023\": rpc error: code = NotFound desc = could not find container \"f46927454f67993366632bb3f6e37e4cb0c5f013266b18c1ac61e9cddc42e023\": container with ID starting with f46927454f67993366632bb3f6e37e4cb0c5f013266b18c1ac61e9cddc42e023 not found: ID does not exist" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.250676 4691 scope.go:117] "RemoveContainer" containerID="04bc58a7b740ca100f333ef769a949f962ff6024f8dd87b798c99a28514e0394" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.250992 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04bc58a7b740ca100f333ef769a949f962ff6024f8dd87b798c99a28514e0394"} err="failed to get container status \"04bc58a7b740ca100f333ef769a949f962ff6024f8dd87b798c99a28514e0394\": rpc error: code = NotFound desc = could not find container \"04bc58a7b740ca100f333ef769a949f962ff6024f8dd87b798c99a28514e0394\": container with ID starting with 04bc58a7b740ca100f333ef769a949f962ff6024f8dd87b798c99a28514e0394 not found: ID does not exist" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.251019 4691 scope.go:117] "RemoveContainer" containerID="5b6df86867d5427797ec0c0a28b653ef5b8ec62af450207d9af5f490df725b05" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.251451 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b6df86867d5427797ec0c0a28b653ef5b8ec62af450207d9af5f490df725b05"} err="failed to get container status \"5b6df86867d5427797ec0c0a28b653ef5b8ec62af450207d9af5f490df725b05\": rpc error: code = NotFound desc = could not find container \"5b6df86867d5427797ec0c0a28b653ef5b8ec62af450207d9af5f490df725b05\": container with ID starting with 5b6df86867d5427797ec0c0a28b653ef5b8ec62af450207d9af5f490df725b05 not found: ID does not exist" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.251476 4691 scope.go:117] "RemoveContainer" containerID="20789e47d7e584044b0763af1eb8b81bbde2f247ec14e56a5a9b08c28894c3be" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.251908 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20789e47d7e584044b0763af1eb8b81bbde2f247ec14e56a5a9b08c28894c3be"} err="failed to get container status \"20789e47d7e584044b0763af1eb8b81bbde2f247ec14e56a5a9b08c28894c3be\": rpc error: code = NotFound desc = could not find container \"20789e47d7e584044b0763af1eb8b81bbde2f247ec14e56a5a9b08c28894c3be\": container with ID starting with 20789e47d7e584044b0763af1eb8b81bbde2f247ec14e56a5a9b08c28894c3be not found: ID does not exist" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.251942 4691 scope.go:117] "RemoveContainer" containerID="d002e869e911c24f154df90324ace473c747a4cfa6dd60ddcbef6473f9815f72" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.252284 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d002e869e911c24f154df90324ace473c747a4cfa6dd60ddcbef6473f9815f72"} err="failed to get container status \"d002e869e911c24f154df90324ace473c747a4cfa6dd60ddcbef6473f9815f72\": rpc error: code = NotFound desc = could not find container \"d002e869e911c24f154df90324ace473c747a4cfa6dd60ddcbef6473f9815f72\": container with ID starting with d002e869e911c24f154df90324ace473c747a4cfa6dd60ddcbef6473f9815f72 not found: ID does not exist" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.252343 4691 scope.go:117] "RemoveContainer" containerID="32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.252749 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72"} err="failed to get container status \"32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72\": rpc error: code = NotFound desc = could not find container \"32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72\": container with ID starting with 32bbccd365b6ea944de5b2b14806be12d0b222b247d677fdd21335974d2c5e72 not found: ID does not exist" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.252776 4691 scope.go:117] "RemoveContainer" containerID="0879865d465a3a53a432bcc490519dc13015da51ae4252508ba71348c72bd4c8" Sep 30 06:28:34 crc kubenswrapper[4691]: I0930 06:28:34.253153 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0879865d465a3a53a432bcc490519dc13015da51ae4252508ba71348c72bd4c8"} err="failed to get container status \"0879865d465a3a53a432bcc490519dc13015da51ae4252508ba71348c72bd4c8\": rpc error: code = NotFound desc = could not find container \"0879865d465a3a53a432bcc490519dc13015da51ae4252508ba71348c72bd4c8\": container with ID starting with 0879865d465a3a53a432bcc490519dc13015da51ae4252508ba71348c72bd4c8 not found: ID does not exist" Sep 30 06:28:35 crc kubenswrapper[4691]: I0930 06:28:35.020102 4691 generic.go:334] "Generic (PLEG): container finished" podID="91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c" containerID="6c432a9ddfbd0a98b0ce0497cfdd3c1c9906242ab2a7012ac2890cc48eb3d429" exitCode=0 Sep 30 06:28:35 crc kubenswrapper[4691]: I0930 06:28:35.020242 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7scx6" event={"ID":"91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c","Type":"ContainerDied","Data":"6c432a9ddfbd0a98b0ce0497cfdd3c1c9906242ab2a7012ac2890cc48eb3d429"} Sep 30 06:28:35 crc kubenswrapper[4691]: I0930 06:28:35.020463 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7scx6" event={"ID":"91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c","Type":"ContainerStarted","Data":"ebc22afeb8ff32768a7656d74e55a6a3173bfea2668b36b4129e0386e2ec42c8"} Sep 30 06:28:35 crc kubenswrapper[4691]: I0930 06:28:35.237687 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d" path="/var/lib/kubelet/pods/6f1b023d-cbb5-4ddf-a9d0-274d2fc70c1d/volumes" Sep 30 06:28:36 crc kubenswrapper[4691]: I0930 06:28:36.032946 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7scx6" event={"ID":"91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c","Type":"ContainerStarted","Data":"8a44c97f7d39371c0d7c4a4c77a5312a4b267d5ed60590f0f432cb4c66ff5df7"} Sep 30 06:28:36 crc kubenswrapper[4691]: I0930 06:28:36.033292 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7scx6" event={"ID":"91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c","Type":"ContainerStarted","Data":"a4e9c51bff6b23b13d5c6ed428086c3b1a4a2a24a66b43cc86c93bc75477ac4f"} Sep 30 06:28:36 crc kubenswrapper[4691]: I0930 06:28:36.033316 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7scx6" event={"ID":"91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c","Type":"ContainerStarted","Data":"e5c4dad0649b8ea642e0540d6623e355eee8a531497a31ebd269835740caa674"} Sep 30 06:28:36 crc kubenswrapper[4691]: I0930 06:28:36.033334 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7scx6" event={"ID":"91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c","Type":"ContainerStarted","Data":"a0557ec046a4e3ed38d0afb2bb29199b3b548170b7d0b707af60c596fc05798d"} Sep 30 06:28:36 crc kubenswrapper[4691]: I0930 06:28:36.033355 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7scx6" event={"ID":"91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c","Type":"ContainerStarted","Data":"78de7846aa18a669e9a88030bcd4f6c86851c0cbc7d3b5a96b1e64a21c8e28bd"} Sep 30 06:28:37 crc kubenswrapper[4691]: I0930 06:28:37.045811 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7scx6" event={"ID":"91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c","Type":"ContainerStarted","Data":"6e8cda0c661ff3962a2747d66b27929220bd3d3c3e5abb3e2ac952a632862cfe"} Sep 30 06:28:39 crc kubenswrapper[4691]: I0930 06:28:39.066276 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7scx6" event={"ID":"91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c","Type":"ContainerStarted","Data":"927912de63fd861a0e49badb5c02b03a3dc7add8bf45e5c9f0f0d0594eb43e77"} Sep 30 06:28:41 crc kubenswrapper[4691]: I0930 06:28:41.084211 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-7scx6" event={"ID":"91261a0e-dfcd-4ae8-ae4f-a72ae6409d1c","Type":"ContainerStarted","Data":"c976d148d5a6ba722d5307fa749584dfbd9500d27bfddd816983a29724ddbfe2"} Sep 30 06:28:41 crc kubenswrapper[4691]: I0930 06:28:41.084570 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7scx6" Sep 30 06:28:41 crc kubenswrapper[4691]: I0930 06:28:41.084662 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7scx6" Sep 30 06:28:41 crc kubenswrapper[4691]: I0930 06:28:41.084673 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-7scx6" Sep 30 06:28:41 crc kubenswrapper[4691]: I0930 06:28:41.121678 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7scx6" Sep 30 06:28:41 crc kubenswrapper[4691]: I0930 06:28:41.124816 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7scx6" Sep 30 06:28:41 crc kubenswrapper[4691]: I0930 06:28:41.126020 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-7scx6" podStartSLOduration=8.125999959 podStartE2EDuration="8.125999959s" podCreationTimestamp="2025-09-30 06:28:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:28:41.119970738 +0000 UTC m=+564.594991828" watchObservedRunningTime="2025-09-30 06:28:41.125999959 +0000 UTC m=+564.601020999" Sep 30 06:28:48 crc kubenswrapper[4691]: I0930 06:28:48.225265 4691 scope.go:117] "RemoveContainer" containerID="6dc9f6de9a72745abb5fcd3a1cc65a6aade6d9c7dc8696106fc1a98b3550d079" Sep 30 06:28:48 crc kubenswrapper[4691]: E0930 06:28:48.227219 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-xjjw8_openshift-multus(5bfd073c-4582-4a65-8170-7030f4852174)\"" pod="openshift-multus/multus-xjjw8" podUID="5bfd073c-4582-4a65-8170-7030f4852174" Sep 30 06:28:52 crc kubenswrapper[4691]: I0930 06:28:52.850495 4691 patch_prober.go:28] interesting pod/machine-config-daemon-4w4k6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 06:28:52 crc kubenswrapper[4691]: I0930 06:28:52.850588 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 06:29:01 crc kubenswrapper[4691]: I0930 06:29:01.226074 4691 scope.go:117] "RemoveContainer" containerID="6dc9f6de9a72745abb5fcd3a1cc65a6aade6d9c7dc8696106fc1a98b3550d079" Sep 30 06:29:02 crc kubenswrapper[4691]: I0930 06:29:02.225169 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xjjw8_5bfd073c-4582-4a65-8170-7030f4852174/kube-multus/2.log" Sep 30 06:29:02 crc kubenswrapper[4691]: I0930 06:29:02.226124 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xjjw8_5bfd073c-4582-4a65-8170-7030f4852174/kube-multus/1.log" Sep 30 06:29:02 crc kubenswrapper[4691]: I0930 06:29:02.226168 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-xjjw8" event={"ID":"5bfd073c-4582-4a65-8170-7030f4852174","Type":"ContainerStarted","Data":"755614c98ef91b93882452af8d323acaff80304f9dddde731df6640e12a67dab"} Sep 30 06:29:02 crc kubenswrapper[4691]: I0930 06:29:02.868477 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkrbqr"] Sep 30 06:29:02 crc kubenswrapper[4691]: I0930 06:29:02.870226 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkrbqr" Sep 30 06:29:02 crc kubenswrapper[4691]: I0930 06:29:02.872653 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Sep 30 06:29:02 crc kubenswrapper[4691]: I0930 06:29:02.882533 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkrbqr"] Sep 30 06:29:02 crc kubenswrapper[4691]: I0930 06:29:02.996397 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fz4sn\" (UniqueName: \"kubernetes.io/projected/de284bf7-ec7a-419c-89fb-8a555cd5b320-kube-api-access-fz4sn\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkrbqr\" (UID: \"de284bf7-ec7a-419c-89fb-8a555cd5b320\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkrbqr" Sep 30 06:29:02 crc kubenswrapper[4691]: I0930 06:29:02.996630 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/de284bf7-ec7a-419c-89fb-8a555cd5b320-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkrbqr\" (UID: \"de284bf7-ec7a-419c-89fb-8a555cd5b320\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkrbqr" Sep 30 06:29:02 crc kubenswrapper[4691]: I0930 06:29:02.996883 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/de284bf7-ec7a-419c-89fb-8a555cd5b320-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkrbqr\" (UID: \"de284bf7-ec7a-419c-89fb-8a555cd5b320\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkrbqr" Sep 30 06:29:03 crc kubenswrapper[4691]: I0930 06:29:03.098289 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fz4sn\" (UniqueName: \"kubernetes.io/projected/de284bf7-ec7a-419c-89fb-8a555cd5b320-kube-api-access-fz4sn\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkrbqr\" (UID: \"de284bf7-ec7a-419c-89fb-8a555cd5b320\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkrbqr" Sep 30 06:29:03 crc kubenswrapper[4691]: I0930 06:29:03.098768 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/de284bf7-ec7a-419c-89fb-8a555cd5b320-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkrbqr\" (UID: \"de284bf7-ec7a-419c-89fb-8a555cd5b320\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkrbqr" Sep 30 06:29:03 crc kubenswrapper[4691]: I0930 06:29:03.098875 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/de284bf7-ec7a-419c-89fb-8a555cd5b320-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkrbqr\" (UID: \"de284bf7-ec7a-419c-89fb-8a555cd5b320\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkrbqr" Sep 30 06:29:03 crc kubenswrapper[4691]: I0930 06:29:03.099527 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/de284bf7-ec7a-419c-89fb-8a555cd5b320-util\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkrbqr\" (UID: \"de284bf7-ec7a-419c-89fb-8a555cd5b320\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkrbqr" Sep 30 06:29:03 crc kubenswrapper[4691]: I0930 06:29:03.099874 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/de284bf7-ec7a-419c-89fb-8a555cd5b320-bundle\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkrbqr\" (UID: \"de284bf7-ec7a-419c-89fb-8a555cd5b320\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkrbqr" Sep 30 06:29:03 crc kubenswrapper[4691]: I0930 06:29:03.131842 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fz4sn\" (UniqueName: \"kubernetes.io/projected/de284bf7-ec7a-419c-89fb-8a555cd5b320-kube-api-access-fz4sn\") pod \"a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkrbqr\" (UID: \"de284bf7-ec7a-419c-89fb-8a555cd5b320\") " pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkrbqr" Sep 30 06:29:03 crc kubenswrapper[4691]: I0930 06:29:03.198143 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkrbqr" Sep 30 06:29:03 crc kubenswrapper[4691]: I0930 06:29:03.507396 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkrbqr"] Sep 30 06:29:03 crc kubenswrapper[4691]: W0930 06:29:03.515877 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde284bf7_ec7a_419c_89fb_8a555cd5b320.slice/crio-4cb635e985c5394ee5681c63f06c996c657a182b446a6ecc06481dead893725a WatchSource:0}: Error finding container 4cb635e985c5394ee5681c63f06c996c657a182b446a6ecc06481dead893725a: Status 404 returned error can't find the container with id 4cb635e985c5394ee5681c63f06c996c657a182b446a6ecc06481dead893725a Sep 30 06:29:04 crc kubenswrapper[4691]: I0930 06:29:04.184436 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-7scx6" Sep 30 06:29:04 crc kubenswrapper[4691]: I0930 06:29:04.243932 4691 generic.go:334] "Generic (PLEG): container finished" podID="de284bf7-ec7a-419c-89fb-8a555cd5b320" containerID="6ba7f78459ace51ea815baf4b38974df3c9941a327c555a75c96df5c6785409e" exitCode=0 Sep 30 06:29:04 crc kubenswrapper[4691]: I0930 06:29:04.243983 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkrbqr" event={"ID":"de284bf7-ec7a-419c-89fb-8a555cd5b320","Type":"ContainerDied","Data":"6ba7f78459ace51ea815baf4b38974df3c9941a327c555a75c96df5c6785409e"} Sep 30 06:29:04 crc kubenswrapper[4691]: I0930 06:29:04.244012 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkrbqr" event={"ID":"de284bf7-ec7a-419c-89fb-8a555cd5b320","Type":"ContainerStarted","Data":"4cb635e985c5394ee5681c63f06c996c657a182b446a6ecc06481dead893725a"} Sep 30 06:29:06 crc kubenswrapper[4691]: I0930 06:29:06.260223 4691 generic.go:334] "Generic (PLEG): container finished" podID="de284bf7-ec7a-419c-89fb-8a555cd5b320" containerID="9dbd6b49c5739a16a62cb36ad22721c1d9f4c31224888f85ed051a34cc6e5ffa" exitCode=0 Sep 30 06:29:06 crc kubenswrapper[4691]: I0930 06:29:06.260309 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkrbqr" event={"ID":"de284bf7-ec7a-419c-89fb-8a555cd5b320","Type":"ContainerDied","Data":"9dbd6b49c5739a16a62cb36ad22721c1d9f4c31224888f85ed051a34cc6e5ffa"} Sep 30 06:29:07 crc kubenswrapper[4691]: I0930 06:29:07.272788 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkrbqr" event={"ID":"de284bf7-ec7a-419c-89fb-8a555cd5b320","Type":"ContainerStarted","Data":"6e9b6f2f8f145731a48eafba6041d26b14d429c443b17cbc201a70e93d258d94"} Sep 30 06:29:07 crc kubenswrapper[4691]: I0930 06:29:07.304718 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkrbqr" podStartSLOduration=3.755961147 podStartE2EDuration="5.304690637s" podCreationTimestamp="2025-09-30 06:29:02 +0000 UTC" firstStartedPulling="2025-09-30 06:29:04.246985635 +0000 UTC m=+587.722006715" lastFinishedPulling="2025-09-30 06:29:05.795715165 +0000 UTC m=+589.270736205" observedRunningTime="2025-09-30 06:29:07.299277156 +0000 UTC m=+590.774298236" watchObservedRunningTime="2025-09-30 06:29:07.304690637 +0000 UTC m=+590.779711707" Sep 30 06:29:08 crc kubenswrapper[4691]: I0930 06:29:08.283221 4691 generic.go:334] "Generic (PLEG): container finished" podID="de284bf7-ec7a-419c-89fb-8a555cd5b320" containerID="6e9b6f2f8f145731a48eafba6041d26b14d429c443b17cbc201a70e93d258d94" exitCode=0 Sep 30 06:29:08 crc kubenswrapper[4691]: I0930 06:29:08.283286 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkrbqr" event={"ID":"de284bf7-ec7a-419c-89fb-8a555cd5b320","Type":"ContainerDied","Data":"6e9b6f2f8f145731a48eafba6041d26b14d429c443b17cbc201a70e93d258d94"} Sep 30 06:29:09 crc kubenswrapper[4691]: I0930 06:29:09.608544 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkrbqr" Sep 30 06:29:09 crc kubenswrapper[4691]: I0930 06:29:09.698360 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/de284bf7-ec7a-419c-89fb-8a555cd5b320-bundle\") pod \"de284bf7-ec7a-419c-89fb-8a555cd5b320\" (UID: \"de284bf7-ec7a-419c-89fb-8a555cd5b320\") " Sep 30 06:29:09 crc kubenswrapper[4691]: I0930 06:29:09.698438 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/de284bf7-ec7a-419c-89fb-8a555cd5b320-util\") pod \"de284bf7-ec7a-419c-89fb-8a555cd5b320\" (UID: \"de284bf7-ec7a-419c-89fb-8a555cd5b320\") " Sep 30 06:29:09 crc kubenswrapper[4691]: I0930 06:29:09.698570 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fz4sn\" (UniqueName: \"kubernetes.io/projected/de284bf7-ec7a-419c-89fb-8a555cd5b320-kube-api-access-fz4sn\") pod \"de284bf7-ec7a-419c-89fb-8a555cd5b320\" (UID: \"de284bf7-ec7a-419c-89fb-8a555cd5b320\") " Sep 30 06:29:09 crc kubenswrapper[4691]: I0930 06:29:09.700121 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de284bf7-ec7a-419c-89fb-8a555cd5b320-bundle" (OuterVolumeSpecName: "bundle") pod "de284bf7-ec7a-419c-89fb-8a555cd5b320" (UID: "de284bf7-ec7a-419c-89fb-8a555cd5b320"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:29:09 crc kubenswrapper[4691]: I0930 06:29:09.704398 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de284bf7-ec7a-419c-89fb-8a555cd5b320-kube-api-access-fz4sn" (OuterVolumeSpecName: "kube-api-access-fz4sn") pod "de284bf7-ec7a-419c-89fb-8a555cd5b320" (UID: "de284bf7-ec7a-419c-89fb-8a555cd5b320"). InnerVolumeSpecName "kube-api-access-fz4sn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:29:09 crc kubenswrapper[4691]: I0930 06:29:09.711708 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de284bf7-ec7a-419c-89fb-8a555cd5b320-util" (OuterVolumeSpecName: "util") pod "de284bf7-ec7a-419c-89fb-8a555cd5b320" (UID: "de284bf7-ec7a-419c-89fb-8a555cd5b320"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:29:09 crc kubenswrapper[4691]: I0930 06:29:09.800450 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fz4sn\" (UniqueName: \"kubernetes.io/projected/de284bf7-ec7a-419c-89fb-8a555cd5b320-kube-api-access-fz4sn\") on node \"crc\" DevicePath \"\"" Sep 30 06:29:09 crc kubenswrapper[4691]: I0930 06:29:09.800492 4691 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/de284bf7-ec7a-419c-89fb-8a555cd5b320-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:29:09 crc kubenswrapper[4691]: I0930 06:29:09.800502 4691 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/de284bf7-ec7a-419c-89fb-8a555cd5b320-util\") on node \"crc\" DevicePath \"\"" Sep 30 06:29:10 crc kubenswrapper[4691]: I0930 06:29:10.299674 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkrbqr" event={"ID":"de284bf7-ec7a-419c-89fb-8a555cd5b320","Type":"ContainerDied","Data":"4cb635e985c5394ee5681c63f06c996c657a182b446a6ecc06481dead893725a"} Sep 30 06:29:10 crc kubenswrapper[4691]: I0930 06:29:10.299728 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4cb635e985c5394ee5681c63f06c996c657a182b446a6ecc06481dead893725a" Sep 30 06:29:10 crc kubenswrapper[4691]: I0930 06:29:10.299752 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkrbqr" Sep 30 06:29:17 crc kubenswrapper[4691]: I0930 06:29:17.525272 4691 scope.go:117] "RemoveContainer" containerID="139bc9d998c0acce39c92915ac1c15ec414b09e1f5c0ae15c37f7af6e5cf570d" Sep 30 06:29:18 crc kubenswrapper[4691]: I0930 06:29:18.347511 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-xjjw8_5bfd073c-4582-4a65-8170-7030f4852174/kube-multus/2.log" Sep 30 06:29:20 crc kubenswrapper[4691]: I0930 06:29:20.060423 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-xqs9h"] Sep 30 06:29:20 crc kubenswrapper[4691]: E0930 06:29:20.060737 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de284bf7-ec7a-419c-89fb-8a555cd5b320" containerName="pull" Sep 30 06:29:20 crc kubenswrapper[4691]: I0930 06:29:20.060758 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="de284bf7-ec7a-419c-89fb-8a555cd5b320" containerName="pull" Sep 30 06:29:20 crc kubenswrapper[4691]: E0930 06:29:20.060775 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de284bf7-ec7a-419c-89fb-8a555cd5b320" containerName="extract" Sep 30 06:29:20 crc kubenswrapper[4691]: I0930 06:29:20.060786 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="de284bf7-ec7a-419c-89fb-8a555cd5b320" containerName="extract" Sep 30 06:29:20 crc kubenswrapper[4691]: E0930 06:29:20.060808 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de284bf7-ec7a-419c-89fb-8a555cd5b320" containerName="util" Sep 30 06:29:20 crc kubenswrapper[4691]: I0930 06:29:20.060819 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="de284bf7-ec7a-419c-89fb-8a555cd5b320" containerName="util" Sep 30 06:29:20 crc kubenswrapper[4691]: I0930 06:29:20.060978 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="de284bf7-ec7a-419c-89fb-8a555cd5b320" containerName="extract" Sep 30 06:29:20 crc kubenswrapper[4691]: I0930 06:29:20.061488 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-xqs9h" Sep 30 06:29:20 crc kubenswrapper[4691]: I0930 06:29:20.064951 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Sep 30 06:29:20 crc kubenswrapper[4691]: I0930 06:29:20.065161 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-wgwjc" Sep 30 06:29:20 crc kubenswrapper[4691]: I0930 06:29:20.065298 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Sep 30 06:29:20 crc kubenswrapper[4691]: I0930 06:29:20.111411 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-xqs9h"] Sep 30 06:29:20 crc kubenswrapper[4691]: I0930 06:29:20.217692 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-67d746f8c7-h259t"] Sep 30 06:29:20 crc kubenswrapper[4691]: I0930 06:29:20.218291 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67d746f8c7-h259t" Sep 30 06:29:20 crc kubenswrapper[4691]: I0930 06:29:20.220195 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Sep 30 06:29:20 crc kubenswrapper[4691]: I0930 06:29:20.220455 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-k7b42" Sep 30 06:29:20 crc kubenswrapper[4691]: I0930 06:29:20.227001 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-67d746f8c7-2qbrt"] Sep 30 06:29:20 crc kubenswrapper[4691]: I0930 06:29:20.227778 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67d746f8c7-2qbrt" Sep 30 06:29:20 crc kubenswrapper[4691]: I0930 06:29:20.243301 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-67d746f8c7-2qbrt"] Sep 30 06:29:20 crc kubenswrapper[4691]: I0930 06:29:20.254111 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rgvd\" (UniqueName: \"kubernetes.io/projected/276e7e45-2756-4551-867f-2184113b0749-kube-api-access-6rgvd\") pod \"obo-prometheus-operator-7c8cf85677-xqs9h\" (UID: \"276e7e45-2756-4551-867f-2184113b0749\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-xqs9h" Sep 30 06:29:20 crc kubenswrapper[4691]: I0930 06:29:20.273476 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-67d746f8c7-h259t"] Sep 30 06:29:20 crc kubenswrapper[4691]: I0930 06:29:20.367350 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5a44c0de-cf12-49e9-9f72-eb618b14445b-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-67d746f8c7-2qbrt\" (UID: \"5a44c0de-cf12-49e9-9f72-eb618b14445b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-67d746f8c7-2qbrt" Sep 30 06:29:20 crc kubenswrapper[4691]: I0930 06:29:20.367404 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rgvd\" (UniqueName: \"kubernetes.io/projected/276e7e45-2756-4551-867f-2184113b0749-kube-api-access-6rgvd\") pod \"obo-prometheus-operator-7c8cf85677-xqs9h\" (UID: \"276e7e45-2756-4551-867f-2184113b0749\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-xqs9h" Sep 30 06:29:20 crc kubenswrapper[4691]: I0930 06:29:20.367444 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b9ef8251-85be-4df7-9372-65a9fa9db6f7-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-67d746f8c7-h259t\" (UID: \"b9ef8251-85be-4df7-9372-65a9fa9db6f7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-67d746f8c7-h259t" Sep 30 06:29:20 crc kubenswrapper[4691]: I0930 06:29:20.367470 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b9ef8251-85be-4df7-9372-65a9fa9db6f7-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-67d746f8c7-h259t\" (UID: \"b9ef8251-85be-4df7-9372-65a9fa9db6f7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-67d746f8c7-h259t" Sep 30 06:29:20 crc kubenswrapper[4691]: I0930 06:29:20.367505 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5a44c0de-cf12-49e9-9f72-eb618b14445b-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-67d746f8c7-2qbrt\" (UID: \"5a44c0de-cf12-49e9-9f72-eb618b14445b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-67d746f8c7-2qbrt" Sep 30 06:29:20 crc kubenswrapper[4691]: I0930 06:29:20.383400 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-jw7dt"] Sep 30 06:29:20 crc kubenswrapper[4691]: I0930 06:29:20.384256 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-jw7dt" Sep 30 06:29:20 crc kubenswrapper[4691]: I0930 06:29:20.386113 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-vl5r6" Sep 30 06:29:20 crc kubenswrapper[4691]: I0930 06:29:20.386644 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Sep 30 06:29:20 crc kubenswrapper[4691]: I0930 06:29:20.400793 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rgvd\" (UniqueName: \"kubernetes.io/projected/276e7e45-2756-4551-867f-2184113b0749-kube-api-access-6rgvd\") pod \"obo-prometheus-operator-7c8cf85677-xqs9h\" (UID: \"276e7e45-2756-4551-867f-2184113b0749\") " pod="openshift-operators/obo-prometheus-operator-7c8cf85677-xqs9h" Sep 30 06:29:20 crc kubenswrapper[4691]: I0930 06:29:20.408850 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-jw7dt"] Sep 30 06:29:20 crc kubenswrapper[4691]: I0930 06:29:20.468837 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5a44c0de-cf12-49e9-9f72-eb618b14445b-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-67d746f8c7-2qbrt\" (UID: \"5a44c0de-cf12-49e9-9f72-eb618b14445b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-67d746f8c7-2qbrt" Sep 30 06:29:20 crc kubenswrapper[4691]: I0930 06:29:20.469848 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97bgd\" (UniqueName: \"kubernetes.io/projected/426f2d02-4b9e-432d-a888-c799b2db417a-kube-api-access-97bgd\") pod \"observability-operator-cc5f78dfc-jw7dt\" (UID: \"426f2d02-4b9e-432d-a888-c799b2db417a\") " pod="openshift-operators/observability-operator-cc5f78dfc-jw7dt" Sep 30 06:29:20 crc kubenswrapper[4691]: I0930 06:29:20.470027 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5a44c0de-cf12-49e9-9f72-eb618b14445b-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-67d746f8c7-2qbrt\" (UID: \"5a44c0de-cf12-49e9-9f72-eb618b14445b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-67d746f8c7-2qbrt" Sep 30 06:29:20 crc kubenswrapper[4691]: I0930 06:29:20.470141 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/426f2d02-4b9e-432d-a888-c799b2db417a-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-jw7dt\" (UID: \"426f2d02-4b9e-432d-a888-c799b2db417a\") " pod="openshift-operators/observability-operator-cc5f78dfc-jw7dt" Sep 30 06:29:20 crc kubenswrapper[4691]: I0930 06:29:20.470302 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b9ef8251-85be-4df7-9372-65a9fa9db6f7-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-67d746f8c7-h259t\" (UID: \"b9ef8251-85be-4df7-9372-65a9fa9db6f7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-67d746f8c7-h259t" Sep 30 06:29:20 crc kubenswrapper[4691]: I0930 06:29:20.470750 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b9ef8251-85be-4df7-9372-65a9fa9db6f7-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-67d746f8c7-h259t\" (UID: \"b9ef8251-85be-4df7-9372-65a9fa9db6f7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-67d746f8c7-h259t" Sep 30 06:29:20 crc kubenswrapper[4691]: I0930 06:29:20.473289 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b9ef8251-85be-4df7-9372-65a9fa9db6f7-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-67d746f8c7-h259t\" (UID: \"b9ef8251-85be-4df7-9372-65a9fa9db6f7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-67d746f8c7-h259t" Sep 30 06:29:20 crc kubenswrapper[4691]: I0930 06:29:20.473698 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/5a44c0de-cf12-49e9-9f72-eb618b14445b-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-67d746f8c7-2qbrt\" (UID: \"5a44c0de-cf12-49e9-9f72-eb618b14445b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-67d746f8c7-2qbrt" Sep 30 06:29:20 crc kubenswrapper[4691]: I0930 06:29:20.474263 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b9ef8251-85be-4df7-9372-65a9fa9db6f7-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-67d746f8c7-h259t\" (UID: \"b9ef8251-85be-4df7-9372-65a9fa9db6f7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-67d746f8c7-h259t" Sep 30 06:29:20 crc kubenswrapper[4691]: I0930 06:29:20.478487 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/5a44c0de-cf12-49e9-9f72-eb618b14445b-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-67d746f8c7-2qbrt\" (UID: \"5a44c0de-cf12-49e9-9f72-eb618b14445b\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-67d746f8c7-2qbrt" Sep 30 06:29:20 crc kubenswrapper[4691]: I0930 06:29:20.537377 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67d746f8c7-h259t" Sep 30 06:29:20 crc kubenswrapper[4691]: I0930 06:29:20.553435 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67d746f8c7-2qbrt" Sep 30 06:29:20 crc kubenswrapper[4691]: I0930 06:29:20.572653 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97bgd\" (UniqueName: \"kubernetes.io/projected/426f2d02-4b9e-432d-a888-c799b2db417a-kube-api-access-97bgd\") pod \"observability-operator-cc5f78dfc-jw7dt\" (UID: \"426f2d02-4b9e-432d-a888-c799b2db417a\") " pod="openshift-operators/observability-operator-cc5f78dfc-jw7dt" Sep 30 06:29:20 crc kubenswrapper[4691]: I0930 06:29:20.573526 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/426f2d02-4b9e-432d-a888-c799b2db417a-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-jw7dt\" (UID: \"426f2d02-4b9e-432d-a888-c799b2db417a\") " pod="openshift-operators/observability-operator-cc5f78dfc-jw7dt" Sep 30 06:29:20 crc kubenswrapper[4691]: I0930 06:29:20.580880 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/426f2d02-4b9e-432d-a888-c799b2db417a-observability-operator-tls\") pod \"observability-operator-cc5f78dfc-jw7dt\" (UID: \"426f2d02-4b9e-432d-a888-c799b2db417a\") " pod="openshift-operators/observability-operator-cc5f78dfc-jw7dt" Sep 30 06:29:20 crc kubenswrapper[4691]: I0930 06:29:20.592658 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-fmpwf"] Sep 30 06:29:20 crc kubenswrapper[4691]: I0930 06:29:20.593720 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-fmpwf" Sep 30 06:29:20 crc kubenswrapper[4691]: I0930 06:29:20.595463 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-55lr2" Sep 30 06:29:20 crc kubenswrapper[4691]: I0930 06:29:20.601130 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97bgd\" (UniqueName: \"kubernetes.io/projected/426f2d02-4b9e-432d-a888-c799b2db417a-kube-api-access-97bgd\") pod \"observability-operator-cc5f78dfc-jw7dt\" (UID: \"426f2d02-4b9e-432d-a888-c799b2db417a\") " pod="openshift-operators/observability-operator-cc5f78dfc-jw7dt" Sep 30 06:29:20 crc kubenswrapper[4691]: I0930 06:29:20.606345 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-fmpwf"] Sep 30 06:29:20 crc kubenswrapper[4691]: I0930 06:29:20.675658 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sncdr\" (UniqueName: \"kubernetes.io/projected/e6e2f68b-8f48-4a4d-a96d-400c32cb80c9-kube-api-access-sncdr\") pod \"perses-operator-54bc95c9fb-fmpwf\" (UID: \"e6e2f68b-8f48-4a4d-a96d-400c32cb80c9\") " pod="openshift-operators/perses-operator-54bc95c9fb-fmpwf" Sep 30 06:29:20 crc kubenswrapper[4691]: I0930 06:29:20.675703 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/e6e2f68b-8f48-4a4d-a96d-400c32cb80c9-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-fmpwf\" (UID: \"e6e2f68b-8f48-4a4d-a96d-400c32cb80c9\") " pod="openshift-operators/perses-operator-54bc95c9fb-fmpwf" Sep 30 06:29:20 crc kubenswrapper[4691]: I0930 06:29:20.678397 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-xqs9h" Sep 30 06:29:20 crc kubenswrapper[4691]: I0930 06:29:20.700860 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-cc5f78dfc-jw7dt" Sep 30 06:29:20 crc kubenswrapper[4691]: I0930 06:29:20.776742 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sncdr\" (UniqueName: \"kubernetes.io/projected/e6e2f68b-8f48-4a4d-a96d-400c32cb80c9-kube-api-access-sncdr\") pod \"perses-operator-54bc95c9fb-fmpwf\" (UID: \"e6e2f68b-8f48-4a4d-a96d-400c32cb80c9\") " pod="openshift-operators/perses-operator-54bc95c9fb-fmpwf" Sep 30 06:29:20 crc kubenswrapper[4691]: I0930 06:29:20.776781 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/e6e2f68b-8f48-4a4d-a96d-400c32cb80c9-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-fmpwf\" (UID: \"e6e2f68b-8f48-4a4d-a96d-400c32cb80c9\") " pod="openshift-operators/perses-operator-54bc95c9fb-fmpwf" Sep 30 06:29:20 crc kubenswrapper[4691]: I0930 06:29:20.777614 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/e6e2f68b-8f48-4a4d-a96d-400c32cb80c9-openshift-service-ca\") pod \"perses-operator-54bc95c9fb-fmpwf\" (UID: \"e6e2f68b-8f48-4a4d-a96d-400c32cb80c9\") " pod="openshift-operators/perses-operator-54bc95c9fb-fmpwf" Sep 30 06:29:20 crc kubenswrapper[4691]: I0930 06:29:20.807466 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sncdr\" (UniqueName: \"kubernetes.io/projected/e6e2f68b-8f48-4a4d-a96d-400c32cb80c9-kube-api-access-sncdr\") pod \"perses-operator-54bc95c9fb-fmpwf\" (UID: \"e6e2f68b-8f48-4a4d-a96d-400c32cb80c9\") " pod="openshift-operators/perses-operator-54bc95c9fb-fmpwf" Sep 30 06:29:20 crc kubenswrapper[4691]: I0930 06:29:20.849214 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-67d746f8c7-2qbrt"] Sep 30 06:29:20 crc kubenswrapper[4691]: W0930 06:29:20.859977 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a44c0de_cf12_49e9_9f72_eb618b14445b.slice/crio-ac1cfa4eba1abdd1fbedea2b992f23c3960e4cab10776ce661694800ad11cfa7 WatchSource:0}: Error finding container ac1cfa4eba1abdd1fbedea2b992f23c3960e4cab10776ce661694800ad11cfa7: Status 404 returned error can't find the container with id ac1cfa4eba1abdd1fbedea2b992f23c3960e4cab10776ce661694800ad11cfa7 Sep 30 06:29:20 crc kubenswrapper[4691]: I0930 06:29:20.928855 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-54bc95c9fb-fmpwf" Sep 30 06:29:20 crc kubenswrapper[4691]: I0930 06:29:20.997755 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-67d746f8c7-h259t"] Sep 30 06:29:21 crc kubenswrapper[4691]: I0930 06:29:21.154822 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-cc5f78dfc-jw7dt"] Sep 30 06:29:21 crc kubenswrapper[4691]: I0930 06:29:21.160134 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-54bc95c9fb-fmpwf"] Sep 30 06:29:21 crc kubenswrapper[4691]: W0930 06:29:21.165666 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod426f2d02_4b9e_432d_a888_c799b2db417a.slice/crio-df3d51495d9ee912abf05db61c57755889fa7058d7b763fef756e730e36f4bed WatchSource:0}: Error finding container df3d51495d9ee912abf05db61c57755889fa7058d7b763fef756e730e36f4bed: Status 404 returned error can't find the container with id df3d51495d9ee912abf05db61c57755889fa7058d7b763fef756e730e36f4bed Sep 30 06:29:21 crc kubenswrapper[4691]: I0930 06:29:21.167666 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-7c8cf85677-xqs9h"] Sep 30 06:29:21 crc kubenswrapper[4691]: W0930 06:29:21.172429 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6e2f68b_8f48_4a4d_a96d_400c32cb80c9.slice/crio-719747eeb87f127c799065604c22d83f56777d060032c11c623f1156193182d6 WatchSource:0}: Error finding container 719747eeb87f127c799065604c22d83f56777d060032c11c623f1156193182d6: Status 404 returned error can't find the container with id 719747eeb87f127c799065604c22d83f56777d060032c11c623f1156193182d6 Sep 30 06:29:21 crc kubenswrapper[4691]: I0930 06:29:21.367163 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-54bc95c9fb-fmpwf" event={"ID":"e6e2f68b-8f48-4a4d-a96d-400c32cb80c9","Type":"ContainerStarted","Data":"719747eeb87f127c799065604c22d83f56777d060032c11c623f1156193182d6"} Sep 30 06:29:21 crc kubenswrapper[4691]: I0930 06:29:21.368695 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67d746f8c7-2qbrt" event={"ID":"5a44c0de-cf12-49e9-9f72-eb618b14445b","Type":"ContainerStarted","Data":"ac1cfa4eba1abdd1fbedea2b992f23c3960e4cab10776ce661694800ad11cfa7"} Sep 30 06:29:21 crc kubenswrapper[4691]: I0930 06:29:21.370023 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-xqs9h" event={"ID":"276e7e45-2756-4551-867f-2184113b0749","Type":"ContainerStarted","Data":"0aa6248c6bdb1b1f781c1dbf40c69bc86993e437f7ce2c71909beb3e91d0c52f"} Sep 30 06:29:21 crc kubenswrapper[4691]: I0930 06:29:21.371395 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-cc5f78dfc-jw7dt" event={"ID":"426f2d02-4b9e-432d-a888-c799b2db417a","Type":"ContainerStarted","Data":"df3d51495d9ee912abf05db61c57755889fa7058d7b763fef756e730e36f4bed"} Sep 30 06:29:21 crc kubenswrapper[4691]: I0930 06:29:21.372604 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67d746f8c7-h259t" event={"ID":"b9ef8251-85be-4df7-9372-65a9fa9db6f7","Type":"ContainerStarted","Data":"6347e0bc376edcd571a4bd77ca80be295eb4a7f62c0b7cff5ba9ceb06d04c6ac"} Sep 30 06:29:22 crc kubenswrapper[4691]: I0930 06:29:22.850447 4691 patch_prober.go:28] interesting pod/machine-config-daemon-4w4k6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 06:29:22 crc kubenswrapper[4691]: I0930 06:29:22.850734 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 06:29:36 crc kubenswrapper[4691]: I0930 06:29:36.499860 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-54bc95c9fb-fmpwf" event={"ID":"e6e2f68b-8f48-4a4d-a96d-400c32cb80c9","Type":"ContainerStarted","Data":"d13593b71b64f2716c4d78a84b483ad64ceb6a88ce1972cdb9f9d895a4d63f56"} Sep 30 06:29:36 crc kubenswrapper[4691]: I0930 06:29:36.500502 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-54bc95c9fb-fmpwf" Sep 30 06:29:36 crc kubenswrapper[4691]: I0930 06:29:36.502418 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67d746f8c7-2qbrt" event={"ID":"5a44c0de-cf12-49e9-9f72-eb618b14445b","Type":"ContainerStarted","Data":"599db6544045289b575c45f5fe51e4294d126a3d6558a551539a7c24c31ac6f7"} Sep 30 06:29:36 crc kubenswrapper[4691]: I0930 06:29:36.503984 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-xqs9h" event={"ID":"276e7e45-2756-4551-867f-2184113b0749","Type":"ContainerStarted","Data":"f5f4e4349551db25dae0a217d9a4f3dd75c099ca41a46fde4cd70e3832983465"} Sep 30 06:29:36 crc kubenswrapper[4691]: I0930 06:29:36.505446 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-cc5f78dfc-jw7dt" event={"ID":"426f2d02-4b9e-432d-a888-c799b2db417a","Type":"ContainerStarted","Data":"3b3c7253c5f18998f80bba3ec1a4e99b7c7841952da62486ab5732b4c5d94495"} Sep 30 06:29:36 crc kubenswrapper[4691]: I0930 06:29:36.505733 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-cc5f78dfc-jw7dt" Sep 30 06:29:36 crc kubenswrapper[4691]: I0930 06:29:36.506711 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67d746f8c7-h259t" event={"ID":"b9ef8251-85be-4df7-9372-65a9fa9db6f7","Type":"ContainerStarted","Data":"d08dbc34161e3d2969bbc518592597a7aa51d8e1ac3c5c6c3c0f5402529d8245"} Sep 30 06:29:36 crc kubenswrapper[4691]: I0930 06:29:36.512448 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-cc5f78dfc-jw7dt" Sep 30 06:29:36 crc kubenswrapper[4691]: I0930 06:29:36.526420 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-54bc95c9fb-fmpwf" podStartSLOduration=1.754625436 podStartE2EDuration="16.52640507s" podCreationTimestamp="2025-09-30 06:29:20 +0000 UTC" firstStartedPulling="2025-09-30 06:29:21.174764892 +0000 UTC m=+604.649785932" lastFinishedPulling="2025-09-30 06:29:35.946544516 +0000 UTC m=+619.421565566" observedRunningTime="2025-09-30 06:29:36.525642836 +0000 UTC m=+620.000663906" watchObservedRunningTime="2025-09-30 06:29:36.52640507 +0000 UTC m=+620.001426120" Sep 30 06:29:36 crc kubenswrapper[4691]: I0930 06:29:36.559850 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-7c8cf85677-xqs9h" podStartSLOduration=1.7826899840000001 podStartE2EDuration="16.559837788s" podCreationTimestamp="2025-09-30 06:29:20 +0000 UTC" firstStartedPulling="2025-09-30 06:29:21.171053444 +0000 UTC m=+604.646074484" lastFinishedPulling="2025-09-30 06:29:35.948201248 +0000 UTC m=+619.423222288" observedRunningTime="2025-09-30 06:29:36.559332252 +0000 UTC m=+620.034353302" watchObservedRunningTime="2025-09-30 06:29:36.559837788 +0000 UTC m=+620.034858828" Sep 30 06:29:36 crc kubenswrapper[4691]: I0930 06:29:36.577956 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-cc5f78dfc-jw7dt" podStartSLOduration=1.7519660529999999 podStartE2EDuration="16.57794065s" podCreationTimestamp="2025-09-30 06:29:20 +0000 UTC" firstStartedPulling="2025-09-30 06:29:21.169467334 +0000 UTC m=+604.644488374" lastFinishedPulling="2025-09-30 06:29:35.995441911 +0000 UTC m=+619.470462971" observedRunningTime="2025-09-30 06:29:36.577360561 +0000 UTC m=+620.052381611" watchObservedRunningTime="2025-09-30 06:29:36.57794065 +0000 UTC m=+620.052961690" Sep 30 06:29:36 crc kubenswrapper[4691]: I0930 06:29:36.602907 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67d746f8c7-2qbrt" podStartSLOduration=1.4726877919999999 podStartE2EDuration="16.602879568s" podCreationTimestamp="2025-09-30 06:29:20 +0000 UTC" firstStartedPulling="2025-09-30 06:29:20.865368049 +0000 UTC m=+604.340389079" lastFinishedPulling="2025-09-30 06:29:35.995559765 +0000 UTC m=+619.470580855" observedRunningTime="2025-09-30 06:29:36.600009467 +0000 UTC m=+620.075030507" watchObservedRunningTime="2025-09-30 06:29:36.602879568 +0000 UTC m=+620.077900608" Sep 30 06:29:36 crc kubenswrapper[4691]: I0930 06:29:36.621611 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-67d746f8c7-h259t" podStartSLOduration=1.690545539 podStartE2EDuration="16.621594159s" podCreationTimestamp="2025-09-30 06:29:20 +0000 UTC" firstStartedPulling="2025-09-30 06:29:21.015442254 +0000 UTC m=+604.490463294" lastFinishedPulling="2025-09-30 06:29:35.946490874 +0000 UTC m=+619.421511914" observedRunningTime="2025-09-30 06:29:36.619899766 +0000 UTC m=+620.094920806" watchObservedRunningTime="2025-09-30 06:29:36.621594159 +0000 UTC m=+620.096615199" Sep 30 06:29:50 crc kubenswrapper[4691]: I0930 06:29:50.932763 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-54bc95c9fb-fmpwf" Sep 30 06:29:52 crc kubenswrapper[4691]: I0930 06:29:52.849757 4691 patch_prober.go:28] interesting pod/machine-config-daemon-4w4k6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 06:29:52 crc kubenswrapper[4691]: I0930 06:29:52.849819 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 06:29:52 crc kubenswrapper[4691]: I0930 06:29:52.849866 4691 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" Sep 30 06:29:52 crc kubenswrapper[4691]: I0930 06:29:52.850337 4691 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"31e757fc7bb8d72540655d2ce1c4ea6d10d3a5eb3fd6ea0108f524dba7e5bca2"} pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 06:29:52 crc kubenswrapper[4691]: I0930 06:29:52.850395 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" containerName="machine-config-daemon" containerID="cri-o://31e757fc7bb8d72540655d2ce1c4ea6d10d3a5eb3fd6ea0108f524dba7e5bca2" gracePeriod=600 Sep 30 06:29:53 crc kubenswrapper[4691]: I0930 06:29:53.616249 4691 generic.go:334] "Generic (PLEG): container finished" podID="69b46ade-8260-448f-84b7-506632d23ff9" containerID="31e757fc7bb8d72540655d2ce1c4ea6d10d3a5eb3fd6ea0108f524dba7e5bca2" exitCode=0 Sep 30 06:29:53 crc kubenswrapper[4691]: I0930 06:29:53.616639 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" event={"ID":"69b46ade-8260-448f-84b7-506632d23ff9","Type":"ContainerDied","Data":"31e757fc7bb8d72540655d2ce1c4ea6d10d3a5eb3fd6ea0108f524dba7e5bca2"} Sep 30 06:29:53 crc kubenswrapper[4691]: I0930 06:29:53.616666 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" event={"ID":"69b46ade-8260-448f-84b7-506632d23ff9","Type":"ContainerStarted","Data":"a93cc69e9131d7c4e2a3f6590c1d8cfd39f8977341d3f1a63ae9e1ccb3a86989"} Sep 30 06:29:53 crc kubenswrapper[4691]: I0930 06:29:53.616680 4691 scope.go:117] "RemoveContainer" containerID="5124cec3e8ade06d39c26cde1baaa625eb5e8cb0cb2eb147c1c6f02b93ecaae0" Sep 30 06:30:00 crc kubenswrapper[4691]: I0930 06:30:00.136252 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320230-dkbgc"] Sep 30 06:30:00 crc kubenswrapper[4691]: I0930 06:30:00.138988 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320230-dkbgc" Sep 30 06:30:00 crc kubenswrapper[4691]: I0930 06:30:00.142087 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 06:30:00 crc kubenswrapper[4691]: I0930 06:30:00.142818 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 06:30:00 crc kubenswrapper[4691]: I0930 06:30:00.143683 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320230-dkbgc"] Sep 30 06:30:00 crc kubenswrapper[4691]: I0930 06:30:00.233484 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e35ed7eb-1300-40cb-b087-8d4aa2cb1daa-config-volume\") pod \"collect-profiles-29320230-dkbgc\" (UID: \"e35ed7eb-1300-40cb-b087-8d4aa2cb1daa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320230-dkbgc" Sep 30 06:30:00 crc kubenswrapper[4691]: I0930 06:30:00.233542 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hr97f\" (UniqueName: \"kubernetes.io/projected/e35ed7eb-1300-40cb-b087-8d4aa2cb1daa-kube-api-access-hr97f\") pod \"collect-profiles-29320230-dkbgc\" (UID: \"e35ed7eb-1300-40cb-b087-8d4aa2cb1daa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320230-dkbgc" Sep 30 06:30:00 crc kubenswrapper[4691]: I0930 06:30:00.233584 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e35ed7eb-1300-40cb-b087-8d4aa2cb1daa-secret-volume\") pod \"collect-profiles-29320230-dkbgc\" (UID: \"e35ed7eb-1300-40cb-b087-8d4aa2cb1daa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320230-dkbgc" Sep 30 06:30:00 crc kubenswrapper[4691]: I0930 06:30:00.334408 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e35ed7eb-1300-40cb-b087-8d4aa2cb1daa-config-volume\") pod \"collect-profiles-29320230-dkbgc\" (UID: \"e35ed7eb-1300-40cb-b087-8d4aa2cb1daa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320230-dkbgc" Sep 30 06:30:00 crc kubenswrapper[4691]: I0930 06:30:00.334772 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hr97f\" (UniqueName: \"kubernetes.io/projected/e35ed7eb-1300-40cb-b087-8d4aa2cb1daa-kube-api-access-hr97f\") pod \"collect-profiles-29320230-dkbgc\" (UID: \"e35ed7eb-1300-40cb-b087-8d4aa2cb1daa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320230-dkbgc" Sep 30 06:30:00 crc kubenswrapper[4691]: I0930 06:30:00.335034 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e35ed7eb-1300-40cb-b087-8d4aa2cb1daa-secret-volume\") pod \"collect-profiles-29320230-dkbgc\" (UID: \"e35ed7eb-1300-40cb-b087-8d4aa2cb1daa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320230-dkbgc" Sep 30 06:30:00 crc kubenswrapper[4691]: I0930 06:30:00.335677 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e35ed7eb-1300-40cb-b087-8d4aa2cb1daa-config-volume\") pod \"collect-profiles-29320230-dkbgc\" (UID: \"e35ed7eb-1300-40cb-b087-8d4aa2cb1daa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320230-dkbgc" Sep 30 06:30:00 crc kubenswrapper[4691]: I0930 06:30:00.342000 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e35ed7eb-1300-40cb-b087-8d4aa2cb1daa-secret-volume\") pod \"collect-profiles-29320230-dkbgc\" (UID: \"e35ed7eb-1300-40cb-b087-8d4aa2cb1daa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320230-dkbgc" Sep 30 06:30:00 crc kubenswrapper[4691]: I0930 06:30:00.350418 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hr97f\" (UniqueName: \"kubernetes.io/projected/e35ed7eb-1300-40cb-b087-8d4aa2cb1daa-kube-api-access-hr97f\") pod \"collect-profiles-29320230-dkbgc\" (UID: \"e35ed7eb-1300-40cb-b087-8d4aa2cb1daa\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320230-dkbgc" Sep 30 06:30:00 crc kubenswrapper[4691]: I0930 06:30:00.459760 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320230-dkbgc" Sep 30 06:30:00 crc kubenswrapper[4691]: I0930 06:30:00.716419 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320230-dkbgc"] Sep 30 06:30:01 crc kubenswrapper[4691]: I0930 06:30:01.670138 4691 generic.go:334] "Generic (PLEG): container finished" podID="e35ed7eb-1300-40cb-b087-8d4aa2cb1daa" containerID="ed2d986debd486b47f26a584828f12308d58cc18e95a6f9578be61260d53ae36" exitCode=0 Sep 30 06:30:01 crc kubenswrapper[4691]: I0930 06:30:01.670218 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320230-dkbgc" event={"ID":"e35ed7eb-1300-40cb-b087-8d4aa2cb1daa","Type":"ContainerDied","Data":"ed2d986debd486b47f26a584828f12308d58cc18e95a6f9578be61260d53ae36"} Sep 30 06:30:01 crc kubenswrapper[4691]: I0930 06:30:01.670481 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320230-dkbgc" event={"ID":"e35ed7eb-1300-40cb-b087-8d4aa2cb1daa","Type":"ContainerStarted","Data":"3eb607733ad43aede3745082b85208316e13bd9c10ae78ddce42a000b6571598"} Sep 30 06:30:02 crc kubenswrapper[4691]: I0930 06:30:02.967386 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320230-dkbgc" Sep 30 06:30:03 crc kubenswrapper[4691]: I0930 06:30:03.076355 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hr97f\" (UniqueName: \"kubernetes.io/projected/e35ed7eb-1300-40cb-b087-8d4aa2cb1daa-kube-api-access-hr97f\") pod \"e35ed7eb-1300-40cb-b087-8d4aa2cb1daa\" (UID: \"e35ed7eb-1300-40cb-b087-8d4aa2cb1daa\") " Sep 30 06:30:03 crc kubenswrapper[4691]: I0930 06:30:03.076423 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e35ed7eb-1300-40cb-b087-8d4aa2cb1daa-secret-volume\") pod \"e35ed7eb-1300-40cb-b087-8d4aa2cb1daa\" (UID: \"e35ed7eb-1300-40cb-b087-8d4aa2cb1daa\") " Sep 30 06:30:03 crc kubenswrapper[4691]: I0930 06:30:03.076513 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e35ed7eb-1300-40cb-b087-8d4aa2cb1daa-config-volume\") pod \"e35ed7eb-1300-40cb-b087-8d4aa2cb1daa\" (UID: \"e35ed7eb-1300-40cb-b087-8d4aa2cb1daa\") " Sep 30 06:30:03 crc kubenswrapper[4691]: I0930 06:30:03.077166 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e35ed7eb-1300-40cb-b087-8d4aa2cb1daa-config-volume" (OuterVolumeSpecName: "config-volume") pod "e35ed7eb-1300-40cb-b087-8d4aa2cb1daa" (UID: "e35ed7eb-1300-40cb-b087-8d4aa2cb1daa"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:30:03 crc kubenswrapper[4691]: I0930 06:30:03.077415 4691 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e35ed7eb-1300-40cb-b087-8d4aa2cb1daa-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 06:30:03 crc kubenswrapper[4691]: I0930 06:30:03.081833 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e35ed7eb-1300-40cb-b087-8d4aa2cb1daa-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e35ed7eb-1300-40cb-b087-8d4aa2cb1daa" (UID: "e35ed7eb-1300-40cb-b087-8d4aa2cb1daa"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:30:03 crc kubenswrapper[4691]: I0930 06:30:03.082002 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e35ed7eb-1300-40cb-b087-8d4aa2cb1daa-kube-api-access-hr97f" (OuterVolumeSpecName: "kube-api-access-hr97f") pod "e35ed7eb-1300-40cb-b087-8d4aa2cb1daa" (UID: "e35ed7eb-1300-40cb-b087-8d4aa2cb1daa"). InnerVolumeSpecName "kube-api-access-hr97f". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:30:03 crc kubenswrapper[4691]: I0930 06:30:03.178673 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hr97f\" (UniqueName: \"kubernetes.io/projected/e35ed7eb-1300-40cb-b087-8d4aa2cb1daa-kube-api-access-hr97f\") on node \"crc\" DevicePath \"\"" Sep 30 06:30:03 crc kubenswrapper[4691]: I0930 06:30:03.178704 4691 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e35ed7eb-1300-40cb-b087-8d4aa2cb1daa-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 06:30:03 crc kubenswrapper[4691]: I0930 06:30:03.685085 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320230-dkbgc" event={"ID":"e35ed7eb-1300-40cb-b087-8d4aa2cb1daa","Type":"ContainerDied","Data":"3eb607733ad43aede3745082b85208316e13bd9c10ae78ddce42a000b6571598"} Sep 30 06:30:03 crc kubenswrapper[4691]: I0930 06:30:03.685443 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3eb607733ad43aede3745082b85208316e13bd9c10ae78ddce42a000b6571598" Sep 30 06:30:03 crc kubenswrapper[4691]: I0930 06:30:03.685139 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320230-dkbgc" Sep 30 06:30:09 crc kubenswrapper[4691]: I0930 06:30:09.541438 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc6pcbs"] Sep 30 06:30:09 crc kubenswrapper[4691]: E0930 06:30:09.542079 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e35ed7eb-1300-40cb-b087-8d4aa2cb1daa" containerName="collect-profiles" Sep 30 06:30:09 crc kubenswrapper[4691]: I0930 06:30:09.542100 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="e35ed7eb-1300-40cb-b087-8d4aa2cb1daa" containerName="collect-profiles" Sep 30 06:30:09 crc kubenswrapper[4691]: I0930 06:30:09.542315 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="e35ed7eb-1300-40cb-b087-8d4aa2cb1daa" containerName="collect-profiles" Sep 30 06:30:09 crc kubenswrapper[4691]: I0930 06:30:09.543678 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc6pcbs" Sep 30 06:30:09 crc kubenswrapper[4691]: I0930 06:30:09.551267 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Sep 30 06:30:09 crc kubenswrapper[4691]: I0930 06:30:09.556153 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc6pcbs"] Sep 30 06:30:09 crc kubenswrapper[4691]: I0930 06:30:09.668332 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjnkk\" (UniqueName: \"kubernetes.io/projected/36dddebb-8230-4914-b81c-b53683028a63-kube-api-access-qjnkk\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc6pcbs\" (UID: \"36dddebb-8230-4914-b81c-b53683028a63\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc6pcbs" Sep 30 06:30:09 crc kubenswrapper[4691]: I0930 06:30:09.668556 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/36dddebb-8230-4914-b81c-b53683028a63-util\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc6pcbs\" (UID: \"36dddebb-8230-4914-b81c-b53683028a63\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc6pcbs" Sep 30 06:30:09 crc kubenswrapper[4691]: I0930 06:30:09.668700 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/36dddebb-8230-4914-b81c-b53683028a63-bundle\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc6pcbs\" (UID: \"36dddebb-8230-4914-b81c-b53683028a63\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc6pcbs" Sep 30 06:30:09 crc kubenswrapper[4691]: I0930 06:30:09.770119 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjnkk\" (UniqueName: \"kubernetes.io/projected/36dddebb-8230-4914-b81c-b53683028a63-kube-api-access-qjnkk\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc6pcbs\" (UID: \"36dddebb-8230-4914-b81c-b53683028a63\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc6pcbs" Sep 30 06:30:09 crc kubenswrapper[4691]: I0930 06:30:09.770224 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/36dddebb-8230-4914-b81c-b53683028a63-util\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc6pcbs\" (UID: \"36dddebb-8230-4914-b81c-b53683028a63\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc6pcbs" Sep 30 06:30:09 crc kubenswrapper[4691]: I0930 06:30:09.770269 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/36dddebb-8230-4914-b81c-b53683028a63-bundle\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc6pcbs\" (UID: \"36dddebb-8230-4914-b81c-b53683028a63\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc6pcbs" Sep 30 06:30:09 crc kubenswrapper[4691]: I0930 06:30:09.771086 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/36dddebb-8230-4914-b81c-b53683028a63-bundle\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc6pcbs\" (UID: \"36dddebb-8230-4914-b81c-b53683028a63\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc6pcbs" Sep 30 06:30:09 crc kubenswrapper[4691]: I0930 06:30:09.771278 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/36dddebb-8230-4914-b81c-b53683028a63-util\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc6pcbs\" (UID: \"36dddebb-8230-4914-b81c-b53683028a63\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc6pcbs" Sep 30 06:30:09 crc kubenswrapper[4691]: I0930 06:30:09.798395 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjnkk\" (UniqueName: \"kubernetes.io/projected/36dddebb-8230-4914-b81c-b53683028a63-kube-api-access-qjnkk\") pod \"9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc6pcbs\" (UID: \"36dddebb-8230-4914-b81c-b53683028a63\") " pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc6pcbs" Sep 30 06:30:09 crc kubenswrapper[4691]: I0930 06:30:09.910592 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc6pcbs" Sep 30 06:30:10 crc kubenswrapper[4691]: I0930 06:30:10.205467 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc6pcbs"] Sep 30 06:30:10 crc kubenswrapper[4691]: W0930 06:30:10.216812 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36dddebb_8230_4914_b81c_b53683028a63.slice/crio-85105fefc43014bdff9bb19b511afaef33294722c1c5dbf4b3ea7f6ec6874e39 WatchSource:0}: Error finding container 85105fefc43014bdff9bb19b511afaef33294722c1c5dbf4b3ea7f6ec6874e39: Status 404 returned error can't find the container with id 85105fefc43014bdff9bb19b511afaef33294722c1c5dbf4b3ea7f6ec6874e39 Sep 30 06:30:10 crc kubenswrapper[4691]: I0930 06:30:10.734679 4691 generic.go:334] "Generic (PLEG): container finished" podID="36dddebb-8230-4914-b81c-b53683028a63" containerID="02d9fda25cb783bee1d7faf43f1f562b366a02ca6d87394058e870c65cb72078" exitCode=0 Sep 30 06:30:10 crc kubenswrapper[4691]: I0930 06:30:10.735040 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc6pcbs" event={"ID":"36dddebb-8230-4914-b81c-b53683028a63","Type":"ContainerDied","Data":"02d9fda25cb783bee1d7faf43f1f562b366a02ca6d87394058e870c65cb72078"} Sep 30 06:30:10 crc kubenswrapper[4691]: I0930 06:30:10.735112 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc6pcbs" event={"ID":"36dddebb-8230-4914-b81c-b53683028a63","Type":"ContainerStarted","Data":"85105fefc43014bdff9bb19b511afaef33294722c1c5dbf4b3ea7f6ec6874e39"} Sep 30 06:30:12 crc kubenswrapper[4691]: I0930 06:30:12.752736 4691 generic.go:334] "Generic (PLEG): container finished" podID="36dddebb-8230-4914-b81c-b53683028a63" containerID="4395840b2cdc76b8ca08f259aaf71b27a084de2f8a73e098cab3e509e13597bb" exitCode=0 Sep 30 06:30:12 crc kubenswrapper[4691]: I0930 06:30:12.752960 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc6pcbs" event={"ID":"36dddebb-8230-4914-b81c-b53683028a63","Type":"ContainerDied","Data":"4395840b2cdc76b8ca08f259aaf71b27a084de2f8a73e098cab3e509e13597bb"} Sep 30 06:30:13 crc kubenswrapper[4691]: I0930 06:30:13.759825 4691 generic.go:334] "Generic (PLEG): container finished" podID="36dddebb-8230-4914-b81c-b53683028a63" containerID="7a288cbf1371e10adca6d0733137332e9ec6e6c23c059b08fa29c992f47157e7" exitCode=0 Sep 30 06:30:13 crc kubenswrapper[4691]: I0930 06:30:13.760150 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc6pcbs" event={"ID":"36dddebb-8230-4914-b81c-b53683028a63","Type":"ContainerDied","Data":"7a288cbf1371e10adca6d0733137332e9ec6e6c23c059b08fa29c992f47157e7"} Sep 30 06:30:15 crc kubenswrapper[4691]: I0930 06:30:15.060603 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc6pcbs" Sep 30 06:30:15 crc kubenswrapper[4691]: I0930 06:30:15.155588 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/36dddebb-8230-4914-b81c-b53683028a63-util\") pod \"36dddebb-8230-4914-b81c-b53683028a63\" (UID: \"36dddebb-8230-4914-b81c-b53683028a63\") " Sep 30 06:30:15 crc kubenswrapper[4691]: I0930 06:30:15.155660 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qjnkk\" (UniqueName: \"kubernetes.io/projected/36dddebb-8230-4914-b81c-b53683028a63-kube-api-access-qjnkk\") pod \"36dddebb-8230-4914-b81c-b53683028a63\" (UID: \"36dddebb-8230-4914-b81c-b53683028a63\") " Sep 30 06:30:15 crc kubenswrapper[4691]: I0930 06:30:15.155758 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/36dddebb-8230-4914-b81c-b53683028a63-bundle\") pod \"36dddebb-8230-4914-b81c-b53683028a63\" (UID: \"36dddebb-8230-4914-b81c-b53683028a63\") " Sep 30 06:30:15 crc kubenswrapper[4691]: I0930 06:30:15.156923 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36dddebb-8230-4914-b81c-b53683028a63-bundle" (OuterVolumeSpecName: "bundle") pod "36dddebb-8230-4914-b81c-b53683028a63" (UID: "36dddebb-8230-4914-b81c-b53683028a63"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:30:15 crc kubenswrapper[4691]: I0930 06:30:15.162403 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36dddebb-8230-4914-b81c-b53683028a63-kube-api-access-qjnkk" (OuterVolumeSpecName: "kube-api-access-qjnkk") pod "36dddebb-8230-4914-b81c-b53683028a63" (UID: "36dddebb-8230-4914-b81c-b53683028a63"). InnerVolumeSpecName "kube-api-access-qjnkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:30:15 crc kubenswrapper[4691]: I0930 06:30:15.177204 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36dddebb-8230-4914-b81c-b53683028a63-util" (OuterVolumeSpecName: "util") pod "36dddebb-8230-4914-b81c-b53683028a63" (UID: "36dddebb-8230-4914-b81c-b53683028a63"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:30:15 crc kubenswrapper[4691]: I0930 06:30:15.257525 4691 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/36dddebb-8230-4914-b81c-b53683028a63-util\") on node \"crc\" DevicePath \"\"" Sep 30 06:30:15 crc kubenswrapper[4691]: I0930 06:30:15.257571 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qjnkk\" (UniqueName: \"kubernetes.io/projected/36dddebb-8230-4914-b81c-b53683028a63-kube-api-access-qjnkk\") on node \"crc\" DevicePath \"\"" Sep 30 06:30:15 crc kubenswrapper[4691]: I0930 06:30:15.257593 4691 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/36dddebb-8230-4914-b81c-b53683028a63-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:30:15 crc kubenswrapper[4691]: I0930 06:30:15.783077 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc6pcbs" event={"ID":"36dddebb-8230-4914-b81c-b53683028a63","Type":"ContainerDied","Data":"85105fefc43014bdff9bb19b511afaef33294722c1c5dbf4b3ea7f6ec6874e39"} Sep 30 06:30:15 crc kubenswrapper[4691]: I0930 06:30:15.783141 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85105fefc43014bdff9bb19b511afaef33294722c1c5dbf4b3ea7f6ec6874e39" Sep 30 06:30:15 crc kubenswrapper[4691]: I0930 06:30:15.783107 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc6pcbs" Sep 30 06:30:21 crc kubenswrapper[4691]: I0930 06:30:21.138270 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5d6f6cfd66-wnr29"] Sep 30 06:30:21 crc kubenswrapper[4691]: E0930 06:30:21.138798 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36dddebb-8230-4914-b81c-b53683028a63" containerName="pull" Sep 30 06:30:21 crc kubenswrapper[4691]: I0930 06:30:21.138815 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="36dddebb-8230-4914-b81c-b53683028a63" containerName="pull" Sep 30 06:30:21 crc kubenswrapper[4691]: E0930 06:30:21.138831 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36dddebb-8230-4914-b81c-b53683028a63" containerName="extract" Sep 30 06:30:21 crc kubenswrapper[4691]: I0930 06:30:21.138838 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="36dddebb-8230-4914-b81c-b53683028a63" containerName="extract" Sep 30 06:30:21 crc kubenswrapper[4691]: E0930 06:30:21.138848 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36dddebb-8230-4914-b81c-b53683028a63" containerName="util" Sep 30 06:30:21 crc kubenswrapper[4691]: I0930 06:30:21.138856 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="36dddebb-8230-4914-b81c-b53683028a63" containerName="util" Sep 30 06:30:21 crc kubenswrapper[4691]: I0930 06:30:21.139013 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="36dddebb-8230-4914-b81c-b53683028a63" containerName="extract" Sep 30 06:30:21 crc kubenswrapper[4691]: I0930 06:30:21.139479 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-wnr29" Sep 30 06:30:21 crc kubenswrapper[4691]: I0930 06:30:21.143370 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Sep 30 06:30:21 crc kubenswrapper[4691]: I0930 06:30:21.144359 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-5bg72" Sep 30 06:30:21 crc kubenswrapper[4691]: I0930 06:30:21.144405 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Sep 30 06:30:21 crc kubenswrapper[4691]: I0930 06:30:21.154454 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5d6f6cfd66-wnr29"] Sep 30 06:30:21 crc kubenswrapper[4691]: I0930 06:30:21.238282 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cgkhv\" (UniqueName: \"kubernetes.io/projected/8dee2c6d-f8b8-4b1a-ae65-af2728adad3e-kube-api-access-cgkhv\") pod \"nmstate-operator-5d6f6cfd66-wnr29\" (UID: \"8dee2c6d-f8b8-4b1a-ae65-af2728adad3e\") " pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-wnr29" Sep 30 06:30:21 crc kubenswrapper[4691]: I0930 06:30:21.339928 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cgkhv\" (UniqueName: \"kubernetes.io/projected/8dee2c6d-f8b8-4b1a-ae65-af2728adad3e-kube-api-access-cgkhv\") pod \"nmstate-operator-5d6f6cfd66-wnr29\" (UID: \"8dee2c6d-f8b8-4b1a-ae65-af2728adad3e\") " pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-wnr29" Sep 30 06:30:21 crc kubenswrapper[4691]: I0930 06:30:21.362368 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cgkhv\" (UniqueName: \"kubernetes.io/projected/8dee2c6d-f8b8-4b1a-ae65-af2728adad3e-kube-api-access-cgkhv\") pod \"nmstate-operator-5d6f6cfd66-wnr29\" (UID: \"8dee2c6d-f8b8-4b1a-ae65-af2728adad3e\") " pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-wnr29" Sep 30 06:30:21 crc kubenswrapper[4691]: I0930 06:30:21.489402 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-wnr29" Sep 30 06:30:21 crc kubenswrapper[4691]: I0930 06:30:21.973378 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5d6f6cfd66-wnr29"] Sep 30 06:30:21 crc kubenswrapper[4691]: W0930 06:30:21.976133 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8dee2c6d_f8b8_4b1a_ae65_af2728adad3e.slice/crio-59de2eada044c58381d9d722028d2e9ef1156562d8e86d341528a12eaf12d39d WatchSource:0}: Error finding container 59de2eada044c58381d9d722028d2e9ef1156562d8e86d341528a12eaf12d39d: Status 404 returned error can't find the container with id 59de2eada044c58381d9d722028d2e9ef1156562d8e86d341528a12eaf12d39d Sep 30 06:30:22 crc kubenswrapper[4691]: I0930 06:30:22.832235 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-wnr29" event={"ID":"8dee2c6d-f8b8-4b1a-ae65-af2728adad3e","Type":"ContainerStarted","Data":"59de2eada044c58381d9d722028d2e9ef1156562d8e86d341528a12eaf12d39d"} Sep 30 06:30:24 crc kubenswrapper[4691]: I0930 06:30:24.845271 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-wnr29" event={"ID":"8dee2c6d-f8b8-4b1a-ae65-af2728adad3e","Type":"ContainerStarted","Data":"250f86ca9d531764ed585239fe1028644be555362080e2a68da6dd306e8d6277"} Sep 30 06:30:24 crc kubenswrapper[4691]: I0930 06:30:24.874214 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5d6f6cfd66-wnr29" podStartSLOduration=1.7232349729999998 podStartE2EDuration="3.874189801s" podCreationTimestamp="2025-09-30 06:30:21 +0000 UTC" firstStartedPulling="2025-09-30 06:30:21.97844415 +0000 UTC m=+665.453465240" lastFinishedPulling="2025-09-30 06:30:24.129399028 +0000 UTC m=+667.604420068" observedRunningTime="2025-09-30 06:30:24.867120023 +0000 UTC m=+668.342141133" watchObservedRunningTime="2025-09-30 06:30:24.874189801 +0000 UTC m=+668.349210881" Sep 30 06:30:30 crc kubenswrapper[4691]: I0930 06:30:30.842410 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-58fcddf996-tqjqg"] Sep 30 06:30:30 crc kubenswrapper[4691]: I0930 06:30:30.843655 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58fcddf996-tqjqg" Sep 30 06:30:30 crc kubenswrapper[4691]: I0930 06:30:30.847965 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-6d689559c5-pz8zd"] Sep 30 06:30:30 crc kubenswrapper[4691]: I0930 06:30:30.848630 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6d689559c5-pz8zd" Sep 30 06:30:30 crc kubenswrapper[4691]: I0930 06:30:30.857145 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-bqbkw" Sep 30 06:30:30 crc kubenswrapper[4691]: I0930 06:30:30.857338 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Sep 30 06:30:30 crc kubenswrapper[4691]: I0930 06:30:30.860497 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58fcddf996-tqjqg"] Sep 30 06:30:30 crc kubenswrapper[4691]: I0930 06:30:30.869946 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6d689559c5-pz8zd"] Sep 30 06:30:30 crc kubenswrapper[4691]: I0930 06:30:30.877141 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-gcxcd"] Sep 30 06:30:30 crc kubenswrapper[4691]: I0930 06:30:30.877794 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-gcxcd" Sep 30 06:30:30 crc kubenswrapper[4691]: I0930 06:30:30.964729 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-864bb6dfb5-r4qfl"] Sep 30 06:30:30 crc kubenswrapper[4691]: I0930 06:30:30.965560 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-r4qfl" Sep 30 06:30:30 crc kubenswrapper[4691]: I0930 06:30:30.967232 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-qr22g" Sep 30 06:30:30 crc kubenswrapper[4691]: I0930 06:30:30.967364 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Sep 30 06:30:30 crc kubenswrapper[4691]: I0930 06:30:30.967781 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Sep 30 06:30:30 crc kubenswrapper[4691]: I0930 06:30:30.974592 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-864bb6dfb5-r4qfl"] Sep 30 06:30:30 crc kubenswrapper[4691]: I0930 06:30:30.998268 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88zct\" (UniqueName: \"kubernetes.io/projected/3df27da9-f98e-41a7-84fb-bfad238e7533-kube-api-access-88zct\") pod \"nmstate-handler-gcxcd\" (UID: \"3df27da9-f98e-41a7-84fb-bfad238e7533\") " pod="openshift-nmstate/nmstate-handler-gcxcd" Sep 30 06:30:30 crc kubenswrapper[4691]: I0930 06:30:30.998313 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49k85\" (UniqueName: \"kubernetes.io/projected/55ec66af-837d-40c5-81d2-6b311f0dc05c-kube-api-access-49k85\") pod \"nmstate-metrics-58fcddf996-tqjqg\" (UID: \"55ec66af-837d-40c5-81d2-6b311f0dc05c\") " pod="openshift-nmstate/nmstate-metrics-58fcddf996-tqjqg" Sep 30 06:30:30 crc kubenswrapper[4691]: I0930 06:30:30.998342 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/3df27da9-f98e-41a7-84fb-bfad238e7533-nmstate-lock\") pod \"nmstate-handler-gcxcd\" (UID: \"3df27da9-f98e-41a7-84fb-bfad238e7533\") " pod="openshift-nmstate/nmstate-handler-gcxcd" Sep 30 06:30:30 crc kubenswrapper[4691]: I0930 06:30:30.998373 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/3cdd6ae9-7044-4fb4-92fb-0a503651b60d-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-pz8zd\" (UID: \"3cdd6ae9-7044-4fb4-92fb-0a503651b60d\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-pz8zd" Sep 30 06:30:30 crc kubenswrapper[4691]: I0930 06:30:30.998453 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/3df27da9-f98e-41a7-84fb-bfad238e7533-ovs-socket\") pod \"nmstate-handler-gcxcd\" (UID: \"3df27da9-f98e-41a7-84fb-bfad238e7533\") " pod="openshift-nmstate/nmstate-handler-gcxcd" Sep 30 06:30:30 crc kubenswrapper[4691]: I0930 06:30:30.998493 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsvfg\" (UniqueName: \"kubernetes.io/projected/3cdd6ae9-7044-4fb4-92fb-0a503651b60d-kube-api-access-zsvfg\") pod \"nmstate-webhook-6d689559c5-pz8zd\" (UID: \"3cdd6ae9-7044-4fb4-92fb-0a503651b60d\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-pz8zd" Sep 30 06:30:30 crc kubenswrapper[4691]: I0930 06:30:30.998520 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/3df27da9-f98e-41a7-84fb-bfad238e7533-dbus-socket\") pod \"nmstate-handler-gcxcd\" (UID: \"3df27da9-f98e-41a7-84fb-bfad238e7533\") " pod="openshift-nmstate/nmstate-handler-gcxcd" Sep 30 06:30:31 crc kubenswrapper[4691]: I0930 06:30:31.099515 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/3cdd6ae9-7044-4fb4-92fb-0a503651b60d-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-pz8zd\" (UID: \"3cdd6ae9-7044-4fb4-92fb-0a503651b60d\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-pz8zd" Sep 30 06:30:31 crc kubenswrapper[4691]: I0930 06:30:31.099601 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/3df27da9-f98e-41a7-84fb-bfad238e7533-ovs-socket\") pod \"nmstate-handler-gcxcd\" (UID: \"3df27da9-f98e-41a7-84fb-bfad238e7533\") " pod="openshift-nmstate/nmstate-handler-gcxcd" Sep 30 06:30:31 crc kubenswrapper[4691]: I0930 06:30:31.099637 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsvfg\" (UniqueName: \"kubernetes.io/projected/3cdd6ae9-7044-4fb4-92fb-0a503651b60d-kube-api-access-zsvfg\") pod \"nmstate-webhook-6d689559c5-pz8zd\" (UID: \"3cdd6ae9-7044-4fb4-92fb-0a503651b60d\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-pz8zd" Sep 30 06:30:31 crc kubenswrapper[4691]: I0930 06:30:31.099672 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e1989084-5e13-4ce8-9d59-050337ff70da-nginx-conf\") pod \"nmstate-console-plugin-864bb6dfb5-r4qfl\" (UID: \"e1989084-5e13-4ce8-9d59-050337ff70da\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-r4qfl" Sep 30 06:30:31 crc kubenswrapper[4691]: I0930 06:30:31.099696 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/3df27da9-f98e-41a7-84fb-bfad238e7533-dbus-socket\") pod \"nmstate-handler-gcxcd\" (UID: \"3df27da9-f98e-41a7-84fb-bfad238e7533\") " pod="openshift-nmstate/nmstate-handler-gcxcd" Sep 30 06:30:31 crc kubenswrapper[4691]: I0930 06:30:31.099742 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5vk4\" (UniqueName: \"kubernetes.io/projected/e1989084-5e13-4ce8-9d59-050337ff70da-kube-api-access-m5vk4\") pod \"nmstate-console-plugin-864bb6dfb5-r4qfl\" (UID: \"e1989084-5e13-4ce8-9d59-050337ff70da\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-r4qfl" Sep 30 06:30:31 crc kubenswrapper[4691]: I0930 06:30:31.099766 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e1989084-5e13-4ce8-9d59-050337ff70da-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-r4qfl\" (UID: \"e1989084-5e13-4ce8-9d59-050337ff70da\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-r4qfl" Sep 30 06:30:31 crc kubenswrapper[4691]: I0930 06:30:31.099817 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49k85\" (UniqueName: \"kubernetes.io/projected/55ec66af-837d-40c5-81d2-6b311f0dc05c-kube-api-access-49k85\") pod \"nmstate-metrics-58fcddf996-tqjqg\" (UID: \"55ec66af-837d-40c5-81d2-6b311f0dc05c\") " pod="openshift-nmstate/nmstate-metrics-58fcddf996-tqjqg" Sep 30 06:30:31 crc kubenswrapper[4691]: I0930 06:30:31.099838 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/3df27da9-f98e-41a7-84fb-bfad238e7533-nmstate-lock\") pod \"nmstate-handler-gcxcd\" (UID: \"3df27da9-f98e-41a7-84fb-bfad238e7533\") " pod="openshift-nmstate/nmstate-handler-gcxcd" Sep 30 06:30:31 crc kubenswrapper[4691]: I0930 06:30:31.099859 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88zct\" (UniqueName: \"kubernetes.io/projected/3df27da9-f98e-41a7-84fb-bfad238e7533-kube-api-access-88zct\") pod \"nmstate-handler-gcxcd\" (UID: \"3df27da9-f98e-41a7-84fb-bfad238e7533\") " pod="openshift-nmstate/nmstate-handler-gcxcd" Sep 30 06:30:31 crc kubenswrapper[4691]: I0930 06:30:31.100060 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/3df27da9-f98e-41a7-84fb-bfad238e7533-nmstate-lock\") pod \"nmstate-handler-gcxcd\" (UID: \"3df27da9-f98e-41a7-84fb-bfad238e7533\") " pod="openshift-nmstate/nmstate-handler-gcxcd" Sep 30 06:30:31 crc kubenswrapper[4691]: I0930 06:30:31.100153 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/3df27da9-f98e-41a7-84fb-bfad238e7533-ovs-socket\") pod \"nmstate-handler-gcxcd\" (UID: \"3df27da9-f98e-41a7-84fb-bfad238e7533\") " pod="openshift-nmstate/nmstate-handler-gcxcd" Sep 30 06:30:31 crc kubenswrapper[4691]: I0930 06:30:31.100408 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/3df27da9-f98e-41a7-84fb-bfad238e7533-dbus-socket\") pod \"nmstate-handler-gcxcd\" (UID: \"3df27da9-f98e-41a7-84fb-bfad238e7533\") " pod="openshift-nmstate/nmstate-handler-gcxcd" Sep 30 06:30:31 crc kubenswrapper[4691]: I0930 06:30:31.109357 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/3cdd6ae9-7044-4fb4-92fb-0a503651b60d-tls-key-pair\") pod \"nmstate-webhook-6d689559c5-pz8zd\" (UID: \"3cdd6ae9-7044-4fb4-92fb-0a503651b60d\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-pz8zd" Sep 30 06:30:31 crc kubenswrapper[4691]: I0930 06:30:31.116418 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49k85\" (UniqueName: \"kubernetes.io/projected/55ec66af-837d-40c5-81d2-6b311f0dc05c-kube-api-access-49k85\") pod \"nmstate-metrics-58fcddf996-tqjqg\" (UID: \"55ec66af-837d-40c5-81d2-6b311f0dc05c\") " pod="openshift-nmstate/nmstate-metrics-58fcddf996-tqjqg" Sep 30 06:30:31 crc kubenswrapper[4691]: I0930 06:30:31.119302 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsvfg\" (UniqueName: \"kubernetes.io/projected/3cdd6ae9-7044-4fb4-92fb-0a503651b60d-kube-api-access-zsvfg\") pod \"nmstate-webhook-6d689559c5-pz8zd\" (UID: \"3cdd6ae9-7044-4fb4-92fb-0a503651b60d\") " pod="openshift-nmstate/nmstate-webhook-6d689559c5-pz8zd" Sep 30 06:30:31 crc kubenswrapper[4691]: I0930 06:30:31.121870 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88zct\" (UniqueName: \"kubernetes.io/projected/3df27da9-f98e-41a7-84fb-bfad238e7533-kube-api-access-88zct\") pod \"nmstate-handler-gcxcd\" (UID: \"3df27da9-f98e-41a7-84fb-bfad238e7533\") " pod="openshift-nmstate/nmstate-handler-gcxcd" Sep 30 06:30:31 crc kubenswrapper[4691]: I0930 06:30:31.154608 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-77d5ffbdb7-rj9x6"] Sep 30 06:30:31 crc kubenswrapper[4691]: I0930 06:30:31.155231 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-77d5ffbdb7-rj9x6" Sep 30 06:30:31 crc kubenswrapper[4691]: I0930 06:30:31.166851 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-58fcddf996-tqjqg" Sep 30 06:30:31 crc kubenswrapper[4691]: I0930 06:30:31.172231 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-77d5ffbdb7-rj9x6"] Sep 30 06:30:31 crc kubenswrapper[4691]: I0930 06:30:31.183193 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-6d689559c5-pz8zd" Sep 30 06:30:31 crc kubenswrapper[4691]: I0930 06:30:31.197265 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-gcxcd" Sep 30 06:30:31 crc kubenswrapper[4691]: I0930 06:30:31.200543 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e1989084-5e13-4ce8-9d59-050337ff70da-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-r4qfl\" (UID: \"e1989084-5e13-4ce8-9d59-050337ff70da\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-r4qfl" Sep 30 06:30:31 crc kubenswrapper[4691]: I0930 06:30:31.200648 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e1989084-5e13-4ce8-9d59-050337ff70da-nginx-conf\") pod \"nmstate-console-plugin-864bb6dfb5-r4qfl\" (UID: \"e1989084-5e13-4ce8-9d59-050337ff70da\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-r4qfl" Sep 30 06:30:31 crc kubenswrapper[4691]: I0930 06:30:31.200710 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5vk4\" (UniqueName: \"kubernetes.io/projected/e1989084-5e13-4ce8-9d59-050337ff70da-kube-api-access-m5vk4\") pod \"nmstate-console-plugin-864bb6dfb5-r4qfl\" (UID: \"e1989084-5e13-4ce8-9d59-050337ff70da\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-r4qfl" Sep 30 06:30:31 crc kubenswrapper[4691]: E0930 06:30:31.201162 4691 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Sep 30 06:30:31 crc kubenswrapper[4691]: E0930 06:30:31.201211 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e1989084-5e13-4ce8-9d59-050337ff70da-plugin-serving-cert podName:e1989084-5e13-4ce8-9d59-050337ff70da nodeName:}" failed. No retries permitted until 2025-09-30 06:30:31.701195833 +0000 UTC m=+675.176216873 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/e1989084-5e13-4ce8-9d59-050337ff70da-plugin-serving-cert") pod "nmstate-console-plugin-864bb6dfb5-r4qfl" (UID: "e1989084-5e13-4ce8-9d59-050337ff70da") : secret "plugin-serving-cert" not found Sep 30 06:30:31 crc kubenswrapper[4691]: I0930 06:30:31.202291 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/e1989084-5e13-4ce8-9d59-050337ff70da-nginx-conf\") pod \"nmstate-console-plugin-864bb6dfb5-r4qfl\" (UID: \"e1989084-5e13-4ce8-9d59-050337ff70da\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-r4qfl" Sep 30 06:30:31 crc kubenswrapper[4691]: I0930 06:30:31.233746 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5vk4\" (UniqueName: \"kubernetes.io/projected/e1989084-5e13-4ce8-9d59-050337ff70da-kube-api-access-m5vk4\") pod \"nmstate-console-plugin-864bb6dfb5-r4qfl\" (UID: \"e1989084-5e13-4ce8-9d59-050337ff70da\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-r4qfl" Sep 30 06:30:31 crc kubenswrapper[4691]: I0930 06:30:31.304269 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/080637b1-0b3e-4005-8a10-c0e1fa1bab7a-console-oauth-config\") pod \"console-77d5ffbdb7-rj9x6\" (UID: \"080637b1-0b3e-4005-8a10-c0e1fa1bab7a\") " pod="openshift-console/console-77d5ffbdb7-rj9x6" Sep 30 06:30:31 crc kubenswrapper[4691]: I0930 06:30:31.304319 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/080637b1-0b3e-4005-8a10-c0e1fa1bab7a-console-serving-cert\") pod \"console-77d5ffbdb7-rj9x6\" (UID: \"080637b1-0b3e-4005-8a10-c0e1fa1bab7a\") " pod="openshift-console/console-77d5ffbdb7-rj9x6" Sep 30 06:30:31 crc kubenswrapper[4691]: I0930 06:30:31.304449 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/080637b1-0b3e-4005-8a10-c0e1fa1bab7a-console-config\") pod \"console-77d5ffbdb7-rj9x6\" (UID: \"080637b1-0b3e-4005-8a10-c0e1fa1bab7a\") " pod="openshift-console/console-77d5ffbdb7-rj9x6" Sep 30 06:30:31 crc kubenswrapper[4691]: I0930 06:30:31.304512 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/080637b1-0b3e-4005-8a10-c0e1fa1bab7a-trusted-ca-bundle\") pod \"console-77d5ffbdb7-rj9x6\" (UID: \"080637b1-0b3e-4005-8a10-c0e1fa1bab7a\") " pod="openshift-console/console-77d5ffbdb7-rj9x6" Sep 30 06:30:31 crc kubenswrapper[4691]: I0930 06:30:31.304554 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/080637b1-0b3e-4005-8a10-c0e1fa1bab7a-oauth-serving-cert\") pod \"console-77d5ffbdb7-rj9x6\" (UID: \"080637b1-0b3e-4005-8a10-c0e1fa1bab7a\") " pod="openshift-console/console-77d5ffbdb7-rj9x6" Sep 30 06:30:31 crc kubenswrapper[4691]: I0930 06:30:31.304681 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/080637b1-0b3e-4005-8a10-c0e1fa1bab7a-service-ca\") pod \"console-77d5ffbdb7-rj9x6\" (UID: \"080637b1-0b3e-4005-8a10-c0e1fa1bab7a\") " pod="openshift-console/console-77d5ffbdb7-rj9x6" Sep 30 06:30:31 crc kubenswrapper[4691]: I0930 06:30:31.304748 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hr54q\" (UniqueName: \"kubernetes.io/projected/080637b1-0b3e-4005-8a10-c0e1fa1bab7a-kube-api-access-hr54q\") pod \"console-77d5ffbdb7-rj9x6\" (UID: \"080637b1-0b3e-4005-8a10-c0e1fa1bab7a\") " pod="openshift-console/console-77d5ffbdb7-rj9x6" Sep 30 06:30:31 crc kubenswrapper[4691]: I0930 06:30:31.405164 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-58fcddf996-tqjqg"] Sep 30 06:30:31 crc kubenswrapper[4691]: I0930 06:30:31.405729 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/080637b1-0b3e-4005-8a10-c0e1fa1bab7a-console-config\") pod \"console-77d5ffbdb7-rj9x6\" (UID: \"080637b1-0b3e-4005-8a10-c0e1fa1bab7a\") " pod="openshift-console/console-77d5ffbdb7-rj9x6" Sep 30 06:30:31 crc kubenswrapper[4691]: I0930 06:30:31.405791 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/080637b1-0b3e-4005-8a10-c0e1fa1bab7a-trusted-ca-bundle\") pod \"console-77d5ffbdb7-rj9x6\" (UID: \"080637b1-0b3e-4005-8a10-c0e1fa1bab7a\") " pod="openshift-console/console-77d5ffbdb7-rj9x6" Sep 30 06:30:31 crc kubenswrapper[4691]: I0930 06:30:31.405823 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/080637b1-0b3e-4005-8a10-c0e1fa1bab7a-oauth-serving-cert\") pod \"console-77d5ffbdb7-rj9x6\" (UID: \"080637b1-0b3e-4005-8a10-c0e1fa1bab7a\") " pod="openshift-console/console-77d5ffbdb7-rj9x6" Sep 30 06:30:31 crc kubenswrapper[4691]: I0930 06:30:31.405851 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/080637b1-0b3e-4005-8a10-c0e1fa1bab7a-service-ca\") pod \"console-77d5ffbdb7-rj9x6\" (UID: \"080637b1-0b3e-4005-8a10-c0e1fa1bab7a\") " pod="openshift-console/console-77d5ffbdb7-rj9x6" Sep 30 06:30:31 crc kubenswrapper[4691]: I0930 06:30:31.406317 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hr54q\" (UniqueName: \"kubernetes.io/projected/080637b1-0b3e-4005-8a10-c0e1fa1bab7a-kube-api-access-hr54q\") pod \"console-77d5ffbdb7-rj9x6\" (UID: \"080637b1-0b3e-4005-8a10-c0e1fa1bab7a\") " pod="openshift-console/console-77d5ffbdb7-rj9x6" Sep 30 06:30:31 crc kubenswrapper[4691]: I0930 06:30:31.406651 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/080637b1-0b3e-4005-8a10-c0e1fa1bab7a-console-config\") pod \"console-77d5ffbdb7-rj9x6\" (UID: \"080637b1-0b3e-4005-8a10-c0e1fa1bab7a\") " pod="openshift-console/console-77d5ffbdb7-rj9x6" Sep 30 06:30:31 crc kubenswrapper[4691]: I0930 06:30:31.407198 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/080637b1-0b3e-4005-8a10-c0e1fa1bab7a-oauth-serving-cert\") pod \"console-77d5ffbdb7-rj9x6\" (UID: \"080637b1-0b3e-4005-8a10-c0e1fa1bab7a\") " pod="openshift-console/console-77d5ffbdb7-rj9x6" Sep 30 06:30:31 crc kubenswrapper[4691]: I0930 06:30:31.407876 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/080637b1-0b3e-4005-8a10-c0e1fa1bab7a-service-ca\") pod \"console-77d5ffbdb7-rj9x6\" (UID: \"080637b1-0b3e-4005-8a10-c0e1fa1bab7a\") " pod="openshift-console/console-77d5ffbdb7-rj9x6" Sep 30 06:30:31 crc kubenswrapper[4691]: W0930 06:30:31.408100 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55ec66af_837d_40c5_81d2_6b311f0dc05c.slice/crio-23d4c51ec201c95582938600e553cfd6f45b9003d840a045c084129488b53410 WatchSource:0}: Error finding container 23d4c51ec201c95582938600e553cfd6f45b9003d840a045c084129488b53410: Status 404 returned error can't find the container with id 23d4c51ec201c95582938600e553cfd6f45b9003d840a045c084129488b53410 Sep 30 06:30:31 crc kubenswrapper[4691]: I0930 06:30:31.408239 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/080637b1-0b3e-4005-8a10-c0e1fa1bab7a-console-oauth-config\") pod \"console-77d5ffbdb7-rj9x6\" (UID: \"080637b1-0b3e-4005-8a10-c0e1fa1bab7a\") " pod="openshift-console/console-77d5ffbdb7-rj9x6" Sep 30 06:30:31 crc kubenswrapper[4691]: I0930 06:30:31.408286 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/080637b1-0b3e-4005-8a10-c0e1fa1bab7a-console-serving-cert\") pod \"console-77d5ffbdb7-rj9x6\" (UID: \"080637b1-0b3e-4005-8a10-c0e1fa1bab7a\") " pod="openshift-console/console-77d5ffbdb7-rj9x6" Sep 30 06:30:31 crc kubenswrapper[4691]: I0930 06:30:31.410799 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/080637b1-0b3e-4005-8a10-c0e1fa1bab7a-console-oauth-config\") pod \"console-77d5ffbdb7-rj9x6\" (UID: \"080637b1-0b3e-4005-8a10-c0e1fa1bab7a\") " pod="openshift-console/console-77d5ffbdb7-rj9x6" Sep 30 06:30:31 crc kubenswrapper[4691]: I0930 06:30:31.411262 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/080637b1-0b3e-4005-8a10-c0e1fa1bab7a-console-serving-cert\") pod \"console-77d5ffbdb7-rj9x6\" (UID: \"080637b1-0b3e-4005-8a10-c0e1fa1bab7a\") " pod="openshift-console/console-77d5ffbdb7-rj9x6" Sep 30 06:30:31 crc kubenswrapper[4691]: I0930 06:30:31.414646 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/080637b1-0b3e-4005-8a10-c0e1fa1bab7a-trusted-ca-bundle\") pod \"console-77d5ffbdb7-rj9x6\" (UID: \"080637b1-0b3e-4005-8a10-c0e1fa1bab7a\") " pod="openshift-console/console-77d5ffbdb7-rj9x6" Sep 30 06:30:31 crc kubenswrapper[4691]: I0930 06:30:31.419304 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hr54q\" (UniqueName: \"kubernetes.io/projected/080637b1-0b3e-4005-8a10-c0e1fa1bab7a-kube-api-access-hr54q\") pod \"console-77d5ffbdb7-rj9x6\" (UID: \"080637b1-0b3e-4005-8a10-c0e1fa1bab7a\") " pod="openshift-console/console-77d5ffbdb7-rj9x6" Sep 30 06:30:31 crc kubenswrapper[4691]: I0930 06:30:31.478932 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-77d5ffbdb7-rj9x6" Sep 30 06:30:31 crc kubenswrapper[4691]: I0930 06:30:31.650304 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-6d689559c5-pz8zd"] Sep 30 06:30:31 crc kubenswrapper[4691]: W0930 06:30:31.658039 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3cdd6ae9_7044_4fb4_92fb_0a503651b60d.slice/crio-8d7ed65d4518b8e7aefd6369b66f04237effe3c72244109cb32e2b2f6d86dff3 WatchSource:0}: Error finding container 8d7ed65d4518b8e7aefd6369b66f04237effe3c72244109cb32e2b2f6d86dff3: Status 404 returned error can't find the container with id 8d7ed65d4518b8e7aefd6369b66f04237effe3c72244109cb32e2b2f6d86dff3 Sep 30 06:30:31 crc kubenswrapper[4691]: I0930 06:30:31.711644 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e1989084-5e13-4ce8-9d59-050337ff70da-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-r4qfl\" (UID: \"e1989084-5e13-4ce8-9d59-050337ff70da\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-r4qfl" Sep 30 06:30:31 crc kubenswrapper[4691]: I0930 06:30:31.714949 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/e1989084-5e13-4ce8-9d59-050337ff70da-plugin-serving-cert\") pod \"nmstate-console-plugin-864bb6dfb5-r4qfl\" (UID: \"e1989084-5e13-4ce8-9d59-050337ff70da\") " pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-r4qfl" Sep 30 06:30:31 crc kubenswrapper[4691]: I0930 06:30:31.876841 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-77d5ffbdb7-rj9x6"] Sep 30 06:30:31 crc kubenswrapper[4691]: I0930 06:30:31.880622 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-r4qfl" Sep 30 06:30:31 crc kubenswrapper[4691]: W0930 06:30:31.885548 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod080637b1_0b3e_4005_8a10_c0e1fa1bab7a.slice/crio-71a522650879a500a555cd29d5f45730dfa911dc95dffb83983fd57c1f7c58df WatchSource:0}: Error finding container 71a522650879a500a555cd29d5f45730dfa911dc95dffb83983fd57c1f7c58df: Status 404 returned error can't find the container with id 71a522650879a500a555cd29d5f45730dfa911dc95dffb83983fd57c1f7c58df Sep 30 06:30:31 crc kubenswrapper[4691]: I0930 06:30:31.897797 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58fcddf996-tqjqg" event={"ID":"55ec66af-837d-40c5-81d2-6b311f0dc05c","Type":"ContainerStarted","Data":"23d4c51ec201c95582938600e553cfd6f45b9003d840a045c084129488b53410"} Sep 30 06:30:31 crc kubenswrapper[4691]: I0930 06:30:31.899490 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6d689559c5-pz8zd" event={"ID":"3cdd6ae9-7044-4fb4-92fb-0a503651b60d","Type":"ContainerStarted","Data":"8d7ed65d4518b8e7aefd6369b66f04237effe3c72244109cb32e2b2f6d86dff3"} Sep 30 06:30:31 crc kubenswrapper[4691]: I0930 06:30:31.902128 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-gcxcd" event={"ID":"3df27da9-f98e-41a7-84fb-bfad238e7533","Type":"ContainerStarted","Data":"00e678b15a5f89c7abeb7a4c82d5eb534ad757672ac98d4f112aeb4c95f928ad"} Sep 30 06:30:32 crc kubenswrapper[4691]: I0930 06:30:32.159013 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-864bb6dfb5-r4qfl"] Sep 30 06:30:32 crc kubenswrapper[4691]: W0930 06:30:32.168659 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1989084_5e13_4ce8_9d59_050337ff70da.slice/crio-fcd31e490d17ed6795c675ad2bd68f156f4900b3fab58754e88487d8476c1866 WatchSource:0}: Error finding container fcd31e490d17ed6795c675ad2bd68f156f4900b3fab58754e88487d8476c1866: Status 404 returned error can't find the container with id fcd31e490d17ed6795c675ad2bd68f156f4900b3fab58754e88487d8476c1866 Sep 30 06:30:32 crc kubenswrapper[4691]: I0930 06:30:32.907729 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-r4qfl" event={"ID":"e1989084-5e13-4ce8-9d59-050337ff70da","Type":"ContainerStarted","Data":"fcd31e490d17ed6795c675ad2bd68f156f4900b3fab58754e88487d8476c1866"} Sep 30 06:30:32 crc kubenswrapper[4691]: I0930 06:30:32.910365 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-77d5ffbdb7-rj9x6" event={"ID":"080637b1-0b3e-4005-8a10-c0e1fa1bab7a","Type":"ContainerStarted","Data":"dc359db2e3a60f222e0f9a84e8ea02604e0ac430a6350ff76e89baaa09b9f972"} Sep 30 06:30:32 crc kubenswrapper[4691]: I0930 06:30:32.910409 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-77d5ffbdb7-rj9x6" event={"ID":"080637b1-0b3e-4005-8a10-c0e1fa1bab7a","Type":"ContainerStarted","Data":"71a522650879a500a555cd29d5f45730dfa911dc95dffb83983fd57c1f7c58df"} Sep 30 06:30:32 crc kubenswrapper[4691]: I0930 06:30:32.929757 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-77d5ffbdb7-rj9x6" podStartSLOduration=1.929737549 podStartE2EDuration="1.929737549s" podCreationTimestamp="2025-09-30 06:30:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:30:32.927833591 +0000 UTC m=+676.402854651" watchObservedRunningTime="2025-09-30 06:30:32.929737549 +0000 UTC m=+676.404758589" Sep 30 06:30:34 crc kubenswrapper[4691]: I0930 06:30:34.930225 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58fcddf996-tqjqg" event={"ID":"55ec66af-837d-40c5-81d2-6b311f0dc05c","Type":"ContainerStarted","Data":"dd4aeda0f00b6735ad6e22ee88d70a88177fdab77902f4706e92d2c03e959f91"} Sep 30 06:30:34 crc kubenswrapper[4691]: I0930 06:30:34.934876 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-6d689559c5-pz8zd" event={"ID":"3cdd6ae9-7044-4fb4-92fb-0a503651b60d","Type":"ContainerStarted","Data":"66b5b8175360ed2a7e01d3d18cb3fe3f412af0b980884e09229f6f22eeeee51c"} Sep 30 06:30:34 crc kubenswrapper[4691]: I0930 06:30:34.935060 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-6d689559c5-pz8zd" Sep 30 06:30:34 crc kubenswrapper[4691]: I0930 06:30:34.937306 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-gcxcd" event={"ID":"3df27da9-f98e-41a7-84fb-bfad238e7533","Type":"ContainerStarted","Data":"03d4cfd89102b81f418faa68e1bbe3191dca43e585ec3199d950f6487972cf8f"} Sep 30 06:30:34 crc kubenswrapper[4691]: I0930 06:30:34.938132 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-gcxcd" Sep 30 06:30:34 crc kubenswrapper[4691]: I0930 06:30:34.964605 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-6d689559c5-pz8zd" podStartSLOduration=2.63057582 podStartE2EDuration="4.964578164s" podCreationTimestamp="2025-09-30 06:30:30 +0000 UTC" firstStartedPulling="2025-09-30 06:30:31.65966679 +0000 UTC m=+675.134687830" lastFinishedPulling="2025-09-30 06:30:33.993669124 +0000 UTC m=+677.468690174" observedRunningTime="2025-09-30 06:30:34.958628073 +0000 UTC m=+678.433649153" watchObservedRunningTime="2025-09-30 06:30:34.964578164 +0000 UTC m=+678.439599244" Sep 30 06:30:34 crc kubenswrapper[4691]: I0930 06:30:34.986035 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-gcxcd" podStartSLOduration=2.209747948 podStartE2EDuration="4.986007212s" podCreationTimestamp="2025-09-30 06:30:30 +0000 UTC" firstStartedPulling="2025-09-30 06:30:31.24216719 +0000 UTC m=+674.717188250" lastFinishedPulling="2025-09-30 06:30:34.018426464 +0000 UTC m=+677.493447514" observedRunningTime="2025-09-30 06:30:34.984387732 +0000 UTC m=+678.459408792" watchObservedRunningTime="2025-09-30 06:30:34.986007212 +0000 UTC m=+678.461028282" Sep 30 06:30:35 crc kubenswrapper[4691]: I0930 06:30:35.949194 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-r4qfl" event={"ID":"e1989084-5e13-4ce8-9d59-050337ff70da","Type":"ContainerStarted","Data":"5e7ea8ac23c27c20d8a0e0eb0209fe2b99e105b9adf848fa6e1211759b109046"} Sep 30 06:30:35 crc kubenswrapper[4691]: I0930 06:30:35.966274 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-864bb6dfb5-r4qfl" podStartSLOduration=3.239350169 podStartE2EDuration="5.966250359s" podCreationTimestamp="2025-09-30 06:30:30 +0000 UTC" firstStartedPulling="2025-09-30 06:30:32.181251954 +0000 UTC m=+675.656272994" lastFinishedPulling="2025-09-30 06:30:34.908152134 +0000 UTC m=+678.383173184" observedRunningTime="2025-09-30 06:30:35.963182815 +0000 UTC m=+679.438203945" watchObservedRunningTime="2025-09-30 06:30:35.966250359 +0000 UTC m=+679.441271439" Sep 30 06:30:36 crc kubenswrapper[4691]: I0930 06:30:36.958841 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-58fcddf996-tqjqg" event={"ID":"55ec66af-837d-40c5-81d2-6b311f0dc05c","Type":"ContainerStarted","Data":"f4563c0271f44ae0d1365a14e816761ae49fc6c4d6baa2d040283607a7acfb1b"} Sep 30 06:30:36 crc kubenswrapper[4691]: I0930 06:30:36.985998 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-58fcddf996-tqjqg" podStartSLOduration=1.843674636 podStartE2EDuration="6.985975967s" podCreationTimestamp="2025-09-30 06:30:30 +0000 UTC" firstStartedPulling="2025-09-30 06:30:31.41033 +0000 UTC m=+674.885351040" lastFinishedPulling="2025-09-30 06:30:36.552631321 +0000 UTC m=+680.027652371" observedRunningTime="2025-09-30 06:30:36.981881592 +0000 UTC m=+680.456902652" watchObservedRunningTime="2025-09-30 06:30:36.985975967 +0000 UTC m=+680.460997017" Sep 30 06:30:41 crc kubenswrapper[4691]: I0930 06:30:41.252734 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-gcxcd" Sep 30 06:30:41 crc kubenswrapper[4691]: I0930 06:30:41.479258 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-77d5ffbdb7-rj9x6" Sep 30 06:30:41 crc kubenswrapper[4691]: I0930 06:30:41.479731 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-77d5ffbdb7-rj9x6" Sep 30 06:30:41 crc kubenswrapper[4691]: I0930 06:30:41.487027 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-77d5ffbdb7-rj9x6" Sep 30 06:30:42 crc kubenswrapper[4691]: I0930 06:30:41.999810 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-77d5ffbdb7-rj9x6" Sep 30 06:30:42 crc kubenswrapper[4691]: I0930 06:30:42.075579 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-thj2p"] Sep 30 06:30:51 crc kubenswrapper[4691]: I0930 06:30:51.191520 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-6d689559c5-pz8zd" Sep 30 06:31:07 crc kubenswrapper[4691]: I0930 06:31:07.135039 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-thj2p" podUID="4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3" containerName="console" containerID="cri-o://5c1f99b3db97a62fe6cb24b3efcf5e1f2d731152a1680d0bc49103ef485d3e04" gracePeriod=15 Sep 30 06:31:07 crc kubenswrapper[4691]: I0930 06:31:07.517755 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-thj2p_4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3/console/0.log" Sep 30 06:31:07 crc kubenswrapper[4691]: I0930 06:31:07.518040 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-thj2p" Sep 30 06:31:07 crc kubenswrapper[4691]: I0930 06:31:07.638689 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3-console-config\") pod \"4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3\" (UID: \"4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3\") " Sep 30 06:31:07 crc kubenswrapper[4691]: I0930 06:31:07.638741 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5ch9\" (UniqueName: \"kubernetes.io/projected/4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3-kube-api-access-g5ch9\") pod \"4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3\" (UID: \"4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3\") " Sep 30 06:31:07 crc kubenswrapper[4691]: I0930 06:31:07.638786 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3-trusted-ca-bundle\") pod \"4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3\" (UID: \"4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3\") " Sep 30 06:31:07 crc kubenswrapper[4691]: I0930 06:31:07.638813 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3-console-oauth-config\") pod \"4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3\" (UID: \"4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3\") " Sep 30 06:31:07 crc kubenswrapper[4691]: I0930 06:31:07.638904 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3-service-ca\") pod \"4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3\" (UID: \"4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3\") " Sep 30 06:31:07 crc kubenswrapper[4691]: I0930 06:31:07.638925 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3-console-serving-cert\") pod \"4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3\" (UID: \"4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3\") " Sep 30 06:31:07 crc kubenswrapper[4691]: I0930 06:31:07.638947 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3-oauth-serving-cert\") pod \"4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3\" (UID: \"4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3\") " Sep 30 06:31:07 crc kubenswrapper[4691]: I0930 06:31:07.639409 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3-console-config" (OuterVolumeSpecName: "console-config") pod "4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3" (UID: "4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:31:07 crc kubenswrapper[4691]: I0930 06:31:07.639449 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3-service-ca" (OuterVolumeSpecName: "service-ca") pod "4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3" (UID: "4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:31:07 crc kubenswrapper[4691]: I0930 06:31:07.639575 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3" (UID: "4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:31:07 crc kubenswrapper[4691]: I0930 06:31:07.639776 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3" (UID: "4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:31:07 crc kubenswrapper[4691]: I0930 06:31:07.640073 4691 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3-service-ca\") on node \"crc\" DevicePath \"\"" Sep 30 06:31:07 crc kubenswrapper[4691]: I0930 06:31:07.640091 4691 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 06:31:07 crc kubenswrapper[4691]: I0930 06:31:07.640103 4691 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3-console-config\") on node \"crc\" DevicePath \"\"" Sep 30 06:31:07 crc kubenswrapper[4691]: I0930 06:31:07.640112 4691 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:31:07 crc kubenswrapper[4691]: I0930 06:31:07.655431 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3-kube-api-access-g5ch9" (OuterVolumeSpecName: "kube-api-access-g5ch9") pod "4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3" (UID: "4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3"). InnerVolumeSpecName "kube-api-access-g5ch9". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:31:07 crc kubenswrapper[4691]: I0930 06:31:07.655445 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3" (UID: "4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:31:07 crc kubenswrapper[4691]: I0930 06:31:07.656073 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3" (UID: "4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:31:07 crc kubenswrapper[4691]: I0930 06:31:07.741786 4691 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3-console-serving-cert\") on node \"crc\" DevicePath \"\"" Sep 30 06:31:07 crc kubenswrapper[4691]: I0930 06:31:07.741819 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5ch9\" (UniqueName: \"kubernetes.io/projected/4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3-kube-api-access-g5ch9\") on node \"crc\" DevicePath \"\"" Sep 30 06:31:07 crc kubenswrapper[4691]: I0930 06:31:07.741830 4691 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3-console-oauth-config\") on node \"crc\" DevicePath \"\"" Sep 30 06:31:08 crc kubenswrapper[4691]: I0930 06:31:08.191029 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-thj2p_4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3/console/0.log" Sep 30 06:31:08 crc kubenswrapper[4691]: I0930 06:31:08.191100 4691 generic.go:334] "Generic (PLEG): container finished" podID="4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3" containerID="5c1f99b3db97a62fe6cb24b3efcf5e1f2d731152a1680d0bc49103ef485d3e04" exitCode=2 Sep 30 06:31:08 crc kubenswrapper[4691]: I0930 06:31:08.191144 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-thj2p" event={"ID":"4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3","Type":"ContainerDied","Data":"5c1f99b3db97a62fe6cb24b3efcf5e1f2d731152a1680d0bc49103ef485d3e04"} Sep 30 06:31:08 crc kubenswrapper[4691]: I0930 06:31:08.191192 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-thj2p" event={"ID":"4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3","Type":"ContainerDied","Data":"2288a7d9adacb360f77c41fddbca550712f08789f2d5146f0b65344bbebcb335"} Sep 30 06:31:08 crc kubenswrapper[4691]: I0930 06:31:08.191194 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-thj2p" Sep 30 06:31:08 crc kubenswrapper[4691]: I0930 06:31:08.191221 4691 scope.go:117] "RemoveContainer" containerID="5c1f99b3db97a62fe6cb24b3efcf5e1f2d731152a1680d0bc49103ef485d3e04" Sep 30 06:31:08 crc kubenswrapper[4691]: I0930 06:31:08.225022 4691 scope.go:117] "RemoveContainer" containerID="5c1f99b3db97a62fe6cb24b3efcf5e1f2d731152a1680d0bc49103ef485d3e04" Sep 30 06:31:08 crc kubenswrapper[4691]: E0930 06:31:08.226616 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5c1f99b3db97a62fe6cb24b3efcf5e1f2d731152a1680d0bc49103ef485d3e04\": container with ID starting with 5c1f99b3db97a62fe6cb24b3efcf5e1f2d731152a1680d0bc49103ef485d3e04 not found: ID does not exist" containerID="5c1f99b3db97a62fe6cb24b3efcf5e1f2d731152a1680d0bc49103ef485d3e04" Sep 30 06:31:08 crc kubenswrapper[4691]: I0930 06:31:08.226674 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5c1f99b3db97a62fe6cb24b3efcf5e1f2d731152a1680d0bc49103ef485d3e04"} err="failed to get container status \"5c1f99b3db97a62fe6cb24b3efcf5e1f2d731152a1680d0bc49103ef485d3e04\": rpc error: code = NotFound desc = could not find container \"5c1f99b3db97a62fe6cb24b3efcf5e1f2d731152a1680d0bc49103ef485d3e04\": container with ID starting with 5c1f99b3db97a62fe6cb24b3efcf5e1f2d731152a1680d0bc49103ef485d3e04 not found: ID does not exist" Sep 30 06:31:08 crc kubenswrapper[4691]: I0930 06:31:08.233399 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-thj2p"] Sep 30 06:31:08 crc kubenswrapper[4691]: I0930 06:31:08.241402 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-thj2p"] Sep 30 06:31:08 crc kubenswrapper[4691]: I0930 06:31:08.331001 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96lp95h"] Sep 30 06:31:08 crc kubenswrapper[4691]: E0930 06:31:08.331199 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3" containerName="console" Sep 30 06:31:08 crc kubenswrapper[4691]: I0930 06:31:08.331210 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3" containerName="console" Sep 30 06:31:08 crc kubenswrapper[4691]: I0930 06:31:08.331323 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3" containerName="console" Sep 30 06:31:08 crc kubenswrapper[4691]: I0930 06:31:08.332053 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96lp95h" Sep 30 06:31:08 crc kubenswrapper[4691]: I0930 06:31:08.334799 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Sep 30 06:31:08 crc kubenswrapper[4691]: I0930 06:31:08.353023 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96lp95h"] Sep 30 06:31:08 crc kubenswrapper[4691]: I0930 06:31:08.453287 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cfeab7bf-5c68-43a0-8c09-cbf293e56f35-bundle\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96lp95h\" (UID: \"cfeab7bf-5c68-43a0-8c09-cbf293e56f35\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96lp95h" Sep 30 06:31:08 crc kubenswrapper[4691]: I0930 06:31:08.453355 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpb45\" (UniqueName: \"kubernetes.io/projected/cfeab7bf-5c68-43a0-8c09-cbf293e56f35-kube-api-access-hpb45\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96lp95h\" (UID: \"cfeab7bf-5c68-43a0-8c09-cbf293e56f35\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96lp95h" Sep 30 06:31:08 crc kubenswrapper[4691]: I0930 06:31:08.453608 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cfeab7bf-5c68-43a0-8c09-cbf293e56f35-util\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96lp95h\" (UID: \"cfeab7bf-5c68-43a0-8c09-cbf293e56f35\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96lp95h" Sep 30 06:31:08 crc kubenswrapper[4691]: I0930 06:31:08.555086 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cfeab7bf-5c68-43a0-8c09-cbf293e56f35-bundle\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96lp95h\" (UID: \"cfeab7bf-5c68-43a0-8c09-cbf293e56f35\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96lp95h" Sep 30 06:31:08 crc kubenswrapper[4691]: I0930 06:31:08.555165 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpb45\" (UniqueName: \"kubernetes.io/projected/cfeab7bf-5c68-43a0-8c09-cbf293e56f35-kube-api-access-hpb45\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96lp95h\" (UID: \"cfeab7bf-5c68-43a0-8c09-cbf293e56f35\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96lp95h" Sep 30 06:31:08 crc kubenswrapper[4691]: I0930 06:31:08.555268 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cfeab7bf-5c68-43a0-8c09-cbf293e56f35-util\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96lp95h\" (UID: \"cfeab7bf-5c68-43a0-8c09-cbf293e56f35\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96lp95h" Sep 30 06:31:08 crc kubenswrapper[4691]: I0930 06:31:08.555759 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cfeab7bf-5c68-43a0-8c09-cbf293e56f35-bundle\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96lp95h\" (UID: \"cfeab7bf-5c68-43a0-8c09-cbf293e56f35\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96lp95h" Sep 30 06:31:08 crc kubenswrapper[4691]: I0930 06:31:08.556065 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cfeab7bf-5c68-43a0-8c09-cbf293e56f35-util\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96lp95h\" (UID: \"cfeab7bf-5c68-43a0-8c09-cbf293e56f35\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96lp95h" Sep 30 06:31:08 crc kubenswrapper[4691]: I0930 06:31:08.576912 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpb45\" (UniqueName: \"kubernetes.io/projected/cfeab7bf-5c68-43a0-8c09-cbf293e56f35-kube-api-access-hpb45\") pod \"f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96lp95h\" (UID: \"cfeab7bf-5c68-43a0-8c09-cbf293e56f35\") " pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96lp95h" Sep 30 06:31:08 crc kubenswrapper[4691]: I0930 06:31:08.658635 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96lp95h" Sep 30 06:31:08 crc kubenswrapper[4691]: I0930 06:31:08.910408 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96lp95h"] Sep 30 06:31:09 crc kubenswrapper[4691]: I0930 06:31:09.212208 4691 generic.go:334] "Generic (PLEG): container finished" podID="cfeab7bf-5c68-43a0-8c09-cbf293e56f35" containerID="5d8a9d7b716a71740924092407f41890b23985658cb79e9e40fdaeebebfc2400" exitCode=0 Sep 30 06:31:09 crc kubenswrapper[4691]: I0930 06:31:09.212260 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96lp95h" event={"ID":"cfeab7bf-5c68-43a0-8c09-cbf293e56f35","Type":"ContainerDied","Data":"5d8a9d7b716a71740924092407f41890b23985658cb79e9e40fdaeebebfc2400"} Sep 30 06:31:09 crc kubenswrapper[4691]: I0930 06:31:09.212294 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96lp95h" event={"ID":"cfeab7bf-5c68-43a0-8c09-cbf293e56f35","Type":"ContainerStarted","Data":"2c7acb78bd250788d19a4ae8a211a8248566302de7b04bd03d0b810b6362be5e"} Sep 30 06:31:09 crc kubenswrapper[4691]: I0930 06:31:09.238649 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3" path="/var/lib/kubelet/pods/4fe6e337-1fbf-420c-b55a-2cdfc3b1c7c3/volumes" Sep 30 06:31:11 crc kubenswrapper[4691]: I0930 06:31:11.237691 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96lp95h" event={"ID":"cfeab7bf-5c68-43a0-8c09-cbf293e56f35","Type":"ContainerStarted","Data":"6cb88328db24d431a014dba521f610e373ac0a4e50c2e2839551d3a89576a071"} Sep 30 06:31:12 crc kubenswrapper[4691]: I0930 06:31:12.246064 4691 generic.go:334] "Generic (PLEG): container finished" podID="cfeab7bf-5c68-43a0-8c09-cbf293e56f35" containerID="6cb88328db24d431a014dba521f610e373ac0a4e50c2e2839551d3a89576a071" exitCode=0 Sep 30 06:31:12 crc kubenswrapper[4691]: I0930 06:31:12.246140 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96lp95h" event={"ID":"cfeab7bf-5c68-43a0-8c09-cbf293e56f35","Type":"ContainerDied","Data":"6cb88328db24d431a014dba521f610e373ac0a4e50c2e2839551d3a89576a071"} Sep 30 06:31:13 crc kubenswrapper[4691]: I0930 06:31:13.254984 4691 generic.go:334] "Generic (PLEG): container finished" podID="cfeab7bf-5c68-43a0-8c09-cbf293e56f35" containerID="3cfa313991b9eee2a3159e616270fce847922f804fa582fd9990f1f56be88b45" exitCode=0 Sep 30 06:31:13 crc kubenswrapper[4691]: I0930 06:31:13.255025 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96lp95h" event={"ID":"cfeab7bf-5c68-43a0-8c09-cbf293e56f35","Type":"ContainerDied","Data":"3cfa313991b9eee2a3159e616270fce847922f804fa582fd9990f1f56be88b45"} Sep 30 06:31:14 crc kubenswrapper[4691]: I0930 06:31:14.530626 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96lp95h" Sep 30 06:31:14 crc kubenswrapper[4691]: I0930 06:31:14.638821 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hpb45\" (UniqueName: \"kubernetes.io/projected/cfeab7bf-5c68-43a0-8c09-cbf293e56f35-kube-api-access-hpb45\") pod \"cfeab7bf-5c68-43a0-8c09-cbf293e56f35\" (UID: \"cfeab7bf-5c68-43a0-8c09-cbf293e56f35\") " Sep 30 06:31:14 crc kubenswrapper[4691]: I0930 06:31:14.638967 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cfeab7bf-5c68-43a0-8c09-cbf293e56f35-bundle\") pod \"cfeab7bf-5c68-43a0-8c09-cbf293e56f35\" (UID: \"cfeab7bf-5c68-43a0-8c09-cbf293e56f35\") " Sep 30 06:31:14 crc kubenswrapper[4691]: I0930 06:31:14.639042 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cfeab7bf-5c68-43a0-8c09-cbf293e56f35-util\") pod \"cfeab7bf-5c68-43a0-8c09-cbf293e56f35\" (UID: \"cfeab7bf-5c68-43a0-8c09-cbf293e56f35\") " Sep 30 06:31:14 crc kubenswrapper[4691]: I0930 06:31:14.640625 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfeab7bf-5c68-43a0-8c09-cbf293e56f35-bundle" (OuterVolumeSpecName: "bundle") pod "cfeab7bf-5c68-43a0-8c09-cbf293e56f35" (UID: "cfeab7bf-5c68-43a0-8c09-cbf293e56f35"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:31:14 crc kubenswrapper[4691]: I0930 06:31:14.647630 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cfeab7bf-5c68-43a0-8c09-cbf293e56f35-kube-api-access-hpb45" (OuterVolumeSpecName: "kube-api-access-hpb45") pod "cfeab7bf-5c68-43a0-8c09-cbf293e56f35" (UID: "cfeab7bf-5c68-43a0-8c09-cbf293e56f35"). InnerVolumeSpecName "kube-api-access-hpb45". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:31:14 crc kubenswrapper[4691]: I0930 06:31:14.661778 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cfeab7bf-5c68-43a0-8c09-cbf293e56f35-util" (OuterVolumeSpecName: "util") pod "cfeab7bf-5c68-43a0-8c09-cbf293e56f35" (UID: "cfeab7bf-5c68-43a0-8c09-cbf293e56f35"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:31:14 crc kubenswrapper[4691]: I0930 06:31:14.740491 4691 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/cfeab7bf-5c68-43a0-8c09-cbf293e56f35-util\") on node \"crc\" DevicePath \"\"" Sep 30 06:31:14 crc kubenswrapper[4691]: I0930 06:31:14.740528 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hpb45\" (UniqueName: \"kubernetes.io/projected/cfeab7bf-5c68-43a0-8c09-cbf293e56f35-kube-api-access-hpb45\") on node \"crc\" DevicePath \"\"" Sep 30 06:31:14 crc kubenswrapper[4691]: I0930 06:31:14.740544 4691 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/cfeab7bf-5c68-43a0-8c09-cbf293e56f35-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:31:15 crc kubenswrapper[4691]: I0930 06:31:15.276728 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96lp95h" event={"ID":"cfeab7bf-5c68-43a0-8c09-cbf293e56f35","Type":"ContainerDied","Data":"2c7acb78bd250788d19a4ae8a211a8248566302de7b04bd03d0b810b6362be5e"} Sep 30 06:31:15 crc kubenswrapper[4691]: I0930 06:31:15.276776 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c7acb78bd250788d19a4ae8a211a8248566302de7b04bd03d0b810b6362be5e" Sep 30 06:31:15 crc kubenswrapper[4691]: I0930 06:31:15.276785 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96lp95h" Sep 30 06:31:25 crc kubenswrapper[4691]: I0930 06:31:25.513418 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-5b7f74d8d8-lfcgt"] Sep 30 06:31:25 crc kubenswrapper[4691]: E0930 06:31:25.515061 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfeab7bf-5c68-43a0-8c09-cbf293e56f35" containerName="pull" Sep 30 06:31:25 crc kubenswrapper[4691]: I0930 06:31:25.515134 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfeab7bf-5c68-43a0-8c09-cbf293e56f35" containerName="pull" Sep 30 06:31:25 crc kubenswrapper[4691]: E0930 06:31:25.515190 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfeab7bf-5c68-43a0-8c09-cbf293e56f35" containerName="extract" Sep 30 06:31:25 crc kubenswrapper[4691]: I0930 06:31:25.515253 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfeab7bf-5c68-43a0-8c09-cbf293e56f35" containerName="extract" Sep 30 06:31:25 crc kubenswrapper[4691]: E0930 06:31:25.515307 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cfeab7bf-5c68-43a0-8c09-cbf293e56f35" containerName="util" Sep 30 06:31:25 crc kubenswrapper[4691]: I0930 06:31:25.515363 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="cfeab7bf-5c68-43a0-8c09-cbf293e56f35" containerName="util" Sep 30 06:31:25 crc kubenswrapper[4691]: I0930 06:31:25.515516 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="cfeab7bf-5c68-43a0-8c09-cbf293e56f35" containerName="extract" Sep 30 06:31:25 crc kubenswrapper[4691]: I0930 06:31:25.515976 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5b7f74d8d8-lfcgt" Sep 30 06:31:25 crc kubenswrapper[4691]: I0930 06:31:25.518183 4691 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Sep 30 06:31:25 crc kubenswrapper[4691]: I0930 06:31:25.518377 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Sep 30 06:31:25 crc kubenswrapper[4691]: I0930 06:31:25.520685 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Sep 30 06:31:25 crc kubenswrapper[4691]: I0930 06:31:25.521028 4691 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-x5mmn" Sep 30 06:31:25 crc kubenswrapper[4691]: I0930 06:31:25.526626 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5b7f74d8d8-lfcgt"] Sep 30 06:31:25 crc kubenswrapper[4691]: I0930 06:31:25.526926 4691 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Sep 30 06:31:25 crc kubenswrapper[4691]: I0930 06:31:25.581753 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47mnn\" (UniqueName: \"kubernetes.io/projected/fc264033-2e29-41cc-b961-92dbd3230d34-kube-api-access-47mnn\") pod \"metallb-operator-controller-manager-5b7f74d8d8-lfcgt\" (UID: \"fc264033-2e29-41cc-b961-92dbd3230d34\") " pod="metallb-system/metallb-operator-controller-manager-5b7f74d8d8-lfcgt" Sep 30 06:31:25 crc kubenswrapper[4691]: I0930 06:31:25.581811 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fc264033-2e29-41cc-b961-92dbd3230d34-apiservice-cert\") pod \"metallb-operator-controller-manager-5b7f74d8d8-lfcgt\" (UID: \"fc264033-2e29-41cc-b961-92dbd3230d34\") " pod="metallb-system/metallb-operator-controller-manager-5b7f74d8d8-lfcgt" Sep 30 06:31:25 crc kubenswrapper[4691]: I0930 06:31:25.581832 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fc264033-2e29-41cc-b961-92dbd3230d34-webhook-cert\") pod \"metallb-operator-controller-manager-5b7f74d8d8-lfcgt\" (UID: \"fc264033-2e29-41cc-b961-92dbd3230d34\") " pod="metallb-system/metallb-operator-controller-manager-5b7f74d8d8-lfcgt" Sep 30 06:31:25 crc kubenswrapper[4691]: I0930 06:31:25.683261 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47mnn\" (UniqueName: \"kubernetes.io/projected/fc264033-2e29-41cc-b961-92dbd3230d34-kube-api-access-47mnn\") pod \"metallb-operator-controller-manager-5b7f74d8d8-lfcgt\" (UID: \"fc264033-2e29-41cc-b961-92dbd3230d34\") " pod="metallb-system/metallb-operator-controller-manager-5b7f74d8d8-lfcgt" Sep 30 06:31:25 crc kubenswrapper[4691]: I0930 06:31:25.683338 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fc264033-2e29-41cc-b961-92dbd3230d34-apiservice-cert\") pod \"metallb-operator-controller-manager-5b7f74d8d8-lfcgt\" (UID: \"fc264033-2e29-41cc-b961-92dbd3230d34\") " pod="metallb-system/metallb-operator-controller-manager-5b7f74d8d8-lfcgt" Sep 30 06:31:25 crc kubenswrapper[4691]: I0930 06:31:25.683366 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fc264033-2e29-41cc-b961-92dbd3230d34-webhook-cert\") pod \"metallb-operator-controller-manager-5b7f74d8d8-lfcgt\" (UID: \"fc264033-2e29-41cc-b961-92dbd3230d34\") " pod="metallb-system/metallb-operator-controller-manager-5b7f74d8d8-lfcgt" Sep 30 06:31:25 crc kubenswrapper[4691]: I0930 06:31:25.689776 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fc264033-2e29-41cc-b961-92dbd3230d34-apiservice-cert\") pod \"metallb-operator-controller-manager-5b7f74d8d8-lfcgt\" (UID: \"fc264033-2e29-41cc-b961-92dbd3230d34\") " pod="metallb-system/metallb-operator-controller-manager-5b7f74d8d8-lfcgt" Sep 30 06:31:25 crc kubenswrapper[4691]: I0930 06:31:25.704324 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47mnn\" (UniqueName: \"kubernetes.io/projected/fc264033-2e29-41cc-b961-92dbd3230d34-kube-api-access-47mnn\") pod \"metallb-operator-controller-manager-5b7f74d8d8-lfcgt\" (UID: \"fc264033-2e29-41cc-b961-92dbd3230d34\") " pod="metallb-system/metallb-operator-controller-manager-5b7f74d8d8-lfcgt" Sep 30 06:31:25 crc kubenswrapper[4691]: I0930 06:31:25.708534 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fc264033-2e29-41cc-b961-92dbd3230d34-webhook-cert\") pod \"metallb-operator-controller-manager-5b7f74d8d8-lfcgt\" (UID: \"fc264033-2e29-41cc-b961-92dbd3230d34\") " pod="metallb-system/metallb-operator-controller-manager-5b7f74d8d8-lfcgt" Sep 30 06:31:25 crc kubenswrapper[4691]: I0930 06:31:25.836837 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-5b7f74d8d8-lfcgt" Sep 30 06:31:25 crc kubenswrapper[4691]: I0930 06:31:25.923901 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-b8f956b88-zp5fz"] Sep 30 06:31:25 crc kubenswrapper[4691]: I0930 06:31:25.924585 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-b8f956b88-zp5fz" Sep 30 06:31:25 crc kubenswrapper[4691]: I0930 06:31:25.926841 4691 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-l8gd7" Sep 30 06:31:25 crc kubenswrapper[4691]: I0930 06:31:25.926993 4691 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Sep 30 06:31:25 crc kubenswrapper[4691]: I0930 06:31:25.928565 4691 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Sep 30 06:31:25 crc kubenswrapper[4691]: I0930 06:31:25.979832 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-b8f956b88-zp5fz"] Sep 30 06:31:26 crc kubenswrapper[4691]: I0930 06:31:26.002023 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xz5x\" (UniqueName: \"kubernetes.io/projected/164027cf-f7af-41cc-bbd2-e3a725230c9e-kube-api-access-2xz5x\") pod \"metallb-operator-webhook-server-b8f956b88-zp5fz\" (UID: \"164027cf-f7af-41cc-bbd2-e3a725230c9e\") " pod="metallb-system/metallb-operator-webhook-server-b8f956b88-zp5fz" Sep 30 06:31:26 crc kubenswrapper[4691]: I0930 06:31:26.002113 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/164027cf-f7af-41cc-bbd2-e3a725230c9e-apiservice-cert\") pod \"metallb-operator-webhook-server-b8f956b88-zp5fz\" (UID: \"164027cf-f7af-41cc-bbd2-e3a725230c9e\") " pod="metallb-system/metallb-operator-webhook-server-b8f956b88-zp5fz" Sep 30 06:31:26 crc kubenswrapper[4691]: I0930 06:31:26.002177 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/164027cf-f7af-41cc-bbd2-e3a725230c9e-webhook-cert\") pod \"metallb-operator-webhook-server-b8f956b88-zp5fz\" (UID: \"164027cf-f7af-41cc-bbd2-e3a725230c9e\") " pod="metallb-system/metallb-operator-webhook-server-b8f956b88-zp5fz" Sep 30 06:31:26 crc kubenswrapper[4691]: I0930 06:31:26.103511 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xz5x\" (UniqueName: \"kubernetes.io/projected/164027cf-f7af-41cc-bbd2-e3a725230c9e-kube-api-access-2xz5x\") pod \"metallb-operator-webhook-server-b8f956b88-zp5fz\" (UID: \"164027cf-f7af-41cc-bbd2-e3a725230c9e\") " pod="metallb-system/metallb-operator-webhook-server-b8f956b88-zp5fz" Sep 30 06:31:26 crc kubenswrapper[4691]: I0930 06:31:26.103575 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/164027cf-f7af-41cc-bbd2-e3a725230c9e-apiservice-cert\") pod \"metallb-operator-webhook-server-b8f956b88-zp5fz\" (UID: \"164027cf-f7af-41cc-bbd2-e3a725230c9e\") " pod="metallb-system/metallb-operator-webhook-server-b8f956b88-zp5fz" Sep 30 06:31:26 crc kubenswrapper[4691]: I0930 06:31:26.103621 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/164027cf-f7af-41cc-bbd2-e3a725230c9e-webhook-cert\") pod \"metallb-operator-webhook-server-b8f956b88-zp5fz\" (UID: \"164027cf-f7af-41cc-bbd2-e3a725230c9e\") " pod="metallb-system/metallb-operator-webhook-server-b8f956b88-zp5fz" Sep 30 06:31:26 crc kubenswrapper[4691]: I0930 06:31:26.107381 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/164027cf-f7af-41cc-bbd2-e3a725230c9e-apiservice-cert\") pod \"metallb-operator-webhook-server-b8f956b88-zp5fz\" (UID: \"164027cf-f7af-41cc-bbd2-e3a725230c9e\") " pod="metallb-system/metallb-operator-webhook-server-b8f956b88-zp5fz" Sep 30 06:31:26 crc kubenswrapper[4691]: I0930 06:31:26.108047 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/164027cf-f7af-41cc-bbd2-e3a725230c9e-webhook-cert\") pod \"metallb-operator-webhook-server-b8f956b88-zp5fz\" (UID: \"164027cf-f7af-41cc-bbd2-e3a725230c9e\") " pod="metallb-system/metallb-operator-webhook-server-b8f956b88-zp5fz" Sep 30 06:31:26 crc kubenswrapper[4691]: I0930 06:31:26.120252 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xz5x\" (UniqueName: \"kubernetes.io/projected/164027cf-f7af-41cc-bbd2-e3a725230c9e-kube-api-access-2xz5x\") pod \"metallb-operator-webhook-server-b8f956b88-zp5fz\" (UID: \"164027cf-f7af-41cc-bbd2-e3a725230c9e\") " pod="metallb-system/metallb-operator-webhook-server-b8f956b88-zp5fz" Sep 30 06:31:26 crc kubenswrapper[4691]: I0930 06:31:26.239931 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-b8f956b88-zp5fz" Sep 30 06:31:26 crc kubenswrapper[4691]: I0930 06:31:26.370609 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-5b7f74d8d8-lfcgt"] Sep 30 06:31:26 crc kubenswrapper[4691]: I0930 06:31:26.538870 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-b8f956b88-zp5fz"] Sep 30 06:31:26 crc kubenswrapper[4691]: W0930 06:31:26.543109 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod164027cf_f7af_41cc_bbd2_e3a725230c9e.slice/crio-439dd9609d61676e94c7c1c9c4d2c88734a4bce6e96ac7ed0ec1990c99f2d0ee WatchSource:0}: Error finding container 439dd9609d61676e94c7c1c9c4d2c88734a4bce6e96ac7ed0ec1990c99f2d0ee: Status 404 returned error can't find the container with id 439dd9609d61676e94c7c1c9c4d2c88734a4bce6e96ac7ed0ec1990c99f2d0ee Sep 30 06:31:27 crc kubenswrapper[4691]: I0930 06:31:27.355083 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-b8f956b88-zp5fz" event={"ID":"164027cf-f7af-41cc-bbd2-e3a725230c9e","Type":"ContainerStarted","Data":"439dd9609d61676e94c7c1c9c4d2c88734a4bce6e96ac7ed0ec1990c99f2d0ee"} Sep 30 06:31:27 crc kubenswrapper[4691]: I0930 06:31:27.356617 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5b7f74d8d8-lfcgt" event={"ID":"fc264033-2e29-41cc-b961-92dbd3230d34","Type":"ContainerStarted","Data":"86e90ffda768ea8cbb102f1ef6680c1bc3c8ffc7bced5cbd0e7471546e475abd"} Sep 30 06:31:32 crc kubenswrapper[4691]: I0930 06:31:32.404458 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-5b7f74d8d8-lfcgt" event={"ID":"fc264033-2e29-41cc-b961-92dbd3230d34","Type":"ContainerStarted","Data":"2c1fb217ac3dc36bd4c8c76560a24eb25d8ffb0e5c8e51e486c4b7434c4d1491"} Sep 30 06:31:32 crc kubenswrapper[4691]: I0930 06:31:32.405796 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-5b7f74d8d8-lfcgt" Sep 30 06:31:32 crc kubenswrapper[4691]: I0930 06:31:32.407151 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-b8f956b88-zp5fz" event={"ID":"164027cf-f7af-41cc-bbd2-e3a725230c9e","Type":"ContainerStarted","Data":"1f1b30b121f9ef666fbf8c2eb9f6fd96ba92b5b88cb5f8b130112924d09bd2f3"} Sep 30 06:31:32 crc kubenswrapper[4691]: I0930 06:31:32.407789 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-b8f956b88-zp5fz" Sep 30 06:31:32 crc kubenswrapper[4691]: I0930 06:31:32.431471 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-5b7f74d8d8-lfcgt" podStartSLOduration=2.456950398 podStartE2EDuration="7.431445802s" podCreationTimestamp="2025-09-30 06:31:25 +0000 UTC" firstStartedPulling="2025-09-30 06:31:26.393162702 +0000 UTC m=+729.868183742" lastFinishedPulling="2025-09-30 06:31:31.367658106 +0000 UTC m=+734.842679146" observedRunningTime="2025-09-30 06:31:32.42853863 +0000 UTC m=+735.903559750" watchObservedRunningTime="2025-09-30 06:31:32.431445802 +0000 UTC m=+735.906466862" Sep 30 06:31:32 crc kubenswrapper[4691]: I0930 06:31:32.459671 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-b8f956b88-zp5fz" podStartSLOduration=2.617972494 podStartE2EDuration="7.459640908s" podCreationTimestamp="2025-09-30 06:31:25 +0000 UTC" firstStartedPulling="2025-09-30 06:31:26.545497142 +0000 UTC m=+730.020518172" lastFinishedPulling="2025-09-30 06:31:31.387165546 +0000 UTC m=+734.862186586" observedRunningTime="2025-09-30 06:31:32.451441517 +0000 UTC m=+735.926462617" watchObservedRunningTime="2025-09-30 06:31:32.459640908 +0000 UTC m=+735.934661978" Sep 30 06:31:46 crc kubenswrapper[4691]: I0930 06:31:46.254357 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-b8f956b88-zp5fz" Sep 30 06:31:51 crc kubenswrapper[4691]: I0930 06:31:51.752998 4691 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Sep 30 06:32:05 crc kubenswrapper[4691]: I0930 06:32:05.840678 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-5b7f74d8d8-lfcgt" Sep 30 06:32:06 crc kubenswrapper[4691]: I0930 06:32:06.667059 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-jbzhs"] Sep 30 06:32:06 crc kubenswrapper[4691]: I0930 06:32:06.677657 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-5478bdb765-4szpr"] Sep 30 06:32:06 crc kubenswrapper[4691]: I0930 06:32:06.677788 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-jbzhs" Sep 30 06:32:06 crc kubenswrapper[4691]: I0930 06:32:06.681710 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-4szpr" Sep 30 06:32:06 crc kubenswrapper[4691]: I0930 06:32:06.689598 4691 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-2w6q5" Sep 30 06:32:06 crc kubenswrapper[4691]: I0930 06:32:06.693595 4691 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Sep 30 06:32:06 crc kubenswrapper[4691]: I0930 06:32:06.693824 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Sep 30 06:32:06 crc kubenswrapper[4691]: I0930 06:32:06.694046 4691 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Sep 30 06:32:06 crc kubenswrapper[4691]: I0930 06:32:06.715805 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-5478bdb765-4szpr"] Sep 30 06:32:06 crc kubenswrapper[4691]: I0930 06:32:06.773376 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-vqw66"] Sep 30 06:32:06 crc kubenswrapper[4691]: I0930 06:32:06.774224 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-vqw66" Sep 30 06:32:06 crc kubenswrapper[4691]: I0930 06:32:06.776214 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Sep 30 06:32:06 crc kubenswrapper[4691]: I0930 06:32:06.776684 4691 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Sep 30 06:32:06 crc kubenswrapper[4691]: I0930 06:32:06.776980 4691 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Sep 30 06:32:06 crc kubenswrapper[4691]: I0930 06:32:06.778323 4691 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-nslmj" Sep 30 06:32:06 crc kubenswrapper[4691]: I0930 06:32:06.790689 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-5d688f5ffc-52gd8"] Sep 30 06:32:06 crc kubenswrapper[4691]: I0930 06:32:06.791624 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5d688f5ffc-52gd8" Sep 30 06:32:06 crc kubenswrapper[4691]: I0930 06:32:06.794694 4691 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Sep 30 06:32:06 crc kubenswrapper[4691]: I0930 06:32:06.797383 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/9949a206-ebbd-42f2-8b22-8dfcf266b934-frr-startup\") pod \"frr-k8s-jbzhs\" (UID: \"9949a206-ebbd-42f2-8b22-8dfcf266b934\") " pod="metallb-system/frr-k8s-jbzhs" Sep 30 06:32:06 crc kubenswrapper[4691]: I0930 06:32:06.797430 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/45188dc2-6524-4d87-bfaa-676d46684df8-cert\") pod \"frr-k8s-webhook-server-5478bdb765-4szpr\" (UID: \"45188dc2-6524-4d87-bfaa-676d46684df8\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-4szpr" Sep 30 06:32:06 crc kubenswrapper[4691]: I0930 06:32:06.797451 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9949a206-ebbd-42f2-8b22-8dfcf266b934-metrics-certs\") pod \"frr-k8s-jbzhs\" (UID: \"9949a206-ebbd-42f2-8b22-8dfcf266b934\") " pod="metallb-system/frr-k8s-jbzhs" Sep 30 06:32:06 crc kubenswrapper[4691]: I0930 06:32:06.797472 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/9949a206-ebbd-42f2-8b22-8dfcf266b934-frr-sockets\") pod \"frr-k8s-jbzhs\" (UID: \"9949a206-ebbd-42f2-8b22-8dfcf266b934\") " pod="metallb-system/frr-k8s-jbzhs" Sep 30 06:32:06 crc kubenswrapper[4691]: I0930 06:32:06.797489 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/9949a206-ebbd-42f2-8b22-8dfcf266b934-metrics\") pod \"frr-k8s-jbzhs\" (UID: \"9949a206-ebbd-42f2-8b22-8dfcf266b934\") " pod="metallb-system/frr-k8s-jbzhs" Sep 30 06:32:06 crc kubenswrapper[4691]: I0930 06:32:06.797511 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6rlx\" (UniqueName: \"kubernetes.io/projected/45188dc2-6524-4d87-bfaa-676d46684df8-kube-api-access-c6rlx\") pod \"frr-k8s-webhook-server-5478bdb765-4szpr\" (UID: \"45188dc2-6524-4d87-bfaa-676d46684df8\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-4szpr" Sep 30 06:32:06 crc kubenswrapper[4691]: I0930 06:32:06.797530 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/9949a206-ebbd-42f2-8b22-8dfcf266b934-reloader\") pod \"frr-k8s-jbzhs\" (UID: \"9949a206-ebbd-42f2-8b22-8dfcf266b934\") " pod="metallb-system/frr-k8s-jbzhs" Sep 30 06:32:06 crc kubenswrapper[4691]: I0930 06:32:06.797551 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zs62x\" (UniqueName: \"kubernetes.io/projected/9949a206-ebbd-42f2-8b22-8dfcf266b934-kube-api-access-zs62x\") pod \"frr-k8s-jbzhs\" (UID: \"9949a206-ebbd-42f2-8b22-8dfcf266b934\") " pod="metallb-system/frr-k8s-jbzhs" Sep 30 06:32:06 crc kubenswrapper[4691]: I0930 06:32:06.797569 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/9949a206-ebbd-42f2-8b22-8dfcf266b934-frr-conf\") pod \"frr-k8s-jbzhs\" (UID: \"9949a206-ebbd-42f2-8b22-8dfcf266b934\") " pod="metallb-system/frr-k8s-jbzhs" Sep 30 06:32:06 crc kubenswrapper[4691]: I0930 06:32:06.808494 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5d688f5ffc-52gd8"] Sep 30 06:32:06 crc kubenswrapper[4691]: I0930 06:32:06.903214 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmqk9\" (UniqueName: \"kubernetes.io/projected/238ed092-dc40-4e7d-add0-854dd611a65f-kube-api-access-zmqk9\") pod \"controller-5d688f5ffc-52gd8\" (UID: \"238ed092-dc40-4e7d-add0-854dd611a65f\") " pod="metallb-system/controller-5d688f5ffc-52gd8" Sep 30 06:32:06 crc kubenswrapper[4691]: I0930 06:32:06.903273 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cbbc08d3-615a-450a-9a3e-7f0aba1c5ff6-metrics-certs\") pod \"speaker-vqw66\" (UID: \"cbbc08d3-615a-450a-9a3e-7f0aba1c5ff6\") " pod="metallb-system/speaker-vqw66" Sep 30 06:32:06 crc kubenswrapper[4691]: I0930 06:32:06.903307 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/45188dc2-6524-4d87-bfaa-676d46684df8-cert\") pod \"frr-k8s-webhook-server-5478bdb765-4szpr\" (UID: \"45188dc2-6524-4d87-bfaa-676d46684df8\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-4szpr" Sep 30 06:32:06 crc kubenswrapper[4691]: I0930 06:32:06.903336 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9949a206-ebbd-42f2-8b22-8dfcf266b934-metrics-certs\") pod \"frr-k8s-jbzhs\" (UID: \"9949a206-ebbd-42f2-8b22-8dfcf266b934\") " pod="metallb-system/frr-k8s-jbzhs" Sep 30 06:32:06 crc kubenswrapper[4691]: I0930 06:32:06.903369 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/9949a206-ebbd-42f2-8b22-8dfcf266b934-frr-sockets\") pod \"frr-k8s-jbzhs\" (UID: \"9949a206-ebbd-42f2-8b22-8dfcf266b934\") " pod="metallb-system/frr-k8s-jbzhs" Sep 30 06:32:06 crc kubenswrapper[4691]: I0930 06:32:06.903395 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/9949a206-ebbd-42f2-8b22-8dfcf266b934-metrics\") pod \"frr-k8s-jbzhs\" (UID: \"9949a206-ebbd-42f2-8b22-8dfcf266b934\") " pod="metallb-system/frr-k8s-jbzhs" Sep 30 06:32:06 crc kubenswrapper[4691]: I0930 06:32:06.903427 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6rlx\" (UniqueName: \"kubernetes.io/projected/45188dc2-6524-4d87-bfaa-676d46684df8-kube-api-access-c6rlx\") pod \"frr-k8s-webhook-server-5478bdb765-4szpr\" (UID: \"45188dc2-6524-4d87-bfaa-676d46684df8\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-4szpr" Sep 30 06:32:06 crc kubenswrapper[4691]: I0930 06:32:06.903455 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/238ed092-dc40-4e7d-add0-854dd611a65f-cert\") pod \"controller-5d688f5ffc-52gd8\" (UID: \"238ed092-dc40-4e7d-add0-854dd611a65f\") " pod="metallb-system/controller-5d688f5ffc-52gd8" Sep 30 06:32:06 crc kubenswrapper[4691]: I0930 06:32:06.903480 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/9949a206-ebbd-42f2-8b22-8dfcf266b934-reloader\") pod \"frr-k8s-jbzhs\" (UID: \"9949a206-ebbd-42f2-8b22-8dfcf266b934\") " pod="metallb-system/frr-k8s-jbzhs" Sep 30 06:32:06 crc kubenswrapper[4691]: I0930 06:32:06.903506 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w898r\" (UniqueName: \"kubernetes.io/projected/cbbc08d3-615a-450a-9a3e-7f0aba1c5ff6-kube-api-access-w898r\") pod \"speaker-vqw66\" (UID: \"cbbc08d3-615a-450a-9a3e-7f0aba1c5ff6\") " pod="metallb-system/speaker-vqw66" Sep 30 06:32:06 crc kubenswrapper[4691]: I0930 06:32:06.903534 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/238ed092-dc40-4e7d-add0-854dd611a65f-metrics-certs\") pod \"controller-5d688f5ffc-52gd8\" (UID: \"238ed092-dc40-4e7d-add0-854dd611a65f\") " pod="metallb-system/controller-5d688f5ffc-52gd8" Sep 30 06:32:06 crc kubenswrapper[4691]: I0930 06:32:06.903562 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zs62x\" (UniqueName: \"kubernetes.io/projected/9949a206-ebbd-42f2-8b22-8dfcf266b934-kube-api-access-zs62x\") pod \"frr-k8s-jbzhs\" (UID: \"9949a206-ebbd-42f2-8b22-8dfcf266b934\") " pod="metallb-system/frr-k8s-jbzhs" Sep 30 06:32:06 crc kubenswrapper[4691]: I0930 06:32:06.903592 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/9949a206-ebbd-42f2-8b22-8dfcf266b934-frr-conf\") pod \"frr-k8s-jbzhs\" (UID: \"9949a206-ebbd-42f2-8b22-8dfcf266b934\") " pod="metallb-system/frr-k8s-jbzhs" Sep 30 06:32:06 crc kubenswrapper[4691]: I0930 06:32:06.903622 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/cbbc08d3-615a-450a-9a3e-7f0aba1c5ff6-memberlist\") pod \"speaker-vqw66\" (UID: \"cbbc08d3-615a-450a-9a3e-7f0aba1c5ff6\") " pod="metallb-system/speaker-vqw66" Sep 30 06:32:06 crc kubenswrapper[4691]: I0930 06:32:06.903675 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/9949a206-ebbd-42f2-8b22-8dfcf266b934-frr-startup\") pod \"frr-k8s-jbzhs\" (UID: \"9949a206-ebbd-42f2-8b22-8dfcf266b934\") " pod="metallb-system/frr-k8s-jbzhs" Sep 30 06:32:06 crc kubenswrapper[4691]: I0930 06:32:06.903708 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/cbbc08d3-615a-450a-9a3e-7f0aba1c5ff6-metallb-excludel2\") pod \"speaker-vqw66\" (UID: \"cbbc08d3-615a-450a-9a3e-7f0aba1c5ff6\") " pod="metallb-system/speaker-vqw66" Sep 30 06:32:06 crc kubenswrapper[4691]: I0930 06:32:06.903856 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/9949a206-ebbd-42f2-8b22-8dfcf266b934-frr-sockets\") pod \"frr-k8s-jbzhs\" (UID: \"9949a206-ebbd-42f2-8b22-8dfcf266b934\") " pod="metallb-system/frr-k8s-jbzhs" Sep 30 06:32:06 crc kubenswrapper[4691]: I0930 06:32:06.903951 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/9949a206-ebbd-42f2-8b22-8dfcf266b934-metrics\") pod \"frr-k8s-jbzhs\" (UID: \"9949a206-ebbd-42f2-8b22-8dfcf266b934\") " pod="metallb-system/frr-k8s-jbzhs" Sep 30 06:32:06 crc kubenswrapper[4691]: I0930 06:32:06.904424 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/9949a206-ebbd-42f2-8b22-8dfcf266b934-reloader\") pod \"frr-k8s-jbzhs\" (UID: \"9949a206-ebbd-42f2-8b22-8dfcf266b934\") " pod="metallb-system/frr-k8s-jbzhs" Sep 30 06:32:06 crc kubenswrapper[4691]: I0930 06:32:06.904463 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/9949a206-ebbd-42f2-8b22-8dfcf266b934-frr-conf\") pod \"frr-k8s-jbzhs\" (UID: \"9949a206-ebbd-42f2-8b22-8dfcf266b934\") " pod="metallb-system/frr-k8s-jbzhs" Sep 30 06:32:06 crc kubenswrapper[4691]: I0930 06:32:06.904746 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/9949a206-ebbd-42f2-8b22-8dfcf266b934-frr-startup\") pod \"frr-k8s-jbzhs\" (UID: \"9949a206-ebbd-42f2-8b22-8dfcf266b934\") " pod="metallb-system/frr-k8s-jbzhs" Sep 30 06:32:06 crc kubenswrapper[4691]: I0930 06:32:06.909042 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9949a206-ebbd-42f2-8b22-8dfcf266b934-metrics-certs\") pod \"frr-k8s-jbzhs\" (UID: \"9949a206-ebbd-42f2-8b22-8dfcf266b934\") " pod="metallb-system/frr-k8s-jbzhs" Sep 30 06:32:06 crc kubenswrapper[4691]: I0930 06:32:06.909169 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/45188dc2-6524-4d87-bfaa-676d46684df8-cert\") pod \"frr-k8s-webhook-server-5478bdb765-4szpr\" (UID: \"45188dc2-6524-4d87-bfaa-676d46684df8\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-4szpr" Sep 30 06:32:06 crc kubenswrapper[4691]: I0930 06:32:06.924787 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6rlx\" (UniqueName: \"kubernetes.io/projected/45188dc2-6524-4d87-bfaa-676d46684df8-kube-api-access-c6rlx\") pod \"frr-k8s-webhook-server-5478bdb765-4szpr\" (UID: \"45188dc2-6524-4d87-bfaa-676d46684df8\") " pod="metallb-system/frr-k8s-webhook-server-5478bdb765-4szpr" Sep 30 06:32:06 crc kubenswrapper[4691]: I0930 06:32:06.939803 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zs62x\" (UniqueName: \"kubernetes.io/projected/9949a206-ebbd-42f2-8b22-8dfcf266b934-kube-api-access-zs62x\") pod \"frr-k8s-jbzhs\" (UID: \"9949a206-ebbd-42f2-8b22-8dfcf266b934\") " pod="metallb-system/frr-k8s-jbzhs" Sep 30 06:32:07 crc kubenswrapper[4691]: I0930 06:32:07.004965 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/cbbc08d3-615a-450a-9a3e-7f0aba1c5ff6-memberlist\") pod \"speaker-vqw66\" (UID: \"cbbc08d3-615a-450a-9a3e-7f0aba1c5ff6\") " pod="metallb-system/speaker-vqw66" Sep 30 06:32:07 crc kubenswrapper[4691]: I0930 06:32:07.005061 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/cbbc08d3-615a-450a-9a3e-7f0aba1c5ff6-metallb-excludel2\") pod \"speaker-vqw66\" (UID: \"cbbc08d3-615a-450a-9a3e-7f0aba1c5ff6\") " pod="metallb-system/speaker-vqw66" Sep 30 06:32:07 crc kubenswrapper[4691]: I0930 06:32:07.005117 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmqk9\" (UniqueName: \"kubernetes.io/projected/238ed092-dc40-4e7d-add0-854dd611a65f-kube-api-access-zmqk9\") pod \"controller-5d688f5ffc-52gd8\" (UID: \"238ed092-dc40-4e7d-add0-854dd611a65f\") " pod="metallb-system/controller-5d688f5ffc-52gd8" Sep 30 06:32:07 crc kubenswrapper[4691]: I0930 06:32:07.005159 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cbbc08d3-615a-450a-9a3e-7f0aba1c5ff6-metrics-certs\") pod \"speaker-vqw66\" (UID: \"cbbc08d3-615a-450a-9a3e-7f0aba1c5ff6\") " pod="metallb-system/speaker-vqw66" Sep 30 06:32:07 crc kubenswrapper[4691]: E0930 06:32:07.005226 4691 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Sep 30 06:32:07 crc kubenswrapper[4691]: I0930 06:32:07.005241 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/238ed092-dc40-4e7d-add0-854dd611a65f-cert\") pod \"controller-5d688f5ffc-52gd8\" (UID: \"238ed092-dc40-4e7d-add0-854dd611a65f\") " pod="metallb-system/controller-5d688f5ffc-52gd8" Sep 30 06:32:07 crc kubenswrapper[4691]: E0930 06:32:07.005331 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbbc08d3-615a-450a-9a3e-7f0aba1c5ff6-memberlist podName:cbbc08d3-615a-450a-9a3e-7f0aba1c5ff6 nodeName:}" failed. No retries permitted until 2025-09-30 06:32:07.50530114 +0000 UTC m=+770.980322220 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/cbbc08d3-615a-450a-9a3e-7f0aba1c5ff6-memberlist") pod "speaker-vqw66" (UID: "cbbc08d3-615a-450a-9a3e-7f0aba1c5ff6") : secret "metallb-memberlist" not found Sep 30 06:32:07 crc kubenswrapper[4691]: I0930 06:32:07.005387 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w898r\" (UniqueName: \"kubernetes.io/projected/cbbc08d3-615a-450a-9a3e-7f0aba1c5ff6-kube-api-access-w898r\") pod \"speaker-vqw66\" (UID: \"cbbc08d3-615a-450a-9a3e-7f0aba1c5ff6\") " pod="metallb-system/speaker-vqw66" Sep 30 06:32:07 crc kubenswrapper[4691]: I0930 06:32:07.005427 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/238ed092-dc40-4e7d-add0-854dd611a65f-metrics-certs\") pod \"controller-5d688f5ffc-52gd8\" (UID: \"238ed092-dc40-4e7d-add0-854dd611a65f\") " pod="metallb-system/controller-5d688f5ffc-52gd8" Sep 30 06:32:07 crc kubenswrapper[4691]: I0930 06:32:07.006045 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/cbbc08d3-615a-450a-9a3e-7f0aba1c5ff6-metallb-excludel2\") pod \"speaker-vqw66\" (UID: \"cbbc08d3-615a-450a-9a3e-7f0aba1c5ff6\") " pod="metallb-system/speaker-vqw66" Sep 30 06:32:07 crc kubenswrapper[4691]: I0930 06:32:07.007325 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-jbzhs" Sep 30 06:32:07 crc kubenswrapper[4691]: I0930 06:32:07.008511 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/238ed092-dc40-4e7d-add0-854dd611a65f-cert\") pod \"controller-5d688f5ffc-52gd8\" (UID: \"238ed092-dc40-4e7d-add0-854dd611a65f\") " pod="metallb-system/controller-5d688f5ffc-52gd8" Sep 30 06:32:07 crc kubenswrapper[4691]: I0930 06:32:07.008729 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/cbbc08d3-615a-450a-9a3e-7f0aba1c5ff6-metrics-certs\") pod \"speaker-vqw66\" (UID: \"cbbc08d3-615a-450a-9a3e-7f0aba1c5ff6\") " pod="metallb-system/speaker-vqw66" Sep 30 06:32:07 crc kubenswrapper[4691]: I0930 06:32:07.009347 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/238ed092-dc40-4e7d-add0-854dd611a65f-metrics-certs\") pod \"controller-5d688f5ffc-52gd8\" (UID: \"238ed092-dc40-4e7d-add0-854dd611a65f\") " pod="metallb-system/controller-5d688f5ffc-52gd8" Sep 30 06:32:07 crc kubenswrapper[4691]: I0930 06:32:07.016522 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-4szpr" Sep 30 06:32:07 crc kubenswrapper[4691]: I0930 06:32:07.025947 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmqk9\" (UniqueName: \"kubernetes.io/projected/238ed092-dc40-4e7d-add0-854dd611a65f-kube-api-access-zmqk9\") pod \"controller-5d688f5ffc-52gd8\" (UID: \"238ed092-dc40-4e7d-add0-854dd611a65f\") " pod="metallb-system/controller-5d688f5ffc-52gd8" Sep 30 06:32:07 crc kubenswrapper[4691]: I0930 06:32:07.027489 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w898r\" (UniqueName: \"kubernetes.io/projected/cbbc08d3-615a-450a-9a3e-7f0aba1c5ff6-kube-api-access-w898r\") pod \"speaker-vqw66\" (UID: \"cbbc08d3-615a-450a-9a3e-7f0aba1c5ff6\") " pod="metallb-system/speaker-vqw66" Sep 30 06:32:07 crc kubenswrapper[4691]: I0930 06:32:07.105774 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-5d688f5ffc-52gd8" Sep 30 06:32:07 crc kubenswrapper[4691]: W0930 06:32:07.425169 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod238ed092_dc40_4e7d_add0_854dd611a65f.slice/crio-5a12b99096a7519b20b4617f006cfc766c205f530055e31d08e16bb4bfbdc114 WatchSource:0}: Error finding container 5a12b99096a7519b20b4617f006cfc766c205f530055e31d08e16bb4bfbdc114: Status 404 returned error can't find the container with id 5a12b99096a7519b20b4617f006cfc766c205f530055e31d08e16bb4bfbdc114 Sep 30 06:32:07 crc kubenswrapper[4691]: I0930 06:32:07.425543 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-5d688f5ffc-52gd8"] Sep 30 06:32:07 crc kubenswrapper[4691]: I0930 06:32:07.510957 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/cbbc08d3-615a-450a-9a3e-7f0aba1c5ff6-memberlist\") pod \"speaker-vqw66\" (UID: \"cbbc08d3-615a-450a-9a3e-7f0aba1c5ff6\") " pod="metallb-system/speaker-vqw66" Sep 30 06:32:07 crc kubenswrapper[4691]: E0930 06:32:07.511085 4691 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Sep 30 06:32:07 crc kubenswrapper[4691]: E0930 06:32:07.511154 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cbbc08d3-615a-450a-9a3e-7f0aba1c5ff6-memberlist podName:cbbc08d3-615a-450a-9a3e-7f0aba1c5ff6 nodeName:}" failed. No retries permitted until 2025-09-30 06:32:08.51113578 +0000 UTC m=+771.986156820 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/cbbc08d3-615a-450a-9a3e-7f0aba1c5ff6-memberlist") pod "speaker-vqw66" (UID: "cbbc08d3-615a-450a-9a3e-7f0aba1c5ff6") : secret "metallb-memberlist" not found Sep 30 06:32:07 crc kubenswrapper[4691]: I0930 06:32:07.619362 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-5478bdb765-4szpr"] Sep 30 06:32:07 crc kubenswrapper[4691]: W0930 06:32:07.629422 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod45188dc2_6524_4d87_bfaa_676d46684df8.slice/crio-20ee411a8d80bbacc62a0457f57362a156ba2f1c26a567e183aea7424b5e4f88 WatchSource:0}: Error finding container 20ee411a8d80bbacc62a0457f57362a156ba2f1c26a567e183aea7424b5e4f88: Status 404 returned error can't find the container with id 20ee411a8d80bbacc62a0457f57362a156ba2f1c26a567e183aea7424b5e4f88 Sep 30 06:32:07 crc kubenswrapper[4691]: I0930 06:32:07.653700 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jbzhs" event={"ID":"9949a206-ebbd-42f2-8b22-8dfcf266b934","Type":"ContainerStarted","Data":"fbd21606ad264b61a0254dc30f1526c7a4ba056ab15b1f5425ff33899d4add6d"} Sep 30 06:32:07 crc kubenswrapper[4691]: I0930 06:32:07.655280 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-4szpr" event={"ID":"45188dc2-6524-4d87-bfaa-676d46684df8","Type":"ContainerStarted","Data":"20ee411a8d80bbacc62a0457f57362a156ba2f1c26a567e183aea7424b5e4f88"} Sep 30 06:32:07 crc kubenswrapper[4691]: I0930 06:32:07.657353 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5d688f5ffc-52gd8" event={"ID":"238ed092-dc40-4e7d-add0-854dd611a65f","Type":"ContainerStarted","Data":"6bfd37da462589e884eb0e0b7ee5dc76bf08066bb6e65e0e604d3e056286cfa8"} Sep 30 06:32:07 crc kubenswrapper[4691]: I0930 06:32:07.657398 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5d688f5ffc-52gd8" event={"ID":"238ed092-dc40-4e7d-add0-854dd611a65f","Type":"ContainerStarted","Data":"5a12b99096a7519b20b4617f006cfc766c205f530055e31d08e16bb4bfbdc114"} Sep 30 06:32:08 crc kubenswrapper[4691]: I0930 06:32:08.522533 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/cbbc08d3-615a-450a-9a3e-7f0aba1c5ff6-memberlist\") pod \"speaker-vqw66\" (UID: \"cbbc08d3-615a-450a-9a3e-7f0aba1c5ff6\") " pod="metallb-system/speaker-vqw66" Sep 30 06:32:08 crc kubenswrapper[4691]: I0930 06:32:08.537837 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/cbbc08d3-615a-450a-9a3e-7f0aba1c5ff6-memberlist\") pod \"speaker-vqw66\" (UID: \"cbbc08d3-615a-450a-9a3e-7f0aba1c5ff6\") " pod="metallb-system/speaker-vqw66" Sep 30 06:32:08 crc kubenswrapper[4691]: I0930 06:32:08.586447 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-vqw66" Sep 30 06:32:08 crc kubenswrapper[4691]: W0930 06:32:08.632369 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbbc08d3_615a_450a_9a3e_7f0aba1c5ff6.slice/crio-d2c17738461e5e6ed9bfa142b6c697daaa6dbff56b61f120e299709ea67737ac WatchSource:0}: Error finding container d2c17738461e5e6ed9bfa142b6c697daaa6dbff56b61f120e299709ea67737ac: Status 404 returned error can't find the container with id d2c17738461e5e6ed9bfa142b6c697daaa6dbff56b61f120e299709ea67737ac Sep 30 06:32:08 crc kubenswrapper[4691]: I0930 06:32:08.664291 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-vqw66" event={"ID":"cbbc08d3-615a-450a-9a3e-7f0aba1c5ff6","Type":"ContainerStarted","Data":"d2c17738461e5e6ed9bfa142b6c697daaa6dbff56b61f120e299709ea67737ac"} Sep 30 06:32:08 crc kubenswrapper[4691]: I0930 06:32:08.668073 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-5d688f5ffc-52gd8" event={"ID":"238ed092-dc40-4e7d-add0-854dd611a65f","Type":"ContainerStarted","Data":"439b8ee5cfc74a7ee4d487008b1ed5c8bd62d0bf56dc31fe593c9df4e8ec45f6"} Sep 30 06:32:08 crc kubenswrapper[4691]: I0930 06:32:08.668246 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-5d688f5ffc-52gd8" Sep 30 06:32:08 crc kubenswrapper[4691]: I0930 06:32:08.691665 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-5d688f5ffc-52gd8" podStartSLOduration=2.691644343 podStartE2EDuration="2.691644343s" podCreationTimestamp="2025-09-30 06:32:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:32:08.690319852 +0000 UTC m=+772.165340902" watchObservedRunningTime="2025-09-30 06:32:08.691644343 +0000 UTC m=+772.166665393" Sep 30 06:32:09 crc kubenswrapper[4691]: I0930 06:32:09.680464 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-vqw66" event={"ID":"cbbc08d3-615a-450a-9a3e-7f0aba1c5ff6","Type":"ContainerStarted","Data":"ffe7687bef7bd1aaf1cb9f89db328309ce15f88e8a343ff32fe8a5dd6f21804b"} Sep 30 06:32:09 crc kubenswrapper[4691]: I0930 06:32:09.680518 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-vqw66" event={"ID":"cbbc08d3-615a-450a-9a3e-7f0aba1c5ff6","Type":"ContainerStarted","Data":"2b783c5f0c7a99be88d1f577935c0e1cf4a809208b8a4d20a6abddcde1a405ab"} Sep 30 06:32:09 crc kubenswrapper[4691]: I0930 06:32:09.708160 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-vqw66" podStartSLOduration=3.708145127 podStartE2EDuration="3.708145127s" podCreationTimestamp="2025-09-30 06:32:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:32:09.704261554 +0000 UTC m=+773.179282604" watchObservedRunningTime="2025-09-30 06:32:09.708145127 +0000 UTC m=+773.183166167" Sep 30 06:32:10 crc kubenswrapper[4691]: I0930 06:32:10.687015 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-vqw66" Sep 30 06:32:12 crc kubenswrapper[4691]: I0930 06:32:12.181575 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-zxjk8"] Sep 30 06:32:12 crc kubenswrapper[4691]: I0930 06:32:12.183126 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zxjk8" Sep 30 06:32:12 crc kubenswrapper[4691]: I0930 06:32:12.186388 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zxjk8"] Sep 30 06:32:12 crc kubenswrapper[4691]: I0930 06:32:12.378861 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jr94h\" (UniqueName: \"kubernetes.io/projected/0d716488-0a49-43b4-a686-a734182c811e-kube-api-access-jr94h\") pod \"community-operators-zxjk8\" (UID: \"0d716488-0a49-43b4-a686-a734182c811e\") " pod="openshift-marketplace/community-operators-zxjk8" Sep 30 06:32:12 crc kubenswrapper[4691]: I0930 06:32:12.378923 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d716488-0a49-43b4-a686-a734182c811e-catalog-content\") pod \"community-operators-zxjk8\" (UID: \"0d716488-0a49-43b4-a686-a734182c811e\") " pod="openshift-marketplace/community-operators-zxjk8" Sep 30 06:32:12 crc kubenswrapper[4691]: I0930 06:32:12.378979 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d716488-0a49-43b4-a686-a734182c811e-utilities\") pod \"community-operators-zxjk8\" (UID: \"0d716488-0a49-43b4-a686-a734182c811e\") " pod="openshift-marketplace/community-operators-zxjk8" Sep 30 06:32:12 crc kubenswrapper[4691]: I0930 06:32:12.480274 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d716488-0a49-43b4-a686-a734182c811e-utilities\") pod \"community-operators-zxjk8\" (UID: \"0d716488-0a49-43b4-a686-a734182c811e\") " pod="openshift-marketplace/community-operators-zxjk8" Sep 30 06:32:12 crc kubenswrapper[4691]: I0930 06:32:12.480424 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jr94h\" (UniqueName: \"kubernetes.io/projected/0d716488-0a49-43b4-a686-a734182c811e-kube-api-access-jr94h\") pod \"community-operators-zxjk8\" (UID: \"0d716488-0a49-43b4-a686-a734182c811e\") " pod="openshift-marketplace/community-operators-zxjk8" Sep 30 06:32:12 crc kubenswrapper[4691]: I0930 06:32:12.480455 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d716488-0a49-43b4-a686-a734182c811e-catalog-content\") pod \"community-operators-zxjk8\" (UID: \"0d716488-0a49-43b4-a686-a734182c811e\") " pod="openshift-marketplace/community-operators-zxjk8" Sep 30 06:32:12 crc kubenswrapper[4691]: I0930 06:32:12.480845 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d716488-0a49-43b4-a686-a734182c811e-utilities\") pod \"community-operators-zxjk8\" (UID: \"0d716488-0a49-43b4-a686-a734182c811e\") " pod="openshift-marketplace/community-operators-zxjk8" Sep 30 06:32:12 crc kubenswrapper[4691]: I0930 06:32:12.481001 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d716488-0a49-43b4-a686-a734182c811e-catalog-content\") pod \"community-operators-zxjk8\" (UID: \"0d716488-0a49-43b4-a686-a734182c811e\") " pod="openshift-marketplace/community-operators-zxjk8" Sep 30 06:32:12 crc kubenswrapper[4691]: I0930 06:32:12.502406 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jr94h\" (UniqueName: \"kubernetes.io/projected/0d716488-0a49-43b4-a686-a734182c811e-kube-api-access-jr94h\") pod \"community-operators-zxjk8\" (UID: \"0d716488-0a49-43b4-a686-a734182c811e\") " pod="openshift-marketplace/community-operators-zxjk8" Sep 30 06:32:12 crc kubenswrapper[4691]: I0930 06:32:12.534111 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zxjk8" Sep 30 06:32:15 crc kubenswrapper[4691]: I0930 06:32:15.177887 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-zxjk8"] Sep 30 06:32:15 crc kubenswrapper[4691]: I0930 06:32:15.728232 4691 generic.go:334] "Generic (PLEG): container finished" podID="0d716488-0a49-43b4-a686-a734182c811e" containerID="5fc5a3cde990be09d7deaa275c0d709f94aa8febac2547047c489e1b2ed50415" exitCode=0 Sep 30 06:32:15 crc kubenswrapper[4691]: I0930 06:32:15.728303 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zxjk8" event={"ID":"0d716488-0a49-43b4-a686-a734182c811e","Type":"ContainerDied","Data":"5fc5a3cde990be09d7deaa275c0d709f94aa8febac2547047c489e1b2ed50415"} Sep 30 06:32:15 crc kubenswrapper[4691]: I0930 06:32:15.728367 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zxjk8" event={"ID":"0d716488-0a49-43b4-a686-a734182c811e","Type":"ContainerStarted","Data":"35c5b0e5e37df64b992f3a142361c6adcb6ce7050e218f70e73135c3cb0722f6"} Sep 30 06:32:15 crc kubenswrapper[4691]: I0930 06:32:15.730771 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-4szpr" event={"ID":"45188dc2-6524-4d87-bfaa-676d46684df8","Type":"ContainerStarted","Data":"bbb16ab7265680f31056e65aca3dfc6a41995c9502d3427b6da85db2f25f297d"} Sep 30 06:32:15 crc kubenswrapper[4691]: I0930 06:32:15.731064 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-4szpr" Sep 30 06:32:15 crc kubenswrapper[4691]: I0930 06:32:15.734312 4691 generic.go:334] "Generic (PLEG): container finished" podID="9949a206-ebbd-42f2-8b22-8dfcf266b934" containerID="b8f716bf00f7f8e64523eeae84f29a0450b1fa7ac3d2b80593e4dbe927db1f4d" exitCode=0 Sep 30 06:32:15 crc kubenswrapper[4691]: I0930 06:32:15.734432 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jbzhs" event={"ID":"9949a206-ebbd-42f2-8b22-8dfcf266b934","Type":"ContainerDied","Data":"b8f716bf00f7f8e64523eeae84f29a0450b1fa7ac3d2b80593e4dbe927db1f4d"} Sep 30 06:32:15 crc kubenswrapper[4691]: I0930 06:32:15.808446 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-4szpr" podStartSLOduration=2.58764863 podStartE2EDuration="9.808416886s" podCreationTimestamp="2025-09-30 06:32:06 +0000 UTC" firstStartedPulling="2025-09-30 06:32:07.632528787 +0000 UTC m=+771.107549857" lastFinishedPulling="2025-09-30 06:32:14.853297033 +0000 UTC m=+778.328318113" observedRunningTime="2025-09-30 06:32:15.808054905 +0000 UTC m=+779.283075995" watchObservedRunningTime="2025-09-30 06:32:15.808416886 +0000 UTC m=+779.283437966" Sep 30 06:32:16 crc kubenswrapper[4691]: I0930 06:32:16.743963 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zxjk8" event={"ID":"0d716488-0a49-43b4-a686-a734182c811e","Type":"ContainerStarted","Data":"9cb80fa6c422e886a9e5e09baa3d11d6fa14841c87bd8f48666be5ea34a26543"} Sep 30 06:32:16 crc kubenswrapper[4691]: I0930 06:32:16.748106 4691 generic.go:334] "Generic (PLEG): container finished" podID="9949a206-ebbd-42f2-8b22-8dfcf266b934" containerID="9d2440bb15d4bb34526d33d238589d295e408a66bf0c1c701c6e9f429598cf77" exitCode=0 Sep 30 06:32:16 crc kubenswrapper[4691]: I0930 06:32:16.748156 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jbzhs" event={"ID":"9949a206-ebbd-42f2-8b22-8dfcf266b934","Type":"ContainerDied","Data":"9d2440bb15d4bb34526d33d238589d295e408a66bf0c1c701c6e9f429598cf77"} Sep 30 06:32:17 crc kubenswrapper[4691]: I0930 06:32:17.111964 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-5d688f5ffc-52gd8" Sep 30 06:32:17 crc kubenswrapper[4691]: I0930 06:32:17.758932 4691 generic.go:334] "Generic (PLEG): container finished" podID="9949a206-ebbd-42f2-8b22-8dfcf266b934" containerID="2375bedaf2b306d7e7364668e3e4f08a11be861a4894d701fba9fd1fe9e3f9d4" exitCode=0 Sep 30 06:32:17 crc kubenswrapper[4691]: I0930 06:32:17.759085 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jbzhs" event={"ID":"9949a206-ebbd-42f2-8b22-8dfcf266b934","Type":"ContainerDied","Data":"2375bedaf2b306d7e7364668e3e4f08a11be861a4894d701fba9fd1fe9e3f9d4"} Sep 30 06:32:17 crc kubenswrapper[4691]: I0930 06:32:17.763271 4691 generic.go:334] "Generic (PLEG): container finished" podID="0d716488-0a49-43b4-a686-a734182c811e" containerID="9cb80fa6c422e886a9e5e09baa3d11d6fa14841c87bd8f48666be5ea34a26543" exitCode=0 Sep 30 06:32:17 crc kubenswrapper[4691]: I0930 06:32:17.763321 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zxjk8" event={"ID":"0d716488-0a49-43b4-a686-a734182c811e","Type":"ContainerDied","Data":"9cb80fa6c422e886a9e5e09baa3d11d6fa14841c87bd8f48666be5ea34a26543"} Sep 30 06:32:18 crc kubenswrapper[4691]: I0930 06:32:18.591436 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-vqw66" Sep 30 06:32:18 crc kubenswrapper[4691]: I0930 06:32:18.773295 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jbzhs" event={"ID":"9949a206-ebbd-42f2-8b22-8dfcf266b934","Type":"ContainerStarted","Data":"afc8d729d5cbfbd6319034de235360b99edbebdc24862c54947610c2b295407b"} Sep 30 06:32:18 crc kubenswrapper[4691]: I0930 06:32:18.773344 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jbzhs" event={"ID":"9949a206-ebbd-42f2-8b22-8dfcf266b934","Type":"ContainerStarted","Data":"9892e891b098cd0ca10ffed507fbae7fee9c619ab26d8a80f6c53b0e8145f022"} Sep 30 06:32:18 crc kubenswrapper[4691]: I0930 06:32:18.773357 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jbzhs" event={"ID":"9949a206-ebbd-42f2-8b22-8dfcf266b934","Type":"ContainerStarted","Data":"0b77841c081ad7105873f8fa002c1bd5c7f2dc64d9b609c762ff1d5bda17c175"} Sep 30 06:32:18 crc kubenswrapper[4691]: I0930 06:32:18.773368 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jbzhs" event={"ID":"9949a206-ebbd-42f2-8b22-8dfcf266b934","Type":"ContainerStarted","Data":"d403030988ac1ca1b7873ee0d923b376f3ff094e13302c46f81f9c44ca708997"} Sep 30 06:32:18 crc kubenswrapper[4691]: I0930 06:32:18.773378 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jbzhs" event={"ID":"9949a206-ebbd-42f2-8b22-8dfcf266b934","Type":"ContainerStarted","Data":"07e8f6ec259980c1946e8e3b347a00c9a5d8e020bf66ebae525a59127c83b28d"} Sep 30 06:32:18 crc kubenswrapper[4691]: I0930 06:32:18.775243 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zxjk8" event={"ID":"0d716488-0a49-43b4-a686-a734182c811e","Type":"ContainerStarted","Data":"6878d1fb89b37ec5721cc1868825b0921b3647b019f21bb1143f7437b70cfe4b"} Sep 30 06:32:18 crc kubenswrapper[4691]: I0930 06:32:18.804449 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-zxjk8" podStartSLOduration=4.293910311 podStartE2EDuration="6.804402456s" podCreationTimestamp="2025-09-30 06:32:12 +0000 UTC" firstStartedPulling="2025-09-30 06:32:15.731166743 +0000 UTC m=+779.206187793" lastFinishedPulling="2025-09-30 06:32:18.241658898 +0000 UTC m=+781.716679938" observedRunningTime="2025-09-30 06:32:18.798572781 +0000 UTC m=+782.273593851" watchObservedRunningTime="2025-09-30 06:32:18.804402456 +0000 UTC m=+782.279423506" Sep 30 06:32:19 crc kubenswrapper[4691]: I0930 06:32:19.789945 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jbzhs" event={"ID":"9949a206-ebbd-42f2-8b22-8dfcf266b934","Type":"ContainerStarted","Data":"783d4632f7af39965e79f57c0a24c78b7b8015166376d4e598cf4609f2061167"} Sep 30 06:32:19 crc kubenswrapper[4691]: I0930 06:32:19.790309 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-jbzhs" Sep 30 06:32:19 crc kubenswrapper[4691]: I0930 06:32:19.825789 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-jbzhs" podStartSLOduration=6.202779888 podStartE2EDuration="13.825765294s" podCreationTimestamp="2025-09-30 06:32:06 +0000 UTC" firstStartedPulling="2025-09-30 06:32:07.207744531 +0000 UTC m=+770.682765571" lastFinishedPulling="2025-09-30 06:32:14.830729917 +0000 UTC m=+778.305750977" observedRunningTime="2025-09-30 06:32:19.821575281 +0000 UTC m=+783.296596391" watchObservedRunningTime="2025-09-30 06:32:19.825765294 +0000 UTC m=+783.300786364" Sep 30 06:32:22 crc kubenswrapper[4691]: I0930 06:32:22.008645 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-jbzhs" Sep 30 06:32:22 crc kubenswrapper[4691]: I0930 06:32:22.064009 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-jbzhs" Sep 30 06:32:22 crc kubenswrapper[4691]: I0930 06:32:22.535220 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-zxjk8" Sep 30 06:32:22 crc kubenswrapper[4691]: I0930 06:32:22.535287 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-zxjk8" Sep 30 06:32:22 crc kubenswrapper[4691]: I0930 06:32:22.596048 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-zxjk8" Sep 30 06:32:22 crc kubenswrapper[4691]: I0930 06:32:22.850379 4691 patch_prober.go:28] interesting pod/machine-config-daemon-4w4k6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 06:32:22 crc kubenswrapper[4691]: I0930 06:32:22.850473 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 06:32:26 crc kubenswrapper[4691]: I0930 06:32:26.179537 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-swhwg"] Sep 30 06:32:26 crc kubenswrapper[4691]: I0930 06:32:26.180793 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-swhwg" Sep 30 06:32:26 crc kubenswrapper[4691]: I0930 06:32:26.183558 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Sep 30 06:32:26 crc kubenswrapper[4691]: I0930 06:32:26.184313 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-wls64" Sep 30 06:32:26 crc kubenswrapper[4691]: I0930 06:32:26.184870 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Sep 30 06:32:26 crc kubenswrapper[4691]: I0930 06:32:26.196577 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-swhwg"] Sep 30 06:32:26 crc kubenswrapper[4691]: I0930 06:32:26.294952 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b59hr\" (UniqueName: \"kubernetes.io/projected/664688ee-c3cc-4f92-86b7-64d53b8c133d-kube-api-access-b59hr\") pod \"openstack-operator-index-swhwg\" (UID: \"664688ee-c3cc-4f92-86b7-64d53b8c133d\") " pod="openstack-operators/openstack-operator-index-swhwg" Sep 30 06:32:26 crc kubenswrapper[4691]: I0930 06:32:26.396657 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b59hr\" (UniqueName: \"kubernetes.io/projected/664688ee-c3cc-4f92-86b7-64d53b8c133d-kube-api-access-b59hr\") pod \"openstack-operator-index-swhwg\" (UID: \"664688ee-c3cc-4f92-86b7-64d53b8c133d\") " pod="openstack-operators/openstack-operator-index-swhwg" Sep 30 06:32:26 crc kubenswrapper[4691]: I0930 06:32:26.430509 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b59hr\" (UniqueName: \"kubernetes.io/projected/664688ee-c3cc-4f92-86b7-64d53b8c133d-kube-api-access-b59hr\") pod \"openstack-operator-index-swhwg\" (UID: \"664688ee-c3cc-4f92-86b7-64d53b8c133d\") " pod="openstack-operators/openstack-operator-index-swhwg" Sep 30 06:32:26 crc kubenswrapper[4691]: I0930 06:32:26.512266 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-swhwg" Sep 30 06:32:26 crc kubenswrapper[4691]: I0930 06:32:26.968935 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-swhwg"] Sep 30 06:32:26 crc kubenswrapper[4691]: W0930 06:32:26.975653 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod664688ee_c3cc_4f92_86b7_64d53b8c133d.slice/crio-b06b41c359af310404321de595b0d5f23f7d926098d6253d8ba7e34cdf3d25ee WatchSource:0}: Error finding container b06b41c359af310404321de595b0d5f23f7d926098d6253d8ba7e34cdf3d25ee: Status 404 returned error can't find the container with id b06b41c359af310404321de595b0d5f23f7d926098d6253d8ba7e34cdf3d25ee Sep 30 06:32:27 crc kubenswrapper[4691]: I0930 06:32:27.010772 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-jbzhs" Sep 30 06:32:27 crc kubenswrapper[4691]: I0930 06:32:27.021735 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-5478bdb765-4szpr" Sep 30 06:32:27 crc kubenswrapper[4691]: I0930 06:32:27.853295 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-swhwg" event={"ID":"664688ee-c3cc-4f92-86b7-64d53b8c133d","Type":"ContainerStarted","Data":"b06b41c359af310404321de595b0d5f23f7d926098d6253d8ba7e34cdf3d25ee"} Sep 30 06:32:31 crc kubenswrapper[4691]: I0930 06:32:31.904089 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-swhwg" event={"ID":"664688ee-c3cc-4f92-86b7-64d53b8c133d","Type":"ContainerStarted","Data":"3c5a4053dbe1a2fbd3b9c6ec5f230cbd3fa35c22a999d9ae24673d1b9737307c"} Sep 30 06:32:31 crc kubenswrapper[4691]: I0930 06:32:31.936470 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-swhwg" podStartSLOduration=1.916897251 podStartE2EDuration="5.936444577s" podCreationTimestamp="2025-09-30 06:32:26 +0000 UTC" firstStartedPulling="2025-09-30 06:32:26.977544199 +0000 UTC m=+790.452565239" lastFinishedPulling="2025-09-30 06:32:30.997091525 +0000 UTC m=+794.472112565" observedRunningTime="2025-09-30 06:32:31.929787736 +0000 UTC m=+795.404808826" watchObservedRunningTime="2025-09-30 06:32:31.936444577 +0000 UTC m=+795.411465647" Sep 30 06:32:32 crc kubenswrapper[4691]: I0930 06:32:32.605075 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-zxjk8" Sep 30 06:32:36 crc kubenswrapper[4691]: I0930 06:32:36.512616 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-swhwg" Sep 30 06:32:36 crc kubenswrapper[4691]: I0930 06:32:36.512964 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-swhwg" Sep 30 06:32:36 crc kubenswrapper[4691]: I0930 06:32:36.558105 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-swhwg" Sep 30 06:32:36 crc kubenswrapper[4691]: I0930 06:32:36.785790 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mjxc7"] Sep 30 06:32:36 crc kubenswrapper[4691]: I0930 06:32:36.788026 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mjxc7" Sep 30 06:32:36 crc kubenswrapper[4691]: I0930 06:32:36.806582 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mjxc7"] Sep 30 06:32:36 crc kubenswrapper[4691]: I0930 06:32:36.949995 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvfzb\" (UniqueName: \"kubernetes.io/projected/76f92011-8ebb-4535-ac7f-fa625a6ceefe-kube-api-access-nvfzb\") pod \"redhat-operators-mjxc7\" (UID: \"76f92011-8ebb-4535-ac7f-fa625a6ceefe\") " pod="openshift-marketplace/redhat-operators-mjxc7" Sep 30 06:32:36 crc kubenswrapper[4691]: I0930 06:32:36.950406 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76f92011-8ebb-4535-ac7f-fa625a6ceefe-utilities\") pod \"redhat-operators-mjxc7\" (UID: \"76f92011-8ebb-4535-ac7f-fa625a6ceefe\") " pod="openshift-marketplace/redhat-operators-mjxc7" Sep 30 06:32:36 crc kubenswrapper[4691]: I0930 06:32:36.950463 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76f92011-8ebb-4535-ac7f-fa625a6ceefe-catalog-content\") pod \"redhat-operators-mjxc7\" (UID: \"76f92011-8ebb-4535-ac7f-fa625a6ceefe\") " pod="openshift-marketplace/redhat-operators-mjxc7" Sep 30 06:32:36 crc kubenswrapper[4691]: I0930 06:32:36.977154 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-swhwg" Sep 30 06:32:37 crc kubenswrapper[4691]: I0930 06:32:37.052058 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76f92011-8ebb-4535-ac7f-fa625a6ceefe-utilities\") pod \"redhat-operators-mjxc7\" (UID: \"76f92011-8ebb-4535-ac7f-fa625a6ceefe\") " pod="openshift-marketplace/redhat-operators-mjxc7" Sep 30 06:32:37 crc kubenswrapper[4691]: I0930 06:32:37.052131 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76f92011-8ebb-4535-ac7f-fa625a6ceefe-catalog-content\") pod \"redhat-operators-mjxc7\" (UID: \"76f92011-8ebb-4535-ac7f-fa625a6ceefe\") " pod="openshift-marketplace/redhat-operators-mjxc7" Sep 30 06:32:37 crc kubenswrapper[4691]: I0930 06:32:37.052228 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvfzb\" (UniqueName: \"kubernetes.io/projected/76f92011-8ebb-4535-ac7f-fa625a6ceefe-kube-api-access-nvfzb\") pod \"redhat-operators-mjxc7\" (UID: \"76f92011-8ebb-4535-ac7f-fa625a6ceefe\") " pod="openshift-marketplace/redhat-operators-mjxc7" Sep 30 06:32:37 crc kubenswrapper[4691]: I0930 06:32:37.053018 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76f92011-8ebb-4535-ac7f-fa625a6ceefe-utilities\") pod \"redhat-operators-mjxc7\" (UID: \"76f92011-8ebb-4535-ac7f-fa625a6ceefe\") " pod="openshift-marketplace/redhat-operators-mjxc7" Sep 30 06:32:37 crc kubenswrapper[4691]: I0930 06:32:37.053397 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76f92011-8ebb-4535-ac7f-fa625a6ceefe-catalog-content\") pod \"redhat-operators-mjxc7\" (UID: \"76f92011-8ebb-4535-ac7f-fa625a6ceefe\") " pod="openshift-marketplace/redhat-operators-mjxc7" Sep 30 06:32:37 crc kubenswrapper[4691]: I0930 06:32:37.079133 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvfzb\" (UniqueName: \"kubernetes.io/projected/76f92011-8ebb-4535-ac7f-fa625a6ceefe-kube-api-access-nvfzb\") pod \"redhat-operators-mjxc7\" (UID: \"76f92011-8ebb-4535-ac7f-fa625a6ceefe\") " pod="openshift-marketplace/redhat-operators-mjxc7" Sep 30 06:32:37 crc kubenswrapper[4691]: I0930 06:32:37.171775 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mjxc7" Sep 30 06:32:37 crc kubenswrapper[4691]: W0930 06:32:37.592580 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76f92011_8ebb_4535_ac7f_fa625a6ceefe.slice/crio-4b0ccdbcf2b2a4b6adb480fc10ba316168032dc3faf4fb19a8e0661b6a606e09 WatchSource:0}: Error finding container 4b0ccdbcf2b2a4b6adb480fc10ba316168032dc3faf4fb19a8e0661b6a606e09: Status 404 returned error can't find the container with id 4b0ccdbcf2b2a4b6adb480fc10ba316168032dc3faf4fb19a8e0661b6a606e09 Sep 30 06:32:37 crc kubenswrapper[4691]: I0930 06:32:37.593048 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mjxc7"] Sep 30 06:32:37 crc kubenswrapper[4691]: I0930 06:32:37.946385 4691 generic.go:334] "Generic (PLEG): container finished" podID="76f92011-8ebb-4535-ac7f-fa625a6ceefe" containerID="b3aa18a92e0669c257495ae4bf2a7deb7151a69d5d3302a2fb275709456f34a3" exitCode=0 Sep 30 06:32:37 crc kubenswrapper[4691]: I0930 06:32:37.946436 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mjxc7" event={"ID":"76f92011-8ebb-4535-ac7f-fa625a6ceefe","Type":"ContainerDied","Data":"b3aa18a92e0669c257495ae4bf2a7deb7151a69d5d3302a2fb275709456f34a3"} Sep 30 06:32:37 crc kubenswrapper[4691]: I0930 06:32:37.946927 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mjxc7" event={"ID":"76f92011-8ebb-4535-ac7f-fa625a6ceefe","Type":"ContainerStarted","Data":"4b0ccdbcf2b2a4b6adb480fc10ba316168032dc3faf4fb19a8e0661b6a606e09"} Sep 30 06:32:37 crc kubenswrapper[4691]: I0930 06:32:37.965582 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zxjk8"] Sep 30 06:32:37 crc kubenswrapper[4691]: I0930 06:32:37.965845 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-zxjk8" podUID="0d716488-0a49-43b4-a686-a734182c811e" containerName="registry-server" containerID="cri-o://6878d1fb89b37ec5721cc1868825b0921b3647b019f21bb1143f7437b70cfe4b" gracePeriod=2 Sep 30 06:32:38 crc kubenswrapper[4691]: I0930 06:32:38.367152 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zxjk8" Sep 30 06:32:38 crc kubenswrapper[4691]: I0930 06:32:38.468572 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d716488-0a49-43b4-a686-a734182c811e-utilities\") pod \"0d716488-0a49-43b4-a686-a734182c811e\" (UID: \"0d716488-0a49-43b4-a686-a734182c811e\") " Sep 30 06:32:38 crc kubenswrapper[4691]: I0930 06:32:38.468649 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d716488-0a49-43b4-a686-a734182c811e-catalog-content\") pod \"0d716488-0a49-43b4-a686-a734182c811e\" (UID: \"0d716488-0a49-43b4-a686-a734182c811e\") " Sep 30 06:32:38 crc kubenswrapper[4691]: I0930 06:32:38.468716 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jr94h\" (UniqueName: \"kubernetes.io/projected/0d716488-0a49-43b4-a686-a734182c811e-kube-api-access-jr94h\") pod \"0d716488-0a49-43b4-a686-a734182c811e\" (UID: \"0d716488-0a49-43b4-a686-a734182c811e\") " Sep 30 06:32:38 crc kubenswrapper[4691]: I0930 06:32:38.469307 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d716488-0a49-43b4-a686-a734182c811e-utilities" (OuterVolumeSpecName: "utilities") pod "0d716488-0a49-43b4-a686-a734182c811e" (UID: "0d716488-0a49-43b4-a686-a734182c811e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:32:38 crc kubenswrapper[4691]: I0930 06:32:38.487176 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d716488-0a49-43b4-a686-a734182c811e-kube-api-access-jr94h" (OuterVolumeSpecName: "kube-api-access-jr94h") pod "0d716488-0a49-43b4-a686-a734182c811e" (UID: "0d716488-0a49-43b4-a686-a734182c811e"). InnerVolumeSpecName "kube-api-access-jr94h". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:32:38 crc kubenswrapper[4691]: I0930 06:32:38.519121 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d716488-0a49-43b4-a686-a734182c811e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0d716488-0a49-43b4-a686-a734182c811e" (UID: "0d716488-0a49-43b4-a686-a734182c811e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:32:38 crc kubenswrapper[4691]: I0930 06:32:38.570273 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0d716488-0a49-43b4-a686-a734182c811e-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 06:32:38 crc kubenswrapper[4691]: I0930 06:32:38.570308 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0d716488-0a49-43b4-a686-a734182c811e-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 06:32:38 crc kubenswrapper[4691]: I0930 06:32:38.570325 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jr94h\" (UniqueName: \"kubernetes.io/projected/0d716488-0a49-43b4-a686-a734182c811e-kube-api-access-jr94h\") on node \"crc\" DevicePath \"\"" Sep 30 06:32:38 crc kubenswrapper[4691]: I0930 06:32:38.956450 4691 generic.go:334] "Generic (PLEG): container finished" podID="0d716488-0a49-43b4-a686-a734182c811e" containerID="6878d1fb89b37ec5721cc1868825b0921b3647b019f21bb1143f7437b70cfe4b" exitCode=0 Sep 30 06:32:38 crc kubenswrapper[4691]: I0930 06:32:38.956506 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zxjk8" event={"ID":"0d716488-0a49-43b4-a686-a734182c811e","Type":"ContainerDied","Data":"6878d1fb89b37ec5721cc1868825b0921b3647b019f21bb1143f7437b70cfe4b"} Sep 30 06:32:38 crc kubenswrapper[4691]: I0930 06:32:38.956546 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-zxjk8" event={"ID":"0d716488-0a49-43b4-a686-a734182c811e","Type":"ContainerDied","Data":"35c5b0e5e37df64b992f3a142361c6adcb6ce7050e218f70e73135c3cb0722f6"} Sep 30 06:32:38 crc kubenswrapper[4691]: I0930 06:32:38.956562 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-zxjk8" Sep 30 06:32:38 crc kubenswrapper[4691]: I0930 06:32:38.956587 4691 scope.go:117] "RemoveContainer" containerID="6878d1fb89b37ec5721cc1868825b0921b3647b019f21bb1143f7437b70cfe4b" Sep 30 06:32:38 crc kubenswrapper[4691]: I0930 06:32:38.973864 4691 scope.go:117] "RemoveContainer" containerID="9cb80fa6c422e886a9e5e09baa3d11d6fa14841c87bd8f48666be5ea34a26543" Sep 30 06:32:38 crc kubenswrapper[4691]: I0930 06:32:38.993749 4691 scope.go:117] "RemoveContainer" containerID="5fc5a3cde990be09d7deaa275c0d709f94aa8febac2547047c489e1b2ed50415" Sep 30 06:32:38 crc kubenswrapper[4691]: I0930 06:32:38.998371 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-zxjk8"] Sep 30 06:32:39 crc kubenswrapper[4691]: I0930 06:32:39.005390 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-zxjk8"] Sep 30 06:32:39 crc kubenswrapper[4691]: I0930 06:32:39.020932 4691 scope.go:117] "RemoveContainer" containerID="6878d1fb89b37ec5721cc1868825b0921b3647b019f21bb1143f7437b70cfe4b" Sep 30 06:32:39 crc kubenswrapper[4691]: E0930 06:32:39.021643 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6878d1fb89b37ec5721cc1868825b0921b3647b019f21bb1143f7437b70cfe4b\": container with ID starting with 6878d1fb89b37ec5721cc1868825b0921b3647b019f21bb1143f7437b70cfe4b not found: ID does not exist" containerID="6878d1fb89b37ec5721cc1868825b0921b3647b019f21bb1143f7437b70cfe4b" Sep 30 06:32:39 crc kubenswrapper[4691]: I0930 06:32:39.021708 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6878d1fb89b37ec5721cc1868825b0921b3647b019f21bb1143f7437b70cfe4b"} err="failed to get container status \"6878d1fb89b37ec5721cc1868825b0921b3647b019f21bb1143f7437b70cfe4b\": rpc error: code = NotFound desc = could not find container \"6878d1fb89b37ec5721cc1868825b0921b3647b019f21bb1143f7437b70cfe4b\": container with ID starting with 6878d1fb89b37ec5721cc1868825b0921b3647b019f21bb1143f7437b70cfe4b not found: ID does not exist" Sep 30 06:32:39 crc kubenswrapper[4691]: I0930 06:32:39.021746 4691 scope.go:117] "RemoveContainer" containerID="9cb80fa6c422e886a9e5e09baa3d11d6fa14841c87bd8f48666be5ea34a26543" Sep 30 06:32:39 crc kubenswrapper[4691]: E0930 06:32:39.022386 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cb80fa6c422e886a9e5e09baa3d11d6fa14841c87bd8f48666be5ea34a26543\": container with ID starting with 9cb80fa6c422e886a9e5e09baa3d11d6fa14841c87bd8f48666be5ea34a26543 not found: ID does not exist" containerID="9cb80fa6c422e886a9e5e09baa3d11d6fa14841c87bd8f48666be5ea34a26543" Sep 30 06:32:39 crc kubenswrapper[4691]: I0930 06:32:39.022431 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cb80fa6c422e886a9e5e09baa3d11d6fa14841c87bd8f48666be5ea34a26543"} err="failed to get container status \"9cb80fa6c422e886a9e5e09baa3d11d6fa14841c87bd8f48666be5ea34a26543\": rpc error: code = NotFound desc = could not find container \"9cb80fa6c422e886a9e5e09baa3d11d6fa14841c87bd8f48666be5ea34a26543\": container with ID starting with 9cb80fa6c422e886a9e5e09baa3d11d6fa14841c87bd8f48666be5ea34a26543 not found: ID does not exist" Sep 30 06:32:39 crc kubenswrapper[4691]: I0930 06:32:39.022457 4691 scope.go:117] "RemoveContainer" containerID="5fc5a3cde990be09d7deaa275c0d709f94aa8febac2547047c489e1b2ed50415" Sep 30 06:32:39 crc kubenswrapper[4691]: E0930 06:32:39.023038 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fc5a3cde990be09d7deaa275c0d709f94aa8febac2547047c489e1b2ed50415\": container with ID starting with 5fc5a3cde990be09d7deaa275c0d709f94aa8febac2547047c489e1b2ed50415 not found: ID does not exist" containerID="5fc5a3cde990be09d7deaa275c0d709f94aa8febac2547047c489e1b2ed50415" Sep 30 06:32:39 crc kubenswrapper[4691]: I0930 06:32:39.023158 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fc5a3cde990be09d7deaa275c0d709f94aa8febac2547047c489e1b2ed50415"} err="failed to get container status \"5fc5a3cde990be09d7deaa275c0d709f94aa8febac2547047c489e1b2ed50415\": rpc error: code = NotFound desc = could not find container \"5fc5a3cde990be09d7deaa275c0d709f94aa8febac2547047c489e1b2ed50415\": container with ID starting with 5fc5a3cde990be09d7deaa275c0d709f94aa8febac2547047c489e1b2ed50415 not found: ID does not exist" Sep 30 06:32:39 crc kubenswrapper[4691]: I0930 06:32:39.239494 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d716488-0a49-43b4-a686-a734182c811e" path="/var/lib/kubelet/pods/0d716488-0a49-43b4-a686-a734182c811e/volumes" Sep 30 06:32:39 crc kubenswrapper[4691]: I0930 06:32:39.976003 4691 generic.go:334] "Generic (PLEG): container finished" podID="76f92011-8ebb-4535-ac7f-fa625a6ceefe" containerID="d1e80c611742cfa8aeb954b2a721c3767674b5be728eaeea16f3912c159eacaa" exitCode=0 Sep 30 06:32:39 crc kubenswrapper[4691]: I0930 06:32:39.976053 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mjxc7" event={"ID":"76f92011-8ebb-4535-ac7f-fa625a6ceefe","Type":"ContainerDied","Data":"d1e80c611742cfa8aeb954b2a721c3767674b5be728eaeea16f3912c159eacaa"} Sep 30 06:32:40 crc kubenswrapper[4691]: I0930 06:32:40.825446 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/3cdb0998117b6528e8d4f4afc7f870da2241657f1d88a1ece591ac0268b75jz"] Sep 30 06:32:40 crc kubenswrapper[4691]: E0930 06:32:40.825974 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d716488-0a49-43b4-a686-a734182c811e" containerName="extract-content" Sep 30 06:32:40 crc kubenswrapper[4691]: I0930 06:32:40.826014 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d716488-0a49-43b4-a686-a734182c811e" containerName="extract-content" Sep 30 06:32:40 crc kubenswrapper[4691]: E0930 06:32:40.826042 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d716488-0a49-43b4-a686-a734182c811e" containerName="extract-utilities" Sep 30 06:32:40 crc kubenswrapper[4691]: I0930 06:32:40.826059 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d716488-0a49-43b4-a686-a734182c811e" containerName="extract-utilities" Sep 30 06:32:40 crc kubenswrapper[4691]: E0930 06:32:40.826077 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d716488-0a49-43b4-a686-a734182c811e" containerName="registry-server" Sep 30 06:32:40 crc kubenswrapper[4691]: I0930 06:32:40.826095 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d716488-0a49-43b4-a686-a734182c811e" containerName="registry-server" Sep 30 06:32:40 crc kubenswrapper[4691]: I0930 06:32:40.826365 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d716488-0a49-43b4-a686-a734182c811e" containerName="registry-server" Sep 30 06:32:40 crc kubenswrapper[4691]: I0930 06:32:40.833367 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/3cdb0998117b6528e8d4f4afc7f870da2241657f1d88a1ece591ac0268b75jz" Sep 30 06:32:40 crc kubenswrapper[4691]: I0930 06:32:40.837864 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-s4grx" Sep 30 06:32:40 crc kubenswrapper[4691]: I0930 06:32:40.864955 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/3cdb0998117b6528e8d4f4afc7f870da2241657f1d88a1ece591ac0268b75jz"] Sep 30 06:32:40 crc kubenswrapper[4691]: I0930 06:32:40.983169 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mjxc7" event={"ID":"76f92011-8ebb-4535-ac7f-fa625a6ceefe","Type":"ContainerStarted","Data":"914436edde955b2e8de893bf200651d814bf3225165e8e12cf3b2eccb830a847"} Sep 30 06:32:41 crc kubenswrapper[4691]: I0930 06:32:41.003252 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mjxc7" podStartSLOduration=2.384476786 podStartE2EDuration="5.003229471s" podCreationTimestamp="2025-09-30 06:32:36 +0000 UTC" firstStartedPulling="2025-09-30 06:32:37.947906276 +0000 UTC m=+801.422927336" lastFinishedPulling="2025-09-30 06:32:40.566658941 +0000 UTC m=+804.041680021" observedRunningTime="2025-09-30 06:32:40.999415969 +0000 UTC m=+804.474437029" watchObservedRunningTime="2025-09-30 06:32:41.003229471 +0000 UTC m=+804.478250521" Sep 30 06:32:41 crc kubenswrapper[4691]: I0930 06:32:41.004177 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ed991450-2c1e-4d1a-a54c-196c5067ce69-bundle\") pod \"3cdb0998117b6528e8d4f4afc7f870da2241657f1d88a1ece591ac0268b75jz\" (UID: \"ed991450-2c1e-4d1a-a54c-196c5067ce69\") " pod="openstack-operators/3cdb0998117b6528e8d4f4afc7f870da2241657f1d88a1ece591ac0268b75jz" Sep 30 06:32:41 crc kubenswrapper[4691]: I0930 06:32:41.004238 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkhrk\" (UniqueName: \"kubernetes.io/projected/ed991450-2c1e-4d1a-a54c-196c5067ce69-kube-api-access-kkhrk\") pod \"3cdb0998117b6528e8d4f4afc7f870da2241657f1d88a1ece591ac0268b75jz\" (UID: \"ed991450-2c1e-4d1a-a54c-196c5067ce69\") " pod="openstack-operators/3cdb0998117b6528e8d4f4afc7f870da2241657f1d88a1ece591ac0268b75jz" Sep 30 06:32:41 crc kubenswrapper[4691]: I0930 06:32:41.004686 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ed991450-2c1e-4d1a-a54c-196c5067ce69-util\") pod \"3cdb0998117b6528e8d4f4afc7f870da2241657f1d88a1ece591ac0268b75jz\" (UID: \"ed991450-2c1e-4d1a-a54c-196c5067ce69\") " pod="openstack-operators/3cdb0998117b6528e8d4f4afc7f870da2241657f1d88a1ece591ac0268b75jz" Sep 30 06:32:41 crc kubenswrapper[4691]: I0930 06:32:41.105994 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ed991450-2c1e-4d1a-a54c-196c5067ce69-util\") pod \"3cdb0998117b6528e8d4f4afc7f870da2241657f1d88a1ece591ac0268b75jz\" (UID: \"ed991450-2c1e-4d1a-a54c-196c5067ce69\") " pod="openstack-operators/3cdb0998117b6528e8d4f4afc7f870da2241657f1d88a1ece591ac0268b75jz" Sep 30 06:32:41 crc kubenswrapper[4691]: I0930 06:32:41.106064 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ed991450-2c1e-4d1a-a54c-196c5067ce69-bundle\") pod \"3cdb0998117b6528e8d4f4afc7f870da2241657f1d88a1ece591ac0268b75jz\" (UID: \"ed991450-2c1e-4d1a-a54c-196c5067ce69\") " pod="openstack-operators/3cdb0998117b6528e8d4f4afc7f870da2241657f1d88a1ece591ac0268b75jz" Sep 30 06:32:41 crc kubenswrapper[4691]: I0930 06:32:41.106102 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkhrk\" (UniqueName: \"kubernetes.io/projected/ed991450-2c1e-4d1a-a54c-196c5067ce69-kube-api-access-kkhrk\") pod \"3cdb0998117b6528e8d4f4afc7f870da2241657f1d88a1ece591ac0268b75jz\" (UID: \"ed991450-2c1e-4d1a-a54c-196c5067ce69\") " pod="openstack-operators/3cdb0998117b6528e8d4f4afc7f870da2241657f1d88a1ece591ac0268b75jz" Sep 30 06:32:41 crc kubenswrapper[4691]: I0930 06:32:41.106644 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ed991450-2c1e-4d1a-a54c-196c5067ce69-util\") pod \"3cdb0998117b6528e8d4f4afc7f870da2241657f1d88a1ece591ac0268b75jz\" (UID: \"ed991450-2c1e-4d1a-a54c-196c5067ce69\") " pod="openstack-operators/3cdb0998117b6528e8d4f4afc7f870da2241657f1d88a1ece591ac0268b75jz" Sep 30 06:32:41 crc kubenswrapper[4691]: I0930 06:32:41.106701 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ed991450-2c1e-4d1a-a54c-196c5067ce69-bundle\") pod \"3cdb0998117b6528e8d4f4afc7f870da2241657f1d88a1ece591ac0268b75jz\" (UID: \"ed991450-2c1e-4d1a-a54c-196c5067ce69\") " pod="openstack-operators/3cdb0998117b6528e8d4f4afc7f870da2241657f1d88a1ece591ac0268b75jz" Sep 30 06:32:41 crc kubenswrapper[4691]: I0930 06:32:41.129567 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkhrk\" (UniqueName: \"kubernetes.io/projected/ed991450-2c1e-4d1a-a54c-196c5067ce69-kube-api-access-kkhrk\") pod \"3cdb0998117b6528e8d4f4afc7f870da2241657f1d88a1ece591ac0268b75jz\" (UID: \"ed991450-2c1e-4d1a-a54c-196c5067ce69\") " pod="openstack-operators/3cdb0998117b6528e8d4f4afc7f870da2241657f1d88a1ece591ac0268b75jz" Sep 30 06:32:41 crc kubenswrapper[4691]: I0930 06:32:41.154726 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/3cdb0998117b6528e8d4f4afc7f870da2241657f1d88a1ece591ac0268b75jz" Sep 30 06:32:41 crc kubenswrapper[4691]: I0930 06:32:41.552829 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/3cdb0998117b6528e8d4f4afc7f870da2241657f1d88a1ece591ac0268b75jz"] Sep 30 06:32:41 crc kubenswrapper[4691]: W0930 06:32:41.558664 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded991450_2c1e_4d1a_a54c_196c5067ce69.slice/crio-39b51aeaadaac8fa062eeb498774bea295d7ff198c63514e177731ca16942ed2 WatchSource:0}: Error finding container 39b51aeaadaac8fa062eeb498774bea295d7ff198c63514e177731ca16942ed2: Status 404 returned error can't find the container with id 39b51aeaadaac8fa062eeb498774bea295d7ff198c63514e177731ca16942ed2 Sep 30 06:32:41 crc kubenswrapper[4691]: I0930 06:32:41.991638 4691 generic.go:334] "Generic (PLEG): container finished" podID="ed991450-2c1e-4d1a-a54c-196c5067ce69" containerID="0b5e2bd7eb669792266b97e7049b0b7a5935f43659f349a4fce556182e5e9b3f" exitCode=0 Sep 30 06:32:41 crc kubenswrapper[4691]: I0930 06:32:41.991734 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/3cdb0998117b6528e8d4f4afc7f870da2241657f1d88a1ece591ac0268b75jz" event={"ID":"ed991450-2c1e-4d1a-a54c-196c5067ce69","Type":"ContainerDied","Data":"0b5e2bd7eb669792266b97e7049b0b7a5935f43659f349a4fce556182e5e9b3f"} Sep 30 06:32:41 crc kubenswrapper[4691]: I0930 06:32:41.991990 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/3cdb0998117b6528e8d4f4afc7f870da2241657f1d88a1ece591ac0268b75jz" event={"ID":"ed991450-2c1e-4d1a-a54c-196c5067ce69","Type":"ContainerStarted","Data":"39b51aeaadaac8fa062eeb498774bea295d7ff198c63514e177731ca16942ed2"} Sep 30 06:32:42 crc kubenswrapper[4691]: I0930 06:32:42.999205 4691 generic.go:334] "Generic (PLEG): container finished" podID="ed991450-2c1e-4d1a-a54c-196c5067ce69" containerID="7138f085b935d5b9db6d805431d22496362dede51f9fb2c577dd7a6397eeeafd" exitCode=0 Sep 30 06:32:42 crc kubenswrapper[4691]: I0930 06:32:42.999305 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/3cdb0998117b6528e8d4f4afc7f870da2241657f1d88a1ece591ac0268b75jz" event={"ID":"ed991450-2c1e-4d1a-a54c-196c5067ce69","Type":"ContainerDied","Data":"7138f085b935d5b9db6d805431d22496362dede51f9fb2c577dd7a6397eeeafd"} Sep 30 06:32:44 crc kubenswrapper[4691]: I0930 06:32:44.006216 4691 generic.go:334] "Generic (PLEG): container finished" podID="ed991450-2c1e-4d1a-a54c-196c5067ce69" containerID="440ff04dde3c0a030126cf64d27a870390d14bea1e26a4435057fb00dc72da39" exitCode=0 Sep 30 06:32:44 crc kubenswrapper[4691]: I0930 06:32:44.006456 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/3cdb0998117b6528e8d4f4afc7f870da2241657f1d88a1ece591ac0268b75jz" event={"ID":"ed991450-2c1e-4d1a-a54c-196c5067ce69","Type":"ContainerDied","Data":"440ff04dde3c0a030126cf64d27a870390d14bea1e26a4435057fb00dc72da39"} Sep 30 06:32:45 crc kubenswrapper[4691]: I0930 06:32:45.342992 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/3cdb0998117b6528e8d4f4afc7f870da2241657f1d88a1ece591ac0268b75jz" Sep 30 06:32:45 crc kubenswrapper[4691]: I0930 06:32:45.463067 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ed991450-2c1e-4d1a-a54c-196c5067ce69-util\") pod \"ed991450-2c1e-4d1a-a54c-196c5067ce69\" (UID: \"ed991450-2c1e-4d1a-a54c-196c5067ce69\") " Sep 30 06:32:45 crc kubenswrapper[4691]: I0930 06:32:45.463187 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ed991450-2c1e-4d1a-a54c-196c5067ce69-bundle\") pod \"ed991450-2c1e-4d1a-a54c-196c5067ce69\" (UID: \"ed991450-2c1e-4d1a-a54c-196c5067ce69\") " Sep 30 06:32:45 crc kubenswrapper[4691]: I0930 06:32:45.463320 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkhrk\" (UniqueName: \"kubernetes.io/projected/ed991450-2c1e-4d1a-a54c-196c5067ce69-kube-api-access-kkhrk\") pod \"ed991450-2c1e-4d1a-a54c-196c5067ce69\" (UID: \"ed991450-2c1e-4d1a-a54c-196c5067ce69\") " Sep 30 06:32:45 crc kubenswrapper[4691]: I0930 06:32:45.463787 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed991450-2c1e-4d1a-a54c-196c5067ce69-bundle" (OuterVolumeSpecName: "bundle") pod "ed991450-2c1e-4d1a-a54c-196c5067ce69" (UID: "ed991450-2c1e-4d1a-a54c-196c5067ce69"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:32:45 crc kubenswrapper[4691]: I0930 06:32:45.470344 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed991450-2c1e-4d1a-a54c-196c5067ce69-kube-api-access-kkhrk" (OuterVolumeSpecName: "kube-api-access-kkhrk") pod "ed991450-2c1e-4d1a-a54c-196c5067ce69" (UID: "ed991450-2c1e-4d1a-a54c-196c5067ce69"). InnerVolumeSpecName "kube-api-access-kkhrk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:32:45 crc kubenswrapper[4691]: I0930 06:32:45.476035 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed991450-2c1e-4d1a-a54c-196c5067ce69-util" (OuterVolumeSpecName: "util") pod "ed991450-2c1e-4d1a-a54c-196c5067ce69" (UID: "ed991450-2c1e-4d1a-a54c-196c5067ce69"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:32:45 crc kubenswrapper[4691]: I0930 06:32:45.564562 4691 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ed991450-2c1e-4d1a-a54c-196c5067ce69-util\") on node \"crc\" DevicePath \"\"" Sep 30 06:32:45 crc kubenswrapper[4691]: I0930 06:32:45.564613 4691 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ed991450-2c1e-4d1a-a54c-196c5067ce69-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:32:45 crc kubenswrapper[4691]: I0930 06:32:45.564631 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkhrk\" (UniqueName: \"kubernetes.io/projected/ed991450-2c1e-4d1a-a54c-196c5067ce69-kube-api-access-kkhrk\") on node \"crc\" DevicePath \"\"" Sep 30 06:32:46 crc kubenswrapper[4691]: I0930 06:32:46.020857 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/3cdb0998117b6528e8d4f4afc7f870da2241657f1d88a1ece591ac0268b75jz" event={"ID":"ed991450-2c1e-4d1a-a54c-196c5067ce69","Type":"ContainerDied","Data":"39b51aeaadaac8fa062eeb498774bea295d7ff198c63514e177731ca16942ed2"} Sep 30 06:32:46 crc kubenswrapper[4691]: I0930 06:32:46.020953 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="39b51aeaadaac8fa062eeb498774bea295d7ff198c63514e177731ca16942ed2" Sep 30 06:32:46 crc kubenswrapper[4691]: I0930 06:32:46.020970 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/3cdb0998117b6528e8d4f4afc7f870da2241657f1d88a1ece591ac0268b75jz" Sep 30 06:32:47 crc kubenswrapper[4691]: I0930 06:32:47.172480 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-mjxc7" Sep 30 06:32:47 crc kubenswrapper[4691]: I0930 06:32:47.172849 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-mjxc7" Sep 30 06:32:47 crc kubenswrapper[4691]: I0930 06:32:47.232509 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-mjxc7" Sep 30 06:32:47 crc kubenswrapper[4691]: I0930 06:32:47.586121 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kl8j2"] Sep 30 06:32:47 crc kubenswrapper[4691]: E0930 06:32:47.586462 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed991450-2c1e-4d1a-a54c-196c5067ce69" containerName="pull" Sep 30 06:32:47 crc kubenswrapper[4691]: I0930 06:32:47.586485 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed991450-2c1e-4d1a-a54c-196c5067ce69" containerName="pull" Sep 30 06:32:47 crc kubenswrapper[4691]: E0930 06:32:47.586506 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed991450-2c1e-4d1a-a54c-196c5067ce69" containerName="util" Sep 30 06:32:47 crc kubenswrapper[4691]: I0930 06:32:47.586516 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed991450-2c1e-4d1a-a54c-196c5067ce69" containerName="util" Sep 30 06:32:47 crc kubenswrapper[4691]: E0930 06:32:47.586551 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed991450-2c1e-4d1a-a54c-196c5067ce69" containerName="extract" Sep 30 06:32:47 crc kubenswrapper[4691]: I0930 06:32:47.586560 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed991450-2c1e-4d1a-a54c-196c5067ce69" containerName="extract" Sep 30 06:32:47 crc kubenswrapper[4691]: I0930 06:32:47.586750 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed991450-2c1e-4d1a-a54c-196c5067ce69" containerName="extract" Sep 30 06:32:47 crc kubenswrapper[4691]: I0930 06:32:47.588033 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kl8j2" Sep 30 06:32:47 crc kubenswrapper[4691]: I0930 06:32:47.602016 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kl8j2"] Sep 30 06:32:47 crc kubenswrapper[4691]: I0930 06:32:47.692274 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c175656-b086-4abd-aefb-ccea22610682-utilities\") pod \"redhat-marketplace-kl8j2\" (UID: \"2c175656-b086-4abd-aefb-ccea22610682\") " pod="openshift-marketplace/redhat-marketplace-kl8j2" Sep 30 06:32:47 crc kubenswrapper[4691]: I0930 06:32:47.692372 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rn8d6\" (UniqueName: \"kubernetes.io/projected/2c175656-b086-4abd-aefb-ccea22610682-kube-api-access-rn8d6\") pod \"redhat-marketplace-kl8j2\" (UID: \"2c175656-b086-4abd-aefb-ccea22610682\") " pod="openshift-marketplace/redhat-marketplace-kl8j2" Sep 30 06:32:47 crc kubenswrapper[4691]: I0930 06:32:47.692441 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c175656-b086-4abd-aefb-ccea22610682-catalog-content\") pod \"redhat-marketplace-kl8j2\" (UID: \"2c175656-b086-4abd-aefb-ccea22610682\") " pod="openshift-marketplace/redhat-marketplace-kl8j2" Sep 30 06:32:47 crc kubenswrapper[4691]: I0930 06:32:47.793479 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c175656-b086-4abd-aefb-ccea22610682-catalog-content\") pod \"redhat-marketplace-kl8j2\" (UID: \"2c175656-b086-4abd-aefb-ccea22610682\") " pod="openshift-marketplace/redhat-marketplace-kl8j2" Sep 30 06:32:47 crc kubenswrapper[4691]: I0930 06:32:47.793658 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c175656-b086-4abd-aefb-ccea22610682-utilities\") pod \"redhat-marketplace-kl8j2\" (UID: \"2c175656-b086-4abd-aefb-ccea22610682\") " pod="openshift-marketplace/redhat-marketplace-kl8j2" Sep 30 06:32:47 crc kubenswrapper[4691]: I0930 06:32:47.793718 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rn8d6\" (UniqueName: \"kubernetes.io/projected/2c175656-b086-4abd-aefb-ccea22610682-kube-api-access-rn8d6\") pod \"redhat-marketplace-kl8j2\" (UID: \"2c175656-b086-4abd-aefb-ccea22610682\") " pod="openshift-marketplace/redhat-marketplace-kl8j2" Sep 30 06:32:47 crc kubenswrapper[4691]: I0930 06:32:47.794631 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c175656-b086-4abd-aefb-ccea22610682-catalog-content\") pod \"redhat-marketplace-kl8j2\" (UID: \"2c175656-b086-4abd-aefb-ccea22610682\") " pod="openshift-marketplace/redhat-marketplace-kl8j2" Sep 30 06:32:47 crc kubenswrapper[4691]: I0930 06:32:47.794807 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c175656-b086-4abd-aefb-ccea22610682-utilities\") pod \"redhat-marketplace-kl8j2\" (UID: \"2c175656-b086-4abd-aefb-ccea22610682\") " pod="openshift-marketplace/redhat-marketplace-kl8j2" Sep 30 06:32:47 crc kubenswrapper[4691]: I0930 06:32:47.819942 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rn8d6\" (UniqueName: \"kubernetes.io/projected/2c175656-b086-4abd-aefb-ccea22610682-kube-api-access-rn8d6\") pod \"redhat-marketplace-kl8j2\" (UID: \"2c175656-b086-4abd-aefb-ccea22610682\") " pod="openshift-marketplace/redhat-marketplace-kl8j2" Sep 30 06:32:47 crc kubenswrapper[4691]: I0930 06:32:47.964252 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kl8j2" Sep 30 06:32:48 crc kubenswrapper[4691]: I0930 06:32:48.094467 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-mjxc7" Sep 30 06:32:48 crc kubenswrapper[4691]: I0930 06:32:48.440882 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kl8j2"] Sep 30 06:32:49 crc kubenswrapper[4691]: I0930 06:32:49.055290 4691 generic.go:334] "Generic (PLEG): container finished" podID="2c175656-b086-4abd-aefb-ccea22610682" containerID="ffc9d71753e9eacb9a1d3ca76b308e21c24be77ba5f50906ef6bcc5b9d365c22" exitCode=0 Sep 30 06:32:49 crc kubenswrapper[4691]: I0930 06:32:49.055367 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kl8j2" event={"ID":"2c175656-b086-4abd-aefb-ccea22610682","Type":"ContainerDied","Data":"ffc9d71753e9eacb9a1d3ca76b308e21c24be77ba5f50906ef6bcc5b9d365c22"} Sep 30 06:32:49 crc kubenswrapper[4691]: I0930 06:32:49.055678 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kl8j2" event={"ID":"2c175656-b086-4abd-aefb-ccea22610682","Type":"ContainerStarted","Data":"6b99547cfc4769bee715b9fc447a306556a4b59c2abdc183858621567e912a8a"} Sep 30 06:32:50 crc kubenswrapper[4691]: I0930 06:32:50.062467 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kl8j2" event={"ID":"2c175656-b086-4abd-aefb-ccea22610682","Type":"ContainerStarted","Data":"78b60c337c92c7f8d4536bcb109244fe5b6f72f5abb981a706a88b089397545e"} Sep 30 06:32:51 crc kubenswrapper[4691]: I0930 06:32:51.100507 4691 generic.go:334] "Generic (PLEG): container finished" podID="2c175656-b086-4abd-aefb-ccea22610682" containerID="78b60c337c92c7f8d4536bcb109244fe5b6f72f5abb981a706a88b089397545e" exitCode=0 Sep 30 06:32:51 crc kubenswrapper[4691]: I0930 06:32:51.100551 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kl8j2" event={"ID":"2c175656-b086-4abd-aefb-ccea22610682","Type":"ContainerDied","Data":"78b60c337c92c7f8d4536bcb109244fe5b6f72f5abb981a706a88b089397545e"} Sep 30 06:32:52 crc kubenswrapper[4691]: I0930 06:32:52.035064 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-6bb46fb86b-vx6kv"] Sep 30 06:32:52 crc kubenswrapper[4691]: I0930 06:32:52.036622 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-6bb46fb86b-vx6kv" Sep 30 06:32:52 crc kubenswrapper[4691]: I0930 06:32:52.038750 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-bwrtq" Sep 30 06:32:52 crc kubenswrapper[4691]: I0930 06:32:52.072229 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-6bb46fb86b-vx6kv"] Sep 30 06:32:52 crc kubenswrapper[4691]: I0930 06:32:52.125636 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kl8j2" event={"ID":"2c175656-b086-4abd-aefb-ccea22610682","Type":"ContainerStarted","Data":"f1f65fa0704472e0580d1fe5ed37d96b3fbb5eb9db6112e4514f74a3299e5826"} Sep 30 06:32:52 crc kubenswrapper[4691]: I0930 06:32:52.154172 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kl8j2" podStartSLOduration=2.601300523 podStartE2EDuration="5.154154775s" podCreationTimestamp="2025-09-30 06:32:47 +0000 UTC" firstStartedPulling="2025-09-30 06:32:49.057160756 +0000 UTC m=+812.532181836" lastFinishedPulling="2025-09-30 06:32:51.610015048 +0000 UTC m=+815.085036088" observedRunningTime="2025-09-30 06:32:52.149267249 +0000 UTC m=+815.624288329" watchObservedRunningTime="2025-09-30 06:32:52.154154775 +0000 UTC m=+815.629175815" Sep 30 06:32:52 crc kubenswrapper[4691]: I0930 06:32:52.155903 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s79z7\" (UniqueName: \"kubernetes.io/projected/e5d557a0-1dee-462b-89d5-86c8479ef2e4-kube-api-access-s79z7\") pod \"openstack-operator-controller-operator-6bb46fb86b-vx6kv\" (UID: \"e5d557a0-1dee-462b-89d5-86c8479ef2e4\") " pod="openstack-operators/openstack-operator-controller-operator-6bb46fb86b-vx6kv" Sep 30 06:32:52 crc kubenswrapper[4691]: I0930 06:32:52.257776 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s79z7\" (UniqueName: \"kubernetes.io/projected/e5d557a0-1dee-462b-89d5-86c8479ef2e4-kube-api-access-s79z7\") pod \"openstack-operator-controller-operator-6bb46fb86b-vx6kv\" (UID: \"e5d557a0-1dee-462b-89d5-86c8479ef2e4\") " pod="openstack-operators/openstack-operator-controller-operator-6bb46fb86b-vx6kv" Sep 30 06:32:52 crc kubenswrapper[4691]: I0930 06:32:52.285847 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s79z7\" (UniqueName: \"kubernetes.io/projected/e5d557a0-1dee-462b-89d5-86c8479ef2e4-kube-api-access-s79z7\") pod \"openstack-operator-controller-operator-6bb46fb86b-vx6kv\" (UID: \"e5d557a0-1dee-462b-89d5-86c8479ef2e4\") " pod="openstack-operators/openstack-operator-controller-operator-6bb46fb86b-vx6kv" Sep 30 06:32:52 crc kubenswrapper[4691]: I0930 06:32:52.354140 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-6bb46fb86b-vx6kv" Sep 30 06:32:52 crc kubenswrapper[4691]: I0930 06:32:52.684577 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-6bb46fb86b-vx6kv"] Sep 30 06:32:52 crc kubenswrapper[4691]: I0930 06:32:52.770638 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qwgtj"] Sep 30 06:32:52 crc kubenswrapper[4691]: I0930 06:32:52.772683 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qwgtj" Sep 30 06:32:52 crc kubenswrapper[4691]: I0930 06:32:52.779513 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qwgtj"] Sep 30 06:32:52 crc kubenswrapper[4691]: I0930 06:32:52.850068 4691 patch_prober.go:28] interesting pod/machine-config-daemon-4w4k6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 06:32:52 crc kubenswrapper[4691]: I0930 06:32:52.850158 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 06:32:52 crc kubenswrapper[4691]: I0930 06:32:52.866016 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c188a486-111c-4c1b-9fbb-385fbe372986-utilities\") pod \"certified-operators-qwgtj\" (UID: \"c188a486-111c-4c1b-9fbb-385fbe372986\") " pod="openshift-marketplace/certified-operators-qwgtj" Sep 30 06:32:52 crc kubenswrapper[4691]: I0930 06:32:52.866055 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c188a486-111c-4c1b-9fbb-385fbe372986-catalog-content\") pod \"certified-operators-qwgtj\" (UID: \"c188a486-111c-4c1b-9fbb-385fbe372986\") " pod="openshift-marketplace/certified-operators-qwgtj" Sep 30 06:32:52 crc kubenswrapper[4691]: I0930 06:32:52.866099 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vftxr\" (UniqueName: \"kubernetes.io/projected/c188a486-111c-4c1b-9fbb-385fbe372986-kube-api-access-vftxr\") pod \"certified-operators-qwgtj\" (UID: \"c188a486-111c-4c1b-9fbb-385fbe372986\") " pod="openshift-marketplace/certified-operators-qwgtj" Sep 30 06:32:52 crc kubenswrapper[4691]: I0930 06:32:52.968163 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c188a486-111c-4c1b-9fbb-385fbe372986-utilities\") pod \"certified-operators-qwgtj\" (UID: \"c188a486-111c-4c1b-9fbb-385fbe372986\") " pod="openshift-marketplace/certified-operators-qwgtj" Sep 30 06:32:52 crc kubenswrapper[4691]: I0930 06:32:52.968217 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c188a486-111c-4c1b-9fbb-385fbe372986-catalog-content\") pod \"certified-operators-qwgtj\" (UID: \"c188a486-111c-4c1b-9fbb-385fbe372986\") " pod="openshift-marketplace/certified-operators-qwgtj" Sep 30 06:32:52 crc kubenswrapper[4691]: I0930 06:32:52.968246 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vftxr\" (UniqueName: \"kubernetes.io/projected/c188a486-111c-4c1b-9fbb-385fbe372986-kube-api-access-vftxr\") pod \"certified-operators-qwgtj\" (UID: \"c188a486-111c-4c1b-9fbb-385fbe372986\") " pod="openshift-marketplace/certified-operators-qwgtj" Sep 30 06:32:52 crc kubenswrapper[4691]: I0930 06:32:52.968962 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c188a486-111c-4c1b-9fbb-385fbe372986-utilities\") pod \"certified-operators-qwgtj\" (UID: \"c188a486-111c-4c1b-9fbb-385fbe372986\") " pod="openshift-marketplace/certified-operators-qwgtj" Sep 30 06:32:52 crc kubenswrapper[4691]: I0930 06:32:52.969015 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c188a486-111c-4c1b-9fbb-385fbe372986-catalog-content\") pod \"certified-operators-qwgtj\" (UID: \"c188a486-111c-4c1b-9fbb-385fbe372986\") " pod="openshift-marketplace/certified-operators-qwgtj" Sep 30 06:32:52 crc kubenswrapper[4691]: I0930 06:32:52.986998 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vftxr\" (UniqueName: \"kubernetes.io/projected/c188a486-111c-4c1b-9fbb-385fbe372986-kube-api-access-vftxr\") pod \"certified-operators-qwgtj\" (UID: \"c188a486-111c-4c1b-9fbb-385fbe372986\") " pod="openshift-marketplace/certified-operators-qwgtj" Sep 30 06:32:53 crc kubenswrapper[4691]: I0930 06:32:53.090501 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qwgtj" Sep 30 06:32:53 crc kubenswrapper[4691]: I0930 06:32:53.141934 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-6bb46fb86b-vx6kv" event={"ID":"e5d557a0-1dee-462b-89d5-86c8479ef2e4","Type":"ContainerStarted","Data":"4f621835dfdf6044bf541a379c9293439ace903c7be0a88f8fb7024ebaafecab"} Sep 30 06:32:53 crc kubenswrapper[4691]: I0930 06:32:53.356560 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qwgtj"] Sep 30 06:32:54 crc kubenswrapper[4691]: I0930 06:32:54.149387 4691 generic.go:334] "Generic (PLEG): container finished" podID="c188a486-111c-4c1b-9fbb-385fbe372986" containerID="a7ecf73e0bfd74fb317144dc8cfca821248b661b094596ee52604487d649e5ed" exitCode=0 Sep 30 06:32:54 crc kubenswrapper[4691]: I0930 06:32:54.149452 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qwgtj" event={"ID":"c188a486-111c-4c1b-9fbb-385fbe372986","Type":"ContainerDied","Data":"a7ecf73e0bfd74fb317144dc8cfca821248b661b094596ee52604487d649e5ed"} Sep 30 06:32:54 crc kubenswrapper[4691]: I0930 06:32:54.149638 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qwgtj" event={"ID":"c188a486-111c-4c1b-9fbb-385fbe372986","Type":"ContainerStarted","Data":"72e5cae328ea1ce65a116a62a86f7f9dae94899bf7b9318984fc5f15228240a9"} Sep 30 06:32:57 crc kubenswrapper[4691]: I0930 06:32:57.173350 4691 generic.go:334] "Generic (PLEG): container finished" podID="c188a486-111c-4c1b-9fbb-385fbe372986" containerID="c07c8a6cd50e3c1d5c3b73814eefde1871cc750044db2870a46d2e70a03a40e0" exitCode=0 Sep 30 06:32:57 crc kubenswrapper[4691]: I0930 06:32:57.173593 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qwgtj" event={"ID":"c188a486-111c-4c1b-9fbb-385fbe372986","Type":"ContainerDied","Data":"c07c8a6cd50e3c1d5c3b73814eefde1871cc750044db2870a46d2e70a03a40e0"} Sep 30 06:32:57 crc kubenswrapper[4691]: I0930 06:32:57.175854 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-6bb46fb86b-vx6kv" event={"ID":"e5d557a0-1dee-462b-89d5-86c8479ef2e4","Type":"ContainerStarted","Data":"cb23ed8ecb81fb8aab3506b38271bebdef9f8455b308fe9c96292d70ba8f0343"} Sep 30 06:32:57 crc kubenswrapper[4691]: I0930 06:32:57.965304 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kl8j2" Sep 30 06:32:57 crc kubenswrapper[4691]: I0930 06:32:57.965697 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kl8j2" Sep 30 06:32:58 crc kubenswrapper[4691]: I0930 06:32:58.008852 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kl8j2" Sep 30 06:32:58 crc kubenswrapper[4691]: I0930 06:32:58.167069 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mjxc7"] Sep 30 06:32:58 crc kubenswrapper[4691]: I0930 06:32:58.168413 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-mjxc7" podUID="76f92011-8ebb-4535-ac7f-fa625a6ceefe" containerName="registry-server" containerID="cri-o://914436edde955b2e8de893bf200651d814bf3225165e8e12cf3b2eccb830a847" gracePeriod=2 Sep 30 06:32:58 crc kubenswrapper[4691]: I0930 06:32:58.230029 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kl8j2" Sep 30 06:32:58 crc kubenswrapper[4691]: E0930 06:32:58.279604 4691 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76f92011_8ebb_4535_ac7f_fa625a6ceefe.slice/crio-914436edde955b2e8de893bf200651d814bf3225165e8e12cf3b2eccb830a847.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76f92011_8ebb_4535_ac7f_fa625a6ceefe.slice/crio-conmon-914436edde955b2e8de893bf200651d814bf3225165e8e12cf3b2eccb830a847.scope\": RecentStats: unable to find data in memory cache]" Sep 30 06:32:58 crc kubenswrapper[4691]: I0930 06:32:58.943861 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mjxc7" Sep 30 06:32:59 crc kubenswrapper[4691]: I0930 06:32:59.061201 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvfzb\" (UniqueName: \"kubernetes.io/projected/76f92011-8ebb-4535-ac7f-fa625a6ceefe-kube-api-access-nvfzb\") pod \"76f92011-8ebb-4535-ac7f-fa625a6ceefe\" (UID: \"76f92011-8ebb-4535-ac7f-fa625a6ceefe\") " Sep 30 06:32:59 crc kubenswrapper[4691]: I0930 06:32:59.061432 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76f92011-8ebb-4535-ac7f-fa625a6ceefe-utilities\") pod \"76f92011-8ebb-4535-ac7f-fa625a6ceefe\" (UID: \"76f92011-8ebb-4535-ac7f-fa625a6ceefe\") " Sep 30 06:32:59 crc kubenswrapper[4691]: I0930 06:32:59.061465 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76f92011-8ebb-4535-ac7f-fa625a6ceefe-catalog-content\") pod \"76f92011-8ebb-4535-ac7f-fa625a6ceefe\" (UID: \"76f92011-8ebb-4535-ac7f-fa625a6ceefe\") " Sep 30 06:32:59 crc kubenswrapper[4691]: I0930 06:32:59.063054 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76f92011-8ebb-4535-ac7f-fa625a6ceefe-utilities" (OuterVolumeSpecName: "utilities") pod "76f92011-8ebb-4535-ac7f-fa625a6ceefe" (UID: "76f92011-8ebb-4535-ac7f-fa625a6ceefe"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:32:59 crc kubenswrapper[4691]: I0930 06:32:59.067329 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76f92011-8ebb-4535-ac7f-fa625a6ceefe-kube-api-access-nvfzb" (OuterVolumeSpecName: "kube-api-access-nvfzb") pod "76f92011-8ebb-4535-ac7f-fa625a6ceefe" (UID: "76f92011-8ebb-4535-ac7f-fa625a6ceefe"). InnerVolumeSpecName "kube-api-access-nvfzb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:32:59 crc kubenswrapper[4691]: I0930 06:32:59.142856 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76f92011-8ebb-4535-ac7f-fa625a6ceefe-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "76f92011-8ebb-4535-ac7f-fa625a6ceefe" (UID: "76f92011-8ebb-4535-ac7f-fa625a6ceefe"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:32:59 crc kubenswrapper[4691]: I0930 06:32:59.162833 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76f92011-8ebb-4535-ac7f-fa625a6ceefe-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 06:32:59 crc kubenswrapper[4691]: I0930 06:32:59.162874 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76f92011-8ebb-4535-ac7f-fa625a6ceefe-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 06:32:59 crc kubenswrapper[4691]: I0930 06:32:59.162903 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvfzb\" (UniqueName: \"kubernetes.io/projected/76f92011-8ebb-4535-ac7f-fa625a6ceefe-kube-api-access-nvfzb\") on node \"crc\" DevicePath \"\"" Sep 30 06:32:59 crc kubenswrapper[4691]: I0930 06:32:59.193156 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qwgtj" event={"ID":"c188a486-111c-4c1b-9fbb-385fbe372986","Type":"ContainerStarted","Data":"e87e05f081f1a102e32e9fff6a0a40f99721b5e851d86c85c5e99b2e3a1b5c0f"} Sep 30 06:32:59 crc kubenswrapper[4691]: I0930 06:32:59.195689 4691 generic.go:334] "Generic (PLEG): container finished" podID="76f92011-8ebb-4535-ac7f-fa625a6ceefe" containerID="914436edde955b2e8de893bf200651d814bf3225165e8e12cf3b2eccb830a847" exitCode=0 Sep 30 06:32:59 crc kubenswrapper[4691]: I0930 06:32:59.195712 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mjxc7" event={"ID":"76f92011-8ebb-4535-ac7f-fa625a6ceefe","Type":"ContainerDied","Data":"914436edde955b2e8de893bf200651d814bf3225165e8e12cf3b2eccb830a847"} Sep 30 06:32:59 crc kubenswrapper[4691]: I0930 06:32:59.195748 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mjxc7" event={"ID":"76f92011-8ebb-4535-ac7f-fa625a6ceefe","Type":"ContainerDied","Data":"4b0ccdbcf2b2a4b6adb480fc10ba316168032dc3faf4fb19a8e0661b6a606e09"} Sep 30 06:32:59 crc kubenswrapper[4691]: I0930 06:32:59.195747 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mjxc7" Sep 30 06:32:59 crc kubenswrapper[4691]: I0930 06:32:59.195778 4691 scope.go:117] "RemoveContainer" containerID="914436edde955b2e8de893bf200651d814bf3225165e8e12cf3b2eccb830a847" Sep 30 06:32:59 crc kubenswrapper[4691]: I0930 06:32:59.199986 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-6bb46fb86b-vx6kv" event={"ID":"e5d557a0-1dee-462b-89d5-86c8479ef2e4","Type":"ContainerStarted","Data":"773e4c96fbd14872ec0811783e41e604123a88e131a77ea7b5107861490fdb36"} Sep 30 06:32:59 crc kubenswrapper[4691]: I0930 06:32:59.221094 4691 scope.go:117] "RemoveContainer" containerID="d1e80c611742cfa8aeb954b2a721c3767674b5be728eaeea16f3912c159eacaa" Sep 30 06:32:59 crc kubenswrapper[4691]: I0930 06:32:59.228805 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qwgtj" podStartSLOduration=3.382393884 podStartE2EDuration="7.22878515s" podCreationTimestamp="2025-09-30 06:32:52 +0000 UTC" firstStartedPulling="2025-09-30 06:32:55.028040916 +0000 UTC m=+818.503061946" lastFinishedPulling="2025-09-30 06:32:58.874432162 +0000 UTC m=+822.349453212" observedRunningTime="2025-09-30 06:32:59.218473581 +0000 UTC m=+822.693494651" watchObservedRunningTime="2025-09-30 06:32:59.22878515 +0000 UTC m=+822.703806200" Sep 30 06:32:59 crc kubenswrapper[4691]: I0930 06:32:59.237780 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mjxc7"] Sep 30 06:32:59 crc kubenswrapper[4691]: I0930 06:32:59.238344 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-mjxc7"] Sep 30 06:32:59 crc kubenswrapper[4691]: I0930 06:32:59.273499 4691 scope.go:117] "RemoveContainer" containerID="b3aa18a92e0669c257495ae4bf2a7deb7151a69d5d3302a2fb275709456f34a3" Sep 30 06:32:59 crc kubenswrapper[4691]: I0930 06:32:59.293833 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-6bb46fb86b-vx6kv" podStartSLOduration=1.059831686 podStartE2EDuration="7.293798594s" podCreationTimestamp="2025-09-30 06:32:52 +0000 UTC" firstStartedPulling="2025-09-30 06:32:52.695208713 +0000 UTC m=+816.170229753" lastFinishedPulling="2025-09-30 06:32:58.929175601 +0000 UTC m=+822.404196661" observedRunningTime="2025-09-30 06:32:59.275790373 +0000 UTC m=+822.750811413" watchObservedRunningTime="2025-09-30 06:32:59.293798594 +0000 UTC m=+822.768819644" Sep 30 06:32:59 crc kubenswrapper[4691]: I0930 06:32:59.297310 4691 scope.go:117] "RemoveContainer" containerID="914436edde955b2e8de893bf200651d814bf3225165e8e12cf3b2eccb830a847" Sep 30 06:32:59 crc kubenswrapper[4691]: E0930 06:32:59.297912 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"914436edde955b2e8de893bf200651d814bf3225165e8e12cf3b2eccb830a847\": container with ID starting with 914436edde955b2e8de893bf200651d814bf3225165e8e12cf3b2eccb830a847 not found: ID does not exist" containerID="914436edde955b2e8de893bf200651d814bf3225165e8e12cf3b2eccb830a847" Sep 30 06:32:59 crc kubenswrapper[4691]: I0930 06:32:59.297959 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"914436edde955b2e8de893bf200651d814bf3225165e8e12cf3b2eccb830a847"} err="failed to get container status \"914436edde955b2e8de893bf200651d814bf3225165e8e12cf3b2eccb830a847\": rpc error: code = NotFound desc = could not find container \"914436edde955b2e8de893bf200651d814bf3225165e8e12cf3b2eccb830a847\": container with ID starting with 914436edde955b2e8de893bf200651d814bf3225165e8e12cf3b2eccb830a847 not found: ID does not exist" Sep 30 06:32:59 crc kubenswrapper[4691]: I0930 06:32:59.297987 4691 scope.go:117] "RemoveContainer" containerID="d1e80c611742cfa8aeb954b2a721c3767674b5be728eaeea16f3912c159eacaa" Sep 30 06:32:59 crc kubenswrapper[4691]: E0930 06:32:59.299141 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1e80c611742cfa8aeb954b2a721c3767674b5be728eaeea16f3912c159eacaa\": container with ID starting with d1e80c611742cfa8aeb954b2a721c3767674b5be728eaeea16f3912c159eacaa not found: ID does not exist" containerID="d1e80c611742cfa8aeb954b2a721c3767674b5be728eaeea16f3912c159eacaa" Sep 30 06:32:59 crc kubenswrapper[4691]: I0930 06:32:59.299198 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1e80c611742cfa8aeb954b2a721c3767674b5be728eaeea16f3912c159eacaa"} err="failed to get container status \"d1e80c611742cfa8aeb954b2a721c3767674b5be728eaeea16f3912c159eacaa\": rpc error: code = NotFound desc = could not find container \"d1e80c611742cfa8aeb954b2a721c3767674b5be728eaeea16f3912c159eacaa\": container with ID starting with d1e80c611742cfa8aeb954b2a721c3767674b5be728eaeea16f3912c159eacaa not found: ID does not exist" Sep 30 06:32:59 crc kubenswrapper[4691]: I0930 06:32:59.299230 4691 scope.go:117] "RemoveContainer" containerID="b3aa18a92e0669c257495ae4bf2a7deb7151a69d5d3302a2fb275709456f34a3" Sep 30 06:32:59 crc kubenswrapper[4691]: E0930 06:32:59.299689 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3aa18a92e0669c257495ae4bf2a7deb7151a69d5d3302a2fb275709456f34a3\": container with ID starting with b3aa18a92e0669c257495ae4bf2a7deb7151a69d5d3302a2fb275709456f34a3 not found: ID does not exist" containerID="b3aa18a92e0669c257495ae4bf2a7deb7151a69d5d3302a2fb275709456f34a3" Sep 30 06:32:59 crc kubenswrapper[4691]: I0930 06:32:59.299722 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3aa18a92e0669c257495ae4bf2a7deb7151a69d5d3302a2fb275709456f34a3"} err="failed to get container status \"b3aa18a92e0669c257495ae4bf2a7deb7151a69d5d3302a2fb275709456f34a3\": rpc error: code = NotFound desc = could not find container \"b3aa18a92e0669c257495ae4bf2a7deb7151a69d5d3302a2fb275709456f34a3\": container with ID starting with b3aa18a92e0669c257495ae4bf2a7deb7151a69d5d3302a2fb275709456f34a3 not found: ID does not exist" Sep 30 06:33:00 crc kubenswrapper[4691]: I0930 06:33:00.209395 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-6bb46fb86b-vx6kv" Sep 30 06:33:01 crc kubenswrapper[4691]: I0930 06:33:01.219537 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-6bb46fb86b-vx6kv" Sep 30 06:33:01 crc kubenswrapper[4691]: I0930 06:33:01.238704 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76f92011-8ebb-4535-ac7f-fa625a6ceefe" path="/var/lib/kubelet/pods/76f92011-8ebb-4535-ac7f-fa625a6ceefe/volumes" Sep 30 06:33:02 crc kubenswrapper[4691]: I0930 06:33:02.970262 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kl8j2"] Sep 30 06:33:02 crc kubenswrapper[4691]: I0930 06:33:02.971213 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-kl8j2" podUID="2c175656-b086-4abd-aefb-ccea22610682" containerName="registry-server" containerID="cri-o://f1f65fa0704472e0580d1fe5ed37d96b3fbb5eb9db6112e4514f74a3299e5826" gracePeriod=2 Sep 30 06:33:03 crc kubenswrapper[4691]: I0930 06:33:03.126962 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qwgtj" Sep 30 06:33:03 crc kubenswrapper[4691]: I0930 06:33:03.135254 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qwgtj" Sep 30 06:33:03 crc kubenswrapper[4691]: I0930 06:33:03.216612 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qwgtj" Sep 30 06:33:03 crc kubenswrapper[4691]: I0930 06:33:03.251957 4691 generic.go:334] "Generic (PLEG): container finished" podID="2c175656-b086-4abd-aefb-ccea22610682" containerID="f1f65fa0704472e0580d1fe5ed37d96b3fbb5eb9db6112e4514f74a3299e5826" exitCode=0 Sep 30 06:33:03 crc kubenswrapper[4691]: I0930 06:33:03.252429 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kl8j2" event={"ID":"2c175656-b086-4abd-aefb-ccea22610682","Type":"ContainerDied","Data":"f1f65fa0704472e0580d1fe5ed37d96b3fbb5eb9db6112e4514f74a3299e5826"} Sep 30 06:33:03 crc kubenswrapper[4691]: I0930 06:33:03.398735 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kl8j2" Sep 30 06:33:03 crc kubenswrapper[4691]: I0930 06:33:03.437788 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c175656-b086-4abd-aefb-ccea22610682-utilities\") pod \"2c175656-b086-4abd-aefb-ccea22610682\" (UID: \"2c175656-b086-4abd-aefb-ccea22610682\") " Sep 30 06:33:03 crc kubenswrapper[4691]: I0930 06:33:03.437850 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rn8d6\" (UniqueName: \"kubernetes.io/projected/2c175656-b086-4abd-aefb-ccea22610682-kube-api-access-rn8d6\") pod \"2c175656-b086-4abd-aefb-ccea22610682\" (UID: \"2c175656-b086-4abd-aefb-ccea22610682\") " Sep 30 06:33:03 crc kubenswrapper[4691]: I0930 06:33:03.437875 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c175656-b086-4abd-aefb-ccea22610682-catalog-content\") pod \"2c175656-b086-4abd-aefb-ccea22610682\" (UID: \"2c175656-b086-4abd-aefb-ccea22610682\") " Sep 30 06:33:03 crc kubenswrapper[4691]: I0930 06:33:03.438816 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c175656-b086-4abd-aefb-ccea22610682-utilities" (OuterVolumeSpecName: "utilities") pod "2c175656-b086-4abd-aefb-ccea22610682" (UID: "2c175656-b086-4abd-aefb-ccea22610682"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:33:03 crc kubenswrapper[4691]: I0930 06:33:03.443295 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c175656-b086-4abd-aefb-ccea22610682-kube-api-access-rn8d6" (OuterVolumeSpecName: "kube-api-access-rn8d6") pod "2c175656-b086-4abd-aefb-ccea22610682" (UID: "2c175656-b086-4abd-aefb-ccea22610682"). InnerVolumeSpecName "kube-api-access-rn8d6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:33:03 crc kubenswrapper[4691]: I0930 06:33:03.452220 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c175656-b086-4abd-aefb-ccea22610682-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2c175656-b086-4abd-aefb-ccea22610682" (UID: "2c175656-b086-4abd-aefb-ccea22610682"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:33:03 crc kubenswrapper[4691]: I0930 06:33:03.539502 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2c175656-b086-4abd-aefb-ccea22610682-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 06:33:03 crc kubenswrapper[4691]: I0930 06:33:03.539537 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rn8d6\" (UniqueName: \"kubernetes.io/projected/2c175656-b086-4abd-aefb-ccea22610682-kube-api-access-rn8d6\") on node \"crc\" DevicePath \"\"" Sep 30 06:33:03 crc kubenswrapper[4691]: I0930 06:33:03.539548 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2c175656-b086-4abd-aefb-ccea22610682-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 06:33:04 crc kubenswrapper[4691]: I0930 06:33:04.260053 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kl8j2" event={"ID":"2c175656-b086-4abd-aefb-ccea22610682","Type":"ContainerDied","Data":"6b99547cfc4769bee715b9fc447a306556a4b59c2abdc183858621567e912a8a"} Sep 30 06:33:04 crc kubenswrapper[4691]: I0930 06:33:04.260095 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kl8j2" Sep 30 06:33:04 crc kubenswrapper[4691]: I0930 06:33:04.260346 4691 scope.go:117] "RemoveContainer" containerID="f1f65fa0704472e0580d1fe5ed37d96b3fbb5eb9db6112e4514f74a3299e5826" Sep 30 06:33:04 crc kubenswrapper[4691]: I0930 06:33:04.284663 4691 scope.go:117] "RemoveContainer" containerID="78b60c337c92c7f8d4536bcb109244fe5b6f72f5abb981a706a88b089397545e" Sep 30 06:33:04 crc kubenswrapper[4691]: I0930 06:33:04.303443 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kl8j2"] Sep 30 06:33:04 crc kubenswrapper[4691]: I0930 06:33:04.312402 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-kl8j2"] Sep 30 06:33:04 crc kubenswrapper[4691]: I0930 06:33:04.324557 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qwgtj" Sep 30 06:33:04 crc kubenswrapper[4691]: I0930 06:33:04.331118 4691 scope.go:117] "RemoveContainer" containerID="ffc9d71753e9eacb9a1d3ca76b308e21c24be77ba5f50906ef6bcc5b9d365c22" Sep 30 06:33:05 crc kubenswrapper[4691]: I0930 06:33:05.238187 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c175656-b086-4abd-aefb-ccea22610682" path="/var/lib/kubelet/pods/2c175656-b086-4abd-aefb-ccea22610682/volumes" Sep 30 06:33:07 crc kubenswrapper[4691]: I0930 06:33:07.763609 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qwgtj"] Sep 30 06:33:07 crc kubenswrapper[4691]: I0930 06:33:07.764015 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qwgtj" podUID="c188a486-111c-4c1b-9fbb-385fbe372986" containerName="registry-server" containerID="cri-o://e87e05f081f1a102e32e9fff6a0a40f99721b5e851d86c85c5e99b2e3a1b5c0f" gracePeriod=2 Sep 30 06:33:08 crc kubenswrapper[4691]: I0930 06:33:08.247153 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qwgtj" Sep 30 06:33:08 crc kubenswrapper[4691]: I0930 06:33:08.294205 4691 generic.go:334] "Generic (PLEG): container finished" podID="c188a486-111c-4c1b-9fbb-385fbe372986" containerID="e87e05f081f1a102e32e9fff6a0a40f99721b5e851d86c85c5e99b2e3a1b5c0f" exitCode=0 Sep 30 06:33:08 crc kubenswrapper[4691]: I0930 06:33:08.294242 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qwgtj" event={"ID":"c188a486-111c-4c1b-9fbb-385fbe372986","Type":"ContainerDied","Data":"e87e05f081f1a102e32e9fff6a0a40f99721b5e851d86c85c5e99b2e3a1b5c0f"} Sep 30 06:33:08 crc kubenswrapper[4691]: I0930 06:33:08.294276 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qwgtj" Sep 30 06:33:08 crc kubenswrapper[4691]: I0930 06:33:08.294305 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qwgtj" event={"ID":"c188a486-111c-4c1b-9fbb-385fbe372986","Type":"ContainerDied","Data":"72e5cae328ea1ce65a116a62a86f7f9dae94899bf7b9318984fc5f15228240a9"} Sep 30 06:33:08 crc kubenswrapper[4691]: I0930 06:33:08.294326 4691 scope.go:117] "RemoveContainer" containerID="e87e05f081f1a102e32e9fff6a0a40f99721b5e851d86c85c5e99b2e3a1b5c0f" Sep 30 06:33:08 crc kubenswrapper[4691]: I0930 06:33:08.303718 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vftxr\" (UniqueName: \"kubernetes.io/projected/c188a486-111c-4c1b-9fbb-385fbe372986-kube-api-access-vftxr\") pod \"c188a486-111c-4c1b-9fbb-385fbe372986\" (UID: \"c188a486-111c-4c1b-9fbb-385fbe372986\") " Sep 30 06:33:08 crc kubenswrapper[4691]: I0930 06:33:08.303769 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c188a486-111c-4c1b-9fbb-385fbe372986-utilities\") pod \"c188a486-111c-4c1b-9fbb-385fbe372986\" (UID: \"c188a486-111c-4c1b-9fbb-385fbe372986\") " Sep 30 06:33:08 crc kubenswrapper[4691]: I0930 06:33:08.303880 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c188a486-111c-4c1b-9fbb-385fbe372986-catalog-content\") pod \"c188a486-111c-4c1b-9fbb-385fbe372986\" (UID: \"c188a486-111c-4c1b-9fbb-385fbe372986\") " Sep 30 06:33:08 crc kubenswrapper[4691]: I0930 06:33:08.305929 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c188a486-111c-4c1b-9fbb-385fbe372986-utilities" (OuterVolumeSpecName: "utilities") pod "c188a486-111c-4c1b-9fbb-385fbe372986" (UID: "c188a486-111c-4c1b-9fbb-385fbe372986"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:33:08 crc kubenswrapper[4691]: I0930 06:33:08.311063 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c188a486-111c-4c1b-9fbb-385fbe372986-kube-api-access-vftxr" (OuterVolumeSpecName: "kube-api-access-vftxr") pod "c188a486-111c-4c1b-9fbb-385fbe372986" (UID: "c188a486-111c-4c1b-9fbb-385fbe372986"). InnerVolumeSpecName "kube-api-access-vftxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:33:08 crc kubenswrapper[4691]: I0930 06:33:08.311461 4691 scope.go:117] "RemoveContainer" containerID="c07c8a6cd50e3c1d5c3b73814eefde1871cc750044db2870a46d2e70a03a40e0" Sep 30 06:33:08 crc kubenswrapper[4691]: I0930 06:33:08.360387 4691 scope.go:117] "RemoveContainer" containerID="a7ecf73e0bfd74fb317144dc8cfca821248b661b094596ee52604487d649e5ed" Sep 30 06:33:08 crc kubenswrapper[4691]: I0930 06:33:08.386640 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c188a486-111c-4c1b-9fbb-385fbe372986-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c188a486-111c-4c1b-9fbb-385fbe372986" (UID: "c188a486-111c-4c1b-9fbb-385fbe372986"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:33:08 crc kubenswrapper[4691]: I0930 06:33:08.389448 4691 scope.go:117] "RemoveContainer" containerID="e87e05f081f1a102e32e9fff6a0a40f99721b5e851d86c85c5e99b2e3a1b5c0f" Sep 30 06:33:08 crc kubenswrapper[4691]: E0930 06:33:08.390192 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e87e05f081f1a102e32e9fff6a0a40f99721b5e851d86c85c5e99b2e3a1b5c0f\": container with ID starting with e87e05f081f1a102e32e9fff6a0a40f99721b5e851d86c85c5e99b2e3a1b5c0f not found: ID does not exist" containerID="e87e05f081f1a102e32e9fff6a0a40f99721b5e851d86c85c5e99b2e3a1b5c0f" Sep 30 06:33:08 crc kubenswrapper[4691]: I0930 06:33:08.390240 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e87e05f081f1a102e32e9fff6a0a40f99721b5e851d86c85c5e99b2e3a1b5c0f"} err="failed to get container status \"e87e05f081f1a102e32e9fff6a0a40f99721b5e851d86c85c5e99b2e3a1b5c0f\": rpc error: code = NotFound desc = could not find container \"e87e05f081f1a102e32e9fff6a0a40f99721b5e851d86c85c5e99b2e3a1b5c0f\": container with ID starting with e87e05f081f1a102e32e9fff6a0a40f99721b5e851d86c85c5e99b2e3a1b5c0f not found: ID does not exist" Sep 30 06:33:08 crc kubenswrapper[4691]: I0930 06:33:08.390269 4691 scope.go:117] "RemoveContainer" containerID="c07c8a6cd50e3c1d5c3b73814eefde1871cc750044db2870a46d2e70a03a40e0" Sep 30 06:33:08 crc kubenswrapper[4691]: E0930 06:33:08.390549 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c07c8a6cd50e3c1d5c3b73814eefde1871cc750044db2870a46d2e70a03a40e0\": container with ID starting with c07c8a6cd50e3c1d5c3b73814eefde1871cc750044db2870a46d2e70a03a40e0 not found: ID does not exist" containerID="c07c8a6cd50e3c1d5c3b73814eefde1871cc750044db2870a46d2e70a03a40e0" Sep 30 06:33:08 crc kubenswrapper[4691]: I0930 06:33:08.390572 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c07c8a6cd50e3c1d5c3b73814eefde1871cc750044db2870a46d2e70a03a40e0"} err="failed to get container status \"c07c8a6cd50e3c1d5c3b73814eefde1871cc750044db2870a46d2e70a03a40e0\": rpc error: code = NotFound desc = could not find container \"c07c8a6cd50e3c1d5c3b73814eefde1871cc750044db2870a46d2e70a03a40e0\": container with ID starting with c07c8a6cd50e3c1d5c3b73814eefde1871cc750044db2870a46d2e70a03a40e0 not found: ID does not exist" Sep 30 06:33:08 crc kubenswrapper[4691]: I0930 06:33:08.390585 4691 scope.go:117] "RemoveContainer" containerID="a7ecf73e0bfd74fb317144dc8cfca821248b661b094596ee52604487d649e5ed" Sep 30 06:33:08 crc kubenswrapper[4691]: E0930 06:33:08.390936 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7ecf73e0bfd74fb317144dc8cfca821248b661b094596ee52604487d649e5ed\": container with ID starting with a7ecf73e0bfd74fb317144dc8cfca821248b661b094596ee52604487d649e5ed not found: ID does not exist" containerID="a7ecf73e0bfd74fb317144dc8cfca821248b661b094596ee52604487d649e5ed" Sep 30 06:33:08 crc kubenswrapper[4691]: I0930 06:33:08.390954 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7ecf73e0bfd74fb317144dc8cfca821248b661b094596ee52604487d649e5ed"} err="failed to get container status \"a7ecf73e0bfd74fb317144dc8cfca821248b661b094596ee52604487d649e5ed\": rpc error: code = NotFound desc = could not find container \"a7ecf73e0bfd74fb317144dc8cfca821248b661b094596ee52604487d649e5ed\": container with ID starting with a7ecf73e0bfd74fb317144dc8cfca821248b661b094596ee52604487d649e5ed not found: ID does not exist" Sep 30 06:33:08 crc kubenswrapper[4691]: I0930 06:33:08.404946 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vftxr\" (UniqueName: \"kubernetes.io/projected/c188a486-111c-4c1b-9fbb-385fbe372986-kube-api-access-vftxr\") on node \"crc\" DevicePath \"\"" Sep 30 06:33:08 crc kubenswrapper[4691]: I0930 06:33:08.404970 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c188a486-111c-4c1b-9fbb-385fbe372986-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 06:33:08 crc kubenswrapper[4691]: I0930 06:33:08.404983 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c188a486-111c-4c1b-9fbb-385fbe372986-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 06:33:08 crc kubenswrapper[4691]: I0930 06:33:08.620954 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qwgtj"] Sep 30 06:33:08 crc kubenswrapper[4691]: I0930 06:33:08.624693 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qwgtj"] Sep 30 06:33:09 crc kubenswrapper[4691]: I0930 06:33:09.238284 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c188a486-111c-4c1b-9fbb-385fbe372986" path="/var/lib/kubelet/pods/c188a486-111c-4c1b-9fbb-385fbe372986/volumes" Sep 30 06:33:18 crc kubenswrapper[4691]: I0930 06:33:18.746214 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6ff8b75857-qqx55"] Sep 30 06:33:18 crc kubenswrapper[4691]: E0930 06:33:18.749570 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c188a486-111c-4c1b-9fbb-385fbe372986" containerName="registry-server" Sep 30 06:33:18 crc kubenswrapper[4691]: I0930 06:33:18.749725 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="c188a486-111c-4c1b-9fbb-385fbe372986" containerName="registry-server" Sep 30 06:33:18 crc kubenswrapper[4691]: E0930 06:33:18.749836 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76f92011-8ebb-4535-ac7f-fa625a6ceefe" containerName="extract-utilities" Sep 30 06:33:18 crc kubenswrapper[4691]: I0930 06:33:18.749968 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="76f92011-8ebb-4535-ac7f-fa625a6ceefe" containerName="extract-utilities" Sep 30 06:33:18 crc kubenswrapper[4691]: E0930 06:33:18.750058 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c175656-b086-4abd-aefb-ccea22610682" containerName="extract-utilities" Sep 30 06:33:18 crc kubenswrapper[4691]: I0930 06:33:18.750232 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c175656-b086-4abd-aefb-ccea22610682" containerName="extract-utilities" Sep 30 06:33:18 crc kubenswrapper[4691]: E0930 06:33:18.750318 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76f92011-8ebb-4535-ac7f-fa625a6ceefe" containerName="registry-server" Sep 30 06:33:18 crc kubenswrapper[4691]: I0930 06:33:18.750397 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="76f92011-8ebb-4535-ac7f-fa625a6ceefe" containerName="registry-server" Sep 30 06:33:18 crc kubenswrapper[4691]: E0930 06:33:18.750477 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c188a486-111c-4c1b-9fbb-385fbe372986" containerName="extract-utilities" Sep 30 06:33:18 crc kubenswrapper[4691]: I0930 06:33:18.750555 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="c188a486-111c-4c1b-9fbb-385fbe372986" containerName="extract-utilities" Sep 30 06:33:18 crc kubenswrapper[4691]: E0930 06:33:18.750666 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76f92011-8ebb-4535-ac7f-fa625a6ceefe" containerName="extract-content" Sep 30 06:33:18 crc kubenswrapper[4691]: I0930 06:33:18.750800 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="76f92011-8ebb-4535-ac7f-fa625a6ceefe" containerName="extract-content" Sep 30 06:33:18 crc kubenswrapper[4691]: E0930 06:33:18.750929 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c175656-b086-4abd-aefb-ccea22610682" containerName="extract-content" Sep 30 06:33:18 crc kubenswrapper[4691]: I0930 06:33:18.751016 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c175656-b086-4abd-aefb-ccea22610682" containerName="extract-content" Sep 30 06:33:18 crc kubenswrapper[4691]: E0930 06:33:18.751098 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c188a486-111c-4c1b-9fbb-385fbe372986" containerName="extract-content" Sep 30 06:33:18 crc kubenswrapper[4691]: I0930 06:33:18.751209 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="c188a486-111c-4c1b-9fbb-385fbe372986" containerName="extract-content" Sep 30 06:33:18 crc kubenswrapper[4691]: E0930 06:33:18.751302 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c175656-b086-4abd-aefb-ccea22610682" containerName="registry-server" Sep 30 06:33:18 crc kubenswrapper[4691]: I0930 06:33:18.751375 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c175656-b086-4abd-aefb-ccea22610682" containerName="registry-server" Sep 30 06:33:18 crc kubenswrapper[4691]: I0930 06:33:18.751624 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="c188a486-111c-4c1b-9fbb-385fbe372986" containerName="registry-server" Sep 30 06:33:18 crc kubenswrapper[4691]: I0930 06:33:18.751725 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c175656-b086-4abd-aefb-ccea22610682" containerName="registry-server" Sep 30 06:33:18 crc kubenswrapper[4691]: I0930 06:33:18.751810 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="76f92011-8ebb-4535-ac7f-fa625a6ceefe" containerName="registry-server" Sep 30 06:33:18 crc kubenswrapper[4691]: I0930 06:33:18.752851 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-qqx55" Sep 30 06:33:18 crc kubenswrapper[4691]: I0930 06:33:18.756101 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-644bddb6d8-9dj86"] Sep 30 06:33:18 crc kubenswrapper[4691]: I0930 06:33:18.756316 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-nf5cx" Sep 30 06:33:18 crc kubenswrapper[4691]: I0930 06:33:18.757219 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-9dj86" Sep 30 06:33:18 crc kubenswrapper[4691]: I0930 06:33:18.766044 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6ff8b75857-qqx55"] Sep 30 06:33:18 crc kubenswrapper[4691]: I0930 06:33:18.770226 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-bmf7s" Sep 30 06:33:18 crc kubenswrapper[4691]: I0930 06:33:18.781575 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-84f4f7b77b-8wf6n"] Sep 30 06:33:18 crc kubenswrapper[4691]: I0930 06:33:18.782650 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-8wf6n" Sep 30 06:33:18 crc kubenswrapper[4691]: I0930 06:33:18.784417 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-6dmrw" Sep 30 06:33:18 crc kubenswrapper[4691]: I0930 06:33:18.797481 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-84958c4d49-qwmv9"] Sep 30 06:33:18 crc kubenswrapper[4691]: I0930 06:33:18.800573 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-qwmv9" Sep 30 06:33:18 crc kubenswrapper[4691]: I0930 06:33:18.803453 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-644bddb6d8-9dj86"] Sep 30 06:33:18 crc kubenswrapper[4691]: I0930 06:33:18.807278 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-9h6w4" Sep 30 06:33:18 crc kubenswrapper[4691]: I0930 06:33:18.809610 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-84f4f7b77b-8wf6n"] Sep 30 06:33:18 crc kubenswrapper[4691]: I0930 06:33:18.828545 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5d889d78cf-5j6fw"] Sep 30 06:33:18 crc kubenswrapper[4691]: I0930 06:33:18.829491 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-5j6fw" Sep 30 06:33:18 crc kubenswrapper[4691]: I0930 06:33:18.831143 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-d9gls" Sep 30 06:33:18 crc kubenswrapper[4691]: I0930 06:33:18.841342 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nq6r\" (UniqueName: \"kubernetes.io/projected/d80f2f0e-5d71-42d4-8bb3-69b8dadd63c2-kube-api-access-8nq6r\") pod \"cinder-operator-controller-manager-644bddb6d8-9dj86\" (UID: \"d80f2f0e-5d71-42d4-8bb3-69b8dadd63c2\") " pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-9dj86" Sep 30 06:33:18 crc kubenswrapper[4691]: I0930 06:33:18.841580 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8qdc\" (UniqueName: \"kubernetes.io/projected/60dcfaf5-c692-44e5-8868-1dfccb14f535-kube-api-access-g8qdc\") pod \"glance-operator-controller-manager-84958c4d49-qwmv9\" (UID: \"60dcfaf5-c692-44e5-8868-1dfccb14f535\") " pod="openstack-operators/glance-operator-controller-manager-84958c4d49-qwmv9" Sep 30 06:33:18 crc kubenswrapper[4691]: I0930 06:33:18.841938 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxnjg\" (UniqueName: \"kubernetes.io/projected/2a1af285-1505-419c-bacc-16d8a161aca2-kube-api-access-gxnjg\") pod \"designate-operator-controller-manager-84f4f7b77b-8wf6n\" (UID: \"2a1af285-1505-419c-bacc-16d8a161aca2\") " pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-8wf6n" Sep 30 06:33:18 crc kubenswrapper[4691]: I0930 06:33:18.842057 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmjcl\" (UniqueName: \"kubernetes.io/projected/a5779e0d-8902-4a45-b28e-4253af3938ae-kube-api-access-qmjcl\") pod \"barbican-operator-controller-manager-6ff8b75857-qqx55\" (UID: \"a5779e0d-8902-4a45-b28e-4253af3938ae\") " pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-qqx55" Sep 30 06:33:18 crc kubenswrapper[4691]: I0930 06:33:18.847131 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-84958c4d49-qwmv9"] Sep 30 06:33:18 crc kubenswrapper[4691]: I0930 06:33:18.859962 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5d889d78cf-5j6fw"] Sep 30 06:33:18 crc kubenswrapper[4691]: I0930 06:33:18.872605 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-7d857cc749-jgh2d"] Sep 30 06:33:18 crc kubenswrapper[4691]: I0930 06:33:18.873761 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-jgh2d" Sep 30 06:33:18 crc kubenswrapper[4691]: I0930 06:33:18.877382 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Sep 30 06:33:18 crc kubenswrapper[4691]: I0930 06:33:18.877586 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-pt2lv" Sep 30 06:33:18 crc kubenswrapper[4691]: I0930 06:33:18.881854 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-9f4696d94-rxdj9"] Sep 30 06:33:18 crc kubenswrapper[4691]: I0930 06:33:18.882918 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-rxdj9" Sep 30 06:33:18 crc kubenswrapper[4691]: I0930 06:33:18.884829 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-xmhwd" Sep 30 06:33:18 crc kubenswrapper[4691]: I0930 06:33:18.885920 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7d857cc749-jgh2d"] Sep 30 06:33:18 crc kubenswrapper[4691]: I0930 06:33:18.889433 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-7975b88857-x79vm"] Sep 30 06:33:18 crc kubenswrapper[4691]: I0930 06:33:18.890414 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-x79vm" Sep 30 06:33:18 crc kubenswrapper[4691]: I0930 06:33:18.894320 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-rq4wj" Sep 30 06:33:18 crc kubenswrapper[4691]: I0930 06:33:18.895214 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-9f4696d94-rxdj9"] Sep 30 06:33:18 crc kubenswrapper[4691]: I0930 06:33:18.897460 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-7975b88857-x79vm"] Sep 30 06:33:18 crc kubenswrapper[4691]: I0930 06:33:18.918043 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5bd55b4bff-9xs5h"] Sep 30 06:33:18 crc kubenswrapper[4691]: I0930 06:33:18.928766 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-9xs5h" Sep 30 06:33:18 crc kubenswrapper[4691]: I0930 06:33:18.932425 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-tz9mc" Sep 30 06:33:18 crc kubenswrapper[4691]: I0930 06:33:18.942095 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-6d68dbc695-rtqwx"] Sep 30 06:33:18 crc kubenswrapper[4691]: I0930 06:33:18.943770 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rg7d\" (UniqueName: \"kubernetes.io/projected/c9f2f281-c656-4c29-bf86-c38f9cd79528-kube-api-access-6rg7d\") pod \"horizon-operator-controller-manager-9f4696d94-rxdj9\" (UID: \"c9f2f281-c656-4c29-bf86-c38f9cd79528\") " pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-rxdj9" Sep 30 06:33:18 crc kubenswrapper[4691]: I0930 06:33:18.943855 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8qdc\" (UniqueName: \"kubernetes.io/projected/60dcfaf5-c692-44e5-8868-1dfccb14f535-kube-api-access-g8qdc\") pod \"glance-operator-controller-manager-84958c4d49-qwmv9\" (UID: \"60dcfaf5-c692-44e5-8868-1dfccb14f535\") " pod="openstack-operators/glance-operator-controller-manager-84958c4d49-qwmv9" Sep 30 06:33:18 crc kubenswrapper[4691]: I0930 06:33:18.943895 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rck5x\" (UniqueName: \"kubernetes.io/projected/5c0ba848-ac6e-4515-99e3-e1665ff79d7c-kube-api-access-rck5x\") pod \"heat-operator-controller-manager-5d889d78cf-5j6fw\" (UID: \"5c0ba848-ac6e-4515-99e3-e1665ff79d7c\") " pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-5j6fw" Sep 30 06:33:18 crc kubenswrapper[4691]: I0930 06:33:18.943921 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxnjg\" (UniqueName: \"kubernetes.io/projected/2a1af285-1505-419c-bacc-16d8a161aca2-kube-api-access-gxnjg\") pod \"designate-operator-controller-manager-84f4f7b77b-8wf6n\" (UID: \"2a1af285-1505-419c-bacc-16d8a161aca2\") " pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-8wf6n" Sep 30 06:33:18 crc kubenswrapper[4691]: I0930 06:33:18.943944 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzn9p\" (UniqueName: \"kubernetes.io/projected/d6fb63f5-e7b6-47fd-ac44-b59058899b3c-kube-api-access-dzn9p\") pod \"keystone-operator-controller-manager-5bd55b4bff-9xs5h\" (UID: \"d6fb63f5-e7b6-47fd-ac44-b59058899b3c\") " pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-9xs5h" Sep 30 06:33:18 crc kubenswrapper[4691]: I0930 06:33:18.943966 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmjcl\" (UniqueName: \"kubernetes.io/projected/a5779e0d-8902-4a45-b28e-4253af3938ae-kube-api-access-qmjcl\") pod \"barbican-operator-controller-manager-6ff8b75857-qqx55\" (UID: \"a5779e0d-8902-4a45-b28e-4253af3938ae\") " pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-qqx55" Sep 30 06:33:18 crc kubenswrapper[4691]: I0930 06:33:18.944002 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nq6r\" (UniqueName: \"kubernetes.io/projected/d80f2f0e-5d71-42d4-8bb3-69b8dadd63c2-kube-api-access-8nq6r\") pod \"cinder-operator-controller-manager-644bddb6d8-9dj86\" (UID: \"d80f2f0e-5d71-42d4-8bb3-69b8dadd63c2\") " pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-9dj86" Sep 30 06:33:18 crc kubenswrapper[4691]: I0930 06:33:18.944101 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfx62\" (UniqueName: \"kubernetes.io/projected/9c5c1b63-6185-424c-a584-35a18e2c69bd-kube-api-access-wfx62\") pod \"ironic-operator-controller-manager-7975b88857-x79vm\" (UID: \"9c5c1b63-6185-424c-a584-35a18e2c69bd\") " pod="openstack-operators/ironic-operator-controller-manager-7975b88857-x79vm" Sep 30 06:33:18 crc kubenswrapper[4691]: I0930 06:33:18.966113 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-rtqwx" Sep 30 06:33:18 crc kubenswrapper[4691]: I0930 06:33:18.976796 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5bd55b4bff-9xs5h"] Sep 30 06:33:18 crc kubenswrapper[4691]: I0930 06:33:18.976833 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6d68dbc695-rtqwx"] Sep 30 06:33:18 crc kubenswrapper[4691]: I0930 06:33:18.977726 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-rhqdt" Sep 30 06:33:18 crc kubenswrapper[4691]: I0930 06:33:18.982918 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxnjg\" (UniqueName: \"kubernetes.io/projected/2a1af285-1505-419c-bacc-16d8a161aca2-kube-api-access-gxnjg\") pod \"designate-operator-controller-manager-84f4f7b77b-8wf6n\" (UID: \"2a1af285-1505-419c-bacc-16d8a161aca2\") " pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-8wf6n" Sep 30 06:33:18 crc kubenswrapper[4691]: I0930 06:33:18.984438 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8qdc\" (UniqueName: \"kubernetes.io/projected/60dcfaf5-c692-44e5-8868-1dfccb14f535-kube-api-access-g8qdc\") pod \"glance-operator-controller-manager-84958c4d49-qwmv9\" (UID: \"60dcfaf5-c692-44e5-8868-1dfccb14f535\") " pod="openstack-operators/glance-operator-controller-manager-84958c4d49-qwmv9" Sep 30 06:33:18 crc kubenswrapper[4691]: I0930 06:33:18.984743 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nq6r\" (UniqueName: \"kubernetes.io/projected/d80f2f0e-5d71-42d4-8bb3-69b8dadd63c2-kube-api-access-8nq6r\") pod \"cinder-operator-controller-manager-644bddb6d8-9dj86\" (UID: \"d80f2f0e-5d71-42d4-8bb3-69b8dadd63c2\") " pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-9dj86" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:18.999364 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-88c7-c5nbk"] Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.000460 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-88c7-c5nbk" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.002672 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-x82wk" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.007580 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-88c7-c5nbk"] Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.017475 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64d7b59854-2kzwg"] Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.018581 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-2kzwg" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.038147 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-7nj2q" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.045520 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1d4f2da8-966a-4a80-aca6-efdd8faca337-cert\") pod \"infra-operator-controller-manager-7d857cc749-jgh2d\" (UID: \"1d4f2da8-966a-4a80-aca6-efdd8faca337\") " pod="openstack-operators/infra-operator-controller-manager-7d857cc749-jgh2d" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.045579 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rck5x\" (UniqueName: \"kubernetes.io/projected/5c0ba848-ac6e-4515-99e3-e1665ff79d7c-kube-api-access-rck5x\") pod \"heat-operator-controller-manager-5d889d78cf-5j6fw\" (UID: \"5c0ba848-ac6e-4515-99e3-e1665ff79d7c\") " pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-5j6fw" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.045600 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzn9p\" (UniqueName: \"kubernetes.io/projected/d6fb63f5-e7b6-47fd-ac44-b59058899b3c-kube-api-access-dzn9p\") pod \"keystone-operator-controller-manager-5bd55b4bff-9xs5h\" (UID: \"d6fb63f5-e7b6-47fd-ac44-b59058899b3c\") " pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-9xs5h" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.045640 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fks44\" (UniqueName: \"kubernetes.io/projected/1d4f2da8-966a-4a80-aca6-efdd8faca337-kube-api-access-fks44\") pod \"infra-operator-controller-manager-7d857cc749-jgh2d\" (UID: \"1d4f2da8-966a-4a80-aca6-efdd8faca337\") " pod="openstack-operators/infra-operator-controller-manager-7d857cc749-jgh2d" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.045671 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfx62\" (UniqueName: \"kubernetes.io/projected/9c5c1b63-6185-424c-a584-35a18e2c69bd-kube-api-access-wfx62\") pod \"ironic-operator-controller-manager-7975b88857-x79vm\" (UID: \"9c5c1b63-6185-424c-a584-35a18e2c69bd\") " pod="openstack-operators/ironic-operator-controller-manager-7975b88857-x79vm" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.045691 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rg7d\" (UniqueName: \"kubernetes.io/projected/c9f2f281-c656-4c29-bf86-c38f9cd79528-kube-api-access-6rg7d\") pod \"horizon-operator-controller-manager-9f4696d94-rxdj9\" (UID: \"c9f2f281-c656-4c29-bf86-c38f9cd79528\") " pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-rxdj9" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.048797 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmjcl\" (UniqueName: \"kubernetes.io/projected/a5779e0d-8902-4a45-b28e-4253af3938ae-kube-api-access-qmjcl\") pod \"barbican-operator-controller-manager-6ff8b75857-qqx55\" (UID: \"a5779e0d-8902-4a45-b28e-4253af3938ae\") " pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-qqx55" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.051344 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64d7b59854-2kzwg"] Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.066749 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzn9p\" (UniqueName: \"kubernetes.io/projected/d6fb63f5-e7b6-47fd-ac44-b59058899b3c-kube-api-access-dzn9p\") pod \"keystone-operator-controller-manager-5bd55b4bff-9xs5h\" (UID: \"d6fb63f5-e7b6-47fd-ac44-b59058899b3c\") " pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-9xs5h" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.066973 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-c7c776c96-gtznz"] Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.068528 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-gtznz" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.071916 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfx62\" (UniqueName: \"kubernetes.io/projected/9c5c1b63-6185-424c-a584-35a18e2c69bd-kube-api-access-wfx62\") pod \"ironic-operator-controller-manager-7975b88857-x79vm\" (UID: \"9c5c1b63-6185-424c-a584-35a18e2c69bd\") " pod="openstack-operators/ironic-operator-controller-manager-7975b88857-x79vm" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.072616 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rck5x\" (UniqueName: \"kubernetes.io/projected/5c0ba848-ac6e-4515-99e3-e1665ff79d7c-kube-api-access-rck5x\") pod \"heat-operator-controller-manager-5d889d78cf-5j6fw\" (UID: \"5c0ba848-ac6e-4515-99e3-e1665ff79d7c\") " pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-5j6fw" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.073139 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-t4vz8" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.073479 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-qqx55" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.078368 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rg7d\" (UniqueName: \"kubernetes.io/projected/c9f2f281-c656-4c29-bf86-c38f9cd79528-kube-api-access-6rg7d\") pod \"horizon-operator-controller-manager-9f4696d94-rxdj9\" (UID: \"c9f2f281-c656-4c29-bf86-c38f9cd79528\") " pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-rxdj9" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.079428 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-c7c776c96-gtznz"] Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.094130 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-9dj86" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.103321 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-9976ff44c-bzx6l"] Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.104417 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-bzx6l" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.109403 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-hl6jj" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.110022 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-8wf6n" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.130189 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-r8s99"] Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.131717 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-r8s99" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.133262 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-qwmv9" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.135172 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.135344 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-h42cw"] Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.136415 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-h42cw" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.139561 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-wdkw7" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.139879 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-vkc9m" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.143593 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-9976ff44c-bzx6l"] Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.146462 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zx2pr\" (UniqueName: \"kubernetes.io/projected/11bd74e6-05a4-44fc-b360-f1d71352011e-kube-api-access-zx2pr\") pod \"manila-operator-controller-manager-6d68dbc695-rtqwx\" (UID: \"11bd74e6-05a4-44fc-b360-f1d71352011e\") " pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-rtqwx" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.146507 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1d4f2da8-966a-4a80-aca6-efdd8faca337-cert\") pod \"infra-operator-controller-manager-7d857cc749-jgh2d\" (UID: \"1d4f2da8-966a-4a80-aca6-efdd8faca337\") " pod="openstack-operators/infra-operator-controller-manager-7d857cc749-jgh2d" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.146544 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5mcf\" (UniqueName: \"kubernetes.io/projected/a1aaa7fa-8695-4124-ad5a-26f11a99b1c8-kube-api-access-f5mcf\") pod \"mariadb-operator-controller-manager-88c7-c5nbk\" (UID: \"a1aaa7fa-8695-4124-ad5a-26f11a99b1c8\") " pod="openstack-operators/mariadb-operator-controller-manager-88c7-c5nbk" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.146567 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnnsx\" (UniqueName: \"kubernetes.io/projected/2c674607-65d9-4be2-9244-d61eadb97dd7-kube-api-access-fnnsx\") pod \"neutron-operator-controller-manager-64d7b59854-2kzwg\" (UID: \"2c674607-65d9-4be2-9244-d61eadb97dd7\") " pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-2kzwg" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.146590 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fks44\" (UniqueName: \"kubernetes.io/projected/1d4f2da8-966a-4a80-aca6-efdd8faca337-kube-api-access-fks44\") pod \"infra-operator-controller-manager-7d857cc749-jgh2d\" (UID: \"1d4f2da8-966a-4a80-aca6-efdd8faca337\") " pod="openstack-operators/infra-operator-controller-manager-7d857cc749-jgh2d" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.149563 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-5j6fw" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.159485 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1d4f2da8-966a-4a80-aca6-efdd8faca337-cert\") pod \"infra-operator-controller-manager-7d857cc749-jgh2d\" (UID: \"1d4f2da8-966a-4a80-aca6-efdd8faca337\") " pod="openstack-operators/infra-operator-controller-manager-7d857cc749-jgh2d" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.166860 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fks44\" (UniqueName: \"kubernetes.io/projected/1d4f2da8-966a-4a80-aca6-efdd8faca337-kube-api-access-fks44\") pod \"infra-operator-controller-manager-7d857cc749-jgh2d\" (UID: \"1d4f2da8-966a-4a80-aca6-efdd8faca337\") " pod="openstack-operators/infra-operator-controller-manager-7d857cc749-jgh2d" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.167943 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-r8s99"] Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.176956 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-589c58c6c-vd4rw"] Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.178298 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-vd4rw" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.180159 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-rlxrf" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.189087 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-jgh2d" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.209195 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-rxdj9" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.214215 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-h42cw"] Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.221379 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-589c58c6c-vd4rw"] Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.231628 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-x79vm" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.244592 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-bc7dc7bd9-s4rpv"] Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.245758 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-s4rpv" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.248239 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5mcf\" (UniqueName: \"kubernetes.io/projected/a1aaa7fa-8695-4124-ad5a-26f11a99b1c8-kube-api-access-f5mcf\") pod \"mariadb-operator-controller-manager-88c7-c5nbk\" (UID: \"a1aaa7fa-8695-4124-ad5a-26f11a99b1c8\") " pod="openstack-operators/mariadb-operator-controller-manager-88c7-c5nbk" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.248278 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5449m\" (UniqueName: \"kubernetes.io/projected/a7851de5-00e1-43bb-b0c9-8ae4f4f9c5af-kube-api-access-5449m\") pod \"ovn-operator-controller-manager-9976ff44c-bzx6l\" (UID: \"a7851de5-00e1-43bb-b0c9-8ae4f4f9c5af\") " pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-bzx6l" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.248304 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnnsx\" (UniqueName: \"kubernetes.io/projected/2c674607-65d9-4be2-9244-d61eadb97dd7-kube-api-access-fnnsx\") pod \"neutron-operator-controller-manager-64d7b59854-2kzwg\" (UID: \"2c674607-65d9-4be2-9244-d61eadb97dd7\") " pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-2kzwg" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.248350 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4a38ca7d-d01d-4a0e-adfa-4875c46d8d7d-cert\") pod \"openstack-baremetal-operator-controller-manager-6d776955-r8s99\" (UID: \"4a38ca7d-d01d-4a0e-adfa-4875c46d8d7d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-r8s99" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.248374 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x77hc\" (UniqueName: \"kubernetes.io/projected/a8dd4aa3-ab8b-4f66-9722-8873600c87eb-kube-api-access-x77hc\") pod \"nova-operator-controller-manager-c7c776c96-gtznz\" (UID: \"a8dd4aa3-ab8b-4f66-9722-8873600c87eb\") " pod="openstack-operators/nova-operator-controller-manager-c7c776c96-gtznz" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.248396 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zx2pr\" (UniqueName: \"kubernetes.io/projected/11bd74e6-05a4-44fc-b360-f1d71352011e-kube-api-access-zx2pr\") pod \"manila-operator-controller-manager-6d68dbc695-rtqwx\" (UID: \"11bd74e6-05a4-44fc-b360-f1d71352011e\") " pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-rtqwx" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.248421 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xcv8\" (UniqueName: \"kubernetes.io/projected/88c75f60-538f-4059-aaeb-b41dcdcf7cfa-kube-api-access-8xcv8\") pod \"octavia-operator-controller-manager-76fcc6dc7c-h42cw\" (UID: \"88c75f60-538f-4059-aaeb-b41dcdcf7cfa\") " pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-h42cw" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.248443 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5svk\" (UniqueName: \"kubernetes.io/projected/4a38ca7d-d01d-4a0e-adfa-4875c46d8d7d-kube-api-access-q5svk\") pod \"openstack-baremetal-operator-controller-manager-6d776955-r8s99\" (UID: \"4a38ca7d-d01d-4a0e-adfa-4875c46d8d7d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-r8s99" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.250846 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-9xs5h" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.265433 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-bc7dc7bd9-s4rpv"] Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.272757 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-9hdjg" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.293901 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-44qv5"] Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.294999 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-44qv5" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.295736 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnnsx\" (UniqueName: \"kubernetes.io/projected/2c674607-65d9-4be2-9244-d61eadb97dd7-kube-api-access-fnnsx\") pod \"neutron-operator-controller-manager-64d7b59854-2kzwg\" (UID: \"2c674607-65d9-4be2-9244-d61eadb97dd7\") " pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-2kzwg" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.296612 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zx2pr\" (UniqueName: \"kubernetes.io/projected/11bd74e6-05a4-44fc-b360-f1d71352011e-kube-api-access-zx2pr\") pod \"manila-operator-controller-manager-6d68dbc695-rtqwx\" (UID: \"11bd74e6-05a4-44fc-b360-f1d71352011e\") " pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-rtqwx" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.297405 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5mcf\" (UniqueName: \"kubernetes.io/projected/a1aaa7fa-8695-4124-ad5a-26f11a99b1c8-kube-api-access-f5mcf\") pod \"mariadb-operator-controller-manager-88c7-c5nbk\" (UID: \"a1aaa7fa-8695-4124-ad5a-26f11a99b1c8\") " pod="openstack-operators/mariadb-operator-controller-manager-88c7-c5nbk" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.297782 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-5j9vw" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.308017 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-44qv5"] Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.375137 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2kbz\" (UniqueName: \"kubernetes.io/projected/3bce910d-be3e-4332-89df-75e715d95988-kube-api-access-z2kbz\") pod \"swift-operator-controller-manager-bc7dc7bd9-s4rpv\" (UID: \"3bce910d-be3e-4332-89df-75e715d95988\") " pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-s4rpv" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.375193 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhjkv\" (UniqueName: \"kubernetes.io/projected/52aa93bd-f5d7-479e-a8fe-2c6e70a70fae-kube-api-access-nhjkv\") pod \"placement-operator-controller-manager-589c58c6c-vd4rw\" (UID: \"52aa93bd-f5d7-479e-a8fe-2c6e70a70fae\") " pod="openstack-operators/placement-operator-controller-manager-589c58c6c-vd4rw" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.375296 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4a38ca7d-d01d-4a0e-adfa-4875c46d8d7d-cert\") pod \"openstack-baremetal-operator-controller-manager-6d776955-r8s99\" (UID: \"4a38ca7d-d01d-4a0e-adfa-4875c46d8d7d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-r8s99" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.375320 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x77hc\" (UniqueName: \"kubernetes.io/projected/a8dd4aa3-ab8b-4f66-9722-8873600c87eb-kube-api-access-x77hc\") pod \"nova-operator-controller-manager-c7c776c96-gtznz\" (UID: \"a8dd4aa3-ab8b-4f66-9722-8873600c87eb\") " pod="openstack-operators/nova-operator-controller-manager-c7c776c96-gtznz" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.375355 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xcv8\" (UniqueName: \"kubernetes.io/projected/88c75f60-538f-4059-aaeb-b41dcdcf7cfa-kube-api-access-8xcv8\") pod \"octavia-operator-controller-manager-76fcc6dc7c-h42cw\" (UID: \"88c75f60-538f-4059-aaeb-b41dcdcf7cfa\") " pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-h42cw" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.375541 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5svk\" (UniqueName: \"kubernetes.io/projected/4a38ca7d-d01d-4a0e-adfa-4875c46d8d7d-kube-api-access-q5svk\") pod \"openstack-baremetal-operator-controller-manager-6d776955-r8s99\" (UID: \"4a38ca7d-d01d-4a0e-adfa-4875c46d8d7d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-r8s99" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.375579 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5449m\" (UniqueName: \"kubernetes.io/projected/a7851de5-00e1-43bb-b0c9-8ae4f4f9c5af-kube-api-access-5449m\") pod \"ovn-operator-controller-manager-9976ff44c-bzx6l\" (UID: \"a7851de5-00e1-43bb-b0c9-8ae4f4f9c5af\") " pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-bzx6l" Sep 30 06:33:19 crc kubenswrapper[4691]: E0930 06:33:19.381829 4691 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Sep 30 06:33:19 crc kubenswrapper[4691]: E0930 06:33:19.381911 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a38ca7d-d01d-4a0e-adfa-4875c46d8d7d-cert podName:4a38ca7d-d01d-4a0e-adfa-4875c46d8d7d nodeName:}" failed. No retries permitted until 2025-09-30 06:33:19.881872458 +0000 UTC m=+843.356893498 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4a38ca7d-d01d-4a0e-adfa-4875c46d8d7d-cert") pod "openstack-baremetal-operator-controller-manager-6d776955-r8s99" (UID: "4a38ca7d-d01d-4a0e-adfa-4875c46d8d7d") : secret "openstack-baremetal-operator-webhook-server-cert" not found Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.422709 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5svk\" (UniqueName: \"kubernetes.io/projected/4a38ca7d-d01d-4a0e-adfa-4875c46d8d7d-kube-api-access-q5svk\") pod \"openstack-baremetal-operator-controller-manager-6d776955-r8s99\" (UID: \"4a38ca7d-d01d-4a0e-adfa-4875c46d8d7d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-r8s99" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.423004 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-f66b554c6-8qj9q"] Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.439450 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-f66b554c6-8qj9q" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.457245 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-88c7-c5nbk" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.461954 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-f66b554c6-8qj9q"] Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.463222 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-rtqwx" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.464950 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5449m\" (UniqueName: \"kubernetes.io/projected/a7851de5-00e1-43bb-b0c9-8ae4f4f9c5af-kube-api-access-5449m\") pod \"ovn-operator-controller-manager-9976ff44c-bzx6l\" (UID: \"a7851de5-00e1-43bb-b0c9-8ae4f4f9c5af\") " pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-bzx6l" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.474535 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x77hc\" (UniqueName: \"kubernetes.io/projected/a8dd4aa3-ab8b-4f66-9722-8873600c87eb-kube-api-access-x77hc\") pod \"nova-operator-controller-manager-c7c776c96-gtznz\" (UID: \"a8dd4aa3-ab8b-4f66-9722-8873600c87eb\") " pod="openstack-operators/nova-operator-controller-manager-c7c776c96-gtznz" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.477851 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-xqkm9" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.481171 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-2kzwg" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.489737 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-gtznz" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.491484 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xcv8\" (UniqueName: \"kubernetes.io/projected/88c75f60-538f-4059-aaeb-b41dcdcf7cfa-kube-api-access-8xcv8\") pod \"octavia-operator-controller-manager-76fcc6dc7c-h42cw\" (UID: \"88c75f60-538f-4059-aaeb-b41dcdcf7cfa\") " pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-h42cw" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.503591 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-bzx6l" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.552125 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdw5b\" (UniqueName: \"kubernetes.io/projected/88da73a4-c9e2-4a78-b313-8cf689562e38-kube-api-access-tdw5b\") pod \"telemetry-operator-controller-manager-b8d54b5d7-44qv5\" (UID: \"88da73a4-c9e2-4a78-b313-8cf689562e38\") " pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-44qv5" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.552437 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2kbz\" (UniqueName: \"kubernetes.io/projected/3bce910d-be3e-4332-89df-75e715d95988-kube-api-access-z2kbz\") pod \"swift-operator-controller-manager-bc7dc7bd9-s4rpv\" (UID: \"3bce910d-be3e-4332-89df-75e715d95988\") " pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-s4rpv" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.552468 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhjkv\" (UniqueName: \"kubernetes.io/projected/52aa93bd-f5d7-479e-a8fe-2c6e70a70fae-kube-api-access-nhjkv\") pod \"placement-operator-controller-manager-589c58c6c-vd4rw\" (UID: \"52aa93bd-f5d7-479e-a8fe-2c6e70a70fae\") " pod="openstack-operators/placement-operator-controller-manager-589c58c6c-vd4rw" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.552773 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-h42cw" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.554978 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bd494bc6d-x495w"] Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.570689 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bd494bc6d-x495w"] Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.570785 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-bd494bc6d-x495w" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.571317 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-86d6bdfc6d-7zkhq"] Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.572978 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-86d6bdfc6d-7zkhq" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.575173 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-5q7s5" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.582259 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-wr9w6" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.582304 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.594654 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-86d6bdfc6d-7zkhq"] Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.603107 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhjkv\" (UniqueName: \"kubernetes.io/projected/52aa93bd-f5d7-479e-a8fe-2c6e70a70fae-kube-api-access-nhjkv\") pod \"placement-operator-controller-manager-589c58c6c-vd4rw\" (UID: \"52aa93bd-f5d7-479e-a8fe-2c6e70a70fae\") " pod="openstack-operators/placement-operator-controller-manager-589c58c6c-vd4rw" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.603372 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2kbz\" (UniqueName: \"kubernetes.io/projected/3bce910d-be3e-4332-89df-75e715d95988-kube-api-access-z2kbz\") pod \"swift-operator-controller-manager-bc7dc7bd9-s4rpv\" (UID: \"3bce910d-be3e-4332-89df-75e715d95988\") " pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-s4rpv" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.629981 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-7v82c"] Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.630960 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-7v82c" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.632901 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-h9n4m" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.634089 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-7v82c"] Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.653353 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z698f\" (UniqueName: \"kubernetes.io/projected/a1cbd98a-2f66-4649-8347-938d07f93eb1-kube-api-access-z698f\") pod \"test-operator-controller-manager-f66b554c6-8qj9q\" (UID: \"a1cbd98a-2f66-4649-8347-938d07f93eb1\") " pod="openstack-operators/test-operator-controller-manager-f66b554c6-8qj9q" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.653416 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdw5b\" (UniqueName: \"kubernetes.io/projected/88da73a4-c9e2-4a78-b313-8cf689562e38-kube-api-access-tdw5b\") pod \"telemetry-operator-controller-manager-b8d54b5d7-44qv5\" (UID: \"88da73a4-c9e2-4a78-b313-8cf689562e38\") " pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-44qv5" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.677497 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdw5b\" (UniqueName: \"kubernetes.io/projected/88da73a4-c9e2-4a78-b313-8cf689562e38-kube-api-access-tdw5b\") pod \"telemetry-operator-controller-manager-b8d54b5d7-44qv5\" (UID: \"88da73a4-c9e2-4a78-b313-8cf689562e38\") " pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-44qv5" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.756925 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1013022f-3fa2-44d5-a111-5f89a6a7bb17-cert\") pod \"openstack-operator-controller-manager-86d6bdfc6d-7zkhq\" (UID: \"1013022f-3fa2-44d5-a111-5f89a6a7bb17\") " pod="openstack-operators/openstack-operator-controller-manager-86d6bdfc6d-7zkhq" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.757007 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xw8rj\" (UniqueName: \"kubernetes.io/projected/54fbcf55-e81d-4336-8e38-9bb1d3ec3c47-kube-api-access-xw8rj\") pod \"rabbitmq-cluster-operator-manager-79d8469568-7v82c\" (UID: \"54fbcf55-e81d-4336-8e38-9bb1d3ec3c47\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-7v82c" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.757042 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z698f\" (UniqueName: \"kubernetes.io/projected/a1cbd98a-2f66-4649-8347-938d07f93eb1-kube-api-access-z698f\") pod \"test-operator-controller-manager-f66b554c6-8qj9q\" (UID: \"a1cbd98a-2f66-4649-8347-938d07f93eb1\") " pod="openstack-operators/test-operator-controller-manager-f66b554c6-8qj9q" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.757065 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzdh7\" (UniqueName: \"kubernetes.io/projected/9678f82b-58e6-4529-bdf6-6faaf2d7bcfa-kube-api-access-fzdh7\") pod \"watcher-operator-controller-manager-bd494bc6d-x495w\" (UID: \"9678f82b-58e6-4529-bdf6-6faaf2d7bcfa\") " pod="openstack-operators/watcher-operator-controller-manager-bd494bc6d-x495w" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.757120 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4rdm\" (UniqueName: \"kubernetes.io/projected/1013022f-3fa2-44d5-a111-5f89a6a7bb17-kube-api-access-h4rdm\") pod \"openstack-operator-controller-manager-86d6bdfc6d-7zkhq\" (UID: \"1013022f-3fa2-44d5-a111-5f89a6a7bb17\") " pod="openstack-operators/openstack-operator-controller-manager-86d6bdfc6d-7zkhq" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.796543 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z698f\" (UniqueName: \"kubernetes.io/projected/a1cbd98a-2f66-4649-8347-938d07f93eb1-kube-api-access-z698f\") pod \"test-operator-controller-manager-f66b554c6-8qj9q\" (UID: \"a1cbd98a-2f66-4649-8347-938d07f93eb1\") " pod="openstack-operators/test-operator-controller-manager-f66b554c6-8qj9q" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.855042 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-vd4rw" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.860467 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4rdm\" (UniqueName: \"kubernetes.io/projected/1013022f-3fa2-44d5-a111-5f89a6a7bb17-kube-api-access-h4rdm\") pod \"openstack-operator-controller-manager-86d6bdfc6d-7zkhq\" (UID: \"1013022f-3fa2-44d5-a111-5f89a6a7bb17\") " pod="openstack-operators/openstack-operator-controller-manager-86d6bdfc6d-7zkhq" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.860538 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1013022f-3fa2-44d5-a111-5f89a6a7bb17-cert\") pod \"openstack-operator-controller-manager-86d6bdfc6d-7zkhq\" (UID: \"1013022f-3fa2-44d5-a111-5f89a6a7bb17\") " pod="openstack-operators/openstack-operator-controller-manager-86d6bdfc6d-7zkhq" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.860586 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xw8rj\" (UniqueName: \"kubernetes.io/projected/54fbcf55-e81d-4336-8e38-9bb1d3ec3c47-kube-api-access-xw8rj\") pod \"rabbitmq-cluster-operator-manager-79d8469568-7v82c\" (UID: \"54fbcf55-e81d-4336-8e38-9bb1d3ec3c47\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-7v82c" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.860690 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzdh7\" (UniqueName: \"kubernetes.io/projected/9678f82b-58e6-4529-bdf6-6faaf2d7bcfa-kube-api-access-fzdh7\") pod \"watcher-operator-controller-manager-bd494bc6d-x495w\" (UID: \"9678f82b-58e6-4529-bdf6-6faaf2d7bcfa\") " pod="openstack-operators/watcher-operator-controller-manager-bd494bc6d-x495w" Sep 30 06:33:19 crc kubenswrapper[4691]: E0930 06:33:19.860931 4691 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Sep 30 06:33:19 crc kubenswrapper[4691]: E0930 06:33:19.861023 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1013022f-3fa2-44d5-a111-5f89a6a7bb17-cert podName:1013022f-3fa2-44d5-a111-5f89a6a7bb17 nodeName:}" failed. No retries permitted until 2025-09-30 06:33:20.360999613 +0000 UTC m=+843.836020653 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1013022f-3fa2-44d5-a111-5f89a6a7bb17-cert") pod "openstack-operator-controller-manager-86d6bdfc6d-7zkhq" (UID: "1013022f-3fa2-44d5-a111-5f89a6a7bb17") : secret "webhook-server-cert" not found Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.881111 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-s4rpv" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.888201 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4rdm\" (UniqueName: \"kubernetes.io/projected/1013022f-3fa2-44d5-a111-5f89a6a7bb17-kube-api-access-h4rdm\") pod \"openstack-operator-controller-manager-86d6bdfc6d-7zkhq\" (UID: \"1013022f-3fa2-44d5-a111-5f89a6a7bb17\") " pod="openstack-operators/openstack-operator-controller-manager-86d6bdfc6d-7zkhq" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.889533 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xw8rj\" (UniqueName: \"kubernetes.io/projected/54fbcf55-e81d-4336-8e38-9bb1d3ec3c47-kube-api-access-xw8rj\") pod \"rabbitmq-cluster-operator-manager-79d8469568-7v82c\" (UID: \"54fbcf55-e81d-4336-8e38-9bb1d3ec3c47\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-7v82c" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.892372 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzdh7\" (UniqueName: \"kubernetes.io/projected/9678f82b-58e6-4529-bdf6-6faaf2d7bcfa-kube-api-access-fzdh7\") pod \"watcher-operator-controller-manager-bd494bc6d-x495w\" (UID: \"9678f82b-58e6-4529-bdf6-6faaf2d7bcfa\") " pod="openstack-operators/watcher-operator-controller-manager-bd494bc6d-x495w" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.931964 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-f66b554c6-8qj9q" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.941697 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-44qv5" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.962499 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4a38ca7d-d01d-4a0e-adfa-4875c46d8d7d-cert\") pod \"openstack-baremetal-operator-controller-manager-6d776955-r8s99\" (UID: \"4a38ca7d-d01d-4a0e-adfa-4875c46d8d7d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-r8s99" Sep 30 06:33:19 crc kubenswrapper[4691]: E0930 06:33:19.962641 4691 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Sep 30 06:33:19 crc kubenswrapper[4691]: E0930 06:33:19.962693 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a38ca7d-d01d-4a0e-adfa-4875c46d8d7d-cert podName:4a38ca7d-d01d-4a0e-adfa-4875c46d8d7d nodeName:}" failed. No retries permitted until 2025-09-30 06:33:20.962676747 +0000 UTC m=+844.437697787 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4a38ca7d-d01d-4a0e-adfa-4875c46d8d7d-cert") pod "openstack-baremetal-operator-controller-manager-6d776955-r8s99" (UID: "4a38ca7d-d01d-4a0e-adfa-4875c46d8d7d") : secret "openstack-baremetal-operator-webhook-server-cert" not found Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.962784 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-bd494bc6d-x495w" Sep 30 06:33:19 crc kubenswrapper[4691]: I0930 06:33:19.987213 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-7v82c" Sep 30 06:33:20 crc kubenswrapper[4691]: I0930 06:33:20.249536 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-84958c4d49-qwmv9"] Sep 30 06:33:20 crc kubenswrapper[4691]: I0930 06:33:20.368216 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1013022f-3fa2-44d5-a111-5f89a6a7bb17-cert\") pod \"openstack-operator-controller-manager-86d6bdfc6d-7zkhq\" (UID: \"1013022f-3fa2-44d5-a111-5f89a6a7bb17\") " pod="openstack-operators/openstack-operator-controller-manager-86d6bdfc6d-7zkhq" Sep 30 06:33:20 crc kubenswrapper[4691]: I0930 06:33:20.372297 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1013022f-3fa2-44d5-a111-5f89a6a7bb17-cert\") pod \"openstack-operator-controller-manager-86d6bdfc6d-7zkhq\" (UID: \"1013022f-3fa2-44d5-a111-5f89a6a7bb17\") " pod="openstack-operators/openstack-operator-controller-manager-86d6bdfc6d-7zkhq" Sep 30 06:33:20 crc kubenswrapper[4691]: I0930 06:33:20.477979 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-qwmv9" event={"ID":"60dcfaf5-c692-44e5-8868-1dfccb14f535","Type":"ContainerStarted","Data":"348462b4208a8070cd4f76e11a3f4780dffb5d379f200973bf6dca309ca2dbc7"} Sep 30 06:33:20 crc kubenswrapper[4691]: I0930 06:33:20.547396 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-86d6bdfc6d-7zkhq" Sep 30 06:33:20 crc kubenswrapper[4691]: I0930 06:33:20.663802 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6ff8b75857-qqx55"] Sep 30 06:33:20 crc kubenswrapper[4691]: I0930 06:33:20.674249 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-9f4696d94-rxdj9"] Sep 30 06:33:20 crc kubenswrapper[4691]: I0930 06:33:20.681143 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7d857cc749-jgh2d"] Sep 30 06:33:20 crc kubenswrapper[4691]: I0930 06:33:20.686454 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-644bddb6d8-9dj86"] Sep 30 06:33:20 crc kubenswrapper[4691]: I0930 06:33:20.696229 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-7975b88857-x79vm"] Sep 30 06:33:20 crc kubenswrapper[4691]: E0930 06:33:20.704523 4691 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/designate-operator@sha256:f6b935f67979298c3c263ad84d277e5cf26c0dbba3f85f255c1ec4d1d75241d2,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gxnjg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-84f4f7b77b-8wf6n_openstack-operators(2a1af285-1505-419c-bacc-16d8a161aca2): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 06:33:20 crc kubenswrapper[4691]: E0930 06:33:20.704566 4691 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:4d08afd31dc5ded10c54a5541f514ac351e9b40a183285b3db27d0757a6354c8,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8xcv8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-76fcc6dc7c-h42cw_openstack-operators(88c75f60-538f-4059-aaeb-b41dcdcf7cfa): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 06:33:20 crc kubenswrapper[4691]: I0930 06:33:20.705531 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-6d68dbc695-rtqwx"] Sep 30 06:33:20 crc kubenswrapper[4691]: I0930 06:33:20.711175 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5d889d78cf-5j6fw"] Sep 30 06:33:20 crc kubenswrapper[4691]: I0930 06:33:20.716841 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-c7c776c96-gtznz"] Sep 30 06:33:20 crc kubenswrapper[4691]: I0930 06:33:20.722978 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-h42cw"] Sep 30 06:33:20 crc kubenswrapper[4691]: I0930 06:33:20.726662 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-5bd55b4bff-9xs5h"] Sep 30 06:33:20 crc kubenswrapper[4691]: I0930 06:33:20.730476 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-88c7-c5nbk"] Sep 30 06:33:20 crc kubenswrapper[4691]: I0930 06:33:20.735911 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-84f4f7b77b-8wf6n"] Sep 30 06:33:20 crc kubenswrapper[4691]: I0930 06:33:20.781750 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-9976ff44c-bzx6l"] Sep 30 06:33:20 crc kubenswrapper[4691]: I0930 06:33:20.796037 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-64d7b59854-2kzwg"] Sep 30 06:33:20 crc kubenswrapper[4691]: E0930 06:33:20.800669 4691 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:1051afc168038fb814f75e7a5f07c588b295a83ebd143dcd8b46d799e31ad302,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5449m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-9976ff44c-bzx6l_openstack-operators(a7851de5-00e1-43bb-b0c9-8ae4f4f9c5af): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 06:33:20 crc kubenswrapper[4691]: W0930 06:33:20.805513 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c674607_65d9_4be2_9244_d61eadb97dd7.slice/crio-f550540f7742c7b56b69e06f386b8c0bce8b204809d183fc1eae5685034b49a9 WatchSource:0}: Error finding container f550540f7742c7b56b69e06f386b8c0bce8b204809d183fc1eae5685034b49a9: Status 404 returned error can't find the container with id f550540f7742c7b56b69e06f386b8c0bce8b204809d183fc1eae5685034b49a9 Sep 30 06:33:20 crc kubenswrapper[4691]: E0930 06:33:20.831758 4691 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:485df5c7813cdf4cf21f48ec48c8e3e4962fee6a1ae4c64f7af127d5ab346a10,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fnnsx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-64d7b59854-2kzwg_openstack-operators(2c674607-65d9-4be2-9244-d61eadb97dd7): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 06:33:20 crc kubenswrapper[4691]: E0930 06:33:20.885697 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-h42cw" podUID="88c75f60-538f-4059-aaeb-b41dcdcf7cfa" Sep 30 06:33:20 crc kubenswrapper[4691]: E0930 06:33:20.886527 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-8wf6n" podUID="2a1af285-1505-419c-bacc-16d8a161aca2" Sep 30 06:33:20 crc kubenswrapper[4691]: E0930 06:33:20.956175 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-2kzwg" podUID="2c674607-65d9-4be2-9244-d61eadb97dd7" Sep 30 06:33:20 crc kubenswrapper[4691]: E0930 06:33:20.966321 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-bzx6l" podUID="a7851de5-00e1-43bb-b0c9-8ae4f4f9c5af" Sep 30 06:33:20 crc kubenswrapper[4691]: I0930 06:33:20.989150 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4a38ca7d-d01d-4a0e-adfa-4875c46d8d7d-cert\") pod \"openstack-baremetal-operator-controller-manager-6d776955-r8s99\" (UID: \"4a38ca7d-d01d-4a0e-adfa-4875c46d8d7d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-r8s99" Sep 30 06:33:20 crc kubenswrapper[4691]: E0930 06:33:20.989356 4691 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Sep 30 06:33:20 crc kubenswrapper[4691]: E0930 06:33:20.989442 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a38ca7d-d01d-4a0e-adfa-4875c46d8d7d-cert podName:4a38ca7d-d01d-4a0e-adfa-4875c46d8d7d nodeName:}" failed. No retries permitted until 2025-09-30 06:33:22.989419982 +0000 UTC m=+846.464441022 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4a38ca7d-d01d-4a0e-adfa-4875c46d8d7d-cert") pod "openstack-baremetal-operator-controller-manager-6d776955-r8s99" (UID: "4a38ca7d-d01d-4a0e-adfa-4875c46d8d7d") : secret "openstack-baremetal-operator-webhook-server-cert" not found Sep 30 06:33:21 crc kubenswrapper[4691]: I0930 06:33:21.005352 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-f66b554c6-8qj9q"] Sep 30 06:33:21 crc kubenswrapper[4691]: I0930 06:33:21.022792 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-44qv5"] Sep 30 06:33:21 crc kubenswrapper[4691]: W0930 06:33:21.024078 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88da73a4_c9e2_4a78_b313_8cf689562e38.slice/crio-da5addb3d10a065ce1a28393bf3dbe17bfb3f27898a0a1c67d90b75e3679b88f WatchSource:0}: Error finding container da5addb3d10a065ce1a28393bf3dbe17bfb3f27898a0a1c67d90b75e3679b88f: Status 404 returned error can't find the container with id da5addb3d10a065ce1a28393bf3dbe17bfb3f27898a0a1c67d90b75e3679b88f Sep 30 06:33:21 crc kubenswrapper[4691]: E0930 06:33:21.026038 4691 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:8fdf377daf05e2fa7346505017078fa81981dd945bf635a64c8022633c68118f,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tdw5b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-b8d54b5d7-44qv5_openstack-operators(88da73a4-c9e2-4a78-b313-8cf689562e38): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 06:33:21 crc kubenswrapper[4691]: I0930 06:33:21.034155 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bd494bc6d-x495w"] Sep 30 06:33:21 crc kubenswrapper[4691]: I0930 06:33:21.062657 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-7v82c"] Sep 30 06:33:21 crc kubenswrapper[4691]: I0930 06:33:21.090630 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-bc7dc7bd9-s4rpv"] Sep 30 06:33:21 crc kubenswrapper[4691]: I0930 06:33:21.100716 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-589c58c6c-vd4rw"] Sep 30 06:33:21 crc kubenswrapper[4691]: I0930 06:33:21.104075 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-86d6bdfc6d-7zkhq"] Sep 30 06:33:21 crc kubenswrapper[4691]: E0930 06:33:21.126599 4691 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:225524223bf2a7f3a4ce95958fc9ca6fdab02745fb70374e8ff5bf1ddaceda4b,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xw8rj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-79d8469568-7v82c_openstack-operators(54fbcf55-e81d-4336-8e38-9bb1d3ec3c47): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 06:33:21 crc kubenswrapper[4691]: E0930 06:33:21.128246 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-7v82c" podUID="54fbcf55-e81d-4336-8e38-9bb1d3ec3c47" Sep 30 06:33:21 crc kubenswrapper[4691]: W0930 06:33:21.152312 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52aa93bd_f5d7_479e_a8fe_2c6e70a70fae.slice/crio-21a9c665b1cbd663f5ea9423c661375ac433656876d392fab5d3738cfac6ee8f WatchSource:0}: Error finding container 21a9c665b1cbd663f5ea9423c661375ac433656876d392fab5d3738cfac6ee8f: Status 404 returned error can't find the container with id 21a9c665b1cbd663f5ea9423c661375ac433656876d392fab5d3738cfac6ee8f Sep 30 06:33:21 crc kubenswrapper[4691]: E0930 06:33:21.155468 4691 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:3c6f7d737e0196ec302f44354228d783ad3b210a75703dda3b39c15c01a67e8c,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-z2kbz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-bc7dc7bd9-s4rpv_openstack-operators(3bce910d-be3e-4332-89df-75e715d95988): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 06:33:21 crc kubenswrapper[4691]: E0930 06:33:21.156453 4691 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:a6b3408d79df6b6d4a467e49defaa4a9d9c088c94d0605a4fee0030c9ccc84d2,Command:[/manager],Args:[--health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080 --leader-elect],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-nhjkv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-589c58c6c-vd4rw_openstack-operators(52aa93bd-f5d7-479e-a8fe-2c6e70a70fae): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 06:33:21 crc kubenswrapper[4691]: E0930 06:33:21.232971 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-44qv5" podUID="88da73a4-c9e2-4a78-b313-8cf689562e38" Sep 30 06:33:21 crc kubenswrapper[4691]: E0930 06:33:21.362041 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-s4rpv" podUID="3bce910d-be3e-4332-89df-75e715d95988" Sep 30 06:33:21 crc kubenswrapper[4691]: E0930 06:33:21.426247 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-vd4rw" podUID="52aa93bd-f5d7-479e-a8fe-2c6e70a70fae" Sep 30 06:33:21 crc kubenswrapper[4691]: I0930 06:33:21.503944 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-rxdj9" event={"ID":"c9f2f281-c656-4c29-bf86-c38f9cd79528","Type":"ContainerStarted","Data":"9251ebd2f2429aa25ef4535c18a8889138116fe5dd0f73e90f33cf95e349e907"} Sep 30 06:33:21 crc kubenswrapper[4691]: I0930 06:33:21.510626 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-f66b554c6-8qj9q" event={"ID":"a1cbd98a-2f66-4649-8347-938d07f93eb1","Type":"ContainerStarted","Data":"d4774e5ff027faed7fa4751793249ed7650bf6d3a4a1697cb010e09716b91911"} Sep 30 06:33:21 crc kubenswrapper[4691]: I0930 06:33:21.527610 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-86d6bdfc6d-7zkhq" event={"ID":"1013022f-3fa2-44d5-a111-5f89a6a7bb17","Type":"ContainerStarted","Data":"d3c45d0ed77742d17008fa5fef655a3d8fe76cd2d373dd4a1b85aff60e0f3470"} Sep 30 06:33:21 crc kubenswrapper[4691]: I0930 06:33:21.527697 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-86d6bdfc6d-7zkhq" event={"ID":"1013022f-3fa2-44d5-a111-5f89a6a7bb17","Type":"ContainerStarted","Data":"05095d45b7c6accb2f5b100e1b6c6461af309b977d1723ba94b0e74a565bc33a"} Sep 30 06:33:21 crc kubenswrapper[4691]: I0930 06:33:21.538596 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-8wf6n" event={"ID":"2a1af285-1505-419c-bacc-16d8a161aca2","Type":"ContainerStarted","Data":"e4b03ccc6fc722e6fc815421f011cb899753e94c3937f18b1198118153959a5d"} Sep 30 06:33:21 crc kubenswrapper[4691]: I0930 06:33:21.538661 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-8wf6n" event={"ID":"2a1af285-1505-419c-bacc-16d8a161aca2","Type":"ContainerStarted","Data":"8ce773fc3348e558b61d2bb5eef5790bf731fb62e27af2991a4d85aeaeb6c2fb"} Sep 30 06:33:21 crc kubenswrapper[4691]: E0930 06:33:21.550139 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/designate-operator@sha256:f6b935f67979298c3c263ad84d277e5cf26c0dbba3f85f255c1ec4d1d75241d2\\\"\"" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-8wf6n" podUID="2a1af285-1505-419c-bacc-16d8a161aca2" Sep 30 06:33:21 crc kubenswrapper[4691]: I0930 06:33:21.556482 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-jgh2d" event={"ID":"1d4f2da8-966a-4a80-aca6-efdd8faca337","Type":"ContainerStarted","Data":"1d4b28809cb84b63e9c7a4c020abf1bd8ad6d22360dea28de27038e33772f262"} Sep 30 06:33:21 crc kubenswrapper[4691]: I0930 06:33:21.558418 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-bzx6l" event={"ID":"a7851de5-00e1-43bb-b0c9-8ae4f4f9c5af","Type":"ContainerStarted","Data":"ce830537d4da5c72cb552109ed4bf7651fec09649a0bc05ebeb35edf14667191"} Sep 30 06:33:21 crc kubenswrapper[4691]: I0930 06:33:21.558443 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-bzx6l" event={"ID":"a7851de5-00e1-43bb-b0c9-8ae4f4f9c5af","Type":"ContainerStarted","Data":"408a281fa699f8c89b762f0b12fc25bf40bb7bc4835f9fa3abd260f18791c55a"} Sep 30 06:33:21 crc kubenswrapper[4691]: E0930 06:33:21.559593 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:1051afc168038fb814f75e7a5f07c588b295a83ebd143dcd8b46d799e31ad302\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-bzx6l" podUID="a7851de5-00e1-43bb-b0c9-8ae4f4f9c5af" Sep 30 06:33:21 crc kubenswrapper[4691]: I0930 06:33:21.561597 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-7v82c" event={"ID":"54fbcf55-e81d-4336-8e38-9bb1d3ec3c47","Type":"ContainerStarted","Data":"d577337188092b1df0a038d3fa9ec04718fd0e59315e9f4e4d2b61639d6d63ef"} Sep 30 06:33:21 crc kubenswrapper[4691]: I0930 06:33:21.562874 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-h42cw" event={"ID":"88c75f60-538f-4059-aaeb-b41dcdcf7cfa","Type":"ContainerStarted","Data":"fe540f8944b1704760f41f8859ff9e10a1a4967026d653ba2df226dcde66ba84"} Sep 30 06:33:21 crc kubenswrapper[4691]: I0930 06:33:21.562909 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-h42cw" event={"ID":"88c75f60-538f-4059-aaeb-b41dcdcf7cfa","Type":"ContainerStarted","Data":"9270171eeb5b9b63ba46f8bdee00968e70aef6c55652452ea8a3513fd0a5b94e"} Sep 30 06:33:21 crc kubenswrapper[4691]: E0930 06:33:21.576498 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:4d08afd31dc5ded10c54a5541f514ac351e9b40a183285b3db27d0757a6354c8\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-h42cw" podUID="88c75f60-538f-4059-aaeb-b41dcdcf7cfa" Sep 30 06:33:21 crc kubenswrapper[4691]: E0930 06:33:21.576766 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:225524223bf2a7f3a4ce95958fc9ca6fdab02745fb70374e8ff5bf1ddaceda4b\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-7v82c" podUID="54fbcf55-e81d-4336-8e38-9bb1d3ec3c47" Sep 30 06:33:21 crc kubenswrapper[4691]: I0930 06:33:21.577780 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-9xs5h" event={"ID":"d6fb63f5-e7b6-47fd-ac44-b59058899b3c","Type":"ContainerStarted","Data":"eb03c14d386b1ea4e346e738630d658b75cebeb2fe535f3e5f32f2c0c3a651ac"} Sep 30 06:33:21 crc kubenswrapper[4691]: I0930 06:33:21.606091 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-s4rpv" event={"ID":"3bce910d-be3e-4332-89df-75e715d95988","Type":"ContainerStarted","Data":"710c3e4070d77c671faa9e3098326434f6014907cfc6b3b1d93b09a5fe224688"} Sep 30 06:33:21 crc kubenswrapper[4691]: I0930 06:33:21.606289 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-s4rpv" event={"ID":"3bce910d-be3e-4332-89df-75e715d95988","Type":"ContainerStarted","Data":"f56e0d708413ad53c06b8de07cd4175aeca848035c2a9832422c928a528b3317"} Sep 30 06:33:21 crc kubenswrapper[4691]: E0930 06:33:21.611995 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3c6f7d737e0196ec302f44354228d783ad3b210a75703dda3b39c15c01a67e8c\\\"\"" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-s4rpv" podUID="3bce910d-be3e-4332-89df-75e715d95988" Sep 30 06:33:21 crc kubenswrapper[4691]: I0930 06:33:21.628327 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-9dj86" event={"ID":"d80f2f0e-5d71-42d4-8bb3-69b8dadd63c2","Type":"ContainerStarted","Data":"fff9b2e95988f8b865b939310f8bb3d97489121e2d759d397cceb57aa4f2719f"} Sep 30 06:33:21 crc kubenswrapper[4691]: I0930 06:33:21.649245 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-44qv5" event={"ID":"88da73a4-c9e2-4a78-b313-8cf689562e38","Type":"ContainerStarted","Data":"a1bd80435390aef0e9d8910a57e2ee5ca0e1833e650b354b54d9afc38206511f"} Sep 30 06:33:21 crc kubenswrapper[4691]: I0930 06:33:21.649291 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-44qv5" event={"ID":"88da73a4-c9e2-4a78-b313-8cf689562e38","Type":"ContainerStarted","Data":"da5addb3d10a065ce1a28393bf3dbe17bfb3f27898a0a1c67d90b75e3679b88f"} Sep 30 06:33:21 crc kubenswrapper[4691]: E0930 06:33:21.654056 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:8fdf377daf05e2fa7346505017078fa81981dd945bf635a64c8022633c68118f\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-44qv5" podUID="88da73a4-c9e2-4a78-b313-8cf689562e38" Sep 30 06:33:21 crc kubenswrapper[4691]: I0930 06:33:21.659393 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-vd4rw" event={"ID":"52aa93bd-f5d7-479e-a8fe-2c6e70a70fae","Type":"ContainerStarted","Data":"a99cca2293789769a585980be480cf2ffdc2d2cc437461c95c68b459941373f6"} Sep 30 06:33:21 crc kubenswrapper[4691]: I0930 06:33:21.659660 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-vd4rw" event={"ID":"52aa93bd-f5d7-479e-a8fe-2c6e70a70fae","Type":"ContainerStarted","Data":"21a9c665b1cbd663f5ea9423c661375ac433656876d392fab5d3738cfac6ee8f"} Sep 30 06:33:21 crc kubenswrapper[4691]: E0930 06:33:21.661336 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:a6b3408d79df6b6d4a467e49defaa4a9d9c088c94d0605a4fee0030c9ccc84d2\\\"\"" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-vd4rw" podUID="52aa93bd-f5d7-479e-a8fe-2c6e70a70fae" Sep 30 06:33:21 crc kubenswrapper[4691]: I0930 06:33:21.661955 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-qqx55" event={"ID":"a5779e0d-8902-4a45-b28e-4253af3938ae","Type":"ContainerStarted","Data":"42a0a0e21e2e41b44b3782090f5c75b8cb9d6a02c04e183ebb277a28e6772504"} Sep 30 06:33:21 crc kubenswrapper[4691]: I0930 06:33:21.665801 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-2kzwg" event={"ID":"2c674607-65d9-4be2-9244-d61eadb97dd7","Type":"ContainerStarted","Data":"f69b6b95fe748326fce993ea01d321e9ca9b1d4ea4466156b131f681968b0f60"} Sep 30 06:33:21 crc kubenswrapper[4691]: I0930 06:33:21.665829 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-2kzwg" event={"ID":"2c674607-65d9-4be2-9244-d61eadb97dd7","Type":"ContainerStarted","Data":"f550540f7742c7b56b69e06f386b8c0bce8b204809d183fc1eae5685034b49a9"} Sep 30 06:33:21 crc kubenswrapper[4691]: E0930 06:33:21.667561 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:485df5c7813cdf4cf21f48ec48c8e3e4962fee6a1ae4c64f7af127d5ab346a10\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-2kzwg" podUID="2c674607-65d9-4be2-9244-d61eadb97dd7" Sep 30 06:33:21 crc kubenswrapper[4691]: I0930 06:33:21.669117 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-rtqwx" event={"ID":"11bd74e6-05a4-44fc-b360-f1d71352011e","Type":"ContainerStarted","Data":"c2c238751054df0537e7863e702c4ec6a552b93015349c98a5019aa5b43fd39d"} Sep 30 06:33:21 crc kubenswrapper[4691]: I0930 06:33:21.672365 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-bd494bc6d-x495w" event={"ID":"9678f82b-58e6-4529-bdf6-6faaf2d7bcfa","Type":"ContainerStarted","Data":"e152f079693e905bed22159fc82f4820ae02617723fa7c167db69ddde060da85"} Sep 30 06:33:21 crc kubenswrapper[4691]: I0930 06:33:21.673632 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-88c7-c5nbk" event={"ID":"a1aaa7fa-8695-4124-ad5a-26f11a99b1c8","Type":"ContainerStarted","Data":"f876ad8abc5e0822fead68b6371a1997f0c2df015d22c5acdad55e1edbd99ff6"} Sep 30 06:33:21 crc kubenswrapper[4691]: I0930 06:33:21.692005 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-x79vm" event={"ID":"9c5c1b63-6185-424c-a584-35a18e2c69bd","Type":"ContainerStarted","Data":"dcaebe00130f80dc970a1ce9bfc726728a65d171c49dabe922a3a7b92b255c16"} Sep 30 06:33:21 crc kubenswrapper[4691]: I0930 06:33:21.700005 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-5j6fw" event={"ID":"5c0ba848-ac6e-4515-99e3-e1665ff79d7c","Type":"ContainerStarted","Data":"653848e72d08036fb5dd7e0a6d2530604dc15c1f9ada849637dea8283541ac37"} Sep 30 06:33:21 crc kubenswrapper[4691]: I0930 06:33:21.726500 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-gtznz" event={"ID":"a8dd4aa3-ab8b-4f66-9722-8873600c87eb","Type":"ContainerStarted","Data":"bb524067b5db56b93ee2d02eb4bc53e26a40940353d347e92172b19d13172b49"} Sep 30 06:33:22 crc kubenswrapper[4691]: I0930 06:33:22.750022 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-86d6bdfc6d-7zkhq" event={"ID":"1013022f-3fa2-44d5-a111-5f89a6a7bb17","Type":"ContainerStarted","Data":"e056e356d4e90c4e96ebb343b537c74dfe3cb7599886712ff35762ff4762fe12"} Sep 30 06:33:22 crc kubenswrapper[4691]: I0930 06:33:22.750844 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-86d6bdfc6d-7zkhq" Sep 30 06:33:22 crc kubenswrapper[4691]: E0930 06:33:22.752239 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:a6b3408d79df6b6d4a467e49defaa4a9d9c088c94d0605a4fee0030c9ccc84d2\\\"\"" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-vd4rw" podUID="52aa93bd-f5d7-479e-a8fe-2c6e70a70fae" Sep 30 06:33:22 crc kubenswrapper[4691]: E0930 06:33:22.753063 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:3c6f7d737e0196ec302f44354228d783ad3b210a75703dda3b39c15c01a67e8c\\\"\"" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-s4rpv" podUID="3bce910d-be3e-4332-89df-75e715d95988" Sep 30 06:33:22 crc kubenswrapper[4691]: E0930 06:33:22.755017 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:8fdf377daf05e2fa7346505017078fa81981dd945bf635a64c8022633c68118f\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-44qv5" podUID="88da73a4-c9e2-4a78-b313-8cf689562e38" Sep 30 06:33:22 crc kubenswrapper[4691]: E0930 06:33:22.755044 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:4d08afd31dc5ded10c54a5541f514ac351e9b40a183285b3db27d0757a6354c8\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-h42cw" podUID="88c75f60-538f-4059-aaeb-b41dcdcf7cfa" Sep 30 06:33:22 crc kubenswrapper[4691]: E0930 06:33:22.755105 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:1051afc168038fb814f75e7a5f07c588b295a83ebd143dcd8b46d799e31ad302\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-bzx6l" podUID="a7851de5-00e1-43bb-b0c9-8ae4f4f9c5af" Sep 30 06:33:22 crc kubenswrapper[4691]: E0930 06:33:22.755171 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:225524223bf2a7f3a4ce95958fc9ca6fdab02745fb70374e8ff5bf1ddaceda4b\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-7v82c" podUID="54fbcf55-e81d-4336-8e38-9bb1d3ec3c47" Sep 30 06:33:22 crc kubenswrapper[4691]: E0930 06:33:22.755207 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/designate-operator@sha256:f6b935f67979298c3c263ad84d277e5cf26c0dbba3f85f255c1ec4d1d75241d2\\\"\"" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-8wf6n" podUID="2a1af285-1505-419c-bacc-16d8a161aca2" Sep 30 06:33:22 crc kubenswrapper[4691]: E0930 06:33:22.755275 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:485df5c7813cdf4cf21f48ec48c8e3e4962fee6a1ae4c64f7af127d5ab346a10\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-2kzwg" podUID="2c674607-65d9-4be2-9244-d61eadb97dd7" Sep 30 06:33:22 crc kubenswrapper[4691]: I0930 06:33:22.850155 4691 patch_prober.go:28] interesting pod/machine-config-daemon-4w4k6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 06:33:22 crc kubenswrapper[4691]: I0930 06:33:22.850217 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 06:33:22 crc kubenswrapper[4691]: I0930 06:33:22.850260 4691 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" Sep 30 06:33:22 crc kubenswrapper[4691]: I0930 06:33:22.850827 4691 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a93cc69e9131d7c4e2a3f6590c1d8cfd39f8977341d3f1a63ae9e1ccb3a86989"} pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 06:33:22 crc kubenswrapper[4691]: I0930 06:33:22.850897 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" containerName="machine-config-daemon" containerID="cri-o://a93cc69e9131d7c4e2a3f6590c1d8cfd39f8977341d3f1a63ae9e1ccb3a86989" gracePeriod=600 Sep 30 06:33:22 crc kubenswrapper[4691]: I0930 06:33:22.867132 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-86d6bdfc6d-7zkhq" podStartSLOduration=3.8671175570000003 podStartE2EDuration="3.867117557s" podCreationTimestamp="2025-09-30 06:33:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:33:22.866771756 +0000 UTC m=+846.341792806" watchObservedRunningTime="2025-09-30 06:33:22.867117557 +0000 UTC m=+846.342138587" Sep 30 06:33:23 crc kubenswrapper[4691]: I0930 06:33:23.035020 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4a38ca7d-d01d-4a0e-adfa-4875c46d8d7d-cert\") pod \"openstack-baremetal-operator-controller-manager-6d776955-r8s99\" (UID: \"4a38ca7d-d01d-4a0e-adfa-4875c46d8d7d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-r8s99" Sep 30 06:33:23 crc kubenswrapper[4691]: I0930 06:33:23.042229 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4a38ca7d-d01d-4a0e-adfa-4875c46d8d7d-cert\") pod \"openstack-baremetal-operator-controller-manager-6d776955-r8s99\" (UID: \"4a38ca7d-d01d-4a0e-adfa-4875c46d8d7d\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-r8s99" Sep 30 06:33:23 crc kubenswrapper[4691]: I0930 06:33:23.112963 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-r8s99" Sep 30 06:33:23 crc kubenswrapper[4691]: I0930 06:33:23.761032 4691 generic.go:334] "Generic (PLEG): container finished" podID="69b46ade-8260-448f-84b7-506632d23ff9" containerID="a93cc69e9131d7c4e2a3f6590c1d8cfd39f8977341d3f1a63ae9e1ccb3a86989" exitCode=0 Sep 30 06:33:23 crc kubenswrapper[4691]: I0930 06:33:23.761084 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" event={"ID":"69b46ade-8260-448f-84b7-506632d23ff9","Type":"ContainerDied","Data":"a93cc69e9131d7c4e2a3f6590c1d8cfd39f8977341d3f1a63ae9e1ccb3a86989"} Sep 30 06:33:23 crc kubenswrapper[4691]: I0930 06:33:23.762354 4691 scope.go:117] "RemoveContainer" containerID="31e757fc7bb8d72540655d2ce1c4ea6d10d3a5eb3fd6ea0108f524dba7e5bca2" Sep 30 06:33:30 crc kubenswrapper[4691]: I0930 06:33:30.559936 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-86d6bdfc6d-7zkhq" Sep 30 06:33:30 crc kubenswrapper[4691]: I0930 06:33:30.938258 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-r8s99"] Sep 30 06:33:30 crc kubenswrapper[4691]: W0930 06:33:30.971555 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a38ca7d_d01d_4a0e_adfa_4875c46d8d7d.slice/crio-f918600cd9350cb3b7079bdb8153cd949813444db1ad2fa07e26a4469e59fa05 WatchSource:0}: Error finding container f918600cd9350cb3b7079bdb8153cd949813444db1ad2fa07e26a4469e59fa05: Status 404 returned error can't find the container with id f918600cd9350cb3b7079bdb8153cd949813444db1ad2fa07e26a4469e59fa05 Sep 30 06:33:30 crc kubenswrapper[4691]: I0930 06:33:30.974896 4691 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 06:33:31 crc kubenswrapper[4691]: I0930 06:33:31.889129 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-9xs5h" event={"ID":"d6fb63f5-e7b6-47fd-ac44-b59058899b3c","Type":"ContainerStarted","Data":"c7b4e9fc24482d104cd87e35adb51402affd92dcfca85e24972908e398269a5d"} Sep 30 06:33:32 crc kubenswrapper[4691]: I0930 06:33:31.910137 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-9dj86" event={"ID":"d80f2f0e-5d71-42d4-8bb3-69b8dadd63c2","Type":"ContainerStarted","Data":"69e1fd3f64ce1d2849b479c6c56fcbbabf5302d24c54e1c4ca560b5d8cf16418"} Sep 30 06:33:32 crc kubenswrapper[4691]: I0930 06:33:31.932179 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-jgh2d" event={"ID":"1d4f2da8-966a-4a80-aca6-efdd8faca337","Type":"ContainerStarted","Data":"30481874532f97551642d665e5df2d623e0a59d2a98274a2f05f39f1b3bb65db"} Sep 30 06:33:32 crc kubenswrapper[4691]: I0930 06:33:31.989433 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-5j6fw" event={"ID":"5c0ba848-ac6e-4515-99e3-e1665ff79d7c","Type":"ContainerStarted","Data":"796b4172f969b1a779c9c555fa30a16d4e6c39d06759d7d1445e34eee6030451"} Sep 30 06:33:32 crc kubenswrapper[4691]: I0930 06:33:32.021262 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-qwmv9" event={"ID":"60dcfaf5-c692-44e5-8868-1dfccb14f535","Type":"ContainerStarted","Data":"3d6c44995d5bc7da4b182d256078864ce65efea4ff3b6e1b035ca10b91b70358"} Sep 30 06:33:32 crc kubenswrapper[4691]: I0930 06:33:32.073829 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-bd494bc6d-x495w" event={"ID":"9678f82b-58e6-4529-bdf6-6faaf2d7bcfa","Type":"ContainerStarted","Data":"e4b732e25c3f8d73c953481d2e1ec2cad7d7191315608e7478eddef18b4713e0"} Sep 30 06:33:32 crc kubenswrapper[4691]: I0930 06:33:32.091423 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-x79vm" event={"ID":"9c5c1b63-6185-424c-a584-35a18e2c69bd","Type":"ContainerStarted","Data":"be0600938675b488588701a7bf8604a57351430ecfb084e06c486a1a1a5689d9"} Sep 30 06:33:32 crc kubenswrapper[4691]: I0930 06:33:32.104196 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-rxdj9" event={"ID":"c9f2f281-c656-4c29-bf86-c38f9cd79528","Type":"ContainerStarted","Data":"3686151ee951aef27e127edae0aa3caff2edd7b968560f2c7977d00321765b6e"} Sep 30 06:33:32 crc kubenswrapper[4691]: I0930 06:33:32.135661 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" event={"ID":"69b46ade-8260-448f-84b7-506632d23ff9","Type":"ContainerStarted","Data":"38f0c707492af70fdfb0f260acc0b7e0af55b1c1967ae7e929f5286c470b2dd6"} Sep 30 06:33:32 crc kubenswrapper[4691]: I0930 06:33:32.154402 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-88c7-c5nbk" event={"ID":"a1aaa7fa-8695-4124-ad5a-26f11a99b1c8","Type":"ContainerStarted","Data":"70a63eb3ee11dba14ff72a325a795d3d6560e6d0e722a35a06bf41fd0b89b48e"} Sep 30 06:33:32 crc kubenswrapper[4691]: I0930 06:33:32.160864 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-88c7-c5nbk" Sep 30 06:33:32 crc kubenswrapper[4691]: I0930 06:33:32.165419 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-gtznz" event={"ID":"a8dd4aa3-ab8b-4f66-9722-8873600c87eb","Type":"ContainerStarted","Data":"36ab08509a93b247d063ffc4e4937c92adc5af8f0c23bf405e3402f6d6d76a64"} Sep 30 06:33:32 crc kubenswrapper[4691]: I0930 06:33:32.165693 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-gtznz" Sep 30 06:33:32 crc kubenswrapper[4691]: I0930 06:33:32.169417 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-rtqwx" event={"ID":"11bd74e6-05a4-44fc-b360-f1d71352011e","Type":"ContainerStarted","Data":"22c25cacf1a97d73d0c97e5c7163fd415ebdb40081a7f6f86c199f12a04b1853"} Sep 30 06:33:32 crc kubenswrapper[4691]: I0930 06:33:32.170613 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-f66b554c6-8qj9q" event={"ID":"a1cbd98a-2f66-4649-8347-938d07f93eb1","Type":"ContainerStarted","Data":"db406286a05ac36f794b7efe19034b80db9706c97482a78c7bbef4538c72966c"} Sep 30 06:33:32 crc kubenswrapper[4691]: I0930 06:33:32.171548 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-r8s99" event={"ID":"4a38ca7d-d01d-4a0e-adfa-4875c46d8d7d","Type":"ContainerStarted","Data":"f918600cd9350cb3b7079bdb8153cd949813444db1ad2fa07e26a4469e59fa05"} Sep 30 06:33:32 crc kubenswrapper[4691]: I0930 06:33:32.174535 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-qqx55" event={"ID":"a5779e0d-8902-4a45-b28e-4253af3938ae","Type":"ContainerStarted","Data":"f2defcf958376834774468816826650cf0c1ba1d35fc4d6a02293e7449c34cac"} Sep 30 06:33:32 crc kubenswrapper[4691]: I0930 06:33:32.188343 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-88c7-c5nbk" podStartSLOduration=4.231182171 podStartE2EDuration="14.188321821s" podCreationTimestamp="2025-09-30 06:33:18 +0000 UTC" firstStartedPulling="2025-09-30 06:33:20.694029694 +0000 UTC m=+844.169050734" lastFinishedPulling="2025-09-30 06:33:30.651169334 +0000 UTC m=+854.126190384" observedRunningTime="2025-09-30 06:33:32.181352947 +0000 UTC m=+855.656373997" watchObservedRunningTime="2025-09-30 06:33:32.188321821 +0000 UTC m=+855.663342871" Sep 30 06:33:32 crc kubenswrapper[4691]: I0930 06:33:32.202043 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-gtznz" podStartSLOduration=4.24331755 podStartE2EDuration="14.202030109s" podCreationTimestamp="2025-09-30 06:33:18 +0000 UTC" firstStartedPulling="2025-09-30 06:33:20.691561366 +0000 UTC m=+844.166582406" lastFinishedPulling="2025-09-30 06:33:30.650273915 +0000 UTC m=+854.125294965" observedRunningTime="2025-09-30 06:33:32.201064018 +0000 UTC m=+855.676085068" watchObservedRunningTime="2025-09-30 06:33:32.202030109 +0000 UTC m=+855.677051149" Sep 30 06:33:33 crc kubenswrapper[4691]: I0930 06:33:33.187227 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-9dj86" event={"ID":"d80f2f0e-5d71-42d4-8bb3-69b8dadd63c2","Type":"ContainerStarted","Data":"1c924a93e9a90468a41ba56e5db5aa9fd86c0970a9154f95b63448275a8c2c82"} Sep 30 06:33:33 crc kubenswrapper[4691]: I0930 06:33:33.187798 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-9dj86" Sep 30 06:33:33 crc kubenswrapper[4691]: I0930 06:33:33.190714 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-rtqwx" event={"ID":"11bd74e6-05a4-44fc-b360-f1d71352011e","Type":"ContainerStarted","Data":"5895553df54b215ab1dbf9ac71d9890d1b39b0e7e3effb1d9771bc0240420b49"} Sep 30 06:33:33 crc kubenswrapper[4691]: I0930 06:33:33.190935 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-rtqwx" Sep 30 06:33:33 crc kubenswrapper[4691]: I0930 06:33:33.192563 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-bd494bc6d-x495w" event={"ID":"9678f82b-58e6-4529-bdf6-6faaf2d7bcfa","Type":"ContainerStarted","Data":"46f56e7ac13535b25b9f8cecac2845cb220f0482c5c8cf44d95016e9a7c54dcd"} Sep 30 06:33:33 crc kubenswrapper[4691]: I0930 06:33:33.192682 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-bd494bc6d-x495w" Sep 30 06:33:33 crc kubenswrapper[4691]: I0930 06:33:33.194575 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-88c7-c5nbk" event={"ID":"a1aaa7fa-8695-4124-ad5a-26f11a99b1c8","Type":"ContainerStarted","Data":"f944d5fb5013b49bb946d06a2cf34fc4f4cd340b1b88d84b37a146cfa25f34e5"} Sep 30 06:33:33 crc kubenswrapper[4691]: I0930 06:33:33.196020 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-5j6fw" event={"ID":"5c0ba848-ac6e-4515-99e3-e1665ff79d7c","Type":"ContainerStarted","Data":"78b0b08191eb4453ed051681849454ea90b67ac531f508f27160b71b3474c1f3"} Sep 30 06:33:33 crc kubenswrapper[4691]: I0930 06:33:33.196141 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-5j6fw" Sep 30 06:33:33 crc kubenswrapper[4691]: I0930 06:33:33.197862 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-f66b554c6-8qj9q" event={"ID":"a1cbd98a-2f66-4649-8347-938d07f93eb1","Type":"ContainerStarted","Data":"482764bdef5ddbcb195eeb1a9e4df4c455bfa4e8c870e8eaab6f6ce5909d025b"} Sep 30 06:33:33 crc kubenswrapper[4691]: I0930 06:33:33.198052 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-f66b554c6-8qj9q" Sep 30 06:33:33 crc kubenswrapper[4691]: I0930 06:33:33.216665 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-qwmv9" event={"ID":"60dcfaf5-c692-44e5-8868-1dfccb14f535","Type":"ContainerStarted","Data":"c1eb1b24bb1d47632485b3d98416aec3452c08492977405e22029eec6f1d3efe"} Sep 30 06:33:33 crc kubenswrapper[4691]: I0930 06:33:33.217391 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-qwmv9" Sep 30 06:33:33 crc kubenswrapper[4691]: I0930 06:33:33.218922 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-jgh2d" event={"ID":"1d4f2da8-966a-4a80-aca6-efdd8faca337","Type":"ContainerStarted","Data":"b2bb012b73c52dc4801c42fa61bbccb01679857aea9df9525efcfd5ca9c04be5"} Sep 30 06:33:33 crc kubenswrapper[4691]: I0930 06:33:33.219510 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-jgh2d" Sep 30 06:33:33 crc kubenswrapper[4691]: I0930 06:33:33.220980 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-x79vm" event={"ID":"9c5c1b63-6185-424c-a584-35a18e2c69bd","Type":"ContainerStarted","Data":"4e332f5523e44e4f9bfd144cebabb52476756394b94a1dd84eed6cea82cae1a6"} Sep 30 06:33:33 crc kubenswrapper[4691]: I0930 06:33:33.221144 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-x79vm" Sep 30 06:33:33 crc kubenswrapper[4691]: I0930 06:33:33.235356 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-9dj86" podStartSLOduration=5.223500704 podStartE2EDuration="15.23533945s" podCreationTimestamp="2025-09-30 06:33:18 +0000 UTC" firstStartedPulling="2025-09-30 06:33:20.660197835 +0000 UTC m=+844.135218875" lastFinishedPulling="2025-09-30 06:33:30.672036571 +0000 UTC m=+854.147057621" observedRunningTime="2025-09-30 06:33:33.206443305 +0000 UTC m=+856.681464355" watchObservedRunningTime="2025-09-30 06:33:33.23533945 +0000 UTC m=+856.710360490" Sep 30 06:33:33 crc kubenswrapper[4691]: I0930 06:33:33.236546 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-5j6fw" podStartSLOduration=5.230459855 podStartE2EDuration="15.236539668s" podCreationTimestamp="2025-09-30 06:33:18 +0000 UTC" firstStartedPulling="2025-09-30 06:33:20.660190504 +0000 UTC m=+844.135211544" lastFinishedPulling="2025-09-30 06:33:30.666270307 +0000 UTC m=+854.141291357" observedRunningTime="2025-09-30 06:33:33.229069289 +0000 UTC m=+856.704090339" watchObservedRunningTime="2025-09-30 06:33:33.236539668 +0000 UTC m=+856.711560708" Sep 30 06:33:33 crc kubenswrapper[4691]: I0930 06:33:33.246609 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-rxdj9" Sep 30 06:33:33 crc kubenswrapper[4691]: I0930 06:33:33.246641 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-rxdj9" event={"ID":"c9f2f281-c656-4c29-bf86-c38f9cd79528","Type":"ContainerStarted","Data":"89fde682d781e5dc6b793285ca15bcb0ef44d2ef8eb4751270917274ca813633"} Sep 30 06:33:33 crc kubenswrapper[4691]: I0930 06:33:33.246657 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-9xs5h" Sep 30 06:33:33 crc kubenswrapper[4691]: I0930 06:33:33.246668 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-gtznz" event={"ID":"a8dd4aa3-ab8b-4f66-9722-8873600c87eb","Type":"ContainerStarted","Data":"7fcfa7bf2ea46b71de7614c66366e6dacfaf0c9be79fc3424b943820e12688bd"} Sep 30 06:33:33 crc kubenswrapper[4691]: I0930 06:33:33.246678 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-9xs5h" event={"ID":"d6fb63f5-e7b6-47fd-ac44-b59058899b3c","Type":"ContainerStarted","Data":"800f6d47c67efa442c32aeddd72fd66cf0dd09559e574fe082f9dba692c98e6f"} Sep 30 06:33:33 crc kubenswrapper[4691]: I0930 06:33:33.250126 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-qqx55" event={"ID":"a5779e0d-8902-4a45-b28e-4253af3938ae","Type":"ContainerStarted","Data":"5fe3550495fd73b9874356e967fe554c92ad3a3430cc24db9d29123f660ec0e7"} Sep 30 06:33:33 crc kubenswrapper[4691]: I0930 06:33:33.250161 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-qqx55" Sep 30 06:33:33 crc kubenswrapper[4691]: I0930 06:33:33.261957 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-bd494bc6d-x495w" podStartSLOduration=4.671804873 podStartE2EDuration="14.26194134s" podCreationTimestamp="2025-09-30 06:33:19 +0000 UTC" firstStartedPulling="2025-09-30 06:33:21.083837067 +0000 UTC m=+844.558858107" lastFinishedPulling="2025-09-30 06:33:30.673973524 +0000 UTC m=+854.148994574" observedRunningTime="2025-09-30 06:33:33.25379869 +0000 UTC m=+856.728819740" watchObservedRunningTime="2025-09-30 06:33:33.26194134 +0000 UTC m=+856.736962380" Sep 30 06:33:33 crc kubenswrapper[4691]: I0930 06:33:33.269092 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-f66b554c6-8qj9q" podStartSLOduration=4.601177301 podStartE2EDuration="14.269074999s" podCreationTimestamp="2025-09-30 06:33:19 +0000 UTC" firstStartedPulling="2025-09-30 06:33:21.013140931 +0000 UTC m=+844.488161971" lastFinishedPulling="2025-09-30 06:33:30.681038619 +0000 UTC m=+854.156059669" observedRunningTime="2025-09-30 06:33:33.266659181 +0000 UTC m=+856.741680221" watchObservedRunningTime="2025-09-30 06:33:33.269074999 +0000 UTC m=+856.744096039" Sep 30 06:33:33 crc kubenswrapper[4691]: I0930 06:33:33.282205 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-rtqwx" podStartSLOduration=5.293816944 podStartE2EDuration="15.282181388s" podCreationTimestamp="2025-09-30 06:33:18 +0000 UTC" firstStartedPulling="2025-09-30 06:33:20.682982255 +0000 UTC m=+844.158003295" lastFinishedPulling="2025-09-30 06:33:30.671346689 +0000 UTC m=+854.146367739" observedRunningTime="2025-09-30 06:33:33.281705983 +0000 UTC m=+856.756727033" watchObservedRunningTime="2025-09-30 06:33:33.282181388 +0000 UTC m=+856.757202428" Sep 30 06:33:33 crc kubenswrapper[4691]: I0930 06:33:33.305729 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-qqx55" podStartSLOduration=5.274871933 podStartE2EDuration="15.30571151s" podCreationTimestamp="2025-09-30 06:33:18 +0000 UTC" firstStartedPulling="2025-09-30 06:33:20.642354541 +0000 UTC m=+844.117375581" lastFinishedPulling="2025-09-30 06:33:30.673194108 +0000 UTC m=+854.148215158" observedRunningTime="2025-09-30 06:33:33.300332948 +0000 UTC m=+856.775354008" watchObservedRunningTime="2025-09-30 06:33:33.30571151 +0000 UTC m=+856.780732550" Sep 30 06:33:33 crc kubenswrapper[4691]: I0930 06:33:33.325614 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-9xs5h" podStartSLOduration=5.34225542 podStartE2EDuration="15.325599216s" podCreationTimestamp="2025-09-30 06:33:18 +0000 UTC" firstStartedPulling="2025-09-30 06:33:20.690060089 +0000 UTC m=+844.165081129" lastFinishedPulling="2025-09-30 06:33:30.673403875 +0000 UTC m=+854.148424925" observedRunningTime="2025-09-30 06:33:33.320290217 +0000 UTC m=+856.795311267" watchObservedRunningTime="2025-09-30 06:33:33.325599216 +0000 UTC m=+856.800620256" Sep 30 06:33:33 crc kubenswrapper[4691]: I0930 06:33:33.360381 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-qwmv9" podStartSLOduration=4.963623755 podStartE2EDuration="15.360368629s" podCreationTimestamp="2025-09-30 06:33:18 +0000 UTC" firstStartedPulling="2025-09-30 06:33:20.252431156 +0000 UTC m=+843.727452196" lastFinishedPulling="2025-09-30 06:33:30.64917602 +0000 UTC m=+854.124197070" observedRunningTime="2025-09-30 06:33:33.340640218 +0000 UTC m=+856.815661268" watchObservedRunningTime="2025-09-30 06:33:33.360368629 +0000 UTC m=+856.835389669" Sep 30 06:33:33 crc kubenswrapper[4691]: I0930 06:33:33.384480 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-x79vm" podStartSLOduration=5.3949417969999995 podStartE2EDuration="15.384468919s" podCreationTimestamp="2025-09-30 06:33:18 +0000 UTC" firstStartedPulling="2025-09-30 06:33:20.682971354 +0000 UTC m=+844.157992394" lastFinishedPulling="2025-09-30 06:33:30.672498476 +0000 UTC m=+854.147519516" observedRunningTime="2025-09-30 06:33:33.380915326 +0000 UTC m=+856.855936376" watchObservedRunningTime="2025-09-30 06:33:33.384468919 +0000 UTC m=+856.859489959" Sep 30 06:33:33 crc kubenswrapper[4691]: I0930 06:33:33.385510 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-jgh2d" podStartSLOduration=5.381309296 podStartE2EDuration="15.385504403s" podCreationTimestamp="2025-09-30 06:33:18 +0000 UTC" firstStartedPulling="2025-09-30 06:33:20.647894256 +0000 UTC m=+844.122915296" lastFinishedPulling="2025-09-30 06:33:30.652089353 +0000 UTC m=+854.127110403" observedRunningTime="2025-09-30 06:33:33.362364472 +0000 UTC m=+856.837385522" watchObservedRunningTime="2025-09-30 06:33:33.385504403 +0000 UTC m=+856.860525443" Sep 30 06:33:33 crc kubenswrapper[4691]: I0930 06:33:33.399776 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-rxdj9" podStartSLOduration=5.418295182 podStartE2EDuration="15.399765659s" podCreationTimestamp="2025-09-30 06:33:18 +0000 UTC" firstStartedPulling="2025-09-30 06:33:20.69106066 +0000 UTC m=+844.166081690" lastFinishedPulling="2025-09-30 06:33:30.672531117 +0000 UTC m=+854.147552167" observedRunningTime="2025-09-30 06:33:33.397662772 +0000 UTC m=+856.872683812" watchObservedRunningTime="2025-09-30 06:33:33.399765659 +0000 UTC m=+856.874786699" Sep 30 06:33:34 crc kubenswrapper[4691]: I0930 06:33:34.259159 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-r8s99" event={"ID":"4a38ca7d-d01d-4a0e-adfa-4875c46d8d7d","Type":"ContainerStarted","Data":"570df803fdf12a0faf10277b90e38877a4ccb38f8517260efebafacb8e01d7fd"} Sep 30 06:33:35 crc kubenswrapper[4691]: I0930 06:33:35.269594 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-r8s99" event={"ID":"4a38ca7d-d01d-4a0e-adfa-4875c46d8d7d","Type":"ContainerStarted","Data":"395b9d32df21052737c925e15f9d7e3091f58549d4b4437d67a58c85bebb306a"} Sep 30 06:33:35 crc kubenswrapper[4691]: I0930 06:33:35.312760 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-r8s99" podStartSLOduration=14.381003644 podStartE2EDuration="17.312734657s" podCreationTimestamp="2025-09-30 06:33:18 +0000 UTC" firstStartedPulling="2025-09-30 06:33:30.974663181 +0000 UTC m=+854.449684221" lastFinishedPulling="2025-09-30 06:33:33.906394184 +0000 UTC m=+857.381415234" observedRunningTime="2025-09-30 06:33:35.311071693 +0000 UTC m=+858.786092803" watchObservedRunningTime="2025-09-30 06:33:35.312734657 +0000 UTC m=+858.787755737" Sep 30 06:33:36 crc kubenswrapper[4691]: I0930 06:33:36.281115 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-r8s99" Sep 30 06:33:39 crc kubenswrapper[4691]: I0930 06:33:39.076429 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-6ff8b75857-qqx55" Sep 30 06:33:39 crc kubenswrapper[4691]: I0930 06:33:39.102622 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-644bddb6d8-9dj86" Sep 30 06:33:39 crc kubenswrapper[4691]: I0930 06:33:39.140858 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-84958c4d49-qwmv9" Sep 30 06:33:39 crc kubenswrapper[4691]: I0930 06:33:39.153197 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5d889d78cf-5j6fw" Sep 30 06:33:39 crc kubenswrapper[4691]: I0930 06:33:39.194493 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-7d857cc749-jgh2d" Sep 30 06:33:39 crc kubenswrapper[4691]: I0930 06:33:39.212251 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-9f4696d94-rxdj9" Sep 30 06:33:39 crc kubenswrapper[4691]: I0930 06:33:39.257355 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-7975b88857-x79vm" Sep 30 06:33:39 crc kubenswrapper[4691]: I0930 06:33:39.257485 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-5bd55b4bff-9xs5h" Sep 30 06:33:39 crc kubenswrapper[4691]: I0930 06:33:39.459560 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-88c7-c5nbk" Sep 30 06:33:39 crc kubenswrapper[4691]: I0930 06:33:39.467493 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-6d68dbc695-rtqwx" Sep 30 06:33:39 crc kubenswrapper[4691]: I0930 06:33:39.495622 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-c7c776c96-gtznz" Sep 30 06:33:39 crc kubenswrapper[4691]: I0930 06:33:39.934188 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-f66b554c6-8qj9q" Sep 30 06:33:39 crc kubenswrapper[4691]: I0930 06:33:39.965773 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-bd494bc6d-x495w" Sep 30 06:33:43 crc kubenswrapper[4691]: I0930 06:33:43.122048 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6d776955-r8s99" Sep 30 06:33:46 crc kubenswrapper[4691]: I0930 06:33:46.365497 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-h42cw" event={"ID":"88c75f60-538f-4059-aaeb-b41dcdcf7cfa","Type":"ContainerStarted","Data":"831b4847c3cbc0c77cd6c59447e9d28c6177b47ae1b013497ae96ccb798c2384"} Sep 30 06:33:46 crc kubenswrapper[4691]: I0930 06:33:46.366180 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-h42cw" Sep 30 06:33:46 crc kubenswrapper[4691]: I0930 06:33:46.367322 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-s4rpv" event={"ID":"3bce910d-be3e-4332-89df-75e715d95988","Type":"ContainerStarted","Data":"9586ef65eb128d473d337e6e611d2ff5c6a3749fa45eb12ad9210e2e5ae7d1d1"} Sep 30 06:33:46 crc kubenswrapper[4691]: I0930 06:33:46.367545 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-s4rpv" Sep 30 06:33:46 crc kubenswrapper[4691]: I0930 06:33:46.368825 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-44qv5" event={"ID":"88da73a4-c9e2-4a78-b313-8cf689562e38","Type":"ContainerStarted","Data":"cdb8116ac4f5155fdc0a07bae95b8b49b1696c26403ed80f336bba347fc890e6"} Sep 30 06:33:46 crc kubenswrapper[4691]: I0930 06:33:46.369023 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-44qv5" Sep 30 06:33:46 crc kubenswrapper[4691]: I0930 06:33:46.370513 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-vd4rw" event={"ID":"52aa93bd-f5d7-479e-a8fe-2c6e70a70fae","Type":"ContainerStarted","Data":"ad35455d81151a6a81a6ca1ba9488beee775adbff88c2f2654c64d1a47c6dec6"} Sep 30 06:33:46 crc kubenswrapper[4691]: I0930 06:33:46.370670 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-vd4rw" Sep 30 06:33:46 crc kubenswrapper[4691]: I0930 06:33:46.371843 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-2kzwg" event={"ID":"2c674607-65d9-4be2-9244-d61eadb97dd7","Type":"ContainerStarted","Data":"304fec4051f0cc18e5f55e8d700618fe8aaaf32c4acd306b387502848e9dfb76"} Sep 30 06:33:46 crc kubenswrapper[4691]: I0930 06:33:46.372015 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-2kzwg" Sep 30 06:33:46 crc kubenswrapper[4691]: I0930 06:33:46.373113 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-8wf6n" event={"ID":"2a1af285-1505-419c-bacc-16d8a161aca2","Type":"ContainerStarted","Data":"f413e099fcb0dd15843927541ff67fddd0ba2d2586ad4762d37b81d52c463d86"} Sep 30 06:33:46 crc kubenswrapper[4691]: I0930 06:33:46.373277 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-8wf6n" Sep 30 06:33:46 crc kubenswrapper[4691]: I0930 06:33:46.374770 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-bzx6l" event={"ID":"a7851de5-00e1-43bb-b0c9-8ae4f4f9c5af","Type":"ContainerStarted","Data":"22689e07bef28c74cc1d4b60cd1c8fadafa4c616e192ebc91ca3491f777a8279"} Sep 30 06:33:46 crc kubenswrapper[4691]: I0930 06:33:46.374943 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-bzx6l" Sep 30 06:33:46 crc kubenswrapper[4691]: I0930 06:33:46.376105 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-7v82c" event={"ID":"54fbcf55-e81d-4336-8e38-9bb1d3ec3c47","Type":"ContainerStarted","Data":"930dad63e6a689b44bdbb0a8fef0bcd327b170d3b5e1771765286a39f71d1f7a"} Sep 30 06:33:46 crc kubenswrapper[4691]: I0930 06:33:46.410489 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-h42cw" podStartSLOduration=4.053151122 podStartE2EDuration="28.410472592s" podCreationTimestamp="2025-09-30 06:33:18 +0000 UTC" firstStartedPulling="2025-09-30 06:33:20.704303429 +0000 UTC m=+844.179324469" lastFinishedPulling="2025-09-30 06:33:45.061624859 +0000 UTC m=+868.536645939" observedRunningTime="2025-09-30 06:33:46.388695206 +0000 UTC m=+869.863716246" watchObservedRunningTime="2025-09-30 06:33:46.410472592 +0000 UTC m=+869.885493632" Sep 30 06:33:46 crc kubenswrapper[4691]: I0930 06:33:46.411042 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-s4rpv" podStartSLOduration=3.002662446 podStartE2EDuration="27.41103627s" podCreationTimestamp="2025-09-30 06:33:19 +0000 UTC" firstStartedPulling="2025-09-30 06:33:21.155342165 +0000 UTC m=+844.630363205" lastFinishedPulling="2025-09-30 06:33:45.563715979 +0000 UTC m=+869.038737029" observedRunningTime="2025-09-30 06:33:46.407297731 +0000 UTC m=+869.882318771" watchObservedRunningTime="2025-09-30 06:33:46.41103627 +0000 UTC m=+869.886057310" Sep 30 06:33:46 crc kubenswrapper[4691]: I0930 06:33:46.435765 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-vd4rw" podStartSLOduration=3.042644045 podStartE2EDuration="27.435750861s" podCreationTimestamp="2025-09-30 06:33:19 +0000 UTC" firstStartedPulling="2025-09-30 06:33:21.156282924 +0000 UTC m=+844.631303964" lastFinishedPulling="2025-09-30 06:33:45.54938974 +0000 UTC m=+869.024410780" observedRunningTime="2025-09-30 06:33:46.431179685 +0000 UTC m=+869.906200735" watchObservedRunningTime="2025-09-30 06:33:46.435750861 +0000 UTC m=+869.910771901" Sep 30 06:33:46 crc kubenswrapper[4691]: I0930 06:33:46.448925 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-8wf6n" podStartSLOduration=3.605301628 podStartE2EDuration="28.448909822s" podCreationTimestamp="2025-09-30 06:33:18 +0000 UTC" firstStartedPulling="2025-09-30 06:33:20.704388372 +0000 UTC m=+844.179409412" lastFinishedPulling="2025-09-30 06:33:45.547996536 +0000 UTC m=+869.023017606" observedRunningTime="2025-09-30 06:33:46.446725102 +0000 UTC m=+869.921746152" watchObservedRunningTime="2025-09-30 06:33:46.448909822 +0000 UTC m=+869.923930862" Sep 30 06:33:46 crc kubenswrapper[4691]: I0930 06:33:46.472287 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-bzx6l" podStartSLOduration=3.723271735 podStartE2EDuration="28.472271159s" podCreationTimestamp="2025-09-30 06:33:18 +0000 UTC" firstStartedPulling="2025-09-30 06:33:20.80051447 +0000 UTC m=+844.275535510" lastFinishedPulling="2025-09-30 06:33:45.549513884 +0000 UTC m=+869.024534934" observedRunningTime="2025-09-30 06:33:46.470066468 +0000 UTC m=+869.945087528" watchObservedRunningTime="2025-09-30 06:33:46.472271159 +0000 UTC m=+869.947292199" Sep 30 06:33:46 crc kubenswrapper[4691]: I0930 06:33:46.498806 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-79d8469568-7v82c" podStartSLOduration=2.967176201 podStartE2EDuration="27.498790737s" podCreationTimestamp="2025-09-30 06:33:19 +0000 UTC" firstStartedPulling="2025-09-30 06:33:21.126474971 +0000 UTC m=+844.601496011" lastFinishedPulling="2025-09-30 06:33:45.658089507 +0000 UTC m=+869.133110547" observedRunningTime="2025-09-30 06:33:46.494617763 +0000 UTC m=+869.969638813" watchObservedRunningTime="2025-09-30 06:33:46.498790737 +0000 UTC m=+869.973811777" Sep 30 06:33:46 crc kubenswrapper[4691]: I0930 06:33:46.499798 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-2kzwg" podStartSLOduration=4.264601931 podStartE2EDuration="28.49979236s" podCreationTimestamp="2025-09-30 06:33:18 +0000 UTC" firstStartedPulling="2025-09-30 06:33:20.826398888 +0000 UTC m=+844.301419928" lastFinishedPulling="2025-09-30 06:33:45.061589327 +0000 UTC m=+868.536610357" observedRunningTime="2025-09-30 06:33:46.484294594 +0000 UTC m=+869.959315644" watchObservedRunningTime="2025-09-30 06:33:46.49979236 +0000 UTC m=+869.974813390" Sep 30 06:33:46 crc kubenswrapper[4691]: I0930 06:33:46.509463 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-44qv5" podStartSLOduration=3.473640911 podStartE2EDuration="27.509445968s" podCreationTimestamp="2025-09-30 06:33:19 +0000 UTC" firstStartedPulling="2025-09-30 06:33:21.025918685 +0000 UTC m=+844.500939725" lastFinishedPulling="2025-09-30 06:33:45.061723732 +0000 UTC m=+868.536744782" observedRunningTime="2025-09-30 06:33:46.508262311 +0000 UTC m=+869.983283361" watchObservedRunningTime="2025-09-30 06:33:46.509445968 +0000 UTC m=+869.984466998" Sep 30 06:33:59 crc kubenswrapper[4691]: I0930 06:33:59.114587 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-84f4f7b77b-8wf6n" Sep 30 06:33:59 crc kubenswrapper[4691]: I0930 06:33:59.487976 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-64d7b59854-2kzwg" Sep 30 06:33:59 crc kubenswrapper[4691]: I0930 06:33:59.508503 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-9976ff44c-bzx6l" Sep 30 06:33:59 crc kubenswrapper[4691]: I0930 06:33:59.560575 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-76fcc6dc7c-h42cw" Sep 30 06:33:59 crc kubenswrapper[4691]: I0930 06:33:59.859808 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-589c58c6c-vd4rw" Sep 30 06:33:59 crc kubenswrapper[4691]: I0930 06:33:59.890928 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-bc7dc7bd9-s4rpv" Sep 30 06:33:59 crc kubenswrapper[4691]: I0930 06:33:59.946826 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-b8d54b5d7-44qv5" Sep 30 06:34:17 crc kubenswrapper[4691]: I0930 06:34:17.958836 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77479b959-nvfnl"] Sep 30 06:34:17 crc kubenswrapper[4691]: I0930 06:34:17.960420 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77479b959-nvfnl" Sep 30 06:34:17 crc kubenswrapper[4691]: I0930 06:34:17.966443 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-trzk7" Sep 30 06:34:17 crc kubenswrapper[4691]: I0930 06:34:17.968273 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Sep 30 06:34:17 crc kubenswrapper[4691]: I0930 06:34:17.971442 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Sep 30 06:34:17 crc kubenswrapper[4691]: I0930 06:34:17.971528 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Sep 30 06:34:17 crc kubenswrapper[4691]: I0930 06:34:17.998115 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77479b959-nvfnl"] Sep 30 06:34:18 crc kubenswrapper[4691]: I0930 06:34:18.070537 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8b8d888b5-mf8ss"] Sep 30 06:34:18 crc kubenswrapper[4691]: I0930 06:34:18.072031 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b8d888b5-mf8ss" Sep 30 06:34:18 crc kubenswrapper[4691]: I0930 06:34:18.075722 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Sep 30 06:34:18 crc kubenswrapper[4691]: I0930 06:34:18.088553 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b8d888b5-mf8ss"] Sep 30 06:34:18 crc kubenswrapper[4691]: I0930 06:34:18.120595 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/223c32e4-0e0d-4ec5-aed4-823e434a40d1-config\") pod \"dnsmasq-dns-77479b959-nvfnl\" (UID: \"223c32e4-0e0d-4ec5-aed4-823e434a40d1\") " pod="openstack/dnsmasq-dns-77479b959-nvfnl" Sep 30 06:34:18 crc kubenswrapper[4691]: I0930 06:34:18.120642 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqh66\" (UniqueName: \"kubernetes.io/projected/223c32e4-0e0d-4ec5-aed4-823e434a40d1-kube-api-access-rqh66\") pod \"dnsmasq-dns-77479b959-nvfnl\" (UID: \"223c32e4-0e0d-4ec5-aed4-823e434a40d1\") " pod="openstack/dnsmasq-dns-77479b959-nvfnl" Sep 30 06:34:18 crc kubenswrapper[4691]: I0930 06:34:18.222109 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23f40da9-1a7d-4b60-8c65-2f9e0cbc6cd0-config\") pod \"dnsmasq-dns-8b8d888b5-mf8ss\" (UID: \"23f40da9-1a7d-4b60-8c65-2f9e0cbc6cd0\") " pod="openstack/dnsmasq-dns-8b8d888b5-mf8ss" Sep 30 06:34:18 crc kubenswrapper[4691]: I0930 06:34:18.222214 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cm9cc\" (UniqueName: \"kubernetes.io/projected/23f40da9-1a7d-4b60-8c65-2f9e0cbc6cd0-kube-api-access-cm9cc\") pod \"dnsmasq-dns-8b8d888b5-mf8ss\" (UID: \"23f40da9-1a7d-4b60-8c65-2f9e0cbc6cd0\") " pod="openstack/dnsmasq-dns-8b8d888b5-mf8ss" Sep 30 06:34:18 crc kubenswrapper[4691]: I0930 06:34:18.222273 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/223c32e4-0e0d-4ec5-aed4-823e434a40d1-config\") pod \"dnsmasq-dns-77479b959-nvfnl\" (UID: \"223c32e4-0e0d-4ec5-aed4-823e434a40d1\") " pod="openstack/dnsmasq-dns-77479b959-nvfnl" Sep 30 06:34:18 crc kubenswrapper[4691]: I0930 06:34:18.222296 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/23f40da9-1a7d-4b60-8c65-2f9e0cbc6cd0-dns-svc\") pod \"dnsmasq-dns-8b8d888b5-mf8ss\" (UID: \"23f40da9-1a7d-4b60-8c65-2f9e0cbc6cd0\") " pod="openstack/dnsmasq-dns-8b8d888b5-mf8ss" Sep 30 06:34:18 crc kubenswrapper[4691]: I0930 06:34:18.222452 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqh66\" (UniqueName: \"kubernetes.io/projected/223c32e4-0e0d-4ec5-aed4-823e434a40d1-kube-api-access-rqh66\") pod \"dnsmasq-dns-77479b959-nvfnl\" (UID: \"223c32e4-0e0d-4ec5-aed4-823e434a40d1\") " pod="openstack/dnsmasq-dns-77479b959-nvfnl" Sep 30 06:34:18 crc kubenswrapper[4691]: I0930 06:34:18.223282 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/223c32e4-0e0d-4ec5-aed4-823e434a40d1-config\") pod \"dnsmasq-dns-77479b959-nvfnl\" (UID: \"223c32e4-0e0d-4ec5-aed4-823e434a40d1\") " pod="openstack/dnsmasq-dns-77479b959-nvfnl" Sep 30 06:34:18 crc kubenswrapper[4691]: I0930 06:34:18.248813 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqh66\" (UniqueName: \"kubernetes.io/projected/223c32e4-0e0d-4ec5-aed4-823e434a40d1-kube-api-access-rqh66\") pod \"dnsmasq-dns-77479b959-nvfnl\" (UID: \"223c32e4-0e0d-4ec5-aed4-823e434a40d1\") " pod="openstack/dnsmasq-dns-77479b959-nvfnl" Sep 30 06:34:18 crc kubenswrapper[4691]: I0930 06:34:18.276197 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77479b959-nvfnl" Sep 30 06:34:18 crc kubenswrapper[4691]: I0930 06:34:18.323488 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23f40da9-1a7d-4b60-8c65-2f9e0cbc6cd0-config\") pod \"dnsmasq-dns-8b8d888b5-mf8ss\" (UID: \"23f40da9-1a7d-4b60-8c65-2f9e0cbc6cd0\") " pod="openstack/dnsmasq-dns-8b8d888b5-mf8ss" Sep 30 06:34:18 crc kubenswrapper[4691]: I0930 06:34:18.323581 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cm9cc\" (UniqueName: \"kubernetes.io/projected/23f40da9-1a7d-4b60-8c65-2f9e0cbc6cd0-kube-api-access-cm9cc\") pod \"dnsmasq-dns-8b8d888b5-mf8ss\" (UID: \"23f40da9-1a7d-4b60-8c65-2f9e0cbc6cd0\") " pod="openstack/dnsmasq-dns-8b8d888b5-mf8ss" Sep 30 06:34:18 crc kubenswrapper[4691]: I0930 06:34:18.323603 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/23f40da9-1a7d-4b60-8c65-2f9e0cbc6cd0-dns-svc\") pod \"dnsmasq-dns-8b8d888b5-mf8ss\" (UID: \"23f40da9-1a7d-4b60-8c65-2f9e0cbc6cd0\") " pod="openstack/dnsmasq-dns-8b8d888b5-mf8ss" Sep 30 06:34:18 crc kubenswrapper[4691]: I0930 06:34:18.325762 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/23f40da9-1a7d-4b60-8c65-2f9e0cbc6cd0-dns-svc\") pod \"dnsmasq-dns-8b8d888b5-mf8ss\" (UID: \"23f40da9-1a7d-4b60-8c65-2f9e0cbc6cd0\") " pod="openstack/dnsmasq-dns-8b8d888b5-mf8ss" Sep 30 06:34:18 crc kubenswrapper[4691]: I0930 06:34:18.325951 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23f40da9-1a7d-4b60-8c65-2f9e0cbc6cd0-config\") pod \"dnsmasq-dns-8b8d888b5-mf8ss\" (UID: \"23f40da9-1a7d-4b60-8c65-2f9e0cbc6cd0\") " pod="openstack/dnsmasq-dns-8b8d888b5-mf8ss" Sep 30 06:34:18 crc kubenswrapper[4691]: I0930 06:34:18.344724 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cm9cc\" (UniqueName: \"kubernetes.io/projected/23f40da9-1a7d-4b60-8c65-2f9e0cbc6cd0-kube-api-access-cm9cc\") pod \"dnsmasq-dns-8b8d888b5-mf8ss\" (UID: \"23f40da9-1a7d-4b60-8c65-2f9e0cbc6cd0\") " pod="openstack/dnsmasq-dns-8b8d888b5-mf8ss" Sep 30 06:34:18 crc kubenswrapper[4691]: I0930 06:34:18.412933 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b8d888b5-mf8ss" Sep 30 06:34:18 crc kubenswrapper[4691]: I0930 06:34:18.657364 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b8d888b5-mf8ss"] Sep 30 06:34:18 crc kubenswrapper[4691]: I0930 06:34:18.752779 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77479b959-nvfnl"] Sep 30 06:34:18 crc kubenswrapper[4691]: W0930 06:34:18.762941 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod223c32e4_0e0d_4ec5_aed4_823e434a40d1.slice/crio-79f19fd58421b503a44b17d30aeb6c46c3c9054a9b4f66a991e409f85d5d19ec WatchSource:0}: Error finding container 79f19fd58421b503a44b17d30aeb6c46c3c9054a9b4f66a991e409f85d5d19ec: Status 404 returned error can't find the container with id 79f19fd58421b503a44b17d30aeb6c46c3c9054a9b4f66a991e409f85d5d19ec Sep 30 06:34:19 crc kubenswrapper[4691]: I0930 06:34:19.679049 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b8d888b5-mf8ss" event={"ID":"23f40da9-1a7d-4b60-8c65-2f9e0cbc6cd0","Type":"ContainerStarted","Data":"98866d797f68156276376c01376c4ff38b4d40746b0c710cac07ebdf32cc6708"} Sep 30 06:34:19 crc kubenswrapper[4691]: I0930 06:34:19.683553 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77479b959-nvfnl" event={"ID":"223c32e4-0e0d-4ec5-aed4-823e434a40d1","Type":"ContainerStarted","Data":"79f19fd58421b503a44b17d30aeb6c46c3c9054a9b4f66a991e409f85d5d19ec"} Sep 30 06:34:22 crc kubenswrapper[4691]: I0930 06:34:22.113937 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77479b959-nvfnl"] Sep 30 06:34:22 crc kubenswrapper[4691]: I0930 06:34:22.136593 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-546bf79c69-swsst"] Sep 30 06:34:22 crc kubenswrapper[4691]: I0930 06:34:22.137755 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-546bf79c69-swsst" Sep 30 06:34:22 crc kubenswrapper[4691]: I0930 06:34:22.146525 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-546bf79c69-swsst"] Sep 30 06:34:22 crc kubenswrapper[4691]: I0930 06:34:22.195447 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18a8bb07-0424-46c0-8405-c49878049ffc-config\") pod \"dnsmasq-dns-546bf79c69-swsst\" (UID: \"18a8bb07-0424-46c0-8405-c49878049ffc\") " pod="openstack/dnsmasq-dns-546bf79c69-swsst" Sep 30 06:34:22 crc kubenswrapper[4691]: I0930 06:34:22.195753 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18a8bb07-0424-46c0-8405-c49878049ffc-dns-svc\") pod \"dnsmasq-dns-546bf79c69-swsst\" (UID: \"18a8bb07-0424-46c0-8405-c49878049ffc\") " pod="openstack/dnsmasq-dns-546bf79c69-swsst" Sep 30 06:34:22 crc kubenswrapper[4691]: I0930 06:34:22.195775 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9s95\" (UniqueName: \"kubernetes.io/projected/18a8bb07-0424-46c0-8405-c49878049ffc-kube-api-access-g9s95\") pod \"dnsmasq-dns-546bf79c69-swsst\" (UID: \"18a8bb07-0424-46c0-8405-c49878049ffc\") " pod="openstack/dnsmasq-dns-546bf79c69-swsst" Sep 30 06:34:22 crc kubenswrapper[4691]: I0930 06:34:22.296870 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18a8bb07-0424-46c0-8405-c49878049ffc-config\") pod \"dnsmasq-dns-546bf79c69-swsst\" (UID: \"18a8bb07-0424-46c0-8405-c49878049ffc\") " pod="openstack/dnsmasq-dns-546bf79c69-swsst" Sep 30 06:34:22 crc kubenswrapper[4691]: I0930 06:34:22.296966 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18a8bb07-0424-46c0-8405-c49878049ffc-dns-svc\") pod \"dnsmasq-dns-546bf79c69-swsst\" (UID: \"18a8bb07-0424-46c0-8405-c49878049ffc\") " pod="openstack/dnsmasq-dns-546bf79c69-swsst" Sep 30 06:34:22 crc kubenswrapper[4691]: I0930 06:34:22.297004 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9s95\" (UniqueName: \"kubernetes.io/projected/18a8bb07-0424-46c0-8405-c49878049ffc-kube-api-access-g9s95\") pod \"dnsmasq-dns-546bf79c69-swsst\" (UID: \"18a8bb07-0424-46c0-8405-c49878049ffc\") " pod="openstack/dnsmasq-dns-546bf79c69-swsst" Sep 30 06:34:22 crc kubenswrapper[4691]: I0930 06:34:22.297748 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18a8bb07-0424-46c0-8405-c49878049ffc-dns-svc\") pod \"dnsmasq-dns-546bf79c69-swsst\" (UID: \"18a8bb07-0424-46c0-8405-c49878049ffc\") " pod="openstack/dnsmasq-dns-546bf79c69-swsst" Sep 30 06:34:22 crc kubenswrapper[4691]: I0930 06:34:22.297912 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18a8bb07-0424-46c0-8405-c49878049ffc-config\") pod \"dnsmasq-dns-546bf79c69-swsst\" (UID: \"18a8bb07-0424-46c0-8405-c49878049ffc\") " pod="openstack/dnsmasq-dns-546bf79c69-swsst" Sep 30 06:34:22 crc kubenswrapper[4691]: I0930 06:34:22.317080 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9s95\" (UniqueName: \"kubernetes.io/projected/18a8bb07-0424-46c0-8405-c49878049ffc-kube-api-access-g9s95\") pod \"dnsmasq-dns-546bf79c69-swsst\" (UID: \"18a8bb07-0424-46c0-8405-c49878049ffc\") " pod="openstack/dnsmasq-dns-546bf79c69-swsst" Sep 30 06:34:22 crc kubenswrapper[4691]: I0930 06:34:22.396574 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b8d888b5-mf8ss"] Sep 30 06:34:22 crc kubenswrapper[4691]: I0930 06:34:22.420610 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-99796b587-zzw5c"] Sep 30 06:34:22 crc kubenswrapper[4691]: I0930 06:34:22.421977 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-99796b587-zzw5c" Sep 30 06:34:22 crc kubenswrapper[4691]: I0930 06:34:22.429681 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-99796b587-zzw5c"] Sep 30 06:34:22 crc kubenswrapper[4691]: I0930 06:34:22.453050 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-546bf79c69-swsst" Sep 30 06:34:22 crc kubenswrapper[4691]: I0930 06:34:22.501533 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfl6n\" (UniqueName: \"kubernetes.io/projected/0280bb49-f0ad-4039-bf11-3ec2b9b0aac9-kube-api-access-hfl6n\") pod \"dnsmasq-dns-99796b587-zzw5c\" (UID: \"0280bb49-f0ad-4039-bf11-3ec2b9b0aac9\") " pod="openstack/dnsmasq-dns-99796b587-zzw5c" Sep 30 06:34:22 crc kubenswrapper[4691]: I0930 06:34:22.501587 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0280bb49-f0ad-4039-bf11-3ec2b9b0aac9-dns-svc\") pod \"dnsmasq-dns-99796b587-zzw5c\" (UID: \"0280bb49-f0ad-4039-bf11-3ec2b9b0aac9\") " pod="openstack/dnsmasq-dns-99796b587-zzw5c" Sep 30 06:34:22 crc kubenswrapper[4691]: I0930 06:34:22.501655 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0280bb49-f0ad-4039-bf11-3ec2b9b0aac9-config\") pod \"dnsmasq-dns-99796b587-zzw5c\" (UID: \"0280bb49-f0ad-4039-bf11-3ec2b9b0aac9\") " pod="openstack/dnsmasq-dns-99796b587-zzw5c" Sep 30 06:34:22 crc kubenswrapper[4691]: I0930 06:34:22.602069 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0280bb49-f0ad-4039-bf11-3ec2b9b0aac9-config\") pod \"dnsmasq-dns-99796b587-zzw5c\" (UID: \"0280bb49-f0ad-4039-bf11-3ec2b9b0aac9\") " pod="openstack/dnsmasq-dns-99796b587-zzw5c" Sep 30 06:34:22 crc kubenswrapper[4691]: I0930 06:34:22.602128 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfl6n\" (UniqueName: \"kubernetes.io/projected/0280bb49-f0ad-4039-bf11-3ec2b9b0aac9-kube-api-access-hfl6n\") pod \"dnsmasq-dns-99796b587-zzw5c\" (UID: \"0280bb49-f0ad-4039-bf11-3ec2b9b0aac9\") " pod="openstack/dnsmasq-dns-99796b587-zzw5c" Sep 30 06:34:22 crc kubenswrapper[4691]: I0930 06:34:22.602154 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0280bb49-f0ad-4039-bf11-3ec2b9b0aac9-dns-svc\") pod \"dnsmasq-dns-99796b587-zzw5c\" (UID: \"0280bb49-f0ad-4039-bf11-3ec2b9b0aac9\") " pod="openstack/dnsmasq-dns-99796b587-zzw5c" Sep 30 06:34:22 crc kubenswrapper[4691]: I0930 06:34:22.602929 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0280bb49-f0ad-4039-bf11-3ec2b9b0aac9-dns-svc\") pod \"dnsmasq-dns-99796b587-zzw5c\" (UID: \"0280bb49-f0ad-4039-bf11-3ec2b9b0aac9\") " pod="openstack/dnsmasq-dns-99796b587-zzw5c" Sep 30 06:34:22 crc kubenswrapper[4691]: I0930 06:34:22.603259 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0280bb49-f0ad-4039-bf11-3ec2b9b0aac9-config\") pod \"dnsmasq-dns-99796b587-zzw5c\" (UID: \"0280bb49-f0ad-4039-bf11-3ec2b9b0aac9\") " pod="openstack/dnsmasq-dns-99796b587-zzw5c" Sep 30 06:34:22 crc kubenswrapper[4691]: I0930 06:34:22.634850 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfl6n\" (UniqueName: \"kubernetes.io/projected/0280bb49-f0ad-4039-bf11-3ec2b9b0aac9-kube-api-access-hfl6n\") pod \"dnsmasq-dns-99796b587-zzw5c\" (UID: \"0280bb49-f0ad-4039-bf11-3ec2b9b0aac9\") " pod="openstack/dnsmasq-dns-99796b587-zzw5c" Sep 30 06:34:22 crc kubenswrapper[4691]: I0930 06:34:22.679602 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-546bf79c69-swsst"] Sep 30 06:34:22 crc kubenswrapper[4691]: I0930 06:34:22.705094 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6749c445df-fjmdf"] Sep 30 06:34:22 crc kubenswrapper[4691]: I0930 06:34:22.706578 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6749c445df-fjmdf" Sep 30 06:34:22 crc kubenswrapper[4691]: I0930 06:34:22.714674 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6749c445df-fjmdf"] Sep 30 06:34:22 crc kubenswrapper[4691]: I0930 06:34:22.734801 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-99796b587-zzw5c" Sep 30 06:34:22 crc kubenswrapper[4691]: I0930 06:34:22.806607 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/871e32c8-9326-4b62-8a26-de0e8d3bc670-dns-svc\") pod \"dnsmasq-dns-6749c445df-fjmdf\" (UID: \"871e32c8-9326-4b62-8a26-de0e8d3bc670\") " pod="openstack/dnsmasq-dns-6749c445df-fjmdf" Sep 30 06:34:22 crc kubenswrapper[4691]: I0930 06:34:22.806673 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hgl5\" (UniqueName: \"kubernetes.io/projected/871e32c8-9326-4b62-8a26-de0e8d3bc670-kube-api-access-4hgl5\") pod \"dnsmasq-dns-6749c445df-fjmdf\" (UID: \"871e32c8-9326-4b62-8a26-de0e8d3bc670\") " pod="openstack/dnsmasq-dns-6749c445df-fjmdf" Sep 30 06:34:22 crc kubenswrapper[4691]: I0930 06:34:22.806741 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/871e32c8-9326-4b62-8a26-de0e8d3bc670-config\") pod \"dnsmasq-dns-6749c445df-fjmdf\" (UID: \"871e32c8-9326-4b62-8a26-de0e8d3bc670\") " pod="openstack/dnsmasq-dns-6749c445df-fjmdf" Sep 30 06:34:22 crc kubenswrapper[4691]: I0930 06:34:22.908160 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/871e32c8-9326-4b62-8a26-de0e8d3bc670-config\") pod \"dnsmasq-dns-6749c445df-fjmdf\" (UID: \"871e32c8-9326-4b62-8a26-de0e8d3bc670\") " pod="openstack/dnsmasq-dns-6749c445df-fjmdf" Sep 30 06:34:22 crc kubenswrapper[4691]: I0930 06:34:22.908222 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/871e32c8-9326-4b62-8a26-de0e8d3bc670-dns-svc\") pod \"dnsmasq-dns-6749c445df-fjmdf\" (UID: \"871e32c8-9326-4b62-8a26-de0e8d3bc670\") " pod="openstack/dnsmasq-dns-6749c445df-fjmdf" Sep 30 06:34:22 crc kubenswrapper[4691]: I0930 06:34:22.908263 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hgl5\" (UniqueName: \"kubernetes.io/projected/871e32c8-9326-4b62-8a26-de0e8d3bc670-kube-api-access-4hgl5\") pod \"dnsmasq-dns-6749c445df-fjmdf\" (UID: \"871e32c8-9326-4b62-8a26-de0e8d3bc670\") " pod="openstack/dnsmasq-dns-6749c445df-fjmdf" Sep 30 06:34:22 crc kubenswrapper[4691]: I0930 06:34:22.909051 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/871e32c8-9326-4b62-8a26-de0e8d3bc670-config\") pod \"dnsmasq-dns-6749c445df-fjmdf\" (UID: \"871e32c8-9326-4b62-8a26-de0e8d3bc670\") " pod="openstack/dnsmasq-dns-6749c445df-fjmdf" Sep 30 06:34:22 crc kubenswrapper[4691]: I0930 06:34:22.909192 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/871e32c8-9326-4b62-8a26-de0e8d3bc670-dns-svc\") pod \"dnsmasq-dns-6749c445df-fjmdf\" (UID: \"871e32c8-9326-4b62-8a26-de0e8d3bc670\") " pod="openstack/dnsmasq-dns-6749c445df-fjmdf" Sep 30 06:34:22 crc kubenswrapper[4691]: I0930 06:34:22.926979 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hgl5\" (UniqueName: \"kubernetes.io/projected/871e32c8-9326-4b62-8a26-de0e8d3bc670-kube-api-access-4hgl5\") pod \"dnsmasq-dns-6749c445df-fjmdf\" (UID: \"871e32c8-9326-4b62-8a26-de0e8d3bc670\") " pod="openstack/dnsmasq-dns-6749c445df-fjmdf" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.024755 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6749c445df-fjmdf" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.299659 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.300768 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.305178 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.305208 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-cvhfh" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.305451 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.305634 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.305715 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.305775 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.305937 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.316699 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.414556 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6b6ff7c5-6146-432e-a89c-fe95ac728e5c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6b6ff7c5-6146-432e-a89c-fe95ac728e5c\") " pod="openstack/rabbitmq-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.414939 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6b6ff7c5-6146-432e-a89c-fe95ac728e5c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6b6ff7c5-6146-432e-a89c-fe95ac728e5c\") " pod="openstack/rabbitmq-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.415031 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6b6ff7c5-6146-432e-a89c-fe95ac728e5c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6b6ff7c5-6146-432e-a89c-fe95ac728e5c\") " pod="openstack/rabbitmq-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.415089 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6b6ff7c5-6146-432e-a89c-fe95ac728e5c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6b6ff7c5-6146-432e-a89c-fe95ac728e5c\") " pod="openstack/rabbitmq-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.415122 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"6b6ff7c5-6146-432e-a89c-fe95ac728e5c\") " pod="openstack/rabbitmq-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.415137 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6b6ff7c5-6146-432e-a89c-fe95ac728e5c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6b6ff7c5-6146-432e-a89c-fe95ac728e5c\") " pod="openstack/rabbitmq-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.415151 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vssb\" (UniqueName: \"kubernetes.io/projected/6b6ff7c5-6146-432e-a89c-fe95ac728e5c-kube-api-access-8vssb\") pod \"rabbitmq-server-0\" (UID: \"6b6ff7c5-6146-432e-a89c-fe95ac728e5c\") " pod="openstack/rabbitmq-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.415202 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6b6ff7c5-6146-432e-a89c-fe95ac728e5c-config-data\") pod \"rabbitmq-server-0\" (UID: \"6b6ff7c5-6146-432e-a89c-fe95ac728e5c\") " pod="openstack/rabbitmq-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.415236 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6b6ff7c5-6146-432e-a89c-fe95ac728e5c-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"6b6ff7c5-6146-432e-a89c-fe95ac728e5c\") " pod="openstack/rabbitmq-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.415257 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6b6ff7c5-6146-432e-a89c-fe95ac728e5c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6b6ff7c5-6146-432e-a89c-fe95ac728e5c\") " pod="openstack/rabbitmq-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.415313 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6b6ff7c5-6146-432e-a89c-fe95ac728e5c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6b6ff7c5-6146-432e-a89c-fe95ac728e5c\") " pod="openstack/rabbitmq-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.516178 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6b6ff7c5-6146-432e-a89c-fe95ac728e5c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6b6ff7c5-6146-432e-a89c-fe95ac728e5c\") " pod="openstack/rabbitmq-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.516232 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6b6ff7c5-6146-432e-a89c-fe95ac728e5c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6b6ff7c5-6146-432e-a89c-fe95ac728e5c\") " pod="openstack/rabbitmq-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.516258 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"6b6ff7c5-6146-432e-a89c-fe95ac728e5c\") " pod="openstack/rabbitmq-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.516273 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6b6ff7c5-6146-432e-a89c-fe95ac728e5c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6b6ff7c5-6146-432e-a89c-fe95ac728e5c\") " pod="openstack/rabbitmq-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.516288 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vssb\" (UniqueName: \"kubernetes.io/projected/6b6ff7c5-6146-432e-a89c-fe95ac728e5c-kube-api-access-8vssb\") pod \"rabbitmq-server-0\" (UID: \"6b6ff7c5-6146-432e-a89c-fe95ac728e5c\") " pod="openstack/rabbitmq-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.516315 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6b6ff7c5-6146-432e-a89c-fe95ac728e5c-config-data\") pod \"rabbitmq-server-0\" (UID: \"6b6ff7c5-6146-432e-a89c-fe95ac728e5c\") " pod="openstack/rabbitmq-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.516335 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6b6ff7c5-6146-432e-a89c-fe95ac728e5c-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"6b6ff7c5-6146-432e-a89c-fe95ac728e5c\") " pod="openstack/rabbitmq-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.516352 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6b6ff7c5-6146-432e-a89c-fe95ac728e5c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6b6ff7c5-6146-432e-a89c-fe95ac728e5c\") " pod="openstack/rabbitmq-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.516379 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6b6ff7c5-6146-432e-a89c-fe95ac728e5c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6b6ff7c5-6146-432e-a89c-fe95ac728e5c\") " pod="openstack/rabbitmq-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.516415 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6b6ff7c5-6146-432e-a89c-fe95ac728e5c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6b6ff7c5-6146-432e-a89c-fe95ac728e5c\") " pod="openstack/rabbitmq-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.516436 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6b6ff7c5-6146-432e-a89c-fe95ac728e5c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6b6ff7c5-6146-432e-a89c-fe95ac728e5c\") " pod="openstack/rabbitmq-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.516635 4691 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"6b6ff7c5-6146-432e-a89c-fe95ac728e5c\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.516658 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6b6ff7c5-6146-432e-a89c-fe95ac728e5c-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"6b6ff7c5-6146-432e-a89c-fe95ac728e5c\") " pod="openstack/rabbitmq-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.517218 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6b6ff7c5-6146-432e-a89c-fe95ac728e5c-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"6b6ff7c5-6146-432e-a89c-fe95ac728e5c\") " pod="openstack/rabbitmq-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.518336 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6b6ff7c5-6146-432e-a89c-fe95ac728e5c-server-conf\") pod \"rabbitmq-server-0\" (UID: \"6b6ff7c5-6146-432e-a89c-fe95ac728e5c\") " pod="openstack/rabbitmq-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.518600 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6b6ff7c5-6146-432e-a89c-fe95ac728e5c-config-data\") pod \"rabbitmq-server-0\" (UID: \"6b6ff7c5-6146-432e-a89c-fe95ac728e5c\") " pod="openstack/rabbitmq-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.518635 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6b6ff7c5-6146-432e-a89c-fe95ac728e5c-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"6b6ff7c5-6146-432e-a89c-fe95ac728e5c\") " pod="openstack/rabbitmq-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.520197 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6b6ff7c5-6146-432e-a89c-fe95ac728e5c-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"6b6ff7c5-6146-432e-a89c-fe95ac728e5c\") " pod="openstack/rabbitmq-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.521960 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6b6ff7c5-6146-432e-a89c-fe95ac728e5c-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"6b6ff7c5-6146-432e-a89c-fe95ac728e5c\") " pod="openstack/rabbitmq-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.523068 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6b6ff7c5-6146-432e-a89c-fe95ac728e5c-pod-info\") pod \"rabbitmq-server-0\" (UID: \"6b6ff7c5-6146-432e-a89c-fe95ac728e5c\") " pod="openstack/rabbitmq-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.523373 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6b6ff7c5-6146-432e-a89c-fe95ac728e5c-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"6b6ff7c5-6146-432e-a89c-fe95ac728e5c\") " pod="openstack/rabbitmq-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.535911 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vssb\" (UniqueName: \"kubernetes.io/projected/6b6ff7c5-6146-432e-a89c-fe95ac728e5c-kube-api-access-8vssb\") pod \"rabbitmq-server-0\" (UID: \"6b6ff7c5-6146-432e-a89c-fe95ac728e5c\") " pod="openstack/rabbitmq-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.536333 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"6b6ff7c5-6146-432e-a89c-fe95ac728e5c\") " pod="openstack/rabbitmq-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.559986 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.561123 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.562875 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-lwgnz" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.563164 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.563250 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.564466 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.564640 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.564813 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.564929 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.586137 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.664741 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.719645 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/fd5df9d9-7a0a-441c-b21d-92dff2af7376-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"fd5df9d9-7a0a-441c-b21d-92dff2af7376\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.719928 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/fd5df9d9-7a0a-441c-b21d-92dff2af7376-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"fd5df9d9-7a0a-441c-b21d-92dff2af7376\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.720055 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/fd5df9d9-7a0a-441c-b21d-92dff2af7376-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"fd5df9d9-7a0a-441c-b21d-92dff2af7376\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.720158 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/fd5df9d9-7a0a-441c-b21d-92dff2af7376-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"fd5df9d9-7a0a-441c-b21d-92dff2af7376\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.720258 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"fd5df9d9-7a0a-441c-b21d-92dff2af7376\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.720492 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/fd5df9d9-7a0a-441c-b21d-92dff2af7376-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"fd5df9d9-7a0a-441c-b21d-92dff2af7376\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.720594 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/fd5df9d9-7a0a-441c-b21d-92dff2af7376-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"fd5df9d9-7a0a-441c-b21d-92dff2af7376\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.720688 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/fd5df9d9-7a0a-441c-b21d-92dff2af7376-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"fd5df9d9-7a0a-441c-b21d-92dff2af7376\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.720766 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/fd5df9d9-7a0a-441c-b21d-92dff2af7376-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"fd5df9d9-7a0a-441c-b21d-92dff2af7376\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.720836 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fd5df9d9-7a0a-441c-b21d-92dff2af7376-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"fd5df9d9-7a0a-441c-b21d-92dff2af7376\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.720939 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrqcp\" (UniqueName: \"kubernetes.io/projected/fd5df9d9-7a0a-441c-b21d-92dff2af7376-kube-api-access-mrqcp\") pod \"rabbitmq-cell1-server-0\" (UID: \"fd5df9d9-7a0a-441c-b21d-92dff2af7376\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.824496 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrqcp\" (UniqueName: \"kubernetes.io/projected/fd5df9d9-7a0a-441c-b21d-92dff2af7376-kube-api-access-mrqcp\") pod \"rabbitmq-cell1-server-0\" (UID: \"fd5df9d9-7a0a-441c-b21d-92dff2af7376\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.824783 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/fd5df9d9-7a0a-441c-b21d-92dff2af7376-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"fd5df9d9-7a0a-441c-b21d-92dff2af7376\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.824872 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/fd5df9d9-7a0a-441c-b21d-92dff2af7376-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"fd5df9d9-7a0a-441c-b21d-92dff2af7376\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.824990 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/fd5df9d9-7a0a-441c-b21d-92dff2af7376-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"fd5df9d9-7a0a-441c-b21d-92dff2af7376\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.826439 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/fd5df9d9-7a0a-441c-b21d-92dff2af7376-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"fd5df9d9-7a0a-441c-b21d-92dff2af7376\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.826802 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"fd5df9d9-7a0a-441c-b21d-92dff2af7376\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.827000 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/fd5df9d9-7a0a-441c-b21d-92dff2af7376-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"fd5df9d9-7a0a-441c-b21d-92dff2af7376\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.827109 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/fd5df9d9-7a0a-441c-b21d-92dff2af7376-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"fd5df9d9-7a0a-441c-b21d-92dff2af7376\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.827471 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/fd5df9d9-7a0a-441c-b21d-92dff2af7376-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"fd5df9d9-7a0a-441c-b21d-92dff2af7376\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.827600 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/fd5df9d9-7a0a-441c-b21d-92dff2af7376-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"fd5df9d9-7a0a-441c-b21d-92dff2af7376\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.827686 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fd5df9d9-7a0a-441c-b21d-92dff2af7376-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"fd5df9d9-7a0a-441c-b21d-92dff2af7376\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.827411 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/fd5df9d9-7a0a-441c-b21d-92dff2af7376-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"fd5df9d9-7a0a-441c-b21d-92dff2af7376\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.826971 4691 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"fd5df9d9-7a0a-441c-b21d-92dff2af7376\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/rabbitmq-cell1-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.826755 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/fd5df9d9-7a0a-441c-b21d-92dff2af7376-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"fd5df9d9-7a0a-441c-b21d-92dff2af7376\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.826386 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/fd5df9d9-7a0a-441c-b21d-92dff2af7376-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"fd5df9d9-7a0a-441c-b21d-92dff2af7376\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.828110 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/fd5df9d9-7a0a-441c-b21d-92dff2af7376-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"fd5df9d9-7a0a-441c-b21d-92dff2af7376\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.828286 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/fd5df9d9-7a0a-441c-b21d-92dff2af7376-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"fd5df9d9-7a0a-441c-b21d-92dff2af7376\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.828368 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fd5df9d9-7a0a-441c-b21d-92dff2af7376-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"fd5df9d9-7a0a-441c-b21d-92dff2af7376\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.836632 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/fd5df9d9-7a0a-441c-b21d-92dff2af7376-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"fd5df9d9-7a0a-441c-b21d-92dff2af7376\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.837186 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/fd5df9d9-7a0a-441c-b21d-92dff2af7376-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"fd5df9d9-7a0a-441c-b21d-92dff2af7376\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.842449 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/fd5df9d9-7a0a-441c-b21d-92dff2af7376-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"fd5df9d9-7a0a-441c-b21d-92dff2af7376\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.846035 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-notifications-server-0"] Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.846593 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrqcp\" (UniqueName: \"kubernetes.io/projected/fd5df9d9-7a0a-441c-b21d-92dff2af7376-kube-api-access-mrqcp\") pod \"rabbitmq-cell1-server-0\" (UID: \"fd5df9d9-7a0a-441c-b21d-92dff2af7376\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.847406 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-notifications-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.849432 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-notifications-default-user" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.851704 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"fd5df9d9-7a0a-441c-b21d-92dff2af7376\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.852018 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-notifications-server-conf" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.852197 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-notifications-svc" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.852487 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-notifications-plugins-conf" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.852668 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-notifications-config-data" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.852970 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-notifications-server-dockercfg-4xnn6" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.853157 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-notifications-erlang-cookie" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.857828 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-notifications-server-0"] Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.897233 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.930284 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d454968e-74c7-45e3-9608-e915973c7f25-config-data\") pod \"rabbitmq-notifications-server-0\" (UID: \"d454968e-74c7-45e3-9608-e915973c7f25\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.930342 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d454968e-74c7-45e3-9608-e915973c7f25-pod-info\") pod \"rabbitmq-notifications-server-0\" (UID: \"d454968e-74c7-45e3-9608-e915973c7f25\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.930376 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d454968e-74c7-45e3-9608-e915973c7f25-rabbitmq-erlang-cookie\") pod \"rabbitmq-notifications-server-0\" (UID: \"d454968e-74c7-45e3-9608-e915973c7f25\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.930396 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d454968e-74c7-45e3-9608-e915973c7f25-rabbitmq-confd\") pod \"rabbitmq-notifications-server-0\" (UID: \"d454968e-74c7-45e3-9608-e915973c7f25\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.930449 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d454968e-74c7-45e3-9608-e915973c7f25-rabbitmq-plugins\") pod \"rabbitmq-notifications-server-0\" (UID: \"d454968e-74c7-45e3-9608-e915973c7f25\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.930466 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4qxw\" (UniqueName: \"kubernetes.io/projected/d454968e-74c7-45e3-9608-e915973c7f25-kube-api-access-d4qxw\") pod \"rabbitmq-notifications-server-0\" (UID: \"d454968e-74c7-45e3-9608-e915973c7f25\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.930481 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d454968e-74c7-45e3-9608-e915973c7f25-server-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"d454968e-74c7-45e3-9608-e915973c7f25\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.930558 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d454968e-74c7-45e3-9608-e915973c7f25-erlang-cookie-secret\") pod \"rabbitmq-notifications-server-0\" (UID: \"d454968e-74c7-45e3-9608-e915973c7f25\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.930597 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d454968e-74c7-45e3-9608-e915973c7f25-plugins-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"d454968e-74c7-45e3-9608-e915973c7f25\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.930635 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d454968e-74c7-45e3-9608-e915973c7f25-rabbitmq-tls\") pod \"rabbitmq-notifications-server-0\" (UID: \"d454968e-74c7-45e3-9608-e915973c7f25\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 06:34:23 crc kubenswrapper[4691]: I0930 06:34:23.930652 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-notifications-server-0\" (UID: \"d454968e-74c7-45e3-9608-e915973c7f25\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 06:34:24 crc kubenswrapper[4691]: I0930 06:34:24.031822 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d454968e-74c7-45e3-9608-e915973c7f25-rabbitmq-erlang-cookie\") pod \"rabbitmq-notifications-server-0\" (UID: \"d454968e-74c7-45e3-9608-e915973c7f25\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 06:34:24 crc kubenswrapper[4691]: I0930 06:34:24.031862 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d454968e-74c7-45e3-9608-e915973c7f25-rabbitmq-confd\") pod \"rabbitmq-notifications-server-0\" (UID: \"d454968e-74c7-45e3-9608-e915973c7f25\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 06:34:24 crc kubenswrapper[4691]: I0930 06:34:24.031896 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d454968e-74c7-45e3-9608-e915973c7f25-rabbitmq-plugins\") pod \"rabbitmq-notifications-server-0\" (UID: \"d454968e-74c7-45e3-9608-e915973c7f25\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 06:34:24 crc kubenswrapper[4691]: I0930 06:34:24.031913 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d454968e-74c7-45e3-9608-e915973c7f25-server-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"d454968e-74c7-45e3-9608-e915973c7f25\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 06:34:24 crc kubenswrapper[4691]: I0930 06:34:24.031931 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4qxw\" (UniqueName: \"kubernetes.io/projected/d454968e-74c7-45e3-9608-e915973c7f25-kube-api-access-d4qxw\") pod \"rabbitmq-notifications-server-0\" (UID: \"d454968e-74c7-45e3-9608-e915973c7f25\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 06:34:24 crc kubenswrapper[4691]: I0930 06:34:24.031957 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d454968e-74c7-45e3-9608-e915973c7f25-erlang-cookie-secret\") pod \"rabbitmq-notifications-server-0\" (UID: \"d454968e-74c7-45e3-9608-e915973c7f25\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 06:34:24 crc kubenswrapper[4691]: I0930 06:34:24.031988 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d454968e-74c7-45e3-9608-e915973c7f25-plugins-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"d454968e-74c7-45e3-9608-e915973c7f25\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 06:34:24 crc kubenswrapper[4691]: I0930 06:34:24.032029 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d454968e-74c7-45e3-9608-e915973c7f25-rabbitmq-tls\") pod \"rabbitmq-notifications-server-0\" (UID: \"d454968e-74c7-45e3-9608-e915973c7f25\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 06:34:24 crc kubenswrapper[4691]: I0930 06:34:24.032048 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-notifications-server-0\" (UID: \"d454968e-74c7-45e3-9608-e915973c7f25\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 06:34:24 crc kubenswrapper[4691]: I0930 06:34:24.032072 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d454968e-74c7-45e3-9608-e915973c7f25-config-data\") pod \"rabbitmq-notifications-server-0\" (UID: \"d454968e-74c7-45e3-9608-e915973c7f25\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 06:34:24 crc kubenswrapper[4691]: I0930 06:34:24.032104 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d454968e-74c7-45e3-9608-e915973c7f25-pod-info\") pod \"rabbitmq-notifications-server-0\" (UID: \"d454968e-74c7-45e3-9608-e915973c7f25\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 06:34:24 crc kubenswrapper[4691]: I0930 06:34:24.032300 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d454968e-74c7-45e3-9608-e915973c7f25-rabbitmq-erlang-cookie\") pod \"rabbitmq-notifications-server-0\" (UID: \"d454968e-74c7-45e3-9608-e915973c7f25\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 06:34:24 crc kubenswrapper[4691]: I0930 06:34:24.032600 4691 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-notifications-server-0\" (UID: \"d454968e-74c7-45e3-9608-e915973c7f25\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/rabbitmq-notifications-server-0" Sep 30 06:34:24 crc kubenswrapper[4691]: I0930 06:34:24.033783 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d454968e-74c7-45e3-9608-e915973c7f25-plugins-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"d454968e-74c7-45e3-9608-e915973c7f25\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 06:34:24 crc kubenswrapper[4691]: I0930 06:34:24.034136 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d454968e-74c7-45e3-9608-e915973c7f25-server-conf\") pod \"rabbitmq-notifications-server-0\" (UID: \"d454968e-74c7-45e3-9608-e915973c7f25\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 06:34:24 crc kubenswrapper[4691]: I0930 06:34:24.034828 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d454968e-74c7-45e3-9608-e915973c7f25-rabbitmq-plugins\") pod \"rabbitmq-notifications-server-0\" (UID: \"d454968e-74c7-45e3-9608-e915973c7f25\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 06:34:24 crc kubenswrapper[4691]: I0930 06:34:24.035708 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d454968e-74c7-45e3-9608-e915973c7f25-pod-info\") pod \"rabbitmq-notifications-server-0\" (UID: \"d454968e-74c7-45e3-9608-e915973c7f25\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 06:34:24 crc kubenswrapper[4691]: I0930 06:34:24.036442 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d454968e-74c7-45e3-9608-e915973c7f25-config-data\") pod \"rabbitmq-notifications-server-0\" (UID: \"d454968e-74c7-45e3-9608-e915973c7f25\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 06:34:24 crc kubenswrapper[4691]: I0930 06:34:24.037499 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d454968e-74c7-45e3-9608-e915973c7f25-rabbitmq-confd\") pod \"rabbitmq-notifications-server-0\" (UID: \"d454968e-74c7-45e3-9608-e915973c7f25\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 06:34:24 crc kubenswrapper[4691]: I0930 06:34:24.037829 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d454968e-74c7-45e3-9608-e915973c7f25-erlang-cookie-secret\") pod \"rabbitmq-notifications-server-0\" (UID: \"d454968e-74c7-45e3-9608-e915973c7f25\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 06:34:24 crc kubenswrapper[4691]: I0930 06:34:24.038309 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d454968e-74c7-45e3-9608-e915973c7f25-rabbitmq-tls\") pod \"rabbitmq-notifications-server-0\" (UID: \"d454968e-74c7-45e3-9608-e915973c7f25\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 06:34:24 crc kubenswrapper[4691]: I0930 06:34:24.048356 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4qxw\" (UniqueName: \"kubernetes.io/projected/d454968e-74c7-45e3-9608-e915973c7f25-kube-api-access-d4qxw\") pod \"rabbitmq-notifications-server-0\" (UID: \"d454968e-74c7-45e3-9608-e915973c7f25\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 06:34:24 crc kubenswrapper[4691]: I0930 06:34:24.053448 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"rabbitmq-notifications-server-0\" (UID: \"d454968e-74c7-45e3-9608-e915973c7f25\") " pod="openstack/rabbitmq-notifications-server-0" Sep 30 06:34:24 crc kubenswrapper[4691]: I0930 06:34:24.201554 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-notifications-server-0" Sep 30 06:34:26 crc kubenswrapper[4691]: I0930 06:34:26.608772 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Sep 30 06:34:26 crc kubenswrapper[4691]: I0930 06:34:26.610531 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Sep 30 06:34:26 crc kubenswrapper[4691]: I0930 06:34:26.613501 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Sep 30 06:34:26 crc kubenswrapper[4691]: I0930 06:34:26.615313 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-2h6cd" Sep 30 06:34:26 crc kubenswrapper[4691]: I0930 06:34:26.615919 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Sep 30 06:34:26 crc kubenswrapper[4691]: I0930 06:34:26.616323 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Sep 30 06:34:26 crc kubenswrapper[4691]: I0930 06:34:26.631065 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Sep 30 06:34:26 crc kubenswrapper[4691]: I0930 06:34:26.635961 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Sep 30 06:34:26 crc kubenswrapper[4691]: I0930 06:34:26.639647 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Sep 30 06:34:26 crc kubenswrapper[4691]: I0930 06:34:26.670682 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/08782d24-2bd9-48d6-b9b2-12a2ad66e6d0-secrets\") pod \"openstack-galera-0\" (UID: \"08782d24-2bd9-48d6-b9b2-12a2ad66e6d0\") " pod="openstack/openstack-galera-0" Sep 30 06:34:26 crc kubenswrapper[4691]: I0930 06:34:26.670737 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/08782d24-2bd9-48d6-b9b2-12a2ad66e6d0-config-data-default\") pod \"openstack-galera-0\" (UID: \"08782d24-2bd9-48d6-b9b2-12a2ad66e6d0\") " pod="openstack/openstack-galera-0" Sep 30 06:34:26 crc kubenswrapper[4691]: I0930 06:34:26.670775 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08782d24-2bd9-48d6-b9b2-12a2ad66e6d0-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"08782d24-2bd9-48d6-b9b2-12a2ad66e6d0\") " pod="openstack/openstack-galera-0" Sep 30 06:34:26 crc kubenswrapper[4691]: I0930 06:34:26.670980 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/08782d24-2bd9-48d6-b9b2-12a2ad66e6d0-config-data-generated\") pod \"openstack-galera-0\" (UID: \"08782d24-2bd9-48d6-b9b2-12a2ad66e6d0\") " pod="openstack/openstack-galera-0" Sep 30 06:34:26 crc kubenswrapper[4691]: I0930 06:34:26.671105 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/08782d24-2bd9-48d6-b9b2-12a2ad66e6d0-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"08782d24-2bd9-48d6-b9b2-12a2ad66e6d0\") " pod="openstack/openstack-galera-0" Sep 30 06:34:26 crc kubenswrapper[4691]: I0930 06:34:26.671222 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/08782d24-2bd9-48d6-b9b2-12a2ad66e6d0-operator-scripts\") pod \"openstack-galera-0\" (UID: \"08782d24-2bd9-48d6-b9b2-12a2ad66e6d0\") " pod="openstack/openstack-galera-0" Sep 30 06:34:26 crc kubenswrapper[4691]: I0930 06:34:26.671268 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/08782d24-2bd9-48d6-b9b2-12a2ad66e6d0-kolla-config\") pod \"openstack-galera-0\" (UID: \"08782d24-2bd9-48d6-b9b2-12a2ad66e6d0\") " pod="openstack/openstack-galera-0" Sep 30 06:34:26 crc kubenswrapper[4691]: I0930 06:34:26.671293 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ccvs\" (UniqueName: \"kubernetes.io/projected/08782d24-2bd9-48d6-b9b2-12a2ad66e6d0-kube-api-access-2ccvs\") pod \"openstack-galera-0\" (UID: \"08782d24-2bd9-48d6-b9b2-12a2ad66e6d0\") " pod="openstack/openstack-galera-0" Sep 30 06:34:26 crc kubenswrapper[4691]: I0930 06:34:26.671361 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"08782d24-2bd9-48d6-b9b2-12a2ad66e6d0\") " pod="openstack/openstack-galera-0" Sep 30 06:34:26 crc kubenswrapper[4691]: I0930 06:34:26.773633 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/08782d24-2bd9-48d6-b9b2-12a2ad66e6d0-secrets\") pod \"openstack-galera-0\" (UID: \"08782d24-2bd9-48d6-b9b2-12a2ad66e6d0\") " pod="openstack/openstack-galera-0" Sep 30 06:34:26 crc kubenswrapper[4691]: I0930 06:34:26.773713 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/08782d24-2bd9-48d6-b9b2-12a2ad66e6d0-config-data-default\") pod \"openstack-galera-0\" (UID: \"08782d24-2bd9-48d6-b9b2-12a2ad66e6d0\") " pod="openstack/openstack-galera-0" Sep 30 06:34:26 crc kubenswrapper[4691]: I0930 06:34:26.773763 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08782d24-2bd9-48d6-b9b2-12a2ad66e6d0-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"08782d24-2bd9-48d6-b9b2-12a2ad66e6d0\") " pod="openstack/openstack-galera-0" Sep 30 06:34:26 crc kubenswrapper[4691]: I0930 06:34:26.773807 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/08782d24-2bd9-48d6-b9b2-12a2ad66e6d0-config-data-generated\") pod \"openstack-galera-0\" (UID: \"08782d24-2bd9-48d6-b9b2-12a2ad66e6d0\") " pod="openstack/openstack-galera-0" Sep 30 06:34:26 crc kubenswrapper[4691]: I0930 06:34:26.773845 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/08782d24-2bd9-48d6-b9b2-12a2ad66e6d0-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"08782d24-2bd9-48d6-b9b2-12a2ad66e6d0\") " pod="openstack/openstack-galera-0" Sep 30 06:34:26 crc kubenswrapper[4691]: I0930 06:34:26.773906 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/08782d24-2bd9-48d6-b9b2-12a2ad66e6d0-operator-scripts\") pod \"openstack-galera-0\" (UID: \"08782d24-2bd9-48d6-b9b2-12a2ad66e6d0\") " pod="openstack/openstack-galera-0" Sep 30 06:34:26 crc kubenswrapper[4691]: I0930 06:34:26.773933 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/08782d24-2bd9-48d6-b9b2-12a2ad66e6d0-kolla-config\") pod \"openstack-galera-0\" (UID: \"08782d24-2bd9-48d6-b9b2-12a2ad66e6d0\") " pod="openstack/openstack-galera-0" Sep 30 06:34:26 crc kubenswrapper[4691]: I0930 06:34:26.773958 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ccvs\" (UniqueName: \"kubernetes.io/projected/08782d24-2bd9-48d6-b9b2-12a2ad66e6d0-kube-api-access-2ccvs\") pod \"openstack-galera-0\" (UID: \"08782d24-2bd9-48d6-b9b2-12a2ad66e6d0\") " pod="openstack/openstack-galera-0" Sep 30 06:34:26 crc kubenswrapper[4691]: I0930 06:34:26.773980 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"08782d24-2bd9-48d6-b9b2-12a2ad66e6d0\") " pod="openstack/openstack-galera-0" Sep 30 06:34:26 crc kubenswrapper[4691]: I0930 06:34:26.774250 4691 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"08782d24-2bd9-48d6-b9b2-12a2ad66e6d0\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/openstack-galera-0" Sep 30 06:34:26 crc kubenswrapper[4691]: I0930 06:34:26.775216 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/08782d24-2bd9-48d6-b9b2-12a2ad66e6d0-config-data-generated\") pod \"openstack-galera-0\" (UID: \"08782d24-2bd9-48d6-b9b2-12a2ad66e6d0\") " pod="openstack/openstack-galera-0" Sep 30 06:34:26 crc kubenswrapper[4691]: I0930 06:34:26.775307 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/08782d24-2bd9-48d6-b9b2-12a2ad66e6d0-config-data-default\") pod \"openstack-galera-0\" (UID: \"08782d24-2bd9-48d6-b9b2-12a2ad66e6d0\") " pod="openstack/openstack-galera-0" Sep 30 06:34:26 crc kubenswrapper[4691]: I0930 06:34:26.775470 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/08782d24-2bd9-48d6-b9b2-12a2ad66e6d0-kolla-config\") pod \"openstack-galera-0\" (UID: \"08782d24-2bd9-48d6-b9b2-12a2ad66e6d0\") " pod="openstack/openstack-galera-0" Sep 30 06:34:26 crc kubenswrapper[4691]: I0930 06:34:26.775471 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/08782d24-2bd9-48d6-b9b2-12a2ad66e6d0-operator-scripts\") pod \"openstack-galera-0\" (UID: \"08782d24-2bd9-48d6-b9b2-12a2ad66e6d0\") " pod="openstack/openstack-galera-0" Sep 30 06:34:26 crc kubenswrapper[4691]: I0930 06:34:26.782315 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08782d24-2bd9-48d6-b9b2-12a2ad66e6d0-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"08782d24-2bd9-48d6-b9b2-12a2ad66e6d0\") " pod="openstack/openstack-galera-0" Sep 30 06:34:26 crc kubenswrapper[4691]: I0930 06:34:26.794552 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/08782d24-2bd9-48d6-b9b2-12a2ad66e6d0-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"08782d24-2bd9-48d6-b9b2-12a2ad66e6d0\") " pod="openstack/openstack-galera-0" Sep 30 06:34:26 crc kubenswrapper[4691]: I0930 06:34:26.794880 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/08782d24-2bd9-48d6-b9b2-12a2ad66e6d0-secrets\") pod \"openstack-galera-0\" (UID: \"08782d24-2bd9-48d6-b9b2-12a2ad66e6d0\") " pod="openstack/openstack-galera-0" Sep 30 06:34:26 crc kubenswrapper[4691]: I0930 06:34:26.799285 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"openstack-galera-0\" (UID: \"08782d24-2bd9-48d6-b9b2-12a2ad66e6d0\") " pod="openstack/openstack-galera-0" Sep 30 06:34:26 crc kubenswrapper[4691]: I0930 06:34:26.813245 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ccvs\" (UniqueName: \"kubernetes.io/projected/08782d24-2bd9-48d6-b9b2-12a2ad66e6d0-kube-api-access-2ccvs\") pod \"openstack-galera-0\" (UID: \"08782d24-2bd9-48d6-b9b2-12a2ad66e6d0\") " pod="openstack/openstack-galera-0" Sep 30 06:34:26 crc kubenswrapper[4691]: I0930 06:34:26.935161 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Sep 30 06:34:27 crc kubenswrapper[4691]: I0930 06:34:27.096974 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Sep 30 06:34:27 crc kubenswrapper[4691]: I0930 06:34:27.098665 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Sep 30 06:34:27 crc kubenswrapper[4691]: I0930 06:34:27.100152 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Sep 30 06:34:27 crc kubenswrapper[4691]: I0930 06:34:27.111553 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Sep 30 06:34:27 crc kubenswrapper[4691]: I0930 06:34:27.111648 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-s6fnq" Sep 30 06:34:27 crc kubenswrapper[4691]: I0930 06:34:27.111732 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Sep 30 06:34:27 crc kubenswrapper[4691]: I0930 06:34:27.111940 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Sep 30 06:34:27 crc kubenswrapper[4691]: I0930 06:34:27.285730 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/605e7ef7-ef82-4a1f-98aa-15eef2a3f8c6-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"605e7ef7-ef82-4a1f-98aa-15eef2a3f8c6\") " pod="openstack/openstack-cell1-galera-0" Sep 30 06:34:27 crc kubenswrapper[4691]: I0930 06:34:27.286092 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/605e7ef7-ef82-4a1f-98aa-15eef2a3f8c6-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"605e7ef7-ef82-4a1f-98aa-15eef2a3f8c6\") " pod="openstack/openstack-cell1-galera-0" Sep 30 06:34:27 crc kubenswrapper[4691]: I0930 06:34:27.286213 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"605e7ef7-ef82-4a1f-98aa-15eef2a3f8c6\") " pod="openstack/openstack-cell1-galera-0" Sep 30 06:34:27 crc kubenswrapper[4691]: I0930 06:34:27.286325 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/605e7ef7-ef82-4a1f-98aa-15eef2a3f8c6-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"605e7ef7-ef82-4a1f-98aa-15eef2a3f8c6\") " pod="openstack/openstack-cell1-galera-0" Sep 30 06:34:27 crc kubenswrapper[4691]: I0930 06:34:27.286451 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wc65n\" (UniqueName: \"kubernetes.io/projected/605e7ef7-ef82-4a1f-98aa-15eef2a3f8c6-kube-api-access-wc65n\") pod \"openstack-cell1-galera-0\" (UID: \"605e7ef7-ef82-4a1f-98aa-15eef2a3f8c6\") " pod="openstack/openstack-cell1-galera-0" Sep 30 06:34:27 crc kubenswrapper[4691]: I0930 06:34:27.286595 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/605e7ef7-ef82-4a1f-98aa-15eef2a3f8c6-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"605e7ef7-ef82-4a1f-98aa-15eef2a3f8c6\") " pod="openstack/openstack-cell1-galera-0" Sep 30 06:34:27 crc kubenswrapper[4691]: I0930 06:34:27.286715 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/605e7ef7-ef82-4a1f-98aa-15eef2a3f8c6-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"605e7ef7-ef82-4a1f-98aa-15eef2a3f8c6\") " pod="openstack/openstack-cell1-galera-0" Sep 30 06:34:27 crc kubenswrapper[4691]: I0930 06:34:27.286923 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/605e7ef7-ef82-4a1f-98aa-15eef2a3f8c6-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"605e7ef7-ef82-4a1f-98aa-15eef2a3f8c6\") " pod="openstack/openstack-cell1-galera-0" Sep 30 06:34:27 crc kubenswrapper[4691]: I0930 06:34:27.287084 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/605e7ef7-ef82-4a1f-98aa-15eef2a3f8c6-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"605e7ef7-ef82-4a1f-98aa-15eef2a3f8c6\") " pod="openstack/openstack-cell1-galera-0" Sep 30 06:34:27 crc kubenswrapper[4691]: I0930 06:34:27.387978 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/605e7ef7-ef82-4a1f-98aa-15eef2a3f8c6-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"605e7ef7-ef82-4a1f-98aa-15eef2a3f8c6\") " pod="openstack/openstack-cell1-galera-0" Sep 30 06:34:27 crc kubenswrapper[4691]: I0930 06:34:27.388225 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wc65n\" (UniqueName: \"kubernetes.io/projected/605e7ef7-ef82-4a1f-98aa-15eef2a3f8c6-kube-api-access-wc65n\") pod \"openstack-cell1-galera-0\" (UID: \"605e7ef7-ef82-4a1f-98aa-15eef2a3f8c6\") " pod="openstack/openstack-cell1-galera-0" Sep 30 06:34:27 crc kubenswrapper[4691]: I0930 06:34:27.388347 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/605e7ef7-ef82-4a1f-98aa-15eef2a3f8c6-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"605e7ef7-ef82-4a1f-98aa-15eef2a3f8c6\") " pod="openstack/openstack-cell1-galera-0" Sep 30 06:34:27 crc kubenswrapper[4691]: I0930 06:34:27.388437 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/605e7ef7-ef82-4a1f-98aa-15eef2a3f8c6-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"605e7ef7-ef82-4a1f-98aa-15eef2a3f8c6\") " pod="openstack/openstack-cell1-galera-0" Sep 30 06:34:27 crc kubenswrapper[4691]: I0930 06:34:27.388523 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/605e7ef7-ef82-4a1f-98aa-15eef2a3f8c6-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"605e7ef7-ef82-4a1f-98aa-15eef2a3f8c6\") " pod="openstack/openstack-cell1-galera-0" Sep 30 06:34:27 crc kubenswrapper[4691]: I0930 06:34:27.388623 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/605e7ef7-ef82-4a1f-98aa-15eef2a3f8c6-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"605e7ef7-ef82-4a1f-98aa-15eef2a3f8c6\") " pod="openstack/openstack-cell1-galera-0" Sep 30 06:34:27 crc kubenswrapper[4691]: I0930 06:34:27.388763 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/605e7ef7-ef82-4a1f-98aa-15eef2a3f8c6-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"605e7ef7-ef82-4a1f-98aa-15eef2a3f8c6\") " pod="openstack/openstack-cell1-galera-0" Sep 30 06:34:27 crc kubenswrapper[4691]: I0930 06:34:27.388866 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/605e7ef7-ef82-4a1f-98aa-15eef2a3f8c6-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"605e7ef7-ef82-4a1f-98aa-15eef2a3f8c6\") " pod="openstack/openstack-cell1-galera-0" Sep 30 06:34:27 crc kubenswrapper[4691]: I0930 06:34:27.388980 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"605e7ef7-ef82-4a1f-98aa-15eef2a3f8c6\") " pod="openstack/openstack-cell1-galera-0" Sep 30 06:34:27 crc kubenswrapper[4691]: I0930 06:34:27.388783 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/605e7ef7-ef82-4a1f-98aa-15eef2a3f8c6-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"605e7ef7-ef82-4a1f-98aa-15eef2a3f8c6\") " pod="openstack/openstack-cell1-galera-0" Sep 30 06:34:27 crc kubenswrapper[4691]: I0930 06:34:27.389259 4691 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"605e7ef7-ef82-4a1f-98aa-15eef2a3f8c6\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/openstack-cell1-galera-0" Sep 30 06:34:27 crc kubenswrapper[4691]: I0930 06:34:27.389564 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/605e7ef7-ef82-4a1f-98aa-15eef2a3f8c6-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"605e7ef7-ef82-4a1f-98aa-15eef2a3f8c6\") " pod="openstack/openstack-cell1-galera-0" Sep 30 06:34:27 crc kubenswrapper[4691]: I0930 06:34:27.390234 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/605e7ef7-ef82-4a1f-98aa-15eef2a3f8c6-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"605e7ef7-ef82-4a1f-98aa-15eef2a3f8c6\") " pod="openstack/openstack-cell1-galera-0" Sep 30 06:34:27 crc kubenswrapper[4691]: I0930 06:34:27.390241 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/605e7ef7-ef82-4a1f-98aa-15eef2a3f8c6-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"605e7ef7-ef82-4a1f-98aa-15eef2a3f8c6\") " pod="openstack/openstack-cell1-galera-0" Sep 30 06:34:27 crc kubenswrapper[4691]: I0930 06:34:27.393668 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/secret/605e7ef7-ef82-4a1f-98aa-15eef2a3f8c6-secrets\") pod \"openstack-cell1-galera-0\" (UID: \"605e7ef7-ef82-4a1f-98aa-15eef2a3f8c6\") " pod="openstack/openstack-cell1-galera-0" Sep 30 06:34:27 crc kubenswrapper[4691]: I0930 06:34:27.397824 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/605e7ef7-ef82-4a1f-98aa-15eef2a3f8c6-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"605e7ef7-ef82-4a1f-98aa-15eef2a3f8c6\") " pod="openstack/openstack-cell1-galera-0" Sep 30 06:34:27 crc kubenswrapper[4691]: I0930 06:34:27.398055 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/605e7ef7-ef82-4a1f-98aa-15eef2a3f8c6-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"605e7ef7-ef82-4a1f-98aa-15eef2a3f8c6\") " pod="openstack/openstack-cell1-galera-0" Sep 30 06:34:27 crc kubenswrapper[4691]: I0930 06:34:27.411849 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"605e7ef7-ef82-4a1f-98aa-15eef2a3f8c6\") " pod="openstack/openstack-cell1-galera-0" Sep 30 06:34:27 crc kubenswrapper[4691]: I0930 06:34:27.412189 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wc65n\" (UniqueName: \"kubernetes.io/projected/605e7ef7-ef82-4a1f-98aa-15eef2a3f8c6-kube-api-access-wc65n\") pod \"openstack-cell1-galera-0\" (UID: \"605e7ef7-ef82-4a1f-98aa-15eef2a3f8c6\") " pod="openstack/openstack-cell1-galera-0" Sep 30 06:34:27 crc kubenswrapper[4691]: I0930 06:34:27.427449 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Sep 30 06:34:27 crc kubenswrapper[4691]: I0930 06:34:27.450727 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Sep 30 06:34:27 crc kubenswrapper[4691]: I0930 06:34:27.451769 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Sep 30 06:34:27 crc kubenswrapper[4691]: I0930 06:34:27.455156 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Sep 30 06:34:27 crc kubenswrapper[4691]: I0930 06:34:27.455407 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Sep 30 06:34:27 crc kubenswrapper[4691]: I0930 06:34:27.455502 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-swmg4" Sep 30 06:34:27 crc kubenswrapper[4691]: I0930 06:34:27.461211 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Sep 30 06:34:27 crc kubenswrapper[4691]: I0930 06:34:27.591708 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98xp5\" (UniqueName: \"kubernetes.io/projected/5c310640-e561-4e1e-8f7c-046a7eec139d-kube-api-access-98xp5\") pod \"memcached-0\" (UID: \"5c310640-e561-4e1e-8f7c-046a7eec139d\") " pod="openstack/memcached-0" Sep 30 06:34:27 crc kubenswrapper[4691]: I0930 06:34:27.591752 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5c310640-e561-4e1e-8f7c-046a7eec139d-kolla-config\") pod \"memcached-0\" (UID: \"5c310640-e561-4e1e-8f7c-046a7eec139d\") " pod="openstack/memcached-0" Sep 30 06:34:27 crc kubenswrapper[4691]: I0930 06:34:27.591777 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c310640-e561-4e1e-8f7c-046a7eec139d-memcached-tls-certs\") pod \"memcached-0\" (UID: \"5c310640-e561-4e1e-8f7c-046a7eec139d\") " pod="openstack/memcached-0" Sep 30 06:34:27 crc kubenswrapper[4691]: I0930 06:34:27.591848 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c310640-e561-4e1e-8f7c-046a7eec139d-combined-ca-bundle\") pod \"memcached-0\" (UID: \"5c310640-e561-4e1e-8f7c-046a7eec139d\") " pod="openstack/memcached-0" Sep 30 06:34:27 crc kubenswrapper[4691]: I0930 06:34:27.591869 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5c310640-e561-4e1e-8f7c-046a7eec139d-config-data\") pod \"memcached-0\" (UID: \"5c310640-e561-4e1e-8f7c-046a7eec139d\") " pod="openstack/memcached-0" Sep 30 06:34:27 crc kubenswrapper[4691]: I0930 06:34:27.692671 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c310640-e561-4e1e-8f7c-046a7eec139d-combined-ca-bundle\") pod \"memcached-0\" (UID: \"5c310640-e561-4e1e-8f7c-046a7eec139d\") " pod="openstack/memcached-0" Sep 30 06:34:27 crc kubenswrapper[4691]: I0930 06:34:27.692716 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5c310640-e561-4e1e-8f7c-046a7eec139d-config-data\") pod \"memcached-0\" (UID: \"5c310640-e561-4e1e-8f7c-046a7eec139d\") " pod="openstack/memcached-0" Sep 30 06:34:27 crc kubenswrapper[4691]: I0930 06:34:27.692829 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98xp5\" (UniqueName: \"kubernetes.io/projected/5c310640-e561-4e1e-8f7c-046a7eec139d-kube-api-access-98xp5\") pod \"memcached-0\" (UID: \"5c310640-e561-4e1e-8f7c-046a7eec139d\") " pod="openstack/memcached-0" Sep 30 06:34:27 crc kubenswrapper[4691]: I0930 06:34:27.692848 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5c310640-e561-4e1e-8f7c-046a7eec139d-kolla-config\") pod \"memcached-0\" (UID: \"5c310640-e561-4e1e-8f7c-046a7eec139d\") " pod="openstack/memcached-0" Sep 30 06:34:27 crc kubenswrapper[4691]: I0930 06:34:27.692867 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c310640-e561-4e1e-8f7c-046a7eec139d-memcached-tls-certs\") pod \"memcached-0\" (UID: \"5c310640-e561-4e1e-8f7c-046a7eec139d\") " pod="openstack/memcached-0" Sep 30 06:34:27 crc kubenswrapper[4691]: I0930 06:34:27.694620 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5c310640-e561-4e1e-8f7c-046a7eec139d-config-data\") pod \"memcached-0\" (UID: \"5c310640-e561-4e1e-8f7c-046a7eec139d\") " pod="openstack/memcached-0" Sep 30 06:34:27 crc kubenswrapper[4691]: I0930 06:34:27.695118 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5c310640-e561-4e1e-8f7c-046a7eec139d-kolla-config\") pod \"memcached-0\" (UID: \"5c310640-e561-4e1e-8f7c-046a7eec139d\") " pod="openstack/memcached-0" Sep 30 06:34:27 crc kubenswrapper[4691]: I0930 06:34:27.695725 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c310640-e561-4e1e-8f7c-046a7eec139d-memcached-tls-certs\") pod \"memcached-0\" (UID: \"5c310640-e561-4e1e-8f7c-046a7eec139d\") " pod="openstack/memcached-0" Sep 30 06:34:27 crc kubenswrapper[4691]: I0930 06:34:27.696635 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c310640-e561-4e1e-8f7c-046a7eec139d-combined-ca-bundle\") pod \"memcached-0\" (UID: \"5c310640-e561-4e1e-8f7c-046a7eec139d\") " pod="openstack/memcached-0" Sep 30 06:34:27 crc kubenswrapper[4691]: I0930 06:34:27.711464 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98xp5\" (UniqueName: \"kubernetes.io/projected/5c310640-e561-4e1e-8f7c-046a7eec139d-kube-api-access-98xp5\") pod \"memcached-0\" (UID: \"5c310640-e561-4e1e-8f7c-046a7eec139d\") " pod="openstack/memcached-0" Sep 30 06:34:27 crc kubenswrapper[4691]: I0930 06:34:27.774961 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Sep 30 06:34:29 crc kubenswrapper[4691]: I0930 06:34:29.043733 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 06:34:29 crc kubenswrapper[4691]: I0930 06:34:29.044956 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 30 06:34:29 crc kubenswrapper[4691]: I0930 06:34:29.050170 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-rkh6p" Sep 30 06:34:29 crc kubenswrapper[4691]: I0930 06:34:29.055860 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 06:34:29 crc kubenswrapper[4691]: I0930 06:34:29.218707 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vd52s\" (UniqueName: \"kubernetes.io/projected/8ebf5adc-aea5-4d38-81e8-722c6f1db55c-kube-api-access-vd52s\") pod \"kube-state-metrics-0\" (UID: \"8ebf5adc-aea5-4d38-81e8-722c6f1db55c\") " pod="openstack/kube-state-metrics-0" Sep 30 06:34:29 crc kubenswrapper[4691]: I0930 06:34:29.320474 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vd52s\" (UniqueName: \"kubernetes.io/projected/8ebf5adc-aea5-4d38-81e8-722c6f1db55c-kube-api-access-vd52s\") pod \"kube-state-metrics-0\" (UID: \"8ebf5adc-aea5-4d38-81e8-722c6f1db55c\") " pod="openstack/kube-state-metrics-0" Sep 30 06:34:29 crc kubenswrapper[4691]: I0930 06:34:29.347465 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vd52s\" (UniqueName: \"kubernetes.io/projected/8ebf5adc-aea5-4d38-81e8-722c6f1db55c-kube-api-access-vd52s\") pod \"kube-state-metrics-0\" (UID: \"8ebf5adc-aea5-4d38-81e8-722c6f1db55c\") " pod="openstack/kube-state-metrics-0" Sep 30 06:34:29 crc kubenswrapper[4691]: I0930 06:34:29.361860 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 30 06:34:30 crc kubenswrapper[4691]: I0930 06:34:30.245357 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 06:34:30 crc kubenswrapper[4691]: I0930 06:34:30.248837 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Sep 30 06:34:30 crc kubenswrapper[4691]: I0930 06:34:30.250682 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Sep 30 06:34:30 crc kubenswrapper[4691]: I0930 06:34:30.252103 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Sep 30 06:34:30 crc kubenswrapper[4691]: I0930 06:34:30.252222 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Sep 30 06:34:30 crc kubenswrapper[4691]: I0930 06:34:30.253168 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-5hw8h" Sep 30 06:34:30 crc kubenswrapper[4691]: I0930 06:34:30.253796 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Sep 30 06:34:30 crc kubenswrapper[4691]: I0930 06:34:30.255496 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 06:34:30 crc kubenswrapper[4691]: I0930 06:34:30.256227 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Sep 30 06:34:30 crc kubenswrapper[4691]: I0930 06:34:30.437221 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d61470fc-16c1-40fb-bc8a-17517013b3be-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"d61470fc-16c1-40fb-bc8a-17517013b3be\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:34:30 crc kubenswrapper[4691]: I0930 06:34:30.437281 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d61470fc-16c1-40fb-bc8a-17517013b3be-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"d61470fc-16c1-40fb-bc8a-17517013b3be\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:34:30 crc kubenswrapper[4691]: I0930 06:34:30.437365 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d61470fc-16c1-40fb-bc8a-17517013b3be-config\") pod \"prometheus-metric-storage-0\" (UID: \"d61470fc-16c1-40fb-bc8a-17517013b3be\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:34:30 crc kubenswrapper[4691]: I0930 06:34:30.437387 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d61470fc-16c1-40fb-bc8a-17517013b3be-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"d61470fc-16c1-40fb-bc8a-17517013b3be\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:34:30 crc kubenswrapper[4691]: I0930 06:34:30.437410 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d61470fc-16c1-40fb-bc8a-17517013b3be-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"d61470fc-16c1-40fb-bc8a-17517013b3be\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:34:30 crc kubenswrapper[4691]: I0930 06:34:30.437505 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggnpl\" (UniqueName: \"kubernetes.io/projected/d61470fc-16c1-40fb-bc8a-17517013b3be-kube-api-access-ggnpl\") pod \"prometheus-metric-storage-0\" (UID: \"d61470fc-16c1-40fb-bc8a-17517013b3be\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:34:30 crc kubenswrapper[4691]: I0930 06:34:30.437584 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-299d6a78-f368-48b6-8212-bc99e11a0dd5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-299d6a78-f368-48b6-8212-bc99e11a0dd5\") pod \"prometheus-metric-storage-0\" (UID: \"d61470fc-16c1-40fb-bc8a-17517013b3be\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:34:30 crc kubenswrapper[4691]: I0930 06:34:30.437793 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d61470fc-16c1-40fb-bc8a-17517013b3be-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"d61470fc-16c1-40fb-bc8a-17517013b3be\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:34:30 crc kubenswrapper[4691]: I0930 06:34:30.541164 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d61470fc-16c1-40fb-bc8a-17517013b3be-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"d61470fc-16c1-40fb-bc8a-17517013b3be\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:34:30 crc kubenswrapper[4691]: I0930 06:34:30.541328 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d61470fc-16c1-40fb-bc8a-17517013b3be-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"d61470fc-16c1-40fb-bc8a-17517013b3be\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:34:30 crc kubenswrapper[4691]: I0930 06:34:30.541473 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d61470fc-16c1-40fb-bc8a-17517013b3be-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"d61470fc-16c1-40fb-bc8a-17517013b3be\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:34:30 crc kubenswrapper[4691]: I0930 06:34:30.541623 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d61470fc-16c1-40fb-bc8a-17517013b3be-config\") pod \"prometheus-metric-storage-0\" (UID: \"d61470fc-16c1-40fb-bc8a-17517013b3be\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:34:30 crc kubenswrapper[4691]: I0930 06:34:30.541700 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d61470fc-16c1-40fb-bc8a-17517013b3be-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"d61470fc-16c1-40fb-bc8a-17517013b3be\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:34:30 crc kubenswrapper[4691]: I0930 06:34:30.541750 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d61470fc-16c1-40fb-bc8a-17517013b3be-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"d61470fc-16c1-40fb-bc8a-17517013b3be\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:34:30 crc kubenswrapper[4691]: I0930 06:34:30.541823 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggnpl\" (UniqueName: \"kubernetes.io/projected/d61470fc-16c1-40fb-bc8a-17517013b3be-kube-api-access-ggnpl\") pod \"prometheus-metric-storage-0\" (UID: \"d61470fc-16c1-40fb-bc8a-17517013b3be\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:34:30 crc kubenswrapper[4691]: I0930 06:34:30.541921 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-299d6a78-f368-48b6-8212-bc99e11a0dd5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-299d6a78-f368-48b6-8212-bc99e11a0dd5\") pod \"prometheus-metric-storage-0\" (UID: \"d61470fc-16c1-40fb-bc8a-17517013b3be\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:34:30 crc kubenswrapper[4691]: I0930 06:34:30.542089 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d61470fc-16c1-40fb-bc8a-17517013b3be-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"d61470fc-16c1-40fb-bc8a-17517013b3be\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:34:30 crc kubenswrapper[4691]: I0930 06:34:30.547497 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d61470fc-16c1-40fb-bc8a-17517013b3be-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"d61470fc-16c1-40fb-bc8a-17517013b3be\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:34:30 crc kubenswrapper[4691]: I0930 06:34:30.558295 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d61470fc-16c1-40fb-bc8a-17517013b3be-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"d61470fc-16c1-40fb-bc8a-17517013b3be\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:34:30 crc kubenswrapper[4691]: I0930 06:34:30.559807 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d61470fc-16c1-40fb-bc8a-17517013b3be-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"d61470fc-16c1-40fb-bc8a-17517013b3be\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:34:30 crc kubenswrapper[4691]: I0930 06:34:30.559820 4691 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Sep 30 06:34:30 crc kubenswrapper[4691]: I0930 06:34:30.559934 4691 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-299d6a78-f368-48b6-8212-bc99e11a0dd5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-299d6a78-f368-48b6-8212-bc99e11a0dd5\") pod \"prometheus-metric-storage-0\" (UID: \"d61470fc-16c1-40fb-bc8a-17517013b3be\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/c288a5d41a5881b6adb6be722d4e7a99207424eb3b5d2db5e4a72cf60753eefa/globalmount\"" pod="openstack/prometheus-metric-storage-0" Sep 30 06:34:30 crc kubenswrapper[4691]: I0930 06:34:30.567302 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/d61470fc-16c1-40fb-bc8a-17517013b3be-config\") pod \"prometheus-metric-storage-0\" (UID: \"d61470fc-16c1-40fb-bc8a-17517013b3be\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:34:30 crc kubenswrapper[4691]: I0930 06:34:30.576359 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggnpl\" (UniqueName: \"kubernetes.io/projected/d61470fc-16c1-40fb-bc8a-17517013b3be-kube-api-access-ggnpl\") pod \"prometheus-metric-storage-0\" (UID: \"d61470fc-16c1-40fb-bc8a-17517013b3be\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:34:30 crc kubenswrapper[4691]: I0930 06:34:30.578173 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d61470fc-16c1-40fb-bc8a-17517013b3be-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"d61470fc-16c1-40fb-bc8a-17517013b3be\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:34:30 crc kubenswrapper[4691]: I0930 06:34:30.600584 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-299d6a78-f368-48b6-8212-bc99e11a0dd5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-299d6a78-f368-48b6-8212-bc99e11a0dd5\") pod \"prometheus-metric-storage-0\" (UID: \"d61470fc-16c1-40fb-bc8a-17517013b3be\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:34:30 crc kubenswrapper[4691]: I0930 06:34:30.884963 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Sep 30 06:34:32 crc kubenswrapper[4691]: I0930 06:34:32.914797 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-2wmg8"] Sep 30 06:34:32 crc kubenswrapper[4691]: I0930 06:34:32.916083 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-2wmg8" Sep 30 06:34:32 crc kubenswrapper[4691]: I0930 06:34:32.919030 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-srf6h" Sep 30 06:34:32 crc kubenswrapper[4691]: I0930 06:34:32.922877 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Sep 30 06:34:32 crc kubenswrapper[4691]: I0930 06:34:32.930263 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-csq87"] Sep 30 06:34:32 crc kubenswrapper[4691]: I0930 06:34:32.932493 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-csq87" Sep 30 06:34:32 crc kubenswrapper[4691]: I0930 06:34:32.935135 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Sep 30 06:34:32 crc kubenswrapper[4691]: I0930 06:34:32.937665 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-2wmg8"] Sep 30 06:34:32 crc kubenswrapper[4691]: I0930 06:34:32.996754 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/99bbc7fe-4a99-4f60-b840-8843790d6cb4-var-run\") pod \"ovn-controller-ovs-csq87\" (UID: \"99bbc7fe-4a99-4f60-b840-8843790d6cb4\") " pod="openstack/ovn-controller-ovs-csq87" Sep 30 06:34:32 crc kubenswrapper[4691]: I0930 06:34:32.996789 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/86397f09-76d1-4c35-a96a-5b6bde1e3574-var-run-ovn\") pod \"ovn-controller-2wmg8\" (UID: \"86397f09-76d1-4c35-a96a-5b6bde1e3574\") " pod="openstack/ovn-controller-2wmg8" Sep 30 06:34:32 crc kubenswrapper[4691]: I0930 06:34:32.996827 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86397f09-76d1-4c35-a96a-5b6bde1e3574-combined-ca-bundle\") pod \"ovn-controller-2wmg8\" (UID: \"86397f09-76d1-4c35-a96a-5b6bde1e3574\") " pod="openstack/ovn-controller-2wmg8" Sep 30 06:34:32 crc kubenswrapper[4691]: I0930 06:34:32.996849 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbmwv\" (UniqueName: \"kubernetes.io/projected/99bbc7fe-4a99-4f60-b840-8843790d6cb4-kube-api-access-zbmwv\") pod \"ovn-controller-ovs-csq87\" (UID: \"99bbc7fe-4a99-4f60-b840-8843790d6cb4\") " pod="openstack/ovn-controller-ovs-csq87" Sep 30 06:34:32 crc kubenswrapper[4691]: I0930 06:34:32.996867 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/86397f09-76d1-4c35-a96a-5b6bde1e3574-var-log-ovn\") pod \"ovn-controller-2wmg8\" (UID: \"86397f09-76d1-4c35-a96a-5b6bde1e3574\") " pod="openstack/ovn-controller-2wmg8" Sep 30 06:34:32 crc kubenswrapper[4691]: I0930 06:34:32.996918 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/99bbc7fe-4a99-4f60-b840-8843790d6cb4-var-log\") pod \"ovn-controller-ovs-csq87\" (UID: \"99bbc7fe-4a99-4f60-b840-8843790d6cb4\") " pod="openstack/ovn-controller-ovs-csq87" Sep 30 06:34:32 crc kubenswrapper[4691]: I0930 06:34:32.996939 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/99bbc7fe-4a99-4f60-b840-8843790d6cb4-scripts\") pod \"ovn-controller-ovs-csq87\" (UID: \"99bbc7fe-4a99-4f60-b840-8843790d6cb4\") " pod="openstack/ovn-controller-ovs-csq87" Sep 30 06:34:32 crc kubenswrapper[4691]: I0930 06:34:32.996957 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/99bbc7fe-4a99-4f60-b840-8843790d6cb4-var-lib\") pod \"ovn-controller-ovs-csq87\" (UID: \"99bbc7fe-4a99-4f60-b840-8843790d6cb4\") " pod="openstack/ovn-controller-ovs-csq87" Sep 30 06:34:32 crc kubenswrapper[4691]: I0930 06:34:32.996983 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8r9r\" (UniqueName: \"kubernetes.io/projected/86397f09-76d1-4c35-a96a-5b6bde1e3574-kube-api-access-b8r9r\") pod \"ovn-controller-2wmg8\" (UID: \"86397f09-76d1-4c35-a96a-5b6bde1e3574\") " pod="openstack/ovn-controller-2wmg8" Sep 30 06:34:32 crc kubenswrapper[4691]: I0930 06:34:32.997004 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/86397f09-76d1-4c35-a96a-5b6bde1e3574-ovn-controller-tls-certs\") pod \"ovn-controller-2wmg8\" (UID: \"86397f09-76d1-4c35-a96a-5b6bde1e3574\") " pod="openstack/ovn-controller-2wmg8" Sep 30 06:34:32 crc kubenswrapper[4691]: I0930 06:34:32.997020 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/86397f09-76d1-4c35-a96a-5b6bde1e3574-var-run\") pod \"ovn-controller-2wmg8\" (UID: \"86397f09-76d1-4c35-a96a-5b6bde1e3574\") " pod="openstack/ovn-controller-2wmg8" Sep 30 06:34:32 crc kubenswrapper[4691]: I0930 06:34:32.997046 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/86397f09-76d1-4c35-a96a-5b6bde1e3574-scripts\") pod \"ovn-controller-2wmg8\" (UID: \"86397f09-76d1-4c35-a96a-5b6bde1e3574\") " pod="openstack/ovn-controller-2wmg8" Sep 30 06:34:32 crc kubenswrapper[4691]: I0930 06:34:32.997065 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/99bbc7fe-4a99-4f60-b840-8843790d6cb4-etc-ovs\") pod \"ovn-controller-ovs-csq87\" (UID: \"99bbc7fe-4a99-4f60-b840-8843790d6cb4\") " pod="openstack/ovn-controller-ovs-csq87" Sep 30 06:34:33 crc kubenswrapper[4691]: I0930 06:34:33.000494 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-csq87"] Sep 30 06:34:33 crc kubenswrapper[4691]: I0930 06:34:33.097969 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/99bbc7fe-4a99-4f60-b840-8843790d6cb4-var-run\") pod \"ovn-controller-ovs-csq87\" (UID: \"99bbc7fe-4a99-4f60-b840-8843790d6cb4\") " pod="openstack/ovn-controller-ovs-csq87" Sep 30 06:34:33 crc kubenswrapper[4691]: I0930 06:34:33.098008 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/86397f09-76d1-4c35-a96a-5b6bde1e3574-var-run-ovn\") pod \"ovn-controller-2wmg8\" (UID: \"86397f09-76d1-4c35-a96a-5b6bde1e3574\") " pod="openstack/ovn-controller-2wmg8" Sep 30 06:34:33 crc kubenswrapper[4691]: I0930 06:34:33.098054 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86397f09-76d1-4c35-a96a-5b6bde1e3574-combined-ca-bundle\") pod \"ovn-controller-2wmg8\" (UID: \"86397f09-76d1-4c35-a96a-5b6bde1e3574\") " pod="openstack/ovn-controller-2wmg8" Sep 30 06:34:33 crc kubenswrapper[4691]: I0930 06:34:33.098096 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbmwv\" (UniqueName: \"kubernetes.io/projected/99bbc7fe-4a99-4f60-b840-8843790d6cb4-kube-api-access-zbmwv\") pod \"ovn-controller-ovs-csq87\" (UID: \"99bbc7fe-4a99-4f60-b840-8843790d6cb4\") " pod="openstack/ovn-controller-ovs-csq87" Sep 30 06:34:33 crc kubenswrapper[4691]: I0930 06:34:33.098114 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/86397f09-76d1-4c35-a96a-5b6bde1e3574-var-log-ovn\") pod \"ovn-controller-2wmg8\" (UID: \"86397f09-76d1-4c35-a96a-5b6bde1e3574\") " pod="openstack/ovn-controller-2wmg8" Sep 30 06:34:33 crc kubenswrapper[4691]: I0930 06:34:33.098413 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/99bbc7fe-4a99-4f60-b840-8843790d6cb4-var-run\") pod \"ovn-controller-ovs-csq87\" (UID: \"99bbc7fe-4a99-4f60-b840-8843790d6cb4\") " pod="openstack/ovn-controller-ovs-csq87" Sep 30 06:34:33 crc kubenswrapper[4691]: I0930 06:34:33.098708 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/99bbc7fe-4a99-4f60-b840-8843790d6cb4-var-log\") pod \"ovn-controller-ovs-csq87\" (UID: \"99bbc7fe-4a99-4f60-b840-8843790d6cb4\") " pod="openstack/ovn-controller-ovs-csq87" Sep 30 06:34:33 crc kubenswrapper[4691]: I0930 06:34:33.098758 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/99bbc7fe-4a99-4f60-b840-8843790d6cb4-scripts\") pod \"ovn-controller-ovs-csq87\" (UID: \"99bbc7fe-4a99-4f60-b840-8843790d6cb4\") " pod="openstack/ovn-controller-ovs-csq87" Sep 30 06:34:33 crc kubenswrapper[4691]: I0930 06:34:33.098787 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/99bbc7fe-4a99-4f60-b840-8843790d6cb4-var-lib\") pod \"ovn-controller-ovs-csq87\" (UID: \"99bbc7fe-4a99-4f60-b840-8843790d6cb4\") " pod="openstack/ovn-controller-ovs-csq87" Sep 30 06:34:33 crc kubenswrapper[4691]: I0930 06:34:33.098820 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8r9r\" (UniqueName: \"kubernetes.io/projected/86397f09-76d1-4c35-a96a-5b6bde1e3574-kube-api-access-b8r9r\") pod \"ovn-controller-2wmg8\" (UID: \"86397f09-76d1-4c35-a96a-5b6bde1e3574\") " pod="openstack/ovn-controller-2wmg8" Sep 30 06:34:33 crc kubenswrapper[4691]: I0930 06:34:33.098847 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/86397f09-76d1-4c35-a96a-5b6bde1e3574-ovn-controller-tls-certs\") pod \"ovn-controller-2wmg8\" (UID: \"86397f09-76d1-4c35-a96a-5b6bde1e3574\") " pod="openstack/ovn-controller-2wmg8" Sep 30 06:34:33 crc kubenswrapper[4691]: I0930 06:34:33.098865 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/86397f09-76d1-4c35-a96a-5b6bde1e3574-var-run\") pod \"ovn-controller-2wmg8\" (UID: \"86397f09-76d1-4c35-a96a-5b6bde1e3574\") " pod="openstack/ovn-controller-2wmg8" Sep 30 06:34:33 crc kubenswrapper[4691]: I0930 06:34:33.098918 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/86397f09-76d1-4c35-a96a-5b6bde1e3574-var-log-ovn\") pod \"ovn-controller-2wmg8\" (UID: \"86397f09-76d1-4c35-a96a-5b6bde1e3574\") " pod="openstack/ovn-controller-2wmg8" Sep 30 06:34:33 crc kubenswrapper[4691]: I0930 06:34:33.098936 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/99bbc7fe-4a99-4f60-b840-8843790d6cb4-var-log\") pod \"ovn-controller-ovs-csq87\" (UID: \"99bbc7fe-4a99-4f60-b840-8843790d6cb4\") " pod="openstack/ovn-controller-ovs-csq87" Sep 30 06:34:33 crc kubenswrapper[4691]: I0930 06:34:33.098943 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/86397f09-76d1-4c35-a96a-5b6bde1e3574-scripts\") pod \"ovn-controller-2wmg8\" (UID: \"86397f09-76d1-4c35-a96a-5b6bde1e3574\") " pod="openstack/ovn-controller-2wmg8" Sep 30 06:34:33 crc kubenswrapper[4691]: I0930 06:34:33.098979 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/99bbc7fe-4a99-4f60-b840-8843790d6cb4-etc-ovs\") pod \"ovn-controller-ovs-csq87\" (UID: \"99bbc7fe-4a99-4f60-b840-8843790d6cb4\") " pod="openstack/ovn-controller-ovs-csq87" Sep 30 06:34:33 crc kubenswrapper[4691]: I0930 06:34:33.099118 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/99bbc7fe-4a99-4f60-b840-8843790d6cb4-etc-ovs\") pod \"ovn-controller-ovs-csq87\" (UID: \"99bbc7fe-4a99-4f60-b840-8843790d6cb4\") " pod="openstack/ovn-controller-ovs-csq87" Sep 30 06:34:33 crc kubenswrapper[4691]: I0930 06:34:33.099304 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/99bbc7fe-4a99-4f60-b840-8843790d6cb4-var-lib\") pod \"ovn-controller-ovs-csq87\" (UID: \"99bbc7fe-4a99-4f60-b840-8843790d6cb4\") " pod="openstack/ovn-controller-ovs-csq87" Sep 30 06:34:33 crc kubenswrapper[4691]: I0930 06:34:33.099310 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/86397f09-76d1-4c35-a96a-5b6bde1e3574-var-run-ovn\") pod \"ovn-controller-2wmg8\" (UID: \"86397f09-76d1-4c35-a96a-5b6bde1e3574\") " pod="openstack/ovn-controller-2wmg8" Sep 30 06:34:33 crc kubenswrapper[4691]: I0930 06:34:33.099418 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/86397f09-76d1-4c35-a96a-5b6bde1e3574-var-run\") pod \"ovn-controller-2wmg8\" (UID: \"86397f09-76d1-4c35-a96a-5b6bde1e3574\") " pod="openstack/ovn-controller-2wmg8" Sep 30 06:34:33 crc kubenswrapper[4691]: I0930 06:34:33.100805 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/99bbc7fe-4a99-4f60-b840-8843790d6cb4-scripts\") pod \"ovn-controller-ovs-csq87\" (UID: \"99bbc7fe-4a99-4f60-b840-8843790d6cb4\") " pod="openstack/ovn-controller-ovs-csq87" Sep 30 06:34:33 crc kubenswrapper[4691]: I0930 06:34:33.101339 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/86397f09-76d1-4c35-a96a-5b6bde1e3574-scripts\") pod \"ovn-controller-2wmg8\" (UID: \"86397f09-76d1-4c35-a96a-5b6bde1e3574\") " pod="openstack/ovn-controller-2wmg8" Sep 30 06:34:33 crc kubenswrapper[4691]: I0930 06:34:33.102591 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/86397f09-76d1-4c35-a96a-5b6bde1e3574-ovn-controller-tls-certs\") pod \"ovn-controller-2wmg8\" (UID: \"86397f09-76d1-4c35-a96a-5b6bde1e3574\") " pod="openstack/ovn-controller-2wmg8" Sep 30 06:34:33 crc kubenswrapper[4691]: I0930 06:34:33.102846 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86397f09-76d1-4c35-a96a-5b6bde1e3574-combined-ca-bundle\") pod \"ovn-controller-2wmg8\" (UID: \"86397f09-76d1-4c35-a96a-5b6bde1e3574\") " pod="openstack/ovn-controller-2wmg8" Sep 30 06:34:33 crc kubenswrapper[4691]: I0930 06:34:33.127376 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbmwv\" (UniqueName: \"kubernetes.io/projected/99bbc7fe-4a99-4f60-b840-8843790d6cb4-kube-api-access-zbmwv\") pod \"ovn-controller-ovs-csq87\" (UID: \"99bbc7fe-4a99-4f60-b840-8843790d6cb4\") " pod="openstack/ovn-controller-ovs-csq87" Sep 30 06:34:33 crc kubenswrapper[4691]: I0930 06:34:33.127379 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8r9r\" (UniqueName: \"kubernetes.io/projected/86397f09-76d1-4c35-a96a-5b6bde1e3574-kube-api-access-b8r9r\") pod \"ovn-controller-2wmg8\" (UID: \"86397f09-76d1-4c35-a96a-5b6bde1e3574\") " pod="openstack/ovn-controller-2wmg8" Sep 30 06:34:33 crc kubenswrapper[4691]: I0930 06:34:33.294225 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-2wmg8" Sep 30 06:34:33 crc kubenswrapper[4691]: I0930 06:34:33.304724 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-csq87" Sep 30 06:34:34 crc kubenswrapper[4691]: I0930 06:34:34.274046 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Sep 30 06:34:34 crc kubenswrapper[4691]: I0930 06:34:34.275642 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Sep 30 06:34:34 crc kubenswrapper[4691]: I0930 06:34:34.278669 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-sqcfj" Sep 30 06:34:34 crc kubenswrapper[4691]: I0930 06:34:34.278845 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Sep 30 06:34:34 crc kubenswrapper[4691]: I0930 06:34:34.278918 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Sep 30 06:34:34 crc kubenswrapper[4691]: I0930 06:34:34.283450 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Sep 30 06:34:34 crc kubenswrapper[4691]: I0930 06:34:34.291780 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Sep 30 06:34:34 crc kubenswrapper[4691]: I0930 06:34:34.294940 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Sep 30 06:34:34 crc kubenswrapper[4691]: I0930 06:34:34.421823 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/48c486cf-48da-4fd0-b450-d821ab6b2755-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"48c486cf-48da-4fd0-b450-d821ab6b2755\") " pod="openstack/ovsdbserver-sb-0" Sep 30 06:34:34 crc kubenswrapper[4691]: I0930 06:34:34.421934 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t22bz\" (UniqueName: \"kubernetes.io/projected/48c486cf-48da-4fd0-b450-d821ab6b2755-kube-api-access-t22bz\") pod \"ovsdbserver-sb-0\" (UID: \"48c486cf-48da-4fd0-b450-d821ab6b2755\") " pod="openstack/ovsdbserver-sb-0" Sep 30 06:34:34 crc kubenswrapper[4691]: I0930 06:34:34.421990 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48c486cf-48da-4fd0-b450-d821ab6b2755-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"48c486cf-48da-4fd0-b450-d821ab6b2755\") " pod="openstack/ovsdbserver-sb-0" Sep 30 06:34:34 crc kubenswrapper[4691]: I0930 06:34:34.422050 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48c486cf-48da-4fd0-b450-d821ab6b2755-config\") pod \"ovsdbserver-sb-0\" (UID: \"48c486cf-48da-4fd0-b450-d821ab6b2755\") " pod="openstack/ovsdbserver-sb-0" Sep 30 06:34:34 crc kubenswrapper[4691]: I0930 06:34:34.422068 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/48c486cf-48da-4fd0-b450-d821ab6b2755-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"48c486cf-48da-4fd0-b450-d821ab6b2755\") " pod="openstack/ovsdbserver-sb-0" Sep 30 06:34:34 crc kubenswrapper[4691]: I0930 06:34:34.422098 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/48c486cf-48da-4fd0-b450-d821ab6b2755-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"48c486cf-48da-4fd0-b450-d821ab6b2755\") " pod="openstack/ovsdbserver-sb-0" Sep 30 06:34:34 crc kubenswrapper[4691]: I0930 06:34:34.422150 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/48c486cf-48da-4fd0-b450-d821ab6b2755-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"48c486cf-48da-4fd0-b450-d821ab6b2755\") " pod="openstack/ovsdbserver-sb-0" Sep 30 06:34:34 crc kubenswrapper[4691]: I0930 06:34:34.422218 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"48c486cf-48da-4fd0-b450-d821ab6b2755\") " pod="openstack/ovsdbserver-sb-0" Sep 30 06:34:34 crc kubenswrapper[4691]: I0930 06:34:34.523202 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t22bz\" (UniqueName: \"kubernetes.io/projected/48c486cf-48da-4fd0-b450-d821ab6b2755-kube-api-access-t22bz\") pod \"ovsdbserver-sb-0\" (UID: \"48c486cf-48da-4fd0-b450-d821ab6b2755\") " pod="openstack/ovsdbserver-sb-0" Sep 30 06:34:34 crc kubenswrapper[4691]: I0930 06:34:34.523262 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48c486cf-48da-4fd0-b450-d821ab6b2755-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"48c486cf-48da-4fd0-b450-d821ab6b2755\") " pod="openstack/ovsdbserver-sb-0" Sep 30 06:34:34 crc kubenswrapper[4691]: I0930 06:34:34.523289 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48c486cf-48da-4fd0-b450-d821ab6b2755-config\") pod \"ovsdbserver-sb-0\" (UID: \"48c486cf-48da-4fd0-b450-d821ab6b2755\") " pod="openstack/ovsdbserver-sb-0" Sep 30 06:34:34 crc kubenswrapper[4691]: I0930 06:34:34.523308 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/48c486cf-48da-4fd0-b450-d821ab6b2755-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"48c486cf-48da-4fd0-b450-d821ab6b2755\") " pod="openstack/ovsdbserver-sb-0" Sep 30 06:34:34 crc kubenswrapper[4691]: I0930 06:34:34.523326 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/48c486cf-48da-4fd0-b450-d821ab6b2755-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"48c486cf-48da-4fd0-b450-d821ab6b2755\") " pod="openstack/ovsdbserver-sb-0" Sep 30 06:34:34 crc kubenswrapper[4691]: I0930 06:34:34.523347 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/48c486cf-48da-4fd0-b450-d821ab6b2755-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"48c486cf-48da-4fd0-b450-d821ab6b2755\") " pod="openstack/ovsdbserver-sb-0" Sep 30 06:34:34 crc kubenswrapper[4691]: I0930 06:34:34.523404 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"48c486cf-48da-4fd0-b450-d821ab6b2755\") " pod="openstack/ovsdbserver-sb-0" Sep 30 06:34:34 crc kubenswrapper[4691]: I0930 06:34:34.523424 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/48c486cf-48da-4fd0-b450-d821ab6b2755-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"48c486cf-48da-4fd0-b450-d821ab6b2755\") " pod="openstack/ovsdbserver-sb-0" Sep 30 06:34:34 crc kubenswrapper[4691]: I0930 06:34:34.524124 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/48c486cf-48da-4fd0-b450-d821ab6b2755-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"48c486cf-48da-4fd0-b450-d821ab6b2755\") " pod="openstack/ovsdbserver-sb-0" Sep 30 06:34:34 crc kubenswrapper[4691]: I0930 06:34:34.524597 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/48c486cf-48da-4fd0-b450-d821ab6b2755-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"48c486cf-48da-4fd0-b450-d821ab6b2755\") " pod="openstack/ovsdbserver-sb-0" Sep 30 06:34:34 crc kubenswrapper[4691]: I0930 06:34:34.525414 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/48c486cf-48da-4fd0-b450-d821ab6b2755-config\") pod \"ovsdbserver-sb-0\" (UID: \"48c486cf-48da-4fd0-b450-d821ab6b2755\") " pod="openstack/ovsdbserver-sb-0" Sep 30 06:34:34 crc kubenswrapper[4691]: I0930 06:34:34.525548 4691 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"48c486cf-48da-4fd0-b450-d821ab6b2755\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/ovsdbserver-sb-0" Sep 30 06:34:34 crc kubenswrapper[4691]: I0930 06:34:34.528667 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48c486cf-48da-4fd0-b450-d821ab6b2755-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"48c486cf-48da-4fd0-b450-d821ab6b2755\") " pod="openstack/ovsdbserver-sb-0" Sep 30 06:34:34 crc kubenswrapper[4691]: I0930 06:34:34.529945 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/48c486cf-48da-4fd0-b450-d821ab6b2755-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"48c486cf-48da-4fd0-b450-d821ab6b2755\") " pod="openstack/ovsdbserver-sb-0" Sep 30 06:34:34 crc kubenswrapper[4691]: I0930 06:34:34.530530 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/48c486cf-48da-4fd0-b450-d821ab6b2755-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"48c486cf-48da-4fd0-b450-d821ab6b2755\") " pod="openstack/ovsdbserver-sb-0" Sep 30 06:34:34 crc kubenswrapper[4691]: I0930 06:34:34.542068 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t22bz\" (UniqueName: \"kubernetes.io/projected/48c486cf-48da-4fd0-b450-d821ab6b2755-kube-api-access-t22bz\") pod \"ovsdbserver-sb-0\" (UID: \"48c486cf-48da-4fd0-b450-d821ab6b2755\") " pod="openstack/ovsdbserver-sb-0" Sep 30 06:34:34 crc kubenswrapper[4691]: I0930 06:34:34.550376 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ovsdbserver-sb-0\" (UID: \"48c486cf-48da-4fd0-b450-d821ab6b2755\") " pod="openstack/ovsdbserver-sb-0" Sep 30 06:34:34 crc kubenswrapper[4691]: I0930 06:34:34.613235 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Sep 30 06:34:35 crc kubenswrapper[4691]: E0930 06:34:35.311201 4691 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.30:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Sep 30 06:34:35 crc kubenswrapper[4691]: E0930 06:34:35.311276 4691 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.30:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Sep 30 06:34:35 crc kubenswrapper[4691]: E0930 06:34:35.311425 4691 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:38.102.83.30:5001/podified-master-centos10/openstack-neutron-server:watcher_latest,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cm9cc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-8b8d888b5-mf8ss_openstack(23f40da9-1a7d-4b60-8c65-2f9e0cbc6cd0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 06:34:35 crc kubenswrapper[4691]: E0930 06:34:35.312699 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-8b8d888b5-mf8ss" podUID="23f40da9-1a7d-4b60-8c65-2f9e0cbc6cd0" Sep 30 06:34:35 crc kubenswrapper[4691]: E0930 06:34:35.394294 4691 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.30:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Sep 30 06:34:35 crc kubenswrapper[4691]: E0930 06:34:35.394349 4691 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.30:5001/podified-master-centos10/openstack-neutron-server:watcher_latest" Sep 30 06:34:35 crc kubenswrapper[4691]: E0930 06:34:35.394459 4691 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:38.102.83.30:5001/podified-master-centos10/openstack-neutron-server:watcher_latest,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rqh66,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-77479b959-nvfnl_openstack(223c32e4-0e0d-4ec5-aed4-823e434a40d1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 06:34:35 crc kubenswrapper[4691]: E0930 06:34:35.395610 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-77479b959-nvfnl" podUID="223c32e4-0e0d-4ec5-aed4-823e434a40d1" Sep 30 06:34:36 crc kubenswrapper[4691]: I0930 06:34:36.102098 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Sep 30 06:34:36 crc kubenswrapper[4691]: I0930 06:34:36.104179 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Sep 30 06:34:36 crc kubenswrapper[4691]: I0930 06:34:36.107807 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Sep 30 06:34:36 crc kubenswrapper[4691]: I0930 06:34:36.108213 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Sep 30 06:34:36 crc kubenswrapper[4691]: I0930 06:34:36.108689 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Sep 30 06:34:36 crc kubenswrapper[4691]: I0930 06:34:36.109076 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-jqk2x" Sep 30 06:34:36 crc kubenswrapper[4691]: I0930 06:34:36.122365 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Sep 30 06:34:36 crc kubenswrapper[4691]: I0930 06:34:36.251778 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/46482328-297b-40b1-83e1-2270733d27d7-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"46482328-297b-40b1-83e1-2270733d27d7\") " pod="openstack/ovsdbserver-nb-0" Sep 30 06:34:36 crc kubenswrapper[4691]: I0930 06:34:36.251969 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46482328-297b-40b1-83e1-2270733d27d7-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"46482328-297b-40b1-83e1-2270733d27d7\") " pod="openstack/ovsdbserver-nb-0" Sep 30 06:34:36 crc kubenswrapper[4691]: I0930 06:34:36.252014 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjmwl\" (UniqueName: \"kubernetes.io/projected/46482328-297b-40b1-83e1-2270733d27d7-kube-api-access-cjmwl\") pod \"ovsdbserver-nb-0\" (UID: \"46482328-297b-40b1-83e1-2270733d27d7\") " pod="openstack/ovsdbserver-nb-0" Sep 30 06:34:36 crc kubenswrapper[4691]: I0930 06:34:36.252050 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/46482328-297b-40b1-83e1-2270733d27d7-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"46482328-297b-40b1-83e1-2270733d27d7\") " pod="openstack/ovsdbserver-nb-0" Sep 30 06:34:36 crc kubenswrapper[4691]: I0930 06:34:36.252114 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/46482328-297b-40b1-83e1-2270733d27d7-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"46482328-297b-40b1-83e1-2270733d27d7\") " pod="openstack/ovsdbserver-nb-0" Sep 30 06:34:36 crc kubenswrapper[4691]: I0930 06:34:36.252162 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/46482328-297b-40b1-83e1-2270733d27d7-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"46482328-297b-40b1-83e1-2270733d27d7\") " pod="openstack/ovsdbserver-nb-0" Sep 30 06:34:36 crc kubenswrapper[4691]: I0930 06:34:36.252242 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"46482328-297b-40b1-83e1-2270733d27d7\") " pod="openstack/ovsdbserver-nb-0" Sep 30 06:34:36 crc kubenswrapper[4691]: I0930 06:34:36.252368 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46482328-297b-40b1-83e1-2270733d27d7-config\") pod \"ovsdbserver-nb-0\" (UID: \"46482328-297b-40b1-83e1-2270733d27d7\") " pod="openstack/ovsdbserver-nb-0" Sep 30 06:34:36 crc kubenswrapper[4691]: I0930 06:34:36.354373 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"46482328-297b-40b1-83e1-2270733d27d7\") " pod="openstack/ovsdbserver-nb-0" Sep 30 06:34:36 crc kubenswrapper[4691]: I0930 06:34:36.354437 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46482328-297b-40b1-83e1-2270733d27d7-config\") pod \"ovsdbserver-nb-0\" (UID: \"46482328-297b-40b1-83e1-2270733d27d7\") " pod="openstack/ovsdbserver-nb-0" Sep 30 06:34:36 crc kubenswrapper[4691]: I0930 06:34:36.354459 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/46482328-297b-40b1-83e1-2270733d27d7-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"46482328-297b-40b1-83e1-2270733d27d7\") " pod="openstack/ovsdbserver-nb-0" Sep 30 06:34:36 crc kubenswrapper[4691]: I0930 06:34:36.354520 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46482328-297b-40b1-83e1-2270733d27d7-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"46482328-297b-40b1-83e1-2270733d27d7\") " pod="openstack/ovsdbserver-nb-0" Sep 30 06:34:36 crc kubenswrapper[4691]: I0930 06:34:36.354539 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjmwl\" (UniqueName: \"kubernetes.io/projected/46482328-297b-40b1-83e1-2270733d27d7-kube-api-access-cjmwl\") pod \"ovsdbserver-nb-0\" (UID: \"46482328-297b-40b1-83e1-2270733d27d7\") " pod="openstack/ovsdbserver-nb-0" Sep 30 06:34:36 crc kubenswrapper[4691]: I0930 06:34:36.354557 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/46482328-297b-40b1-83e1-2270733d27d7-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"46482328-297b-40b1-83e1-2270733d27d7\") " pod="openstack/ovsdbserver-nb-0" Sep 30 06:34:36 crc kubenswrapper[4691]: I0930 06:34:36.354589 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/46482328-297b-40b1-83e1-2270733d27d7-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"46482328-297b-40b1-83e1-2270733d27d7\") " pod="openstack/ovsdbserver-nb-0" Sep 30 06:34:36 crc kubenswrapper[4691]: I0930 06:34:36.354613 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/46482328-297b-40b1-83e1-2270733d27d7-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"46482328-297b-40b1-83e1-2270733d27d7\") " pod="openstack/ovsdbserver-nb-0" Sep 30 06:34:36 crc kubenswrapper[4691]: I0930 06:34:36.357916 4691 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"46482328-297b-40b1-83e1-2270733d27d7\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/ovsdbserver-nb-0" Sep 30 06:34:36 crc kubenswrapper[4691]: I0930 06:34:36.358123 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/46482328-297b-40b1-83e1-2270733d27d7-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"46482328-297b-40b1-83e1-2270733d27d7\") " pod="openstack/ovsdbserver-nb-0" Sep 30 06:34:36 crc kubenswrapper[4691]: I0930 06:34:36.358641 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46482328-297b-40b1-83e1-2270733d27d7-config\") pod \"ovsdbserver-nb-0\" (UID: \"46482328-297b-40b1-83e1-2270733d27d7\") " pod="openstack/ovsdbserver-nb-0" Sep 30 06:34:36 crc kubenswrapper[4691]: I0930 06:34:36.358717 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/46482328-297b-40b1-83e1-2270733d27d7-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"46482328-297b-40b1-83e1-2270733d27d7\") " pod="openstack/ovsdbserver-nb-0" Sep 30 06:34:36 crc kubenswrapper[4691]: I0930 06:34:36.362332 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/46482328-297b-40b1-83e1-2270733d27d7-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"46482328-297b-40b1-83e1-2270733d27d7\") " pod="openstack/ovsdbserver-nb-0" Sep 30 06:34:36 crc kubenswrapper[4691]: I0930 06:34:36.363260 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46482328-297b-40b1-83e1-2270733d27d7-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"46482328-297b-40b1-83e1-2270733d27d7\") " pod="openstack/ovsdbserver-nb-0" Sep 30 06:34:36 crc kubenswrapper[4691]: I0930 06:34:36.372590 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/46482328-297b-40b1-83e1-2270733d27d7-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"46482328-297b-40b1-83e1-2270733d27d7\") " pod="openstack/ovsdbserver-nb-0" Sep 30 06:34:36 crc kubenswrapper[4691]: I0930 06:34:36.385844 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjmwl\" (UniqueName: \"kubernetes.io/projected/46482328-297b-40b1-83e1-2270733d27d7-kube-api-access-cjmwl\") pod \"ovsdbserver-nb-0\" (UID: \"46482328-297b-40b1-83e1-2270733d27d7\") " pod="openstack/ovsdbserver-nb-0" Sep 30 06:34:36 crc kubenswrapper[4691]: I0930 06:34:36.435080 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"ovsdbserver-nb-0\" (UID: \"46482328-297b-40b1-83e1-2270733d27d7\") " pod="openstack/ovsdbserver-nb-0" Sep 30 06:34:36 crc kubenswrapper[4691]: I0930 06:34:36.661305 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-notifications-server-0"] Sep 30 06:34:36 crc kubenswrapper[4691]: I0930 06:34:36.673760 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Sep 30 06:34:36 crc kubenswrapper[4691]: I0930 06:34:36.682445 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-546bf79c69-swsst"] Sep 30 06:34:36 crc kubenswrapper[4691]: I0930 06:34:36.693560 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6749c445df-fjmdf"] Sep 30 06:34:36 crc kubenswrapper[4691]: W0930 06:34:36.708163 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod871e32c8_9326_4b62_8a26_de0e8d3bc670.slice/crio-41f05a397f9a6bcb7302593bacb27e7971598339b22d60e49628a979d2fa85a7 WatchSource:0}: Error finding container 41f05a397f9a6bcb7302593bacb27e7971598339b22d60e49628a979d2fa85a7: Status 404 returned error can't find the container with id 41f05a397f9a6bcb7302593bacb27e7971598339b22d60e49628a979d2fa85a7 Sep 30 06:34:36 crc kubenswrapper[4691]: I0930 06:34:36.719836 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 06:34:36 crc kubenswrapper[4691]: I0930 06:34:36.729548 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b8d888b5-mf8ss" Sep 30 06:34:36 crc kubenswrapper[4691]: I0930 06:34:36.732244 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Sep 30 06:34:36 crc kubenswrapper[4691]: I0930 06:34:36.732672 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77479b959-nvfnl" Sep 30 06:34:36 crc kubenswrapper[4691]: I0930 06:34:36.735952 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-2wmg8"] Sep 30 06:34:36 crc kubenswrapper[4691]: I0930 06:34:36.741271 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Sep 30 06:34:36 crc kubenswrapper[4691]: W0930 06:34:36.747722 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod86397f09_76d1_4c35_a96a_5b6bde1e3574.slice/crio-8a02cd8eebd61ffb54d3df1a9aa306118cba711557654c4be0a94c438ed7e9b8 WatchSource:0}: Error finding container 8a02cd8eebd61ffb54d3df1a9aa306118cba711557654c4be0a94c438ed7e9b8: Status 404 returned error can't find the container with id 8a02cd8eebd61ffb54d3df1a9aa306118cba711557654c4be0a94c438ed7e9b8 Sep 30 06:34:36 crc kubenswrapper[4691]: I0930 06:34:36.768204 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 06:34:36 crc kubenswrapper[4691]: I0930 06:34:36.787967 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Sep 30 06:34:36 crc kubenswrapper[4691]: I0930 06:34:36.794182 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-99796b587-zzw5c"] Sep 30 06:34:36 crc kubenswrapper[4691]: I0930 06:34:36.822634 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 06:34:36 crc kubenswrapper[4691]: I0930 06:34:36.825352 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77479b959-nvfnl" event={"ID":"223c32e4-0e0d-4ec5-aed4-823e434a40d1","Type":"ContainerDied","Data":"79f19fd58421b503a44b17d30aeb6c46c3c9054a9b4f66a991e409f85d5d19ec"} Sep 30 06:34:36 crc kubenswrapper[4691]: I0930 06:34:36.825485 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77479b959-nvfnl" Sep 30 06:34:36 crc kubenswrapper[4691]: I0930 06:34:36.832124 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"5c310640-e561-4e1e-8f7c-046a7eec139d","Type":"ContainerStarted","Data":"9461fd09ea6e6fc1b22dce4d45e4bda14279f2d98e13b0a5c3f1500f23ea4813"} Sep 30 06:34:36 crc kubenswrapper[4691]: I0930 06:34:36.837743 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6b6ff7c5-6146-432e-a89c-fe95ac728e5c","Type":"ContainerStarted","Data":"20832adcf2ec865fc3b6cdc4c73cacb061fc89ca927a4726e31b491eef44ecf2"} Sep 30 06:34:36 crc kubenswrapper[4691]: I0930 06:34:36.866727 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23f40da9-1a7d-4b60-8c65-2f9e0cbc6cd0-config\") pod \"23f40da9-1a7d-4b60-8c65-2f9e0cbc6cd0\" (UID: \"23f40da9-1a7d-4b60-8c65-2f9e0cbc6cd0\") " Sep 30 06:34:36 crc kubenswrapper[4691]: I0930 06:34:36.866896 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cm9cc\" (UniqueName: \"kubernetes.io/projected/23f40da9-1a7d-4b60-8c65-2f9e0cbc6cd0-kube-api-access-cm9cc\") pod \"23f40da9-1a7d-4b60-8c65-2f9e0cbc6cd0\" (UID: \"23f40da9-1a7d-4b60-8c65-2f9e0cbc6cd0\") " Sep 30 06:34:36 crc kubenswrapper[4691]: I0930 06:34:36.866952 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rqh66\" (UniqueName: \"kubernetes.io/projected/223c32e4-0e0d-4ec5-aed4-823e434a40d1-kube-api-access-rqh66\") pod \"223c32e4-0e0d-4ec5-aed4-823e434a40d1\" (UID: \"223c32e4-0e0d-4ec5-aed4-823e434a40d1\") " Sep 30 06:34:36 crc kubenswrapper[4691]: I0930 06:34:36.867011 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/223c32e4-0e0d-4ec5-aed4-823e434a40d1-config\") pod \"223c32e4-0e0d-4ec5-aed4-823e434a40d1\" (UID: \"223c32e4-0e0d-4ec5-aed4-823e434a40d1\") " Sep 30 06:34:36 crc kubenswrapper[4691]: I0930 06:34:36.867029 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/23f40da9-1a7d-4b60-8c65-2f9e0cbc6cd0-dns-svc\") pod \"23f40da9-1a7d-4b60-8c65-2f9e0cbc6cd0\" (UID: \"23f40da9-1a7d-4b60-8c65-2f9e0cbc6cd0\") " Sep 30 06:34:36 crc kubenswrapper[4691]: I0930 06:34:36.867095 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"08782d24-2bd9-48d6-b9b2-12a2ad66e6d0","Type":"ContainerStarted","Data":"d465c41dcf4e80ac9ec3e1c12ddee42d88d32ac021e8560340a91aa434358414"} Sep 30 06:34:36 crc kubenswrapper[4691]: I0930 06:34:36.867869 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23f40da9-1a7d-4b60-8c65-2f9e0cbc6cd0-config" (OuterVolumeSpecName: "config") pod "23f40da9-1a7d-4b60-8c65-2f9e0cbc6cd0" (UID: "23f40da9-1a7d-4b60-8c65-2f9e0cbc6cd0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:34:36 crc kubenswrapper[4691]: I0930 06:34:36.868375 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/223c32e4-0e0d-4ec5-aed4-823e434a40d1-config" (OuterVolumeSpecName: "config") pod "223c32e4-0e0d-4ec5-aed4-823e434a40d1" (UID: "223c32e4-0e0d-4ec5-aed4-823e434a40d1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:34:36 crc kubenswrapper[4691]: I0930 06:34:36.868610 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/23f40da9-1a7d-4b60-8c65-2f9e0cbc6cd0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "23f40da9-1a7d-4b60-8c65-2f9e0cbc6cd0" (UID: "23f40da9-1a7d-4b60-8c65-2f9e0cbc6cd0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:34:36 crc kubenswrapper[4691]: I0930 06:34:36.870094 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-notifications-server-0" event={"ID":"d454968e-74c7-45e3-9608-e915973c7f25","Type":"ContainerStarted","Data":"59253b0a308d16f5de49e12c6c5835e4f0e986f8b250cf1b8ae08d8505301d8a"} Sep 30 06:34:36 crc kubenswrapper[4691]: I0930 06:34:36.880051 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/223c32e4-0e0d-4ec5-aed4-823e434a40d1-kube-api-access-rqh66" (OuterVolumeSpecName: "kube-api-access-rqh66") pod "223c32e4-0e0d-4ec5-aed4-823e434a40d1" (UID: "223c32e4-0e0d-4ec5-aed4-823e434a40d1"). InnerVolumeSpecName "kube-api-access-rqh66". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:34:36 crc kubenswrapper[4691]: I0930 06:34:36.880164 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23f40da9-1a7d-4b60-8c65-2f9e0cbc6cd0-kube-api-access-cm9cc" (OuterVolumeSpecName: "kube-api-access-cm9cc") pod "23f40da9-1a7d-4b60-8c65-2f9e0cbc6cd0" (UID: "23f40da9-1a7d-4b60-8c65-2f9e0cbc6cd0"). InnerVolumeSpecName "kube-api-access-cm9cc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:34:36 crc kubenswrapper[4691]: I0930 06:34:36.883180 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-99796b587-zzw5c" event={"ID":"0280bb49-f0ad-4039-bf11-3ec2b9b0aac9","Type":"ContainerStarted","Data":"f9fcd50e982b059f8c7235aa14c5aa54b89887de80c368780c5e8672bcb89bf2"} Sep 30 06:34:36 crc kubenswrapper[4691]: I0930 06:34:36.886210 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-2wmg8" event={"ID":"86397f09-76d1-4c35-a96a-5b6bde1e3574","Type":"ContainerStarted","Data":"8a02cd8eebd61ffb54d3df1a9aa306118cba711557654c4be0a94c438ed7e9b8"} Sep 30 06:34:36 crc kubenswrapper[4691]: I0930 06:34:36.908909 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-csq87"] Sep 30 06:34:36 crc kubenswrapper[4691]: I0930 06:34:36.910346 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b8d888b5-mf8ss" Sep 30 06:34:36 crc kubenswrapper[4691]: I0930 06:34:36.913522 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b8d888b5-mf8ss" event={"ID":"23f40da9-1a7d-4b60-8c65-2f9e0cbc6cd0","Type":"ContainerDied","Data":"98866d797f68156276376c01376c4ff38b4d40746b0c710cac07ebdf32cc6708"} Sep 30 06:34:36 crc kubenswrapper[4691]: I0930 06:34:36.927633 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d61470fc-16c1-40fb-bc8a-17517013b3be","Type":"ContainerStarted","Data":"5cecc09118a6085ae692055c9030c58e0231d43d1b32afd1ea79888a196c7da5"} Sep 30 06:34:36 crc kubenswrapper[4691]: I0930 06:34:36.935765 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-546bf79c69-swsst" event={"ID":"18a8bb07-0424-46c0-8405-c49878049ffc","Type":"ContainerStarted","Data":"d7d500f30ff76f5cdd50b9220761d39900d4ccd5bb4d28656f48a1215f0abda3"} Sep 30 06:34:36 crc kubenswrapper[4691]: W0930 06:34:36.936539 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod99bbc7fe_4a99_4f60_b840_8843790d6cb4.slice/crio-44186d28eb2997330eaeeb80c951a998458c36ebb6d1bd74b15b76c9c1d05cbb WatchSource:0}: Error finding container 44186d28eb2997330eaeeb80c951a998458c36ebb6d1bd74b15b76c9c1d05cbb: Status 404 returned error can't find the container with id 44186d28eb2997330eaeeb80c951a998458c36ebb6d1bd74b15b76c9c1d05cbb Sep 30 06:34:36 crc kubenswrapper[4691]: I0930 06:34:36.957553 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"fd5df9d9-7a0a-441c-b21d-92dff2af7376","Type":"ContainerStarted","Data":"cb51188eddc163015e354c12b278405a46bb52ee286d1a252cb509b6f9199636"} Sep 30 06:34:36 crc kubenswrapper[4691]: I0930 06:34:36.972867 4691 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/223c32e4-0e0d-4ec5-aed4-823e434a40d1-config\") on node \"crc\" DevicePath \"\"" Sep 30 06:34:36 crc kubenswrapper[4691]: I0930 06:34:36.972923 4691 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/23f40da9-1a7d-4b60-8c65-2f9e0cbc6cd0-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 06:34:36 crc kubenswrapper[4691]: I0930 06:34:36.972933 4691 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/23f40da9-1a7d-4b60-8c65-2f9e0cbc6cd0-config\") on node \"crc\" DevicePath \"\"" Sep 30 06:34:36 crc kubenswrapper[4691]: I0930 06:34:36.972942 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cm9cc\" (UniqueName: \"kubernetes.io/projected/23f40da9-1a7d-4b60-8c65-2f9e0cbc6cd0-kube-api-access-cm9cc\") on node \"crc\" DevicePath \"\"" Sep 30 06:34:36 crc kubenswrapper[4691]: I0930 06:34:36.972951 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rqh66\" (UniqueName: \"kubernetes.io/projected/223c32e4-0e0d-4ec5-aed4-823e434a40d1-kube-api-access-rqh66\") on node \"crc\" DevicePath \"\"" Sep 30 06:34:36 crc kubenswrapper[4691]: I0930 06:34:36.975648 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"605e7ef7-ef82-4a1f-98aa-15eef2a3f8c6","Type":"ContainerStarted","Data":"1d062b38820751b0f500993e7f7c1fe4e551618cad9fbb0a277ecdab21c66f0e"} Sep 30 06:34:36 crc kubenswrapper[4691]: I0930 06:34:36.979904 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6749c445df-fjmdf" event={"ID":"871e32c8-9326-4b62-8a26-de0e8d3bc670","Type":"ContainerStarted","Data":"41f05a397f9a6bcb7302593bacb27e7971598339b22d60e49628a979d2fa85a7"} Sep 30 06:34:36 crc kubenswrapper[4691]: I0930 06:34:36.980833 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b8d888b5-mf8ss"] Sep 30 06:34:37 crc kubenswrapper[4691]: I0930 06:34:37.012240 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8b8d888b5-mf8ss"] Sep 30 06:34:37 crc kubenswrapper[4691]: I0930 06:34:37.046964 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 06:34:37 crc kubenswrapper[4691]: I0930 06:34:37.121952 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Sep 30 06:34:37 crc kubenswrapper[4691]: I0930 06:34:37.239271 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23f40da9-1a7d-4b60-8c65-2f9e0cbc6cd0" path="/var/lib/kubelet/pods/23f40da9-1a7d-4b60-8c65-2f9e0cbc6cd0/volumes" Sep 30 06:34:37 crc kubenswrapper[4691]: W0930 06:34:37.287739 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod48c486cf_48da_4fd0_b450_d821ab6b2755.slice/crio-4c1d7199e9d25c1cd6758287553a2efa94e4991750a6778be4061549f21e37f0 WatchSource:0}: Error finding container 4c1d7199e9d25c1cd6758287553a2efa94e4991750a6778be4061549f21e37f0: Status 404 returned error can't find the container with id 4c1d7199e9d25c1cd6758287553a2efa94e4991750a6778be4061549f21e37f0 Sep 30 06:34:37 crc kubenswrapper[4691]: E0930 06:34:37.401205 4691 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ovsdbserver-nb,Image:38.102.83.30:5001/podified-master-centos10/openstack-ovn-nb-db-server:watcher_latest,Command:[/usr/bin/dumb-init],Args:[/usr/local/bin/container-scripts/setup.sh],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n75hb9h88h68fh595h66fh59h597h69hf8h58bh666h695hc4h687h57ch679h67bh57h567hdbh88h697h675h59dh5ch66dh56bh655h574h5d4h78q,ValueFrom:nil,},EnvVar{Name:OVN_LOGDIR,Value:/tmp,ValueFrom:nil,},EnvVar{Name:OVN_RUNDIR,Value:/tmp,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovndbcluster-nb-etc-ovn,ReadOnly:false,MountPath:/etc/ovn,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdb-rundir,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndb.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovndb.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovsdbserver-nb-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cjmwl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:&Lifecycle{PostStart:nil,PreStop:&LifecycleHandler{Exec:&ExecAction{Command:[/usr/local/bin/container-scripts/cleanup.sh],},HTTPGet:nil,TCPSocket:nil,Sleep:nil,},},TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pidof ovsdb-server],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:20,TerminationGracePeriodSeconds:nil,},ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovsdbserver-nb-0_openstack(46482328-297b-40b1-83e1-2270733d27d7): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 06:34:37 crc kubenswrapper[4691]: E0930 06:34:37.404143 4691 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:openstack-network-exporter,Image:quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified,Command:[/app/openstack-network-exporter],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:OPENSTACK_NETWORK_EXPORTER_YAML,Value:/etc/config/openstack-network-exporter.yaml,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovsdb-rundir,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-certs-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovnmetrics.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-certs-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/private/ovnmetrics.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-certs-tls-certs,ReadOnly:true,MountPath:/etc/pki/tls/certs/ovndbca.crt,SubPath:ca.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cjmwl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovsdbserver-nb-0_openstack(46482328-297b-40b1-83e1-2270733d27d7): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Sep 30 06:34:37 crc kubenswrapper[4691]: I0930 06:34:37.404507 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Sep 30 06:34:37 crc kubenswrapper[4691]: E0930 06:34:37.405846 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"ovsdbserver-nb\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"openstack-network-exporter\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack/ovsdbserver-nb-0" podUID="46482328-297b-40b1-83e1-2270733d27d7" Sep 30 06:34:37 crc kubenswrapper[4691]: I0930 06:34:37.536663 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77479b959-nvfnl"] Sep 30 06:34:37 crc kubenswrapper[4691]: I0930 06:34:37.548083 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77479b959-nvfnl"] Sep 30 06:34:37 crc kubenswrapper[4691]: I0930 06:34:37.991794 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"48c486cf-48da-4fd0-b450-d821ab6b2755","Type":"ContainerStarted","Data":"4c1d7199e9d25c1cd6758287553a2efa94e4991750a6778be4061549f21e37f0"} Sep 30 06:34:37 crc kubenswrapper[4691]: I0930 06:34:37.994693 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-csq87" event={"ID":"99bbc7fe-4a99-4f60-b840-8843790d6cb4","Type":"ContainerStarted","Data":"44186d28eb2997330eaeeb80c951a998458c36ebb6d1bd74b15b76c9c1d05cbb"} Sep 30 06:34:37 crc kubenswrapper[4691]: I0930 06:34:37.997769 4691 generic.go:334] "Generic (PLEG): container finished" podID="871e32c8-9326-4b62-8a26-de0e8d3bc670" containerID="46b06f5edfd77c83b92a54db9db0d111ad47b4d0f96cb664bd683cb91041801a" exitCode=0 Sep 30 06:34:37 crc kubenswrapper[4691]: I0930 06:34:37.997991 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6749c445df-fjmdf" event={"ID":"871e32c8-9326-4b62-8a26-de0e8d3bc670","Type":"ContainerDied","Data":"46b06f5edfd77c83b92a54db9db0d111ad47b4d0f96cb664bd683cb91041801a"} Sep 30 06:34:38 crc kubenswrapper[4691]: I0930 06:34:38.000490 4691 generic.go:334] "Generic (PLEG): container finished" podID="18a8bb07-0424-46c0-8405-c49878049ffc" containerID="485376ad41b18d5526a065fc309c8adbc7cb6fb6d3d6cd30cd09739899ce6cee" exitCode=0 Sep 30 06:34:38 crc kubenswrapper[4691]: I0930 06:34:38.000531 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-546bf79c69-swsst" event={"ID":"18a8bb07-0424-46c0-8405-c49878049ffc","Type":"ContainerDied","Data":"485376ad41b18d5526a065fc309c8adbc7cb6fb6d3d6cd30cd09739899ce6cee"} Sep 30 06:34:38 crc kubenswrapper[4691]: I0930 06:34:38.003417 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8ebf5adc-aea5-4d38-81e8-722c6f1db55c","Type":"ContainerStarted","Data":"debd78bd3131a8ae316d65287dd37311473cfbdf99704f67c29877839f077bd3"} Sep 30 06:34:38 crc kubenswrapper[4691]: I0930 06:34:38.005213 4691 generic.go:334] "Generic (PLEG): container finished" podID="0280bb49-f0ad-4039-bf11-3ec2b9b0aac9" containerID="2830774f42d6d1bda70767331e6c5c21e9bded13d1875d085eaa8772486a9935" exitCode=0 Sep 30 06:34:38 crc kubenswrapper[4691]: I0930 06:34:38.005259 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-99796b587-zzw5c" event={"ID":"0280bb49-f0ad-4039-bf11-3ec2b9b0aac9","Type":"ContainerDied","Data":"2830774f42d6d1bda70767331e6c5c21e9bded13d1875d085eaa8772486a9935"} Sep 30 06:34:38 crc kubenswrapper[4691]: I0930 06:34:38.006254 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"46482328-297b-40b1-83e1-2270733d27d7","Type":"ContainerStarted","Data":"5cec797aaf22df79a298739078e9fbabeb30c39305be07558d928c548b15c5d1"} Sep 30 06:34:38 crc kubenswrapper[4691]: E0930 06:34:38.009148 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"ovsdbserver-nb\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.30:5001/podified-master-centos10/openstack-ovn-nb-db-server:watcher_latest\\\"\", failed to \"StartContainer\" for \"openstack-network-exporter\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified\\\"\"]" pod="openstack/ovsdbserver-nb-0" podUID="46482328-297b-40b1-83e1-2270733d27d7" Sep 30 06:34:38 crc kubenswrapper[4691]: I0930 06:34:38.753532 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-546bf79c69-swsst" Sep 30 06:34:38 crc kubenswrapper[4691]: I0930 06:34:38.910244 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18a8bb07-0424-46c0-8405-c49878049ffc-dns-svc\") pod \"18a8bb07-0424-46c0-8405-c49878049ffc\" (UID: \"18a8bb07-0424-46c0-8405-c49878049ffc\") " Sep 30 06:34:38 crc kubenswrapper[4691]: I0930 06:34:38.910323 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18a8bb07-0424-46c0-8405-c49878049ffc-config\") pod \"18a8bb07-0424-46c0-8405-c49878049ffc\" (UID: \"18a8bb07-0424-46c0-8405-c49878049ffc\") " Sep 30 06:34:38 crc kubenswrapper[4691]: I0930 06:34:38.910429 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9s95\" (UniqueName: \"kubernetes.io/projected/18a8bb07-0424-46c0-8405-c49878049ffc-kube-api-access-g9s95\") pod \"18a8bb07-0424-46c0-8405-c49878049ffc\" (UID: \"18a8bb07-0424-46c0-8405-c49878049ffc\") " Sep 30 06:34:38 crc kubenswrapper[4691]: I0930 06:34:38.923443 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18a8bb07-0424-46c0-8405-c49878049ffc-kube-api-access-g9s95" (OuterVolumeSpecName: "kube-api-access-g9s95") pod "18a8bb07-0424-46c0-8405-c49878049ffc" (UID: "18a8bb07-0424-46c0-8405-c49878049ffc"). InnerVolumeSpecName "kube-api-access-g9s95". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:34:38 crc kubenswrapper[4691]: I0930 06:34:38.933114 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18a8bb07-0424-46c0-8405-c49878049ffc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "18a8bb07-0424-46c0-8405-c49878049ffc" (UID: "18a8bb07-0424-46c0-8405-c49878049ffc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:34:38 crc kubenswrapper[4691]: I0930 06:34:38.941516 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/18a8bb07-0424-46c0-8405-c49878049ffc-config" (OuterVolumeSpecName: "config") pod "18a8bb07-0424-46c0-8405-c49878049ffc" (UID: "18a8bb07-0424-46c0-8405-c49878049ffc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:34:39 crc kubenswrapper[4691]: I0930 06:34:39.015078 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9s95\" (UniqueName: \"kubernetes.io/projected/18a8bb07-0424-46c0-8405-c49878049ffc-kube-api-access-g9s95\") on node \"crc\" DevicePath \"\"" Sep 30 06:34:39 crc kubenswrapper[4691]: I0930 06:34:39.015107 4691 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/18a8bb07-0424-46c0-8405-c49878049ffc-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 06:34:39 crc kubenswrapper[4691]: I0930 06:34:39.015116 4691 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/18a8bb07-0424-46c0-8405-c49878049ffc-config\") on node \"crc\" DevicePath \"\"" Sep 30 06:34:39 crc kubenswrapper[4691]: I0930 06:34:39.015461 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-546bf79c69-swsst" Sep 30 06:34:39 crc kubenswrapper[4691]: I0930 06:34:39.015804 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-546bf79c69-swsst" event={"ID":"18a8bb07-0424-46c0-8405-c49878049ffc","Type":"ContainerDied","Data":"d7d500f30ff76f5cdd50b9220761d39900d4ccd5bb4d28656f48a1215f0abda3"} Sep 30 06:34:39 crc kubenswrapper[4691]: I0930 06:34:39.015838 4691 scope.go:117] "RemoveContainer" containerID="485376ad41b18d5526a065fc309c8adbc7cb6fb6d3d6cd30cd09739899ce6cee" Sep 30 06:34:39 crc kubenswrapper[4691]: E0930 06:34:39.017163 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"ovsdbserver-nb\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.30:5001/podified-master-centos10/openstack-ovn-nb-db-server:watcher_latest\\\"\", failed to \"StartContainer\" for \"openstack-network-exporter\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified\\\"\"]" pod="openstack/ovsdbserver-nb-0" podUID="46482328-297b-40b1-83e1-2270733d27d7" Sep 30 06:34:39 crc kubenswrapper[4691]: I0930 06:34:39.066788 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-546bf79c69-swsst"] Sep 30 06:34:39 crc kubenswrapper[4691]: I0930 06:34:39.072594 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-546bf79c69-swsst"] Sep 30 06:34:39 crc kubenswrapper[4691]: I0930 06:34:39.239704 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18a8bb07-0424-46c0-8405-c49878049ffc" path="/var/lib/kubelet/pods/18a8bb07-0424-46c0-8405-c49878049ffc/volumes" Sep 30 06:34:39 crc kubenswrapper[4691]: I0930 06:34:39.240596 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="223c32e4-0e0d-4ec5-aed4-823e434a40d1" path="/var/lib/kubelet/pods/223c32e4-0e0d-4ec5-aed4-823e434a40d1/volumes" Sep 30 06:34:45 crc kubenswrapper[4691]: I0930 06:34:45.069911 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6749c445df-fjmdf" event={"ID":"871e32c8-9326-4b62-8a26-de0e8d3bc670","Type":"ContainerStarted","Data":"d17ba032a6d235b38b24ab9fdd8b54353eddf8cfcfee745cd83dbcfa3ba34b53"} Sep 30 06:34:45 crc kubenswrapper[4691]: I0930 06:34:45.070414 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6749c445df-fjmdf" Sep 30 06:34:45 crc kubenswrapper[4691]: I0930 06:34:45.110367 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6749c445df-fjmdf" podStartSLOduration=23.029715424 podStartE2EDuration="23.110336242s" podCreationTimestamp="2025-09-30 06:34:22 +0000 UTC" firstStartedPulling="2025-09-30 06:34:36.743257928 +0000 UTC m=+920.218278968" lastFinishedPulling="2025-09-30 06:34:36.823878746 +0000 UTC m=+920.298899786" observedRunningTime="2025-09-30 06:34:45.096226391 +0000 UTC m=+928.571247471" watchObservedRunningTime="2025-09-30 06:34:45.110336242 +0000 UTC m=+928.585357312" Sep 30 06:34:46 crc kubenswrapper[4691]: I0930 06:34:46.081604 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"5c310640-e561-4e1e-8f7c-046a7eec139d","Type":"ContainerStarted","Data":"a49ae6e8a714aed6e21b8f2638818701a191b70a2bd87e9c225ec45b00669165"} Sep 30 06:34:46 crc kubenswrapper[4691]: I0930 06:34:46.083146 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Sep 30 06:34:46 crc kubenswrapper[4691]: I0930 06:34:46.090369 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6b6ff7c5-6146-432e-a89c-fe95ac728e5c","Type":"ContainerStarted","Data":"b57a0525ddc4af294fd81dcc684a4dec26e9b838458599245b9086835932afd7"} Sep 30 06:34:46 crc kubenswrapper[4691]: I0930 06:34:46.114825 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-notifications-server-0" event={"ID":"d454968e-74c7-45e3-9608-e915973c7f25","Type":"ContainerStarted","Data":"dedcda10d9f4175b32470999d84df603f681ecdb704564dc1ee4851215f61575"} Sep 30 06:34:46 crc kubenswrapper[4691]: I0930 06:34:46.149837 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-99796b587-zzw5c" event={"ID":"0280bb49-f0ad-4039-bf11-3ec2b9b0aac9","Type":"ContainerStarted","Data":"947001d6e734caaec87d675da0071e7404dc8d0f9728f83924a11f3ea27f1db4"} Sep 30 06:34:46 crc kubenswrapper[4691]: I0930 06:34:46.149877 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-99796b587-zzw5c" Sep 30 06:34:46 crc kubenswrapper[4691]: I0930 06:34:46.150597 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=12.355895683 podStartE2EDuration="19.150577274s" podCreationTimestamp="2025-09-30 06:34:27 +0000 UTC" firstStartedPulling="2025-09-30 06:34:36.664499979 +0000 UTC m=+920.139521029" lastFinishedPulling="2025-09-30 06:34:43.45918158 +0000 UTC m=+926.934202620" observedRunningTime="2025-09-30 06:34:46.124622195 +0000 UTC m=+929.599643245" watchObservedRunningTime="2025-09-30 06:34:46.150577274 +0000 UTC m=+929.625598314" Sep 30 06:34:46 crc kubenswrapper[4691]: I0930 06:34:46.233021 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-99796b587-zzw5c" podStartSLOduration=24.233005551 podStartE2EDuration="24.233005551s" podCreationTimestamp="2025-09-30 06:34:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:34:46.231351689 +0000 UTC m=+929.706372749" watchObservedRunningTime="2025-09-30 06:34:46.233005551 +0000 UTC m=+929.708026591" Sep 30 06:34:47 crc kubenswrapper[4691]: I0930 06:34:47.160481 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"08782d24-2bd9-48d6-b9b2-12a2ad66e6d0","Type":"ContainerStarted","Data":"9fc663a7aa533d21d4ce05e408a9f76d42deaa758dc33218b48a88f7b1944308"} Sep 30 06:34:47 crc kubenswrapper[4691]: I0930 06:34:47.163481 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"fd5df9d9-7a0a-441c-b21d-92dff2af7376","Type":"ContainerStarted","Data":"23bbfb3a62c8abdf4f6d70e59cf97b8f36b8de2806338d340cd77a5da638b089"} Sep 30 06:34:47 crc kubenswrapper[4691]: I0930 06:34:47.165707 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"605e7ef7-ef82-4a1f-98aa-15eef2a3f8c6","Type":"ContainerStarted","Data":"745b8c886a9994527800a2a4783110a9ce6274df3adb380b23b030ed803eb0ad"} Sep 30 06:34:47 crc kubenswrapper[4691]: I0930 06:34:47.168152 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-2wmg8" event={"ID":"86397f09-76d1-4c35-a96a-5b6bde1e3574","Type":"ContainerStarted","Data":"b906c5ad646eed94603a17e2cc7a923ef528538a0992248cf1d4b2091d62ba5e"} Sep 30 06:34:47 crc kubenswrapper[4691]: I0930 06:34:47.168321 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-2wmg8" Sep 30 06:34:47 crc kubenswrapper[4691]: I0930 06:34:47.170166 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"48c486cf-48da-4fd0-b450-d821ab6b2755","Type":"ContainerStarted","Data":"b42339dedf6f8bd92d1a308e03c4fba69cd1ac303f5c8a65c58d570b410b186a"} Sep 30 06:34:47 crc kubenswrapper[4691]: I0930 06:34:47.172471 4691 generic.go:334] "Generic (PLEG): container finished" podID="99bbc7fe-4a99-4f60-b840-8843790d6cb4" containerID="39fe100720bd1fbbe4d3e6937900ad3a405914ea4524759eb48678def31beacc" exitCode=0 Sep 30 06:34:47 crc kubenswrapper[4691]: I0930 06:34:47.172605 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-csq87" event={"ID":"99bbc7fe-4a99-4f60-b840-8843790d6cb4","Type":"ContainerDied","Data":"39fe100720bd1fbbe4d3e6937900ad3a405914ea4524759eb48678def31beacc"} Sep 30 06:34:47 crc kubenswrapper[4691]: I0930 06:34:47.175690 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8ebf5adc-aea5-4d38-81e8-722c6f1db55c","Type":"ContainerStarted","Data":"0e42ca2b32b2f0526f514321e8691f5ee74dc750ee820e044519805d5ce11fb0"} Sep 30 06:34:47 crc kubenswrapper[4691]: I0930 06:34:47.263380 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=10.082262352 podStartE2EDuration="18.263357568s" podCreationTimestamp="2025-09-30 06:34:29 +0000 UTC" firstStartedPulling="2025-09-30 06:34:37.11695091 +0000 UTC m=+920.591971950" lastFinishedPulling="2025-09-30 06:34:45.298046106 +0000 UTC m=+928.773067166" observedRunningTime="2025-09-30 06:34:47.254561236 +0000 UTC m=+930.729582296" watchObservedRunningTime="2025-09-30 06:34:47.263357568 +0000 UTC m=+930.738378618" Sep 30 06:34:47 crc kubenswrapper[4691]: I0930 06:34:47.297459 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-2wmg8" podStartSLOduration=7.984649915 podStartE2EDuration="15.297442618s" podCreationTimestamp="2025-09-30 06:34:32 +0000 UTC" firstStartedPulling="2025-09-30 06:34:36.777169293 +0000 UTC m=+920.252190333" lastFinishedPulling="2025-09-30 06:34:44.089961986 +0000 UTC m=+927.564983036" observedRunningTime="2025-09-30 06:34:47.293795631 +0000 UTC m=+930.768816681" watchObservedRunningTime="2025-09-30 06:34:47.297442618 +0000 UTC m=+930.772463658" Sep 30 06:34:48 crc kubenswrapper[4691]: I0930 06:34:48.196574 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-csq87" event={"ID":"99bbc7fe-4a99-4f60-b840-8843790d6cb4","Type":"ContainerStarted","Data":"2c0fa9035e9fdb82b3101fa7d8a35b0a336ef6f69e68dc22b19706b63c6a82b3"} Sep 30 06:34:48 crc kubenswrapper[4691]: I0930 06:34:48.196867 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-csq87" event={"ID":"99bbc7fe-4a99-4f60-b840-8843790d6cb4","Type":"ContainerStarted","Data":"3c228656673f8241440cde49f2660cd8e936fdfd10c4f9ed21bb7d07a1a1df82"} Sep 30 06:34:48 crc kubenswrapper[4691]: I0930 06:34:48.196905 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-csq87" Sep 30 06:34:48 crc kubenswrapper[4691]: I0930 06:34:48.196918 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-csq87" Sep 30 06:34:48 crc kubenswrapper[4691]: I0930 06:34:48.199235 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d61470fc-16c1-40fb-bc8a-17517013b3be","Type":"ContainerStarted","Data":"abb28118e166575249eb35ac69a1322f28ea82829f07caec702e4fd6cda8d793"} Sep 30 06:34:48 crc kubenswrapper[4691]: I0930 06:34:48.199450 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Sep 30 06:34:48 crc kubenswrapper[4691]: I0930 06:34:48.221271 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-csq87" podStartSLOduration=9.640481929 podStartE2EDuration="16.221249637s" podCreationTimestamp="2025-09-30 06:34:32 +0000 UTC" firstStartedPulling="2025-09-30 06:34:36.947720308 +0000 UTC m=+920.422741338" lastFinishedPulling="2025-09-30 06:34:43.528487956 +0000 UTC m=+927.003509046" observedRunningTime="2025-09-30 06:34:48.217644381 +0000 UTC m=+931.692665451" watchObservedRunningTime="2025-09-30 06:34:48.221249637 +0000 UTC m=+931.696270677" Sep 30 06:34:51 crc kubenswrapper[4691]: I0930 06:34:51.236034 4691 generic.go:334] "Generic (PLEG): container finished" podID="08782d24-2bd9-48d6-b9b2-12a2ad66e6d0" containerID="9fc663a7aa533d21d4ce05e408a9f76d42deaa758dc33218b48a88f7b1944308" exitCode=0 Sep 30 06:34:51 crc kubenswrapper[4691]: I0930 06:34:51.239220 4691 generic.go:334] "Generic (PLEG): container finished" podID="605e7ef7-ef82-4a1f-98aa-15eef2a3f8c6" containerID="745b8c886a9994527800a2a4783110a9ce6274df3adb380b23b030ed803eb0ad" exitCode=0 Sep 30 06:34:51 crc kubenswrapper[4691]: I0930 06:34:51.245399 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"08782d24-2bd9-48d6-b9b2-12a2ad66e6d0","Type":"ContainerDied","Data":"9fc663a7aa533d21d4ce05e408a9f76d42deaa758dc33218b48a88f7b1944308"} Sep 30 06:34:51 crc kubenswrapper[4691]: I0930 06:34:51.245496 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"605e7ef7-ef82-4a1f-98aa-15eef2a3f8c6","Type":"ContainerDied","Data":"745b8c886a9994527800a2a4783110a9ce6274df3adb380b23b030ed803eb0ad"} Sep 30 06:34:51 crc kubenswrapper[4691]: I0930 06:34:51.245525 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"48c486cf-48da-4fd0-b450-d821ab6b2755","Type":"ContainerStarted","Data":"7055036246b792745b51f5dd34ad5cc2598da12dbdf16451cca3bbafc808a276"} Sep 30 06:34:51 crc kubenswrapper[4691]: I0930 06:34:51.280799 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=5.072847215 podStartE2EDuration="18.280776647s" podCreationTimestamp="2025-09-30 06:34:33 +0000 UTC" firstStartedPulling="2025-09-30 06:34:37.290747689 +0000 UTC m=+920.765768729" lastFinishedPulling="2025-09-30 06:34:50.498677111 +0000 UTC m=+933.973698161" observedRunningTime="2025-09-30 06:34:51.274402053 +0000 UTC m=+934.749423143" watchObservedRunningTime="2025-09-30 06:34:51.280776647 +0000 UTC m=+934.755797717" Sep 30 06:34:52 crc kubenswrapper[4691]: I0930 06:34:52.260320 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"08782d24-2bd9-48d6-b9b2-12a2ad66e6d0","Type":"ContainerStarted","Data":"63c53e29777fb53a58364944d0fbfc40fb92d7a855db61ca787ab57cf39f834c"} Sep 30 06:34:52 crc kubenswrapper[4691]: I0930 06:34:52.264644 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"605e7ef7-ef82-4a1f-98aa-15eef2a3f8c6","Type":"ContainerStarted","Data":"ad65ae3db8a3f139fba437af23b61fa654ca674e5e7430dd1b665f59aad01255"} Sep 30 06:34:52 crc kubenswrapper[4691]: I0930 06:34:52.292807 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=19.980287543 podStartE2EDuration="27.292778677s" podCreationTimestamp="2025-09-30 06:34:25 +0000 UTC" firstStartedPulling="2025-09-30 06:34:36.777843573 +0000 UTC m=+920.252864613" lastFinishedPulling="2025-09-30 06:34:44.090334667 +0000 UTC m=+927.565355747" observedRunningTime="2025-09-30 06:34:52.288458778 +0000 UTC m=+935.763479858" watchObservedRunningTime="2025-09-30 06:34:52.292778677 +0000 UTC m=+935.767799767" Sep 30 06:34:52 crc kubenswrapper[4691]: I0930 06:34:52.328459 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=18.992745952 podStartE2EDuration="26.328433897s" podCreationTimestamp="2025-09-30 06:34:26 +0000 UTC" firstStartedPulling="2025-09-30 06:34:36.779855078 +0000 UTC m=+920.254876118" lastFinishedPulling="2025-09-30 06:34:44.115543013 +0000 UTC m=+927.590564063" observedRunningTime="2025-09-30 06:34:52.319325785 +0000 UTC m=+935.794346865" watchObservedRunningTime="2025-09-30 06:34:52.328433897 +0000 UTC m=+935.803454957" Sep 30 06:34:52 crc kubenswrapper[4691]: I0930 06:34:52.615728 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Sep 30 06:34:52 crc kubenswrapper[4691]: I0930 06:34:52.671040 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Sep 30 06:34:52 crc kubenswrapper[4691]: I0930 06:34:52.736189 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-99796b587-zzw5c" Sep 30 06:34:52 crc kubenswrapper[4691]: I0930 06:34:52.779504 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Sep 30 06:34:53 crc kubenswrapper[4691]: I0930 06:34:53.032125 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6749c445df-fjmdf" Sep 30 06:34:53 crc kubenswrapper[4691]: I0930 06:34:53.089835 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-99796b587-zzw5c"] Sep 30 06:34:53 crc kubenswrapper[4691]: I0930 06:34:53.274209 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"46482328-297b-40b1-83e1-2270733d27d7","Type":"ContainerStarted","Data":"4858f3f9cab3af7958684fe6fa7e0f5741e39952ab7b8169453061426352d59f"} Sep 30 06:34:53 crc kubenswrapper[4691]: I0930 06:34:53.274459 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Sep 30 06:34:53 crc kubenswrapper[4691]: I0930 06:34:53.274469 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"46482328-297b-40b1-83e1-2270733d27d7","Type":"ContainerStarted","Data":"fae26ae9054f75d616517cb204b067c78c6cc21e34ae9edf3169caac48c88a72"} Sep 30 06:34:53 crc kubenswrapper[4691]: I0930 06:34:53.275113 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-99796b587-zzw5c" podUID="0280bb49-f0ad-4039-bf11-3ec2b9b0aac9" containerName="dnsmasq-dns" containerID="cri-o://947001d6e734caaec87d675da0071e7404dc8d0f9728f83924a11f3ea27f1db4" gracePeriod=10 Sep 30 06:34:53 crc kubenswrapper[4691]: I0930 06:34:53.301662 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=3.010509461 podStartE2EDuration="18.301646075s" podCreationTimestamp="2025-09-30 06:34:35 +0000 UTC" firstStartedPulling="2025-09-30 06:34:37.401074268 +0000 UTC m=+920.876095308" lastFinishedPulling="2025-09-30 06:34:52.692210842 +0000 UTC m=+936.167231922" observedRunningTime="2025-09-30 06:34:53.29616159 +0000 UTC m=+936.771182640" watchObservedRunningTime="2025-09-30 06:34:53.301646075 +0000 UTC m=+936.776667115" Sep 30 06:34:53 crc kubenswrapper[4691]: I0930 06:34:53.322502 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Sep 30 06:34:53 crc kubenswrapper[4691]: I0930 06:34:53.608085 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78bf94944f-kk2mm"] Sep 30 06:34:53 crc kubenswrapper[4691]: E0930 06:34:53.608404 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18a8bb07-0424-46c0-8405-c49878049ffc" containerName="init" Sep 30 06:34:53 crc kubenswrapper[4691]: I0930 06:34:53.608421 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="18a8bb07-0424-46c0-8405-c49878049ffc" containerName="init" Sep 30 06:34:53 crc kubenswrapper[4691]: I0930 06:34:53.608570 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="18a8bb07-0424-46c0-8405-c49878049ffc" containerName="init" Sep 30 06:34:53 crc kubenswrapper[4691]: I0930 06:34:53.610593 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78bf94944f-kk2mm" Sep 30 06:34:53 crc kubenswrapper[4691]: I0930 06:34:53.612585 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Sep 30 06:34:53 crc kubenswrapper[4691]: I0930 06:34:53.634784 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78bf94944f-kk2mm"] Sep 30 06:34:53 crc kubenswrapper[4691]: I0930 06:34:53.672496 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-8h9p5"] Sep 30 06:34:53 crc kubenswrapper[4691]: I0930 06:34:53.673464 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-8h9p5" Sep 30 06:34:53 crc kubenswrapper[4691]: I0930 06:34:53.675902 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Sep 30 06:34:53 crc kubenswrapper[4691]: I0930 06:34:53.686978 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-8h9p5"] Sep 30 06:34:53 crc kubenswrapper[4691]: I0930 06:34:53.716547 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/46186ef0-fe53-4d5a-b096-d641504a12da-ovsdbserver-sb\") pod \"dnsmasq-dns-78bf94944f-kk2mm\" (UID: \"46186ef0-fe53-4d5a-b096-d641504a12da\") " pod="openstack/dnsmasq-dns-78bf94944f-kk2mm" Sep 30 06:34:53 crc kubenswrapper[4691]: I0930 06:34:53.716643 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/46186ef0-fe53-4d5a-b096-d641504a12da-dns-svc\") pod \"dnsmasq-dns-78bf94944f-kk2mm\" (UID: \"46186ef0-fe53-4d5a-b096-d641504a12da\") " pod="openstack/dnsmasq-dns-78bf94944f-kk2mm" Sep 30 06:34:53 crc kubenswrapper[4691]: I0930 06:34:53.716727 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54l5h\" (UniqueName: \"kubernetes.io/projected/46186ef0-fe53-4d5a-b096-d641504a12da-kube-api-access-54l5h\") pod \"dnsmasq-dns-78bf94944f-kk2mm\" (UID: \"46186ef0-fe53-4d5a-b096-d641504a12da\") " pod="openstack/dnsmasq-dns-78bf94944f-kk2mm" Sep 30 06:34:53 crc kubenswrapper[4691]: I0930 06:34:53.716752 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46186ef0-fe53-4d5a-b096-d641504a12da-config\") pod \"dnsmasq-dns-78bf94944f-kk2mm\" (UID: \"46186ef0-fe53-4d5a-b096-d641504a12da\") " pod="openstack/dnsmasq-dns-78bf94944f-kk2mm" Sep 30 06:34:53 crc kubenswrapper[4691]: I0930 06:34:53.818019 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/46186ef0-fe53-4d5a-b096-d641504a12da-ovsdbserver-sb\") pod \"dnsmasq-dns-78bf94944f-kk2mm\" (UID: \"46186ef0-fe53-4d5a-b096-d641504a12da\") " pod="openstack/dnsmasq-dns-78bf94944f-kk2mm" Sep 30 06:34:53 crc kubenswrapper[4691]: I0930 06:34:53.818082 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24478def-6fea-4596-b4e3-fd3abee81a62-config\") pod \"ovn-controller-metrics-8h9p5\" (UID: \"24478def-6fea-4596-b4e3-fd3abee81a62\") " pod="openstack/ovn-controller-metrics-8h9p5" Sep 30 06:34:53 crc kubenswrapper[4691]: I0930 06:34:53.818109 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/46186ef0-fe53-4d5a-b096-d641504a12da-dns-svc\") pod \"dnsmasq-dns-78bf94944f-kk2mm\" (UID: \"46186ef0-fe53-4d5a-b096-d641504a12da\") " pod="openstack/dnsmasq-dns-78bf94944f-kk2mm" Sep 30 06:34:53 crc kubenswrapper[4691]: I0930 06:34:53.818160 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54l5h\" (UniqueName: \"kubernetes.io/projected/46186ef0-fe53-4d5a-b096-d641504a12da-kube-api-access-54l5h\") pod \"dnsmasq-dns-78bf94944f-kk2mm\" (UID: \"46186ef0-fe53-4d5a-b096-d641504a12da\") " pod="openstack/dnsmasq-dns-78bf94944f-kk2mm" Sep 30 06:34:53 crc kubenswrapper[4691]: I0930 06:34:53.818181 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46186ef0-fe53-4d5a-b096-d641504a12da-config\") pod \"dnsmasq-dns-78bf94944f-kk2mm\" (UID: \"46186ef0-fe53-4d5a-b096-d641504a12da\") " pod="openstack/dnsmasq-dns-78bf94944f-kk2mm" Sep 30 06:34:53 crc kubenswrapper[4691]: I0930 06:34:53.818201 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crpms\" (UniqueName: \"kubernetes.io/projected/24478def-6fea-4596-b4e3-fd3abee81a62-kube-api-access-crpms\") pod \"ovn-controller-metrics-8h9p5\" (UID: \"24478def-6fea-4596-b4e3-fd3abee81a62\") " pod="openstack/ovn-controller-metrics-8h9p5" Sep 30 06:34:53 crc kubenswrapper[4691]: I0930 06:34:53.818233 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/24478def-6fea-4596-b4e3-fd3abee81a62-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-8h9p5\" (UID: \"24478def-6fea-4596-b4e3-fd3abee81a62\") " pod="openstack/ovn-controller-metrics-8h9p5" Sep 30 06:34:53 crc kubenswrapper[4691]: I0930 06:34:53.818264 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/24478def-6fea-4596-b4e3-fd3abee81a62-ovs-rundir\") pod \"ovn-controller-metrics-8h9p5\" (UID: \"24478def-6fea-4596-b4e3-fd3abee81a62\") " pod="openstack/ovn-controller-metrics-8h9p5" Sep 30 06:34:53 crc kubenswrapper[4691]: I0930 06:34:53.818288 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24478def-6fea-4596-b4e3-fd3abee81a62-combined-ca-bundle\") pod \"ovn-controller-metrics-8h9p5\" (UID: \"24478def-6fea-4596-b4e3-fd3abee81a62\") " pod="openstack/ovn-controller-metrics-8h9p5" Sep 30 06:34:53 crc kubenswrapper[4691]: I0930 06:34:53.818325 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/24478def-6fea-4596-b4e3-fd3abee81a62-ovn-rundir\") pod \"ovn-controller-metrics-8h9p5\" (UID: \"24478def-6fea-4596-b4e3-fd3abee81a62\") " pod="openstack/ovn-controller-metrics-8h9p5" Sep 30 06:34:53 crc kubenswrapper[4691]: I0930 06:34:53.819083 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/46186ef0-fe53-4d5a-b096-d641504a12da-dns-svc\") pod \"dnsmasq-dns-78bf94944f-kk2mm\" (UID: \"46186ef0-fe53-4d5a-b096-d641504a12da\") " pod="openstack/dnsmasq-dns-78bf94944f-kk2mm" Sep 30 06:34:53 crc kubenswrapper[4691]: I0930 06:34:53.819082 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/46186ef0-fe53-4d5a-b096-d641504a12da-ovsdbserver-sb\") pod \"dnsmasq-dns-78bf94944f-kk2mm\" (UID: \"46186ef0-fe53-4d5a-b096-d641504a12da\") " pod="openstack/dnsmasq-dns-78bf94944f-kk2mm" Sep 30 06:34:53 crc kubenswrapper[4691]: I0930 06:34:53.819189 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46186ef0-fe53-4d5a-b096-d641504a12da-config\") pod \"dnsmasq-dns-78bf94944f-kk2mm\" (UID: \"46186ef0-fe53-4d5a-b096-d641504a12da\") " pod="openstack/dnsmasq-dns-78bf94944f-kk2mm" Sep 30 06:34:53 crc kubenswrapper[4691]: I0930 06:34:53.838422 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54l5h\" (UniqueName: \"kubernetes.io/projected/46186ef0-fe53-4d5a-b096-d641504a12da-kube-api-access-54l5h\") pod \"dnsmasq-dns-78bf94944f-kk2mm\" (UID: \"46186ef0-fe53-4d5a-b096-d641504a12da\") " pod="openstack/dnsmasq-dns-78bf94944f-kk2mm" Sep 30 06:34:53 crc kubenswrapper[4691]: I0930 06:34:53.882479 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-99796b587-zzw5c" Sep 30 06:34:53 crc kubenswrapper[4691]: I0930 06:34:53.918151 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78bf94944f-kk2mm"] Sep 30 06:34:53 crc kubenswrapper[4691]: I0930 06:34:53.919405 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24478def-6fea-4596-b4e3-fd3abee81a62-config\") pod \"ovn-controller-metrics-8h9p5\" (UID: \"24478def-6fea-4596-b4e3-fd3abee81a62\") " pod="openstack/ovn-controller-metrics-8h9p5" Sep 30 06:34:53 crc kubenswrapper[4691]: I0930 06:34:53.919458 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78bf94944f-kk2mm" Sep 30 06:34:53 crc kubenswrapper[4691]: I0930 06:34:53.919483 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crpms\" (UniqueName: \"kubernetes.io/projected/24478def-6fea-4596-b4e3-fd3abee81a62-kube-api-access-crpms\") pod \"ovn-controller-metrics-8h9p5\" (UID: \"24478def-6fea-4596-b4e3-fd3abee81a62\") " pod="openstack/ovn-controller-metrics-8h9p5" Sep 30 06:34:53 crc kubenswrapper[4691]: I0930 06:34:53.919513 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/24478def-6fea-4596-b4e3-fd3abee81a62-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-8h9p5\" (UID: \"24478def-6fea-4596-b4e3-fd3abee81a62\") " pod="openstack/ovn-controller-metrics-8h9p5" Sep 30 06:34:53 crc kubenswrapper[4691]: I0930 06:34:53.919541 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/24478def-6fea-4596-b4e3-fd3abee81a62-ovs-rundir\") pod \"ovn-controller-metrics-8h9p5\" (UID: \"24478def-6fea-4596-b4e3-fd3abee81a62\") " pod="openstack/ovn-controller-metrics-8h9p5" Sep 30 06:34:53 crc kubenswrapper[4691]: I0930 06:34:53.919560 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24478def-6fea-4596-b4e3-fd3abee81a62-combined-ca-bundle\") pod \"ovn-controller-metrics-8h9p5\" (UID: \"24478def-6fea-4596-b4e3-fd3abee81a62\") " pod="openstack/ovn-controller-metrics-8h9p5" Sep 30 06:34:53 crc kubenswrapper[4691]: I0930 06:34:53.919597 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/24478def-6fea-4596-b4e3-fd3abee81a62-ovn-rundir\") pod \"ovn-controller-metrics-8h9p5\" (UID: \"24478def-6fea-4596-b4e3-fd3abee81a62\") " pod="openstack/ovn-controller-metrics-8h9p5" Sep 30 06:34:53 crc kubenswrapper[4691]: I0930 06:34:53.919857 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/24478def-6fea-4596-b4e3-fd3abee81a62-ovn-rundir\") pod \"ovn-controller-metrics-8h9p5\" (UID: \"24478def-6fea-4596-b4e3-fd3abee81a62\") " pod="openstack/ovn-controller-metrics-8h9p5" Sep 30 06:34:53 crc kubenswrapper[4691]: I0930 06:34:53.920402 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/24478def-6fea-4596-b4e3-fd3abee81a62-ovs-rundir\") pod \"ovn-controller-metrics-8h9p5\" (UID: \"24478def-6fea-4596-b4e3-fd3abee81a62\") " pod="openstack/ovn-controller-metrics-8h9p5" Sep 30 06:34:53 crc kubenswrapper[4691]: I0930 06:34:53.924903 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/24478def-6fea-4596-b4e3-fd3abee81a62-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-8h9p5\" (UID: \"24478def-6fea-4596-b4e3-fd3abee81a62\") " pod="openstack/ovn-controller-metrics-8h9p5" Sep 30 06:34:53 crc kubenswrapper[4691]: I0930 06:34:53.931047 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24478def-6fea-4596-b4e3-fd3abee81a62-config\") pod \"ovn-controller-metrics-8h9p5\" (UID: \"24478def-6fea-4596-b4e3-fd3abee81a62\") " pod="openstack/ovn-controller-metrics-8h9p5" Sep 30 06:34:53 crc kubenswrapper[4691]: I0930 06:34:53.939728 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24478def-6fea-4596-b4e3-fd3abee81a62-combined-ca-bundle\") pod \"ovn-controller-metrics-8h9p5\" (UID: \"24478def-6fea-4596-b4e3-fd3abee81a62\") " pod="openstack/ovn-controller-metrics-8h9p5" Sep 30 06:34:53 crc kubenswrapper[4691]: I0930 06:34:53.949439 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f6d45dc65-mp6g5"] Sep 30 06:34:53 crc kubenswrapper[4691]: E0930 06:34:53.949801 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0280bb49-f0ad-4039-bf11-3ec2b9b0aac9" containerName="dnsmasq-dns" Sep 30 06:34:53 crc kubenswrapper[4691]: I0930 06:34:53.949833 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="0280bb49-f0ad-4039-bf11-3ec2b9b0aac9" containerName="dnsmasq-dns" Sep 30 06:34:53 crc kubenswrapper[4691]: E0930 06:34:53.949846 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0280bb49-f0ad-4039-bf11-3ec2b9b0aac9" containerName="init" Sep 30 06:34:53 crc kubenswrapper[4691]: I0930 06:34:53.949853 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="0280bb49-f0ad-4039-bf11-3ec2b9b0aac9" containerName="init" Sep 30 06:34:53 crc kubenswrapper[4691]: I0930 06:34:53.950006 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="0280bb49-f0ad-4039-bf11-3ec2b9b0aac9" containerName="dnsmasq-dns" Sep 30 06:34:53 crc kubenswrapper[4691]: I0930 06:34:53.950787 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f6d45dc65-mp6g5" Sep 30 06:34:53 crc kubenswrapper[4691]: I0930 06:34:53.958520 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Sep 30 06:34:53 crc kubenswrapper[4691]: I0930 06:34:53.961119 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f6d45dc65-mp6g5"] Sep 30 06:34:53 crc kubenswrapper[4691]: I0930 06:34:53.961295 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crpms\" (UniqueName: \"kubernetes.io/projected/24478def-6fea-4596-b4e3-fd3abee81a62-kube-api-access-crpms\") pod \"ovn-controller-metrics-8h9p5\" (UID: \"24478def-6fea-4596-b4e3-fd3abee81a62\") " pod="openstack/ovn-controller-metrics-8h9p5" Sep 30 06:34:54 crc kubenswrapper[4691]: I0930 06:34:54.020206 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0280bb49-f0ad-4039-bf11-3ec2b9b0aac9-config\") pod \"0280bb49-f0ad-4039-bf11-3ec2b9b0aac9\" (UID: \"0280bb49-f0ad-4039-bf11-3ec2b9b0aac9\") " Sep 30 06:34:54 crc kubenswrapper[4691]: I0930 06:34:54.020353 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0280bb49-f0ad-4039-bf11-3ec2b9b0aac9-dns-svc\") pod \"0280bb49-f0ad-4039-bf11-3ec2b9b0aac9\" (UID: \"0280bb49-f0ad-4039-bf11-3ec2b9b0aac9\") " Sep 30 06:34:54 crc kubenswrapper[4691]: I0930 06:34:54.020382 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hfl6n\" (UniqueName: \"kubernetes.io/projected/0280bb49-f0ad-4039-bf11-3ec2b9b0aac9-kube-api-access-hfl6n\") pod \"0280bb49-f0ad-4039-bf11-3ec2b9b0aac9\" (UID: \"0280bb49-f0ad-4039-bf11-3ec2b9b0aac9\") " Sep 30 06:34:54 crc kubenswrapper[4691]: I0930 06:34:54.020587 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c64a95f7-003e-4e31-a314-2e24c0f624b3-ovsdbserver-nb\") pod \"dnsmasq-dns-5f6d45dc65-mp6g5\" (UID: \"c64a95f7-003e-4e31-a314-2e24c0f624b3\") " pod="openstack/dnsmasq-dns-5f6d45dc65-mp6g5" Sep 30 06:34:54 crc kubenswrapper[4691]: I0930 06:34:54.020618 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c64a95f7-003e-4e31-a314-2e24c0f624b3-config\") pod \"dnsmasq-dns-5f6d45dc65-mp6g5\" (UID: \"c64a95f7-003e-4e31-a314-2e24c0f624b3\") " pod="openstack/dnsmasq-dns-5f6d45dc65-mp6g5" Sep 30 06:34:54 crc kubenswrapper[4691]: I0930 06:34:54.020640 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c64a95f7-003e-4e31-a314-2e24c0f624b3-ovsdbserver-sb\") pod \"dnsmasq-dns-5f6d45dc65-mp6g5\" (UID: \"c64a95f7-003e-4e31-a314-2e24c0f624b3\") " pod="openstack/dnsmasq-dns-5f6d45dc65-mp6g5" Sep 30 06:34:54 crc kubenswrapper[4691]: I0930 06:34:54.020677 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49jv6\" (UniqueName: \"kubernetes.io/projected/c64a95f7-003e-4e31-a314-2e24c0f624b3-kube-api-access-49jv6\") pod \"dnsmasq-dns-5f6d45dc65-mp6g5\" (UID: \"c64a95f7-003e-4e31-a314-2e24c0f624b3\") " pod="openstack/dnsmasq-dns-5f6d45dc65-mp6g5" Sep 30 06:34:54 crc kubenswrapper[4691]: I0930 06:34:54.020754 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c64a95f7-003e-4e31-a314-2e24c0f624b3-dns-svc\") pod \"dnsmasq-dns-5f6d45dc65-mp6g5\" (UID: \"c64a95f7-003e-4e31-a314-2e24c0f624b3\") " pod="openstack/dnsmasq-dns-5f6d45dc65-mp6g5" Sep 30 06:34:54 crc kubenswrapper[4691]: I0930 06:34:54.023442 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0280bb49-f0ad-4039-bf11-3ec2b9b0aac9-kube-api-access-hfl6n" (OuterVolumeSpecName: "kube-api-access-hfl6n") pod "0280bb49-f0ad-4039-bf11-3ec2b9b0aac9" (UID: "0280bb49-f0ad-4039-bf11-3ec2b9b0aac9"). InnerVolumeSpecName "kube-api-access-hfl6n". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:34:54 crc kubenswrapper[4691]: I0930 06:34:54.037919 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-8h9p5" Sep 30 06:34:54 crc kubenswrapper[4691]: I0930 06:34:54.062619 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0280bb49-f0ad-4039-bf11-3ec2b9b0aac9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0280bb49-f0ad-4039-bf11-3ec2b9b0aac9" (UID: "0280bb49-f0ad-4039-bf11-3ec2b9b0aac9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:34:54 crc kubenswrapper[4691]: I0930 06:34:54.072414 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0280bb49-f0ad-4039-bf11-3ec2b9b0aac9-config" (OuterVolumeSpecName: "config") pod "0280bb49-f0ad-4039-bf11-3ec2b9b0aac9" (UID: "0280bb49-f0ad-4039-bf11-3ec2b9b0aac9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:34:54 crc kubenswrapper[4691]: I0930 06:34:54.122296 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c64a95f7-003e-4e31-a314-2e24c0f624b3-ovsdbserver-nb\") pod \"dnsmasq-dns-5f6d45dc65-mp6g5\" (UID: \"c64a95f7-003e-4e31-a314-2e24c0f624b3\") " pod="openstack/dnsmasq-dns-5f6d45dc65-mp6g5" Sep 30 06:34:54 crc kubenswrapper[4691]: I0930 06:34:54.122331 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c64a95f7-003e-4e31-a314-2e24c0f624b3-config\") pod \"dnsmasq-dns-5f6d45dc65-mp6g5\" (UID: \"c64a95f7-003e-4e31-a314-2e24c0f624b3\") " pod="openstack/dnsmasq-dns-5f6d45dc65-mp6g5" Sep 30 06:34:54 crc kubenswrapper[4691]: I0930 06:34:54.122360 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c64a95f7-003e-4e31-a314-2e24c0f624b3-ovsdbserver-sb\") pod \"dnsmasq-dns-5f6d45dc65-mp6g5\" (UID: \"c64a95f7-003e-4e31-a314-2e24c0f624b3\") " pod="openstack/dnsmasq-dns-5f6d45dc65-mp6g5" Sep 30 06:34:54 crc kubenswrapper[4691]: I0930 06:34:54.122418 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49jv6\" (UniqueName: \"kubernetes.io/projected/c64a95f7-003e-4e31-a314-2e24c0f624b3-kube-api-access-49jv6\") pod \"dnsmasq-dns-5f6d45dc65-mp6g5\" (UID: \"c64a95f7-003e-4e31-a314-2e24c0f624b3\") " pod="openstack/dnsmasq-dns-5f6d45dc65-mp6g5" Sep 30 06:34:54 crc kubenswrapper[4691]: I0930 06:34:54.122455 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c64a95f7-003e-4e31-a314-2e24c0f624b3-dns-svc\") pod \"dnsmasq-dns-5f6d45dc65-mp6g5\" (UID: \"c64a95f7-003e-4e31-a314-2e24c0f624b3\") " pod="openstack/dnsmasq-dns-5f6d45dc65-mp6g5" Sep 30 06:34:54 crc kubenswrapper[4691]: I0930 06:34:54.122543 4691 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0280bb49-f0ad-4039-bf11-3ec2b9b0aac9-config\") on node \"crc\" DevicePath \"\"" Sep 30 06:34:54 crc kubenswrapper[4691]: I0930 06:34:54.122553 4691 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0280bb49-f0ad-4039-bf11-3ec2b9b0aac9-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 06:34:54 crc kubenswrapper[4691]: I0930 06:34:54.122561 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hfl6n\" (UniqueName: \"kubernetes.io/projected/0280bb49-f0ad-4039-bf11-3ec2b9b0aac9-kube-api-access-hfl6n\") on node \"crc\" DevicePath \"\"" Sep 30 06:34:54 crc kubenswrapper[4691]: I0930 06:34:54.123275 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c64a95f7-003e-4e31-a314-2e24c0f624b3-dns-svc\") pod \"dnsmasq-dns-5f6d45dc65-mp6g5\" (UID: \"c64a95f7-003e-4e31-a314-2e24c0f624b3\") " pod="openstack/dnsmasq-dns-5f6d45dc65-mp6g5" Sep 30 06:34:54 crc kubenswrapper[4691]: I0930 06:34:54.124415 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c64a95f7-003e-4e31-a314-2e24c0f624b3-config\") pod \"dnsmasq-dns-5f6d45dc65-mp6g5\" (UID: \"c64a95f7-003e-4e31-a314-2e24c0f624b3\") " pod="openstack/dnsmasq-dns-5f6d45dc65-mp6g5" Sep 30 06:34:54 crc kubenswrapper[4691]: I0930 06:34:54.125200 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c64a95f7-003e-4e31-a314-2e24c0f624b3-ovsdbserver-sb\") pod \"dnsmasq-dns-5f6d45dc65-mp6g5\" (UID: \"c64a95f7-003e-4e31-a314-2e24c0f624b3\") " pod="openstack/dnsmasq-dns-5f6d45dc65-mp6g5" Sep 30 06:34:54 crc kubenswrapper[4691]: I0930 06:34:54.125243 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c64a95f7-003e-4e31-a314-2e24c0f624b3-ovsdbserver-nb\") pod \"dnsmasq-dns-5f6d45dc65-mp6g5\" (UID: \"c64a95f7-003e-4e31-a314-2e24c0f624b3\") " pod="openstack/dnsmasq-dns-5f6d45dc65-mp6g5" Sep 30 06:34:54 crc kubenswrapper[4691]: I0930 06:34:54.141633 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49jv6\" (UniqueName: \"kubernetes.io/projected/c64a95f7-003e-4e31-a314-2e24c0f624b3-kube-api-access-49jv6\") pod \"dnsmasq-dns-5f6d45dc65-mp6g5\" (UID: \"c64a95f7-003e-4e31-a314-2e24c0f624b3\") " pod="openstack/dnsmasq-dns-5f6d45dc65-mp6g5" Sep 30 06:34:54 crc kubenswrapper[4691]: I0930 06:34:54.283422 4691 generic.go:334] "Generic (PLEG): container finished" podID="0280bb49-f0ad-4039-bf11-3ec2b9b0aac9" containerID="947001d6e734caaec87d675da0071e7404dc8d0f9728f83924a11f3ea27f1db4" exitCode=0 Sep 30 06:34:54 crc kubenswrapper[4691]: I0930 06:34:54.284426 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-99796b587-zzw5c" Sep 30 06:34:54 crc kubenswrapper[4691]: I0930 06:34:54.297353 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-99796b587-zzw5c" event={"ID":"0280bb49-f0ad-4039-bf11-3ec2b9b0aac9","Type":"ContainerDied","Data":"947001d6e734caaec87d675da0071e7404dc8d0f9728f83924a11f3ea27f1db4"} Sep 30 06:34:54 crc kubenswrapper[4691]: I0930 06:34:54.297417 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-99796b587-zzw5c" event={"ID":"0280bb49-f0ad-4039-bf11-3ec2b9b0aac9","Type":"ContainerDied","Data":"f9fcd50e982b059f8c7235aa14c5aa54b89887de80c368780c5e8672bcb89bf2"} Sep 30 06:34:54 crc kubenswrapper[4691]: I0930 06:34:54.297437 4691 scope.go:117] "RemoveContainer" containerID="947001d6e734caaec87d675da0071e7404dc8d0f9728f83924a11f3ea27f1db4" Sep 30 06:34:54 crc kubenswrapper[4691]: I0930 06:34:54.315750 4691 scope.go:117] "RemoveContainer" containerID="2830774f42d6d1bda70767331e6c5c21e9bded13d1875d085eaa8772486a9935" Sep 30 06:34:54 crc kubenswrapper[4691]: I0930 06:34:54.319474 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f6d45dc65-mp6g5" Sep 30 06:34:54 crc kubenswrapper[4691]: I0930 06:34:54.339006 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-99796b587-zzw5c"] Sep 30 06:34:54 crc kubenswrapper[4691]: I0930 06:34:54.349720 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-99796b587-zzw5c"] Sep 30 06:34:54 crc kubenswrapper[4691]: I0930 06:34:54.355798 4691 scope.go:117] "RemoveContainer" containerID="947001d6e734caaec87d675da0071e7404dc8d0f9728f83924a11f3ea27f1db4" Sep 30 06:34:54 crc kubenswrapper[4691]: E0930 06:34:54.356306 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"947001d6e734caaec87d675da0071e7404dc8d0f9728f83924a11f3ea27f1db4\": container with ID starting with 947001d6e734caaec87d675da0071e7404dc8d0f9728f83924a11f3ea27f1db4 not found: ID does not exist" containerID="947001d6e734caaec87d675da0071e7404dc8d0f9728f83924a11f3ea27f1db4" Sep 30 06:34:54 crc kubenswrapper[4691]: I0930 06:34:54.356337 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"947001d6e734caaec87d675da0071e7404dc8d0f9728f83924a11f3ea27f1db4"} err="failed to get container status \"947001d6e734caaec87d675da0071e7404dc8d0f9728f83924a11f3ea27f1db4\": rpc error: code = NotFound desc = could not find container \"947001d6e734caaec87d675da0071e7404dc8d0f9728f83924a11f3ea27f1db4\": container with ID starting with 947001d6e734caaec87d675da0071e7404dc8d0f9728f83924a11f3ea27f1db4 not found: ID does not exist" Sep 30 06:34:54 crc kubenswrapper[4691]: I0930 06:34:54.356374 4691 scope.go:117] "RemoveContainer" containerID="2830774f42d6d1bda70767331e6c5c21e9bded13d1875d085eaa8772486a9935" Sep 30 06:34:54 crc kubenswrapper[4691]: E0930 06:34:54.356639 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2830774f42d6d1bda70767331e6c5c21e9bded13d1875d085eaa8772486a9935\": container with ID starting with 2830774f42d6d1bda70767331e6c5c21e9bded13d1875d085eaa8772486a9935 not found: ID does not exist" containerID="2830774f42d6d1bda70767331e6c5c21e9bded13d1875d085eaa8772486a9935" Sep 30 06:34:54 crc kubenswrapper[4691]: I0930 06:34:54.356664 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2830774f42d6d1bda70767331e6c5c21e9bded13d1875d085eaa8772486a9935"} err="failed to get container status \"2830774f42d6d1bda70767331e6c5c21e9bded13d1875d085eaa8772486a9935\": rpc error: code = NotFound desc = could not find container \"2830774f42d6d1bda70767331e6c5c21e9bded13d1875d085eaa8772486a9935\": container with ID starting with 2830774f42d6d1bda70767331e6c5c21e9bded13d1875d085eaa8772486a9935 not found: ID does not exist" Sep 30 06:34:54 crc kubenswrapper[4691]: I0930 06:34:54.410669 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78bf94944f-kk2mm"] Sep 30 06:34:54 crc kubenswrapper[4691]: I0930 06:34:54.529196 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-8h9p5"] Sep 30 06:34:54 crc kubenswrapper[4691]: W0930 06:34:54.547853 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24478def_6fea_4596_b4e3_fd3abee81a62.slice/crio-e1e78f852f2dff757716266f128c4b01c10f3b91c2d3b94c8fad139c47730b0c WatchSource:0}: Error finding container e1e78f852f2dff757716266f128c4b01c10f3b91c2d3b94c8fad139c47730b0c: Status 404 returned error can't find the container with id e1e78f852f2dff757716266f128c4b01c10f3b91c2d3b94c8fad139c47730b0c Sep 30 06:34:54 crc kubenswrapper[4691]: I0930 06:34:54.733026 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Sep 30 06:34:54 crc kubenswrapper[4691]: W0930 06:34:54.809659 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc64a95f7_003e_4e31_a314_2e24c0f624b3.slice/crio-1aa1c51402f26577f0144d735d33178ab6eb303c3f1fda377de207f0ec47126b WatchSource:0}: Error finding container 1aa1c51402f26577f0144d735d33178ab6eb303c3f1fda377de207f0ec47126b: Status 404 returned error can't find the container with id 1aa1c51402f26577f0144d735d33178ab6eb303c3f1fda377de207f0ec47126b Sep 30 06:34:54 crc kubenswrapper[4691]: I0930 06:34:54.811363 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f6d45dc65-mp6g5"] Sep 30 06:34:55 crc kubenswrapper[4691]: I0930 06:34:55.239236 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0280bb49-f0ad-4039-bf11-3ec2b9b0aac9" path="/var/lib/kubelet/pods/0280bb49-f0ad-4039-bf11-3ec2b9b0aac9/volumes" Sep 30 06:34:55 crc kubenswrapper[4691]: I0930 06:34:55.296811 4691 generic.go:334] "Generic (PLEG): container finished" podID="c64a95f7-003e-4e31-a314-2e24c0f624b3" containerID="d929be105bebeef02d777eadecd61adefc0f9aca20230b25454cc3a1305153e3" exitCode=0 Sep 30 06:34:55 crc kubenswrapper[4691]: I0930 06:34:55.296873 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f6d45dc65-mp6g5" event={"ID":"c64a95f7-003e-4e31-a314-2e24c0f624b3","Type":"ContainerDied","Data":"d929be105bebeef02d777eadecd61adefc0f9aca20230b25454cc3a1305153e3"} Sep 30 06:34:55 crc kubenswrapper[4691]: I0930 06:34:55.296963 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f6d45dc65-mp6g5" event={"ID":"c64a95f7-003e-4e31-a314-2e24c0f624b3","Type":"ContainerStarted","Data":"1aa1c51402f26577f0144d735d33178ab6eb303c3f1fda377de207f0ec47126b"} Sep 30 06:34:55 crc kubenswrapper[4691]: I0930 06:34:55.298901 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-8h9p5" event={"ID":"24478def-6fea-4596-b4e3-fd3abee81a62","Type":"ContainerStarted","Data":"8065c1414aa4f9216df7951853aa710c72ee944caf9bc338fad3de4b13e1fa0d"} Sep 30 06:34:55 crc kubenswrapper[4691]: I0930 06:34:55.298950 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-8h9p5" event={"ID":"24478def-6fea-4596-b4e3-fd3abee81a62","Type":"ContainerStarted","Data":"e1e78f852f2dff757716266f128c4b01c10f3b91c2d3b94c8fad139c47730b0c"} Sep 30 06:34:55 crc kubenswrapper[4691]: I0930 06:34:55.300671 4691 generic.go:334] "Generic (PLEG): container finished" podID="46186ef0-fe53-4d5a-b096-d641504a12da" containerID="7401334e342c045339ee11747e67b0f376f7bf1ca133e8207f9eca70df825a48" exitCode=0 Sep 30 06:34:55 crc kubenswrapper[4691]: I0930 06:34:55.300772 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78bf94944f-kk2mm" event={"ID":"46186ef0-fe53-4d5a-b096-d641504a12da","Type":"ContainerDied","Data":"7401334e342c045339ee11747e67b0f376f7bf1ca133e8207f9eca70df825a48"} Sep 30 06:34:55 crc kubenswrapper[4691]: I0930 06:34:55.300812 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78bf94944f-kk2mm" event={"ID":"46186ef0-fe53-4d5a-b096-d641504a12da","Type":"ContainerStarted","Data":"4aae629081eefb4c01238eb73f2e79b304ad4c3987a16de7322e33c8f4b93fdc"} Sep 30 06:34:55 crc kubenswrapper[4691]: I0930 06:34:55.388253 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-8h9p5" podStartSLOduration=2.388233426 podStartE2EDuration="2.388233426s" podCreationTimestamp="2025-09-30 06:34:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:34:55.378822005 +0000 UTC m=+938.853843075" watchObservedRunningTime="2025-09-30 06:34:55.388233426 +0000 UTC m=+938.863254506" Sep 30 06:34:55 crc kubenswrapper[4691]: I0930 06:34:55.708597 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78bf94944f-kk2mm" Sep 30 06:34:55 crc kubenswrapper[4691]: I0930 06:34:55.760747 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46186ef0-fe53-4d5a-b096-d641504a12da-config\") pod \"46186ef0-fe53-4d5a-b096-d641504a12da\" (UID: \"46186ef0-fe53-4d5a-b096-d641504a12da\") " Sep 30 06:34:55 crc kubenswrapper[4691]: I0930 06:34:55.761039 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/46186ef0-fe53-4d5a-b096-d641504a12da-ovsdbserver-sb\") pod \"46186ef0-fe53-4d5a-b096-d641504a12da\" (UID: \"46186ef0-fe53-4d5a-b096-d641504a12da\") " Sep 30 06:34:55 crc kubenswrapper[4691]: I0930 06:34:55.761102 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54l5h\" (UniqueName: \"kubernetes.io/projected/46186ef0-fe53-4d5a-b096-d641504a12da-kube-api-access-54l5h\") pod \"46186ef0-fe53-4d5a-b096-d641504a12da\" (UID: \"46186ef0-fe53-4d5a-b096-d641504a12da\") " Sep 30 06:34:55 crc kubenswrapper[4691]: I0930 06:34:55.761187 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/46186ef0-fe53-4d5a-b096-d641504a12da-dns-svc\") pod \"46186ef0-fe53-4d5a-b096-d641504a12da\" (UID: \"46186ef0-fe53-4d5a-b096-d641504a12da\") " Sep 30 06:34:55 crc kubenswrapper[4691]: I0930 06:34:55.771060 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46186ef0-fe53-4d5a-b096-d641504a12da-kube-api-access-54l5h" (OuterVolumeSpecName: "kube-api-access-54l5h") pod "46186ef0-fe53-4d5a-b096-d641504a12da" (UID: "46186ef0-fe53-4d5a-b096-d641504a12da"). InnerVolumeSpecName "kube-api-access-54l5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:34:55 crc kubenswrapper[4691]: I0930 06:34:55.780713 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46186ef0-fe53-4d5a-b096-d641504a12da-config" (OuterVolumeSpecName: "config") pod "46186ef0-fe53-4d5a-b096-d641504a12da" (UID: "46186ef0-fe53-4d5a-b096-d641504a12da"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:34:55 crc kubenswrapper[4691]: I0930 06:34:55.796779 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46186ef0-fe53-4d5a-b096-d641504a12da-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "46186ef0-fe53-4d5a-b096-d641504a12da" (UID: "46186ef0-fe53-4d5a-b096-d641504a12da"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:34:55 crc kubenswrapper[4691]: I0930 06:34:55.815468 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/46186ef0-fe53-4d5a-b096-d641504a12da-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "46186ef0-fe53-4d5a-b096-d641504a12da" (UID: "46186ef0-fe53-4d5a-b096-d641504a12da"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:34:55 crc kubenswrapper[4691]: I0930 06:34:55.862605 4691 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/46186ef0-fe53-4d5a-b096-d641504a12da-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 06:34:55 crc kubenswrapper[4691]: I0930 06:34:55.862637 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54l5h\" (UniqueName: \"kubernetes.io/projected/46186ef0-fe53-4d5a-b096-d641504a12da-kube-api-access-54l5h\") on node \"crc\" DevicePath \"\"" Sep 30 06:34:55 crc kubenswrapper[4691]: I0930 06:34:55.862648 4691 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/46186ef0-fe53-4d5a-b096-d641504a12da-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 06:34:55 crc kubenswrapper[4691]: I0930 06:34:55.862657 4691 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46186ef0-fe53-4d5a-b096-d641504a12da-config\") on node \"crc\" DevicePath \"\"" Sep 30 06:34:56 crc kubenswrapper[4691]: I0930 06:34:56.314938 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f6d45dc65-mp6g5" event={"ID":"c64a95f7-003e-4e31-a314-2e24c0f624b3","Type":"ContainerStarted","Data":"48238f48eb050d30c90944ccc11cf7b1027115c6d7832d8d9db91a521129348e"} Sep 30 06:34:56 crc kubenswrapper[4691]: I0930 06:34:56.319215 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78bf94944f-kk2mm" event={"ID":"46186ef0-fe53-4d5a-b096-d641504a12da","Type":"ContainerDied","Data":"4aae629081eefb4c01238eb73f2e79b304ad4c3987a16de7322e33c8f4b93fdc"} Sep 30 06:34:56 crc kubenswrapper[4691]: I0930 06:34:56.319265 4691 scope.go:117] "RemoveContainer" containerID="7401334e342c045339ee11747e67b0f376f7bf1ca133e8207f9eca70df825a48" Sep 30 06:34:56 crc kubenswrapper[4691]: I0930 06:34:56.319449 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78bf94944f-kk2mm" Sep 30 06:34:56 crc kubenswrapper[4691]: I0930 06:34:56.332598 4691 generic.go:334] "Generic (PLEG): container finished" podID="d61470fc-16c1-40fb-bc8a-17517013b3be" containerID="abb28118e166575249eb35ac69a1322f28ea82829f07caec702e4fd6cda8d793" exitCode=0 Sep 30 06:34:56 crc kubenswrapper[4691]: I0930 06:34:56.333059 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d61470fc-16c1-40fb-bc8a-17517013b3be","Type":"ContainerDied","Data":"abb28118e166575249eb35ac69a1322f28ea82829f07caec702e4fd6cda8d793"} Sep 30 06:34:56 crc kubenswrapper[4691]: I0930 06:34:56.345466 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5f6d45dc65-mp6g5" podStartSLOduration=3.345446563 podStartE2EDuration="3.345446563s" podCreationTimestamp="2025-09-30 06:34:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:34:56.334747911 +0000 UTC m=+939.809768961" watchObservedRunningTime="2025-09-30 06:34:56.345446563 +0000 UTC m=+939.820467593" Sep 30 06:34:56 crc kubenswrapper[4691]: I0930 06:34:56.428236 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78bf94944f-kk2mm"] Sep 30 06:34:56 crc kubenswrapper[4691]: I0930 06:34:56.434114 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78bf94944f-kk2mm"] Sep 30 06:34:56 crc kubenswrapper[4691]: I0930 06:34:56.733680 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Sep 30 06:34:56 crc kubenswrapper[4691]: I0930 06:34:56.935912 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Sep 30 06:34:56 crc kubenswrapper[4691]: I0930 06:34:56.936050 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Sep 30 06:34:57 crc kubenswrapper[4691]: I0930 06:34:57.184689 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Sep 30 06:34:57 crc kubenswrapper[4691]: I0930 06:34:57.241948 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46186ef0-fe53-4d5a-b096-d641504a12da" path="/var/lib/kubelet/pods/46186ef0-fe53-4d5a-b096-d641504a12da/volumes" Sep 30 06:34:57 crc kubenswrapper[4691]: I0930 06:34:57.343801 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f6d45dc65-mp6g5" Sep 30 06:34:57 crc kubenswrapper[4691]: I0930 06:34:57.428292 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Sep 30 06:34:57 crc kubenswrapper[4691]: I0930 06:34:57.428339 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Sep 30 06:34:57 crc kubenswrapper[4691]: I0930 06:34:57.432474 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Sep 30 06:34:57 crc kubenswrapper[4691]: I0930 06:34:57.812610 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Sep 30 06:34:57 crc kubenswrapper[4691]: I0930 06:34:57.856315 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Sep 30 06:34:57 crc kubenswrapper[4691]: E0930 06:34:57.954794 4691 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.46:43026->38.102.83.46:45179: write tcp 38.102.83.46:43026->38.102.83.46:45179: write: broken pipe Sep 30 06:34:58 crc kubenswrapper[4691]: I0930 06:34:58.089609 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Sep 30 06:34:58 crc kubenswrapper[4691]: E0930 06:34:58.089921 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46186ef0-fe53-4d5a-b096-d641504a12da" containerName="init" Sep 30 06:34:58 crc kubenswrapper[4691]: I0930 06:34:58.089938 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="46186ef0-fe53-4d5a-b096-d641504a12da" containerName="init" Sep 30 06:34:58 crc kubenswrapper[4691]: I0930 06:34:58.090098 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="46186ef0-fe53-4d5a-b096-d641504a12da" containerName="init" Sep 30 06:34:58 crc kubenswrapper[4691]: I0930 06:34:58.090849 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Sep 30 06:34:58 crc kubenswrapper[4691]: I0930 06:34:58.093743 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Sep 30 06:34:58 crc kubenswrapper[4691]: I0930 06:34:58.094782 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Sep 30 06:34:58 crc kubenswrapper[4691]: I0930 06:34:58.095241 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Sep 30 06:34:58 crc kubenswrapper[4691]: I0930 06:34:58.095985 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-7crnp" Sep 30 06:34:58 crc kubenswrapper[4691]: I0930 06:34:58.108488 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Sep 30 06:34:58 crc kubenswrapper[4691]: I0930 06:34:58.212876 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e59b91c6-5922-4272-9c75-4e139031c87b-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"e59b91c6-5922-4272-9c75-4e139031c87b\") " pod="openstack/ovn-northd-0" Sep 30 06:34:58 crc kubenswrapper[4691]: I0930 06:34:58.212943 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e59b91c6-5922-4272-9c75-4e139031c87b-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"e59b91c6-5922-4272-9c75-4e139031c87b\") " pod="openstack/ovn-northd-0" Sep 30 06:34:58 crc kubenswrapper[4691]: I0930 06:34:58.213078 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e59b91c6-5922-4272-9c75-4e139031c87b-scripts\") pod \"ovn-northd-0\" (UID: \"e59b91c6-5922-4272-9c75-4e139031c87b\") " pod="openstack/ovn-northd-0" Sep 30 06:34:58 crc kubenswrapper[4691]: I0930 06:34:58.213105 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/e59b91c6-5922-4272-9c75-4e139031c87b-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"e59b91c6-5922-4272-9c75-4e139031c87b\") " pod="openstack/ovn-northd-0" Sep 30 06:34:58 crc kubenswrapper[4691]: I0930 06:34:58.213155 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e59b91c6-5922-4272-9c75-4e139031c87b-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"e59b91c6-5922-4272-9c75-4e139031c87b\") " pod="openstack/ovn-northd-0" Sep 30 06:34:58 crc kubenswrapper[4691]: I0930 06:34:58.213205 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpgjf\" (UniqueName: \"kubernetes.io/projected/e59b91c6-5922-4272-9c75-4e139031c87b-kube-api-access-lpgjf\") pod \"ovn-northd-0\" (UID: \"e59b91c6-5922-4272-9c75-4e139031c87b\") " pod="openstack/ovn-northd-0" Sep 30 06:34:58 crc kubenswrapper[4691]: I0930 06:34:58.213236 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e59b91c6-5922-4272-9c75-4e139031c87b-config\") pod \"ovn-northd-0\" (UID: \"e59b91c6-5922-4272-9c75-4e139031c87b\") " pod="openstack/ovn-northd-0" Sep 30 06:34:58 crc kubenswrapper[4691]: I0930 06:34:58.314712 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e59b91c6-5922-4272-9c75-4e139031c87b-scripts\") pod \"ovn-northd-0\" (UID: \"e59b91c6-5922-4272-9c75-4e139031c87b\") " pod="openstack/ovn-northd-0" Sep 30 06:34:58 crc kubenswrapper[4691]: I0930 06:34:58.314748 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/e59b91c6-5922-4272-9c75-4e139031c87b-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"e59b91c6-5922-4272-9c75-4e139031c87b\") " pod="openstack/ovn-northd-0" Sep 30 06:34:58 crc kubenswrapper[4691]: I0930 06:34:58.314801 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e59b91c6-5922-4272-9c75-4e139031c87b-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"e59b91c6-5922-4272-9c75-4e139031c87b\") " pod="openstack/ovn-northd-0" Sep 30 06:34:58 crc kubenswrapper[4691]: I0930 06:34:58.314848 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpgjf\" (UniqueName: \"kubernetes.io/projected/e59b91c6-5922-4272-9c75-4e139031c87b-kube-api-access-lpgjf\") pod \"ovn-northd-0\" (UID: \"e59b91c6-5922-4272-9c75-4e139031c87b\") " pod="openstack/ovn-northd-0" Sep 30 06:34:58 crc kubenswrapper[4691]: I0930 06:34:58.314874 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e59b91c6-5922-4272-9c75-4e139031c87b-config\") pod \"ovn-northd-0\" (UID: \"e59b91c6-5922-4272-9c75-4e139031c87b\") " pod="openstack/ovn-northd-0" Sep 30 06:34:58 crc kubenswrapper[4691]: I0930 06:34:58.314915 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e59b91c6-5922-4272-9c75-4e139031c87b-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"e59b91c6-5922-4272-9c75-4e139031c87b\") " pod="openstack/ovn-northd-0" Sep 30 06:34:58 crc kubenswrapper[4691]: I0930 06:34:58.314938 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e59b91c6-5922-4272-9c75-4e139031c87b-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"e59b91c6-5922-4272-9c75-4e139031c87b\") " pod="openstack/ovn-northd-0" Sep 30 06:34:58 crc kubenswrapper[4691]: I0930 06:34:58.315312 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/e59b91c6-5922-4272-9c75-4e139031c87b-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"e59b91c6-5922-4272-9c75-4e139031c87b\") " pod="openstack/ovn-northd-0" Sep 30 06:34:58 crc kubenswrapper[4691]: I0930 06:34:58.316166 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e59b91c6-5922-4272-9c75-4e139031c87b-scripts\") pod \"ovn-northd-0\" (UID: \"e59b91c6-5922-4272-9c75-4e139031c87b\") " pod="openstack/ovn-northd-0" Sep 30 06:34:58 crc kubenswrapper[4691]: I0930 06:34:58.316432 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e59b91c6-5922-4272-9c75-4e139031c87b-config\") pod \"ovn-northd-0\" (UID: \"e59b91c6-5922-4272-9c75-4e139031c87b\") " pod="openstack/ovn-northd-0" Sep 30 06:34:58 crc kubenswrapper[4691]: I0930 06:34:58.319966 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/e59b91c6-5922-4272-9c75-4e139031c87b-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"e59b91c6-5922-4272-9c75-4e139031c87b\") " pod="openstack/ovn-northd-0" Sep 30 06:34:58 crc kubenswrapper[4691]: I0930 06:34:58.320505 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/e59b91c6-5922-4272-9c75-4e139031c87b-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"e59b91c6-5922-4272-9c75-4e139031c87b\") " pod="openstack/ovn-northd-0" Sep 30 06:34:58 crc kubenswrapper[4691]: I0930 06:34:58.328595 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e59b91c6-5922-4272-9c75-4e139031c87b-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"e59b91c6-5922-4272-9c75-4e139031c87b\") " pod="openstack/ovn-northd-0" Sep 30 06:34:58 crc kubenswrapper[4691]: I0930 06:34:58.335211 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpgjf\" (UniqueName: \"kubernetes.io/projected/e59b91c6-5922-4272-9c75-4e139031c87b-kube-api-access-lpgjf\") pod \"ovn-northd-0\" (UID: \"e59b91c6-5922-4272-9c75-4e139031c87b\") " pod="openstack/ovn-northd-0" Sep 30 06:34:58 crc kubenswrapper[4691]: I0930 06:34:58.413799 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Sep 30 06:34:58 crc kubenswrapper[4691]: I0930 06:34:58.891798 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Sep 30 06:34:58 crc kubenswrapper[4691]: W0930 06:34:58.902593 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode59b91c6_5922_4272_9c75_4e139031c87b.slice/crio-846bc0b01d74b549b57f1d2443cfe7a4aa205e73a4519e064b7716285ac9bf73 WatchSource:0}: Error finding container 846bc0b01d74b549b57f1d2443cfe7a4aa205e73a4519e064b7716285ac9bf73: Status 404 returned error can't find the container with id 846bc0b01d74b549b57f1d2443cfe7a4aa205e73a4519e064b7716285ac9bf73 Sep 30 06:34:59 crc kubenswrapper[4691]: I0930 06:34:59.370347 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Sep 30 06:34:59 crc kubenswrapper[4691]: I0930 06:34:59.372813 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"e59b91c6-5922-4272-9c75-4e139031c87b","Type":"ContainerStarted","Data":"846bc0b01d74b549b57f1d2443cfe7a4aa205e73a4519e064b7716285ac9bf73"} Sep 30 06:34:59 crc kubenswrapper[4691]: I0930 06:34:59.427456 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-db-create-mx4sz"] Sep 30 06:34:59 crc kubenswrapper[4691]: I0930 06:34:59.428538 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-mx4sz" Sep 30 06:34:59 crc kubenswrapper[4691]: I0930 06:34:59.442023 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f6d45dc65-mp6g5"] Sep 30 06:34:59 crc kubenswrapper[4691]: I0930 06:34:59.442488 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5f6d45dc65-mp6g5" podUID="c64a95f7-003e-4e31-a314-2e24c0f624b3" containerName="dnsmasq-dns" containerID="cri-o://48238f48eb050d30c90944ccc11cf7b1027115c6d7832d8d9db91a521129348e" gracePeriod=10 Sep 30 06:34:59 crc kubenswrapper[4691]: I0930 06:34:59.448230 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-create-mx4sz"] Sep 30 06:34:59 crc kubenswrapper[4691]: I0930 06:34:59.479103 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-648b6fc9cc-b6vxg"] Sep 30 06:34:59 crc kubenswrapper[4691]: I0930 06:34:59.480321 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-648b6fc9cc-b6vxg" Sep 30 06:34:59 crc kubenswrapper[4691]: I0930 06:34:59.495921 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-648b6fc9cc-b6vxg"] Sep 30 06:34:59 crc kubenswrapper[4691]: I0930 06:34:59.534862 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bc9dc6ca-48fa-4947-a151-db88fc6bcd0c-ovsdbserver-sb\") pod \"dnsmasq-dns-648b6fc9cc-b6vxg\" (UID: \"bc9dc6ca-48fa-4947-a151-db88fc6bcd0c\") " pod="openstack/dnsmasq-dns-648b6fc9cc-b6vxg" Sep 30 06:34:59 crc kubenswrapper[4691]: I0930 06:34:59.534923 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vswd4\" (UniqueName: \"kubernetes.io/projected/26e581b3-ef82-4712-827a-48a328785696-kube-api-access-vswd4\") pod \"watcher-db-create-mx4sz\" (UID: \"26e581b3-ef82-4712-827a-48a328785696\") " pod="openstack/watcher-db-create-mx4sz" Sep 30 06:34:59 crc kubenswrapper[4691]: I0930 06:34:59.534990 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc9dc6ca-48fa-4947-a151-db88fc6bcd0c-config\") pod \"dnsmasq-dns-648b6fc9cc-b6vxg\" (UID: \"bc9dc6ca-48fa-4947-a151-db88fc6bcd0c\") " pod="openstack/dnsmasq-dns-648b6fc9cc-b6vxg" Sep 30 06:34:59 crc kubenswrapper[4691]: I0930 06:34:59.535013 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-778pw\" (UniqueName: \"kubernetes.io/projected/bc9dc6ca-48fa-4947-a151-db88fc6bcd0c-kube-api-access-778pw\") pod \"dnsmasq-dns-648b6fc9cc-b6vxg\" (UID: \"bc9dc6ca-48fa-4947-a151-db88fc6bcd0c\") " pod="openstack/dnsmasq-dns-648b6fc9cc-b6vxg" Sep 30 06:34:59 crc kubenswrapper[4691]: I0930 06:34:59.535055 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bc9dc6ca-48fa-4947-a151-db88fc6bcd0c-ovsdbserver-nb\") pod \"dnsmasq-dns-648b6fc9cc-b6vxg\" (UID: \"bc9dc6ca-48fa-4947-a151-db88fc6bcd0c\") " pod="openstack/dnsmasq-dns-648b6fc9cc-b6vxg" Sep 30 06:34:59 crc kubenswrapper[4691]: I0930 06:34:59.535076 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bc9dc6ca-48fa-4947-a151-db88fc6bcd0c-dns-svc\") pod \"dnsmasq-dns-648b6fc9cc-b6vxg\" (UID: \"bc9dc6ca-48fa-4947-a151-db88fc6bcd0c\") " pod="openstack/dnsmasq-dns-648b6fc9cc-b6vxg" Sep 30 06:34:59 crc kubenswrapper[4691]: I0930 06:34:59.636546 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc9dc6ca-48fa-4947-a151-db88fc6bcd0c-config\") pod \"dnsmasq-dns-648b6fc9cc-b6vxg\" (UID: \"bc9dc6ca-48fa-4947-a151-db88fc6bcd0c\") " pod="openstack/dnsmasq-dns-648b6fc9cc-b6vxg" Sep 30 06:34:59 crc kubenswrapper[4691]: I0930 06:34:59.636596 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-778pw\" (UniqueName: \"kubernetes.io/projected/bc9dc6ca-48fa-4947-a151-db88fc6bcd0c-kube-api-access-778pw\") pod \"dnsmasq-dns-648b6fc9cc-b6vxg\" (UID: \"bc9dc6ca-48fa-4947-a151-db88fc6bcd0c\") " pod="openstack/dnsmasq-dns-648b6fc9cc-b6vxg" Sep 30 06:34:59 crc kubenswrapper[4691]: I0930 06:34:59.636647 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bc9dc6ca-48fa-4947-a151-db88fc6bcd0c-ovsdbserver-nb\") pod \"dnsmasq-dns-648b6fc9cc-b6vxg\" (UID: \"bc9dc6ca-48fa-4947-a151-db88fc6bcd0c\") " pod="openstack/dnsmasq-dns-648b6fc9cc-b6vxg" Sep 30 06:34:59 crc kubenswrapper[4691]: I0930 06:34:59.636668 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bc9dc6ca-48fa-4947-a151-db88fc6bcd0c-dns-svc\") pod \"dnsmasq-dns-648b6fc9cc-b6vxg\" (UID: \"bc9dc6ca-48fa-4947-a151-db88fc6bcd0c\") " pod="openstack/dnsmasq-dns-648b6fc9cc-b6vxg" Sep 30 06:34:59 crc kubenswrapper[4691]: I0930 06:34:59.636706 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bc9dc6ca-48fa-4947-a151-db88fc6bcd0c-ovsdbserver-sb\") pod \"dnsmasq-dns-648b6fc9cc-b6vxg\" (UID: \"bc9dc6ca-48fa-4947-a151-db88fc6bcd0c\") " pod="openstack/dnsmasq-dns-648b6fc9cc-b6vxg" Sep 30 06:34:59 crc kubenswrapper[4691]: I0930 06:34:59.636730 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vswd4\" (UniqueName: \"kubernetes.io/projected/26e581b3-ef82-4712-827a-48a328785696-kube-api-access-vswd4\") pod \"watcher-db-create-mx4sz\" (UID: \"26e581b3-ef82-4712-827a-48a328785696\") " pod="openstack/watcher-db-create-mx4sz" Sep 30 06:34:59 crc kubenswrapper[4691]: I0930 06:34:59.637517 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc9dc6ca-48fa-4947-a151-db88fc6bcd0c-config\") pod \"dnsmasq-dns-648b6fc9cc-b6vxg\" (UID: \"bc9dc6ca-48fa-4947-a151-db88fc6bcd0c\") " pod="openstack/dnsmasq-dns-648b6fc9cc-b6vxg" Sep 30 06:34:59 crc kubenswrapper[4691]: I0930 06:34:59.637531 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bc9dc6ca-48fa-4947-a151-db88fc6bcd0c-ovsdbserver-nb\") pod \"dnsmasq-dns-648b6fc9cc-b6vxg\" (UID: \"bc9dc6ca-48fa-4947-a151-db88fc6bcd0c\") " pod="openstack/dnsmasq-dns-648b6fc9cc-b6vxg" Sep 30 06:34:59 crc kubenswrapper[4691]: I0930 06:34:59.637764 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bc9dc6ca-48fa-4947-a151-db88fc6bcd0c-dns-svc\") pod \"dnsmasq-dns-648b6fc9cc-b6vxg\" (UID: \"bc9dc6ca-48fa-4947-a151-db88fc6bcd0c\") " pod="openstack/dnsmasq-dns-648b6fc9cc-b6vxg" Sep 30 06:34:59 crc kubenswrapper[4691]: I0930 06:34:59.637770 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bc9dc6ca-48fa-4947-a151-db88fc6bcd0c-ovsdbserver-sb\") pod \"dnsmasq-dns-648b6fc9cc-b6vxg\" (UID: \"bc9dc6ca-48fa-4947-a151-db88fc6bcd0c\") " pod="openstack/dnsmasq-dns-648b6fc9cc-b6vxg" Sep 30 06:34:59 crc kubenswrapper[4691]: I0930 06:34:59.654455 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-778pw\" (UniqueName: \"kubernetes.io/projected/bc9dc6ca-48fa-4947-a151-db88fc6bcd0c-kube-api-access-778pw\") pod \"dnsmasq-dns-648b6fc9cc-b6vxg\" (UID: \"bc9dc6ca-48fa-4947-a151-db88fc6bcd0c\") " pod="openstack/dnsmasq-dns-648b6fc9cc-b6vxg" Sep 30 06:34:59 crc kubenswrapper[4691]: I0930 06:34:59.654790 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vswd4\" (UniqueName: \"kubernetes.io/projected/26e581b3-ef82-4712-827a-48a328785696-kube-api-access-vswd4\") pod \"watcher-db-create-mx4sz\" (UID: \"26e581b3-ef82-4712-827a-48a328785696\") " pod="openstack/watcher-db-create-mx4sz" Sep 30 06:34:59 crc kubenswrapper[4691]: I0930 06:34:59.755570 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-mx4sz" Sep 30 06:34:59 crc kubenswrapper[4691]: I0930 06:34:59.813377 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-648b6fc9cc-b6vxg" Sep 30 06:35:00 crc kubenswrapper[4691]: I0930 06:35:00.250278 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-create-mx4sz"] Sep 30 06:35:00 crc kubenswrapper[4691]: W0930 06:35:00.259224 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod26e581b3_ef82_4712_827a_48a328785696.slice/crio-014d796699a20c859218a531f23319c3f3652c14764f52dad1dfffb0e1157a43 WatchSource:0}: Error finding container 014d796699a20c859218a531f23319c3f3652c14764f52dad1dfffb0e1157a43: Status 404 returned error can't find the container with id 014d796699a20c859218a531f23319c3f3652c14764f52dad1dfffb0e1157a43 Sep 30 06:35:00 crc kubenswrapper[4691]: I0930 06:35:00.343932 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-648b6fc9cc-b6vxg"] Sep 30 06:35:00 crc kubenswrapper[4691]: W0930 06:35:00.354822 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc9dc6ca_48fa_4947_a151_db88fc6bcd0c.slice/crio-534019919df35d562c6066de8287c10ab907f04ed20c23bae1c21bca0ae17561 WatchSource:0}: Error finding container 534019919df35d562c6066de8287c10ab907f04ed20c23bae1c21bca0ae17561: Status 404 returned error can't find the container with id 534019919df35d562c6066de8287c10ab907f04ed20c23bae1c21bca0ae17561 Sep 30 06:35:00 crc kubenswrapper[4691]: I0930 06:35:00.395314 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-mx4sz" event={"ID":"26e581b3-ef82-4712-827a-48a328785696","Type":"ContainerStarted","Data":"014d796699a20c859218a531f23319c3f3652c14764f52dad1dfffb0e1157a43"} Sep 30 06:35:00 crc kubenswrapper[4691]: I0930 06:35:00.397194 4691 generic.go:334] "Generic (PLEG): container finished" podID="c64a95f7-003e-4e31-a314-2e24c0f624b3" containerID="48238f48eb050d30c90944ccc11cf7b1027115c6d7832d8d9db91a521129348e" exitCode=0 Sep 30 06:35:00 crc kubenswrapper[4691]: I0930 06:35:00.397244 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f6d45dc65-mp6g5" event={"ID":"c64a95f7-003e-4e31-a314-2e24c0f624b3","Type":"ContainerDied","Data":"48238f48eb050d30c90944ccc11cf7b1027115c6d7832d8d9db91a521129348e"} Sep 30 06:35:00 crc kubenswrapper[4691]: I0930 06:35:00.398081 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-648b6fc9cc-b6vxg" event={"ID":"bc9dc6ca-48fa-4947-a151-db88fc6bcd0c","Type":"ContainerStarted","Data":"534019919df35d562c6066de8287c10ab907f04ed20c23bae1c21bca0ae17561"} Sep 30 06:35:00 crc kubenswrapper[4691]: I0930 06:35:00.456655 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f6d45dc65-mp6g5" Sep 30 06:35:00 crc kubenswrapper[4691]: I0930 06:35:00.524723 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Sep 30 06:35:00 crc kubenswrapper[4691]: E0930 06:35:00.525046 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c64a95f7-003e-4e31-a314-2e24c0f624b3" containerName="init" Sep 30 06:35:00 crc kubenswrapper[4691]: I0930 06:35:00.525057 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="c64a95f7-003e-4e31-a314-2e24c0f624b3" containerName="init" Sep 30 06:35:00 crc kubenswrapper[4691]: E0930 06:35:00.525076 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c64a95f7-003e-4e31-a314-2e24c0f624b3" containerName="dnsmasq-dns" Sep 30 06:35:00 crc kubenswrapper[4691]: I0930 06:35:00.525082 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="c64a95f7-003e-4e31-a314-2e24c0f624b3" containerName="dnsmasq-dns" Sep 30 06:35:00 crc kubenswrapper[4691]: I0930 06:35:00.525245 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="c64a95f7-003e-4e31-a314-2e24c0f624b3" containerName="dnsmasq-dns" Sep 30 06:35:00 crc kubenswrapper[4691]: I0930 06:35:00.529695 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Sep 30 06:35:00 crc kubenswrapper[4691]: I0930 06:35:00.531597 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Sep 30 06:35:00 crc kubenswrapper[4691]: I0930 06:35:00.531740 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-d78vf" Sep 30 06:35:00 crc kubenswrapper[4691]: I0930 06:35:00.531747 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Sep 30 06:35:00 crc kubenswrapper[4691]: I0930 06:35:00.531848 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Sep 30 06:35:00 crc kubenswrapper[4691]: I0930 06:35:00.544272 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Sep 30 06:35:00 crc kubenswrapper[4691]: I0930 06:35:00.551447 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c64a95f7-003e-4e31-a314-2e24c0f624b3-ovsdbserver-sb\") pod \"c64a95f7-003e-4e31-a314-2e24c0f624b3\" (UID: \"c64a95f7-003e-4e31-a314-2e24c0f624b3\") " Sep 30 06:35:00 crc kubenswrapper[4691]: I0930 06:35:00.551486 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c64a95f7-003e-4e31-a314-2e24c0f624b3-ovsdbserver-nb\") pod \"c64a95f7-003e-4e31-a314-2e24c0f624b3\" (UID: \"c64a95f7-003e-4e31-a314-2e24c0f624b3\") " Sep 30 06:35:00 crc kubenswrapper[4691]: I0930 06:35:00.551522 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c64a95f7-003e-4e31-a314-2e24c0f624b3-dns-svc\") pod \"c64a95f7-003e-4e31-a314-2e24c0f624b3\" (UID: \"c64a95f7-003e-4e31-a314-2e24c0f624b3\") " Sep 30 06:35:00 crc kubenswrapper[4691]: I0930 06:35:00.551603 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49jv6\" (UniqueName: \"kubernetes.io/projected/c64a95f7-003e-4e31-a314-2e24c0f624b3-kube-api-access-49jv6\") pod \"c64a95f7-003e-4e31-a314-2e24c0f624b3\" (UID: \"c64a95f7-003e-4e31-a314-2e24c0f624b3\") " Sep 30 06:35:00 crc kubenswrapper[4691]: I0930 06:35:00.551631 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c64a95f7-003e-4e31-a314-2e24c0f624b3-config\") pod \"c64a95f7-003e-4e31-a314-2e24c0f624b3\" (UID: \"c64a95f7-003e-4e31-a314-2e24c0f624b3\") " Sep 30 06:35:00 crc kubenswrapper[4691]: I0930 06:35:00.574112 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c64a95f7-003e-4e31-a314-2e24c0f624b3-kube-api-access-49jv6" (OuterVolumeSpecName: "kube-api-access-49jv6") pod "c64a95f7-003e-4e31-a314-2e24c0f624b3" (UID: "c64a95f7-003e-4e31-a314-2e24c0f624b3"). InnerVolumeSpecName "kube-api-access-49jv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:35:00 crc kubenswrapper[4691]: I0930 06:35:00.608611 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c64a95f7-003e-4e31-a314-2e24c0f624b3-config" (OuterVolumeSpecName: "config") pod "c64a95f7-003e-4e31-a314-2e24c0f624b3" (UID: "c64a95f7-003e-4e31-a314-2e24c0f624b3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:35:00 crc kubenswrapper[4691]: I0930 06:35:00.609165 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c64a95f7-003e-4e31-a314-2e24c0f624b3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c64a95f7-003e-4e31-a314-2e24c0f624b3" (UID: "c64a95f7-003e-4e31-a314-2e24c0f624b3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:35:00 crc kubenswrapper[4691]: I0930 06:35:00.613383 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c64a95f7-003e-4e31-a314-2e24c0f624b3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c64a95f7-003e-4e31-a314-2e24c0f624b3" (UID: "c64a95f7-003e-4e31-a314-2e24c0f624b3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:35:00 crc kubenswrapper[4691]: I0930 06:35:00.632071 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c64a95f7-003e-4e31-a314-2e24c0f624b3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c64a95f7-003e-4e31-a314-2e24c0f624b3" (UID: "c64a95f7-003e-4e31-a314-2e24c0f624b3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:35:00 crc kubenswrapper[4691]: I0930 06:35:00.652894 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/ccb4d975-40e7-4a38-8b86-b18e685c570b-lock\") pod \"swift-storage-0\" (UID: \"ccb4d975-40e7-4a38-8b86-b18e685c570b\") " pod="openstack/swift-storage-0" Sep 30 06:35:00 crc kubenswrapper[4691]: I0930 06:35:00.652974 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/ccb4d975-40e7-4a38-8b86-b18e685c570b-cache\") pod \"swift-storage-0\" (UID: \"ccb4d975-40e7-4a38-8b86-b18e685c570b\") " pod="openstack/swift-storage-0" Sep 30 06:35:00 crc kubenswrapper[4691]: I0930 06:35:00.653157 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ccb4d975-40e7-4a38-8b86-b18e685c570b-etc-swift\") pod \"swift-storage-0\" (UID: \"ccb4d975-40e7-4a38-8b86-b18e685c570b\") " pod="openstack/swift-storage-0" Sep 30 06:35:00 crc kubenswrapper[4691]: I0930 06:35:00.653347 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gv7p6\" (UniqueName: \"kubernetes.io/projected/ccb4d975-40e7-4a38-8b86-b18e685c570b-kube-api-access-gv7p6\") pod \"swift-storage-0\" (UID: \"ccb4d975-40e7-4a38-8b86-b18e685c570b\") " pod="openstack/swift-storage-0" Sep 30 06:35:00 crc kubenswrapper[4691]: I0930 06:35:00.653553 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"ccb4d975-40e7-4a38-8b86-b18e685c570b\") " pod="openstack/swift-storage-0" Sep 30 06:35:00 crc kubenswrapper[4691]: I0930 06:35:00.653638 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49jv6\" (UniqueName: \"kubernetes.io/projected/c64a95f7-003e-4e31-a314-2e24c0f624b3-kube-api-access-49jv6\") on node \"crc\" DevicePath \"\"" Sep 30 06:35:00 crc kubenswrapper[4691]: I0930 06:35:00.653662 4691 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c64a95f7-003e-4e31-a314-2e24c0f624b3-config\") on node \"crc\" DevicePath \"\"" Sep 30 06:35:00 crc kubenswrapper[4691]: I0930 06:35:00.653676 4691 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c64a95f7-003e-4e31-a314-2e24c0f624b3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 06:35:00 crc kubenswrapper[4691]: I0930 06:35:00.653688 4691 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c64a95f7-003e-4e31-a314-2e24c0f624b3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 06:35:00 crc kubenswrapper[4691]: I0930 06:35:00.653702 4691 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c64a95f7-003e-4e31-a314-2e24c0f624b3-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 06:35:00 crc kubenswrapper[4691]: I0930 06:35:00.754956 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ccb4d975-40e7-4a38-8b86-b18e685c570b-etc-swift\") pod \"swift-storage-0\" (UID: \"ccb4d975-40e7-4a38-8b86-b18e685c570b\") " pod="openstack/swift-storage-0" Sep 30 06:35:00 crc kubenswrapper[4691]: I0930 06:35:00.755040 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gv7p6\" (UniqueName: \"kubernetes.io/projected/ccb4d975-40e7-4a38-8b86-b18e685c570b-kube-api-access-gv7p6\") pod \"swift-storage-0\" (UID: \"ccb4d975-40e7-4a38-8b86-b18e685c570b\") " pod="openstack/swift-storage-0" Sep 30 06:35:00 crc kubenswrapper[4691]: I0930 06:35:00.755093 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"ccb4d975-40e7-4a38-8b86-b18e685c570b\") " pod="openstack/swift-storage-0" Sep 30 06:35:00 crc kubenswrapper[4691]: I0930 06:35:00.755124 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/ccb4d975-40e7-4a38-8b86-b18e685c570b-lock\") pod \"swift-storage-0\" (UID: \"ccb4d975-40e7-4a38-8b86-b18e685c570b\") " pod="openstack/swift-storage-0" Sep 30 06:35:00 crc kubenswrapper[4691]: I0930 06:35:00.755150 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/ccb4d975-40e7-4a38-8b86-b18e685c570b-cache\") pod \"swift-storage-0\" (UID: \"ccb4d975-40e7-4a38-8b86-b18e685c570b\") " pod="openstack/swift-storage-0" Sep 30 06:35:00 crc kubenswrapper[4691]: E0930 06:35:00.755120 4691 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Sep 30 06:35:00 crc kubenswrapper[4691]: E0930 06:35:00.755229 4691 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Sep 30 06:35:00 crc kubenswrapper[4691]: E0930 06:35:00.755333 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ccb4d975-40e7-4a38-8b86-b18e685c570b-etc-swift podName:ccb4d975-40e7-4a38-8b86-b18e685c570b nodeName:}" failed. No retries permitted until 2025-09-30 06:35:01.255309355 +0000 UTC m=+944.730330405 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ccb4d975-40e7-4a38-8b86-b18e685c570b-etc-swift") pod "swift-storage-0" (UID: "ccb4d975-40e7-4a38-8b86-b18e685c570b") : configmap "swift-ring-files" not found Sep 30 06:35:00 crc kubenswrapper[4691]: I0930 06:35:00.755418 4691 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"ccb4d975-40e7-4a38-8b86-b18e685c570b\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/swift-storage-0" Sep 30 06:35:00 crc kubenswrapper[4691]: I0930 06:35:00.755956 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/ccb4d975-40e7-4a38-8b86-b18e685c570b-cache\") pod \"swift-storage-0\" (UID: \"ccb4d975-40e7-4a38-8b86-b18e685c570b\") " pod="openstack/swift-storage-0" Sep 30 06:35:00 crc kubenswrapper[4691]: I0930 06:35:00.756333 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/ccb4d975-40e7-4a38-8b86-b18e685c570b-lock\") pod \"swift-storage-0\" (UID: \"ccb4d975-40e7-4a38-8b86-b18e685c570b\") " pod="openstack/swift-storage-0" Sep 30 06:35:00 crc kubenswrapper[4691]: I0930 06:35:00.777869 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gv7p6\" (UniqueName: \"kubernetes.io/projected/ccb4d975-40e7-4a38-8b86-b18e685c570b-kube-api-access-gv7p6\") pod \"swift-storage-0\" (UID: \"ccb4d975-40e7-4a38-8b86-b18e685c570b\") " pod="openstack/swift-storage-0" Sep 30 06:35:00 crc kubenswrapper[4691]: I0930 06:35:00.802087 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"swift-storage-0\" (UID: \"ccb4d975-40e7-4a38-8b86-b18e685c570b\") " pod="openstack/swift-storage-0" Sep 30 06:35:01 crc kubenswrapper[4691]: I0930 06:35:01.263905 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ccb4d975-40e7-4a38-8b86-b18e685c570b-etc-swift\") pod \"swift-storage-0\" (UID: \"ccb4d975-40e7-4a38-8b86-b18e685c570b\") " pod="openstack/swift-storage-0" Sep 30 06:35:01 crc kubenswrapper[4691]: E0930 06:35:01.264443 4691 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Sep 30 06:35:01 crc kubenswrapper[4691]: E0930 06:35:01.264515 4691 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Sep 30 06:35:01 crc kubenswrapper[4691]: E0930 06:35:01.264609 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ccb4d975-40e7-4a38-8b86-b18e685c570b-etc-swift podName:ccb4d975-40e7-4a38-8b86-b18e685c570b nodeName:}" failed. No retries permitted until 2025-09-30 06:35:02.264593264 +0000 UTC m=+945.739614304 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ccb4d975-40e7-4a38-8b86-b18e685c570b-etc-swift") pod "swift-storage-0" (UID: "ccb4d975-40e7-4a38-8b86-b18e685c570b") : configmap "swift-ring-files" not found Sep 30 06:35:01 crc kubenswrapper[4691]: I0930 06:35:01.406760 4691 generic.go:334] "Generic (PLEG): container finished" podID="bc9dc6ca-48fa-4947-a151-db88fc6bcd0c" containerID="6cd42cfdf019af1631abfd5a42ec4c12c920decb6fcb33ef972285226821cd47" exitCode=0 Sep 30 06:35:01 crc kubenswrapper[4691]: I0930 06:35:01.406853 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-648b6fc9cc-b6vxg" event={"ID":"bc9dc6ca-48fa-4947-a151-db88fc6bcd0c","Type":"ContainerDied","Data":"6cd42cfdf019af1631abfd5a42ec4c12c920decb6fcb33ef972285226821cd47"} Sep 30 06:35:01 crc kubenswrapper[4691]: I0930 06:35:01.409399 4691 generic.go:334] "Generic (PLEG): container finished" podID="26e581b3-ef82-4712-827a-48a328785696" containerID="b88dbada347fb995043d43e78fbf0983e8e441303f74888d3c86bd5c9ff7933b" exitCode=0 Sep 30 06:35:01 crc kubenswrapper[4691]: I0930 06:35:01.409487 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-mx4sz" event={"ID":"26e581b3-ef82-4712-827a-48a328785696","Type":"ContainerDied","Data":"b88dbada347fb995043d43e78fbf0983e8e441303f74888d3c86bd5c9ff7933b"} Sep 30 06:35:01 crc kubenswrapper[4691]: I0930 06:35:01.411986 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f6d45dc65-mp6g5" event={"ID":"c64a95f7-003e-4e31-a314-2e24c0f624b3","Type":"ContainerDied","Data":"1aa1c51402f26577f0144d735d33178ab6eb303c3f1fda377de207f0ec47126b"} Sep 30 06:35:01 crc kubenswrapper[4691]: I0930 06:35:01.412010 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f6d45dc65-mp6g5" Sep 30 06:35:01 crc kubenswrapper[4691]: I0930 06:35:01.412027 4691 scope.go:117] "RemoveContainer" containerID="48238f48eb050d30c90944ccc11cf7b1027115c6d7832d8d9db91a521129348e" Sep 30 06:35:01 crc kubenswrapper[4691]: I0930 06:35:01.447183 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f6d45dc65-mp6g5"] Sep 30 06:35:01 crc kubenswrapper[4691]: I0930 06:35:01.452032 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f6d45dc65-mp6g5"] Sep 30 06:35:01 crc kubenswrapper[4691]: I0930 06:35:01.558132 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Sep 30 06:35:01 crc kubenswrapper[4691]: I0930 06:35:01.612985 4691 scope.go:117] "RemoveContainer" containerID="d929be105bebeef02d777eadecd61adefc0f9aca20230b25454cc3a1305153e3" Sep 30 06:35:01 crc kubenswrapper[4691]: I0930 06:35:01.628273 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Sep 30 06:35:02 crc kubenswrapper[4691]: I0930 06:35:02.284912 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ccb4d975-40e7-4a38-8b86-b18e685c570b-etc-swift\") pod \"swift-storage-0\" (UID: \"ccb4d975-40e7-4a38-8b86-b18e685c570b\") " pod="openstack/swift-storage-0" Sep 30 06:35:02 crc kubenswrapper[4691]: E0930 06:35:02.285127 4691 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Sep 30 06:35:02 crc kubenswrapper[4691]: E0930 06:35:02.285352 4691 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Sep 30 06:35:02 crc kubenswrapper[4691]: E0930 06:35:02.285415 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ccb4d975-40e7-4a38-8b86-b18e685c570b-etc-swift podName:ccb4d975-40e7-4a38-8b86-b18e685c570b nodeName:}" failed. No retries permitted until 2025-09-30 06:35:04.285397335 +0000 UTC m=+947.760418375 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ccb4d975-40e7-4a38-8b86-b18e685c570b-etc-swift") pod "swift-storage-0" (UID: "ccb4d975-40e7-4a38-8b86-b18e685c570b") : configmap "swift-ring-files" not found Sep 30 06:35:02 crc kubenswrapper[4691]: I0930 06:35:02.432915 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-648b6fc9cc-b6vxg" event={"ID":"bc9dc6ca-48fa-4947-a151-db88fc6bcd0c","Type":"ContainerStarted","Data":"bf77a04625f42b543380f37ddd1340ba394ea4c321e945ee76a74c7774bc1181"} Sep 30 06:35:02 crc kubenswrapper[4691]: I0930 06:35:02.433015 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-648b6fc9cc-b6vxg" Sep 30 06:35:02 crc kubenswrapper[4691]: I0930 06:35:02.444494 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"e59b91c6-5922-4272-9c75-4e139031c87b","Type":"ContainerStarted","Data":"31b7a3ba6aff545d6f96291c176980b796d637a9837fd1d595c12f2ef062693b"} Sep 30 06:35:02 crc kubenswrapper[4691]: I0930 06:35:02.444605 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"e59b91c6-5922-4272-9c75-4e139031c87b","Type":"ContainerStarted","Data":"9a2bc9a15f60b47353fe394cd3908559776c66e60c6c4dc730efca3f1fc8cb4e"} Sep 30 06:35:02 crc kubenswrapper[4691]: I0930 06:35:02.459954 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-648b6fc9cc-b6vxg" podStartSLOduration=3.459937898 podStartE2EDuration="3.459937898s" podCreationTimestamp="2025-09-30 06:34:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:35:02.453819932 +0000 UTC m=+945.928840982" watchObservedRunningTime="2025-09-30 06:35:02.459937898 +0000 UTC m=+945.934958938" Sep 30 06:35:02 crc kubenswrapper[4691]: I0930 06:35:02.478398 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=1.729456101 podStartE2EDuration="4.478353916s" podCreationTimestamp="2025-09-30 06:34:58 +0000 UTC" firstStartedPulling="2025-09-30 06:34:58.905728844 +0000 UTC m=+942.380749894" lastFinishedPulling="2025-09-30 06:35:01.654626659 +0000 UTC m=+945.129647709" observedRunningTime="2025-09-30 06:35:02.476108875 +0000 UTC m=+945.951129925" watchObservedRunningTime="2025-09-30 06:35:02.478353916 +0000 UTC m=+945.953374966" Sep 30 06:35:02 crc kubenswrapper[4691]: I0930 06:35:02.872542 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-mx4sz" Sep 30 06:35:02 crc kubenswrapper[4691]: I0930 06:35:02.965312 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-hskhk"] Sep 30 06:35:02 crc kubenswrapper[4691]: E0930 06:35:02.965773 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26e581b3-ef82-4712-827a-48a328785696" containerName="mariadb-database-create" Sep 30 06:35:02 crc kubenswrapper[4691]: I0930 06:35:02.965792 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="26e581b3-ef82-4712-827a-48a328785696" containerName="mariadb-database-create" Sep 30 06:35:02 crc kubenswrapper[4691]: I0930 06:35:02.965983 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="26e581b3-ef82-4712-827a-48a328785696" containerName="mariadb-database-create" Sep 30 06:35:02 crc kubenswrapper[4691]: I0930 06:35:02.966751 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-hskhk" Sep 30 06:35:02 crc kubenswrapper[4691]: I0930 06:35:02.971311 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-hskhk"] Sep 30 06:35:02 crc kubenswrapper[4691]: I0930 06:35:02.995177 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vswd4\" (UniqueName: \"kubernetes.io/projected/26e581b3-ef82-4712-827a-48a328785696-kube-api-access-vswd4\") pod \"26e581b3-ef82-4712-827a-48a328785696\" (UID: \"26e581b3-ef82-4712-827a-48a328785696\") " Sep 30 06:35:02 crc kubenswrapper[4691]: I0930 06:35:02.995836 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khrvh\" (UniqueName: \"kubernetes.io/projected/988ba2e2-8687-4a2d-91b4-f158c4725b65-kube-api-access-khrvh\") pod \"glance-db-create-hskhk\" (UID: \"988ba2e2-8687-4a2d-91b4-f158c4725b65\") " pod="openstack/glance-db-create-hskhk" Sep 30 06:35:02 crc kubenswrapper[4691]: I0930 06:35:02.999598 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26e581b3-ef82-4712-827a-48a328785696-kube-api-access-vswd4" (OuterVolumeSpecName: "kube-api-access-vswd4") pod "26e581b3-ef82-4712-827a-48a328785696" (UID: "26e581b3-ef82-4712-827a-48a328785696"). InnerVolumeSpecName "kube-api-access-vswd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:35:03 crc kubenswrapper[4691]: I0930 06:35:03.097261 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khrvh\" (UniqueName: \"kubernetes.io/projected/988ba2e2-8687-4a2d-91b4-f158c4725b65-kube-api-access-khrvh\") pod \"glance-db-create-hskhk\" (UID: \"988ba2e2-8687-4a2d-91b4-f158c4725b65\") " pod="openstack/glance-db-create-hskhk" Sep 30 06:35:03 crc kubenswrapper[4691]: I0930 06:35:03.097357 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vswd4\" (UniqueName: \"kubernetes.io/projected/26e581b3-ef82-4712-827a-48a328785696-kube-api-access-vswd4\") on node \"crc\" DevicePath \"\"" Sep 30 06:35:03 crc kubenswrapper[4691]: I0930 06:35:03.112373 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khrvh\" (UniqueName: \"kubernetes.io/projected/988ba2e2-8687-4a2d-91b4-f158c4725b65-kube-api-access-khrvh\") pod \"glance-db-create-hskhk\" (UID: \"988ba2e2-8687-4a2d-91b4-f158c4725b65\") " pod="openstack/glance-db-create-hskhk" Sep 30 06:35:03 crc kubenswrapper[4691]: I0930 06:35:03.236451 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c64a95f7-003e-4e31-a314-2e24c0f624b3" path="/var/lib/kubelet/pods/c64a95f7-003e-4e31-a314-2e24c0f624b3/volumes" Sep 30 06:35:03 crc kubenswrapper[4691]: I0930 06:35:03.292333 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-hskhk" Sep 30 06:35:03 crc kubenswrapper[4691]: I0930 06:35:03.414728 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Sep 30 06:35:03 crc kubenswrapper[4691]: I0930 06:35:03.455061 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-create-mx4sz" event={"ID":"26e581b3-ef82-4712-827a-48a328785696","Type":"ContainerDied","Data":"014d796699a20c859218a531f23319c3f3652c14764f52dad1dfffb0e1157a43"} Sep 30 06:35:03 crc kubenswrapper[4691]: I0930 06:35:03.455118 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="014d796699a20c859218a531f23319c3f3652c14764f52dad1dfffb0e1157a43" Sep 30 06:35:03 crc kubenswrapper[4691]: I0930 06:35:03.455313 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-create-mx4sz" Sep 30 06:35:04 crc kubenswrapper[4691]: I0930 06:35:04.317625 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ccb4d975-40e7-4a38-8b86-b18e685c570b-etc-swift\") pod \"swift-storage-0\" (UID: \"ccb4d975-40e7-4a38-8b86-b18e685c570b\") " pod="openstack/swift-storage-0" Sep 30 06:35:04 crc kubenswrapper[4691]: E0930 06:35:04.317819 4691 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Sep 30 06:35:04 crc kubenswrapper[4691]: E0930 06:35:04.317833 4691 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Sep 30 06:35:04 crc kubenswrapper[4691]: E0930 06:35:04.317900 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ccb4d975-40e7-4a38-8b86-b18e685c570b-etc-swift podName:ccb4d975-40e7-4a38-8b86-b18e685c570b nodeName:}" failed. No retries permitted until 2025-09-30 06:35:08.317867024 +0000 UTC m=+951.792888064 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ccb4d975-40e7-4a38-8b86-b18e685c570b-etc-swift") pod "swift-storage-0" (UID: "ccb4d975-40e7-4a38-8b86-b18e685c570b") : configmap "swift-ring-files" not found Sep 30 06:35:04 crc kubenswrapper[4691]: I0930 06:35:04.631576 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-s8254"] Sep 30 06:35:04 crc kubenswrapper[4691]: I0930 06:35:04.634782 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-s8254" Sep 30 06:35:04 crc kubenswrapper[4691]: I0930 06:35:04.639054 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Sep 30 06:35:04 crc kubenswrapper[4691]: I0930 06:35:04.639313 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Sep 30 06:35:04 crc kubenswrapper[4691]: I0930 06:35:04.640529 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Sep 30 06:35:04 crc kubenswrapper[4691]: I0930 06:35:04.643649 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-s8254"] Sep 30 06:35:04 crc kubenswrapper[4691]: I0930 06:35:04.663171 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-s8254"] Sep 30 06:35:04 crc kubenswrapper[4691]: E0930 06:35:04.663633 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-rstpl ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-rstpl ring-data-devices scripts swiftconf]: context canceled" pod="openstack/swift-ring-rebalance-s8254" podUID="88a6624c-c601-461e-bc30-2a629d9addf0" Sep 30 06:35:04 crc kubenswrapper[4691]: I0930 06:35:04.680868 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-f4xvm"] Sep 30 06:35:04 crc kubenswrapper[4691]: I0930 06:35:04.692120 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-f4xvm" Sep 30 06:35:04 crc kubenswrapper[4691]: I0930 06:35:04.714428 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-f4xvm"] Sep 30 06:35:04 crc kubenswrapper[4691]: I0930 06:35:04.728444 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/88a6624c-c601-461e-bc30-2a629d9addf0-swiftconf\") pod \"swift-ring-rebalance-s8254\" (UID: \"88a6624c-c601-461e-bc30-2a629d9addf0\") " pod="openstack/swift-ring-rebalance-s8254" Sep 30 06:35:04 crc kubenswrapper[4691]: I0930 06:35:04.728706 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/22775d02-1312-4d7a-917d-80dc62539dba-ring-data-devices\") pod \"swift-ring-rebalance-f4xvm\" (UID: \"22775d02-1312-4d7a-917d-80dc62539dba\") " pod="openstack/swift-ring-rebalance-f4xvm" Sep 30 06:35:04 crc kubenswrapper[4691]: I0930 06:35:04.728739 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/22775d02-1312-4d7a-917d-80dc62539dba-swiftconf\") pod \"swift-ring-rebalance-f4xvm\" (UID: \"22775d02-1312-4d7a-917d-80dc62539dba\") " pod="openstack/swift-ring-rebalance-f4xvm" Sep 30 06:35:04 crc kubenswrapper[4691]: I0930 06:35:04.728759 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/22775d02-1312-4d7a-917d-80dc62539dba-dispersionconf\") pod \"swift-ring-rebalance-f4xvm\" (UID: \"22775d02-1312-4d7a-917d-80dc62539dba\") " pod="openstack/swift-ring-rebalance-f4xvm" Sep 30 06:35:04 crc kubenswrapper[4691]: I0930 06:35:04.728781 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/88a6624c-c601-461e-bc30-2a629d9addf0-dispersionconf\") pod \"swift-ring-rebalance-s8254\" (UID: \"88a6624c-c601-461e-bc30-2a629d9addf0\") " pod="openstack/swift-ring-rebalance-s8254" Sep 30 06:35:04 crc kubenswrapper[4691]: I0930 06:35:04.728819 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6btr\" (UniqueName: \"kubernetes.io/projected/22775d02-1312-4d7a-917d-80dc62539dba-kube-api-access-k6btr\") pod \"swift-ring-rebalance-f4xvm\" (UID: \"22775d02-1312-4d7a-917d-80dc62539dba\") " pod="openstack/swift-ring-rebalance-f4xvm" Sep 30 06:35:04 crc kubenswrapper[4691]: I0930 06:35:04.729008 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88a6624c-c601-461e-bc30-2a629d9addf0-combined-ca-bundle\") pod \"swift-ring-rebalance-s8254\" (UID: \"88a6624c-c601-461e-bc30-2a629d9addf0\") " pod="openstack/swift-ring-rebalance-s8254" Sep 30 06:35:04 crc kubenswrapper[4691]: I0930 06:35:04.729048 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/88a6624c-c601-461e-bc30-2a629d9addf0-etc-swift\") pod \"swift-ring-rebalance-s8254\" (UID: \"88a6624c-c601-461e-bc30-2a629d9addf0\") " pod="openstack/swift-ring-rebalance-s8254" Sep 30 06:35:04 crc kubenswrapper[4691]: I0930 06:35:04.729071 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22775d02-1312-4d7a-917d-80dc62539dba-combined-ca-bundle\") pod \"swift-ring-rebalance-f4xvm\" (UID: \"22775d02-1312-4d7a-917d-80dc62539dba\") " pod="openstack/swift-ring-rebalance-f4xvm" Sep 30 06:35:04 crc kubenswrapper[4691]: I0930 06:35:04.729159 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/88a6624c-c601-461e-bc30-2a629d9addf0-scripts\") pod \"swift-ring-rebalance-s8254\" (UID: \"88a6624c-c601-461e-bc30-2a629d9addf0\") " pod="openstack/swift-ring-rebalance-s8254" Sep 30 06:35:04 crc kubenswrapper[4691]: I0930 06:35:04.729185 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/22775d02-1312-4d7a-917d-80dc62539dba-scripts\") pod \"swift-ring-rebalance-f4xvm\" (UID: \"22775d02-1312-4d7a-917d-80dc62539dba\") " pod="openstack/swift-ring-rebalance-f4xvm" Sep 30 06:35:04 crc kubenswrapper[4691]: I0930 06:35:04.729274 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rstpl\" (UniqueName: \"kubernetes.io/projected/88a6624c-c601-461e-bc30-2a629d9addf0-kube-api-access-rstpl\") pod \"swift-ring-rebalance-s8254\" (UID: \"88a6624c-c601-461e-bc30-2a629d9addf0\") " pod="openstack/swift-ring-rebalance-s8254" Sep 30 06:35:04 crc kubenswrapper[4691]: I0930 06:35:04.729338 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/22775d02-1312-4d7a-917d-80dc62539dba-etc-swift\") pod \"swift-ring-rebalance-f4xvm\" (UID: \"22775d02-1312-4d7a-917d-80dc62539dba\") " pod="openstack/swift-ring-rebalance-f4xvm" Sep 30 06:35:04 crc kubenswrapper[4691]: I0930 06:35:04.729377 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/88a6624c-c601-461e-bc30-2a629d9addf0-ring-data-devices\") pod \"swift-ring-rebalance-s8254\" (UID: \"88a6624c-c601-461e-bc30-2a629d9addf0\") " pod="openstack/swift-ring-rebalance-s8254" Sep 30 06:35:04 crc kubenswrapper[4691]: I0930 06:35:04.832487 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88a6624c-c601-461e-bc30-2a629d9addf0-combined-ca-bundle\") pod \"swift-ring-rebalance-s8254\" (UID: \"88a6624c-c601-461e-bc30-2a629d9addf0\") " pod="openstack/swift-ring-rebalance-s8254" Sep 30 06:35:04 crc kubenswrapper[4691]: I0930 06:35:04.832596 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/88a6624c-c601-461e-bc30-2a629d9addf0-etc-swift\") pod \"swift-ring-rebalance-s8254\" (UID: \"88a6624c-c601-461e-bc30-2a629d9addf0\") " pod="openstack/swift-ring-rebalance-s8254" Sep 30 06:35:04 crc kubenswrapper[4691]: I0930 06:35:04.832696 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22775d02-1312-4d7a-917d-80dc62539dba-combined-ca-bundle\") pod \"swift-ring-rebalance-f4xvm\" (UID: \"22775d02-1312-4d7a-917d-80dc62539dba\") " pod="openstack/swift-ring-rebalance-f4xvm" Sep 30 06:35:04 crc kubenswrapper[4691]: I0930 06:35:04.832749 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/88a6624c-c601-461e-bc30-2a629d9addf0-scripts\") pod \"swift-ring-rebalance-s8254\" (UID: \"88a6624c-c601-461e-bc30-2a629d9addf0\") " pod="openstack/swift-ring-rebalance-s8254" Sep 30 06:35:04 crc kubenswrapper[4691]: I0930 06:35:04.832777 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/22775d02-1312-4d7a-917d-80dc62539dba-scripts\") pod \"swift-ring-rebalance-f4xvm\" (UID: \"22775d02-1312-4d7a-917d-80dc62539dba\") " pod="openstack/swift-ring-rebalance-f4xvm" Sep 30 06:35:04 crc kubenswrapper[4691]: I0930 06:35:04.832839 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rstpl\" (UniqueName: \"kubernetes.io/projected/88a6624c-c601-461e-bc30-2a629d9addf0-kube-api-access-rstpl\") pod \"swift-ring-rebalance-s8254\" (UID: \"88a6624c-c601-461e-bc30-2a629d9addf0\") " pod="openstack/swift-ring-rebalance-s8254" Sep 30 06:35:04 crc kubenswrapper[4691]: I0930 06:35:04.832915 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/22775d02-1312-4d7a-917d-80dc62539dba-etc-swift\") pod \"swift-ring-rebalance-f4xvm\" (UID: \"22775d02-1312-4d7a-917d-80dc62539dba\") " pod="openstack/swift-ring-rebalance-f4xvm" Sep 30 06:35:04 crc kubenswrapper[4691]: I0930 06:35:04.832951 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/88a6624c-c601-461e-bc30-2a629d9addf0-ring-data-devices\") pod \"swift-ring-rebalance-s8254\" (UID: \"88a6624c-c601-461e-bc30-2a629d9addf0\") " pod="openstack/swift-ring-rebalance-s8254" Sep 30 06:35:04 crc kubenswrapper[4691]: I0930 06:35:04.833001 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/88a6624c-c601-461e-bc30-2a629d9addf0-swiftconf\") pod \"swift-ring-rebalance-s8254\" (UID: \"88a6624c-c601-461e-bc30-2a629d9addf0\") " pod="openstack/swift-ring-rebalance-s8254" Sep 30 06:35:04 crc kubenswrapper[4691]: I0930 06:35:04.833032 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/22775d02-1312-4d7a-917d-80dc62539dba-ring-data-devices\") pod \"swift-ring-rebalance-f4xvm\" (UID: \"22775d02-1312-4d7a-917d-80dc62539dba\") " pod="openstack/swift-ring-rebalance-f4xvm" Sep 30 06:35:04 crc kubenswrapper[4691]: I0930 06:35:04.833074 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/22775d02-1312-4d7a-917d-80dc62539dba-swiftconf\") pod \"swift-ring-rebalance-f4xvm\" (UID: \"22775d02-1312-4d7a-917d-80dc62539dba\") " pod="openstack/swift-ring-rebalance-f4xvm" Sep 30 06:35:04 crc kubenswrapper[4691]: I0930 06:35:04.833107 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/22775d02-1312-4d7a-917d-80dc62539dba-dispersionconf\") pod \"swift-ring-rebalance-f4xvm\" (UID: \"22775d02-1312-4d7a-917d-80dc62539dba\") " pod="openstack/swift-ring-rebalance-f4xvm" Sep 30 06:35:04 crc kubenswrapper[4691]: I0930 06:35:04.833112 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/88a6624c-c601-461e-bc30-2a629d9addf0-etc-swift\") pod \"swift-ring-rebalance-s8254\" (UID: \"88a6624c-c601-461e-bc30-2a629d9addf0\") " pod="openstack/swift-ring-rebalance-s8254" Sep 30 06:35:04 crc kubenswrapper[4691]: I0930 06:35:04.833137 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/88a6624c-c601-461e-bc30-2a629d9addf0-dispersionconf\") pod \"swift-ring-rebalance-s8254\" (UID: \"88a6624c-c601-461e-bc30-2a629d9addf0\") " pod="openstack/swift-ring-rebalance-s8254" Sep 30 06:35:04 crc kubenswrapper[4691]: I0930 06:35:04.833190 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6btr\" (UniqueName: \"kubernetes.io/projected/22775d02-1312-4d7a-917d-80dc62539dba-kube-api-access-k6btr\") pod \"swift-ring-rebalance-f4xvm\" (UID: \"22775d02-1312-4d7a-917d-80dc62539dba\") " pod="openstack/swift-ring-rebalance-f4xvm" Sep 30 06:35:04 crc kubenswrapper[4691]: I0930 06:35:04.833681 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/22775d02-1312-4d7a-917d-80dc62539dba-etc-swift\") pod \"swift-ring-rebalance-f4xvm\" (UID: \"22775d02-1312-4d7a-917d-80dc62539dba\") " pod="openstack/swift-ring-rebalance-f4xvm" Sep 30 06:35:04 crc kubenswrapper[4691]: I0930 06:35:04.834141 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/88a6624c-c601-461e-bc30-2a629d9addf0-scripts\") pod \"swift-ring-rebalance-s8254\" (UID: \"88a6624c-c601-461e-bc30-2a629d9addf0\") " pod="openstack/swift-ring-rebalance-s8254" Sep 30 06:35:04 crc kubenswrapper[4691]: I0930 06:35:04.834475 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/88a6624c-c601-461e-bc30-2a629d9addf0-ring-data-devices\") pod \"swift-ring-rebalance-s8254\" (UID: \"88a6624c-c601-461e-bc30-2a629d9addf0\") " pod="openstack/swift-ring-rebalance-s8254" Sep 30 06:35:04 crc kubenswrapper[4691]: I0930 06:35:04.836322 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/22775d02-1312-4d7a-917d-80dc62539dba-scripts\") pod \"swift-ring-rebalance-f4xvm\" (UID: \"22775d02-1312-4d7a-917d-80dc62539dba\") " pod="openstack/swift-ring-rebalance-f4xvm" Sep 30 06:35:04 crc kubenswrapper[4691]: I0930 06:35:04.837255 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/22775d02-1312-4d7a-917d-80dc62539dba-ring-data-devices\") pod \"swift-ring-rebalance-f4xvm\" (UID: \"22775d02-1312-4d7a-917d-80dc62539dba\") " pod="openstack/swift-ring-rebalance-f4xvm" Sep 30 06:35:04 crc kubenswrapper[4691]: I0930 06:35:04.838364 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/88a6624c-c601-461e-bc30-2a629d9addf0-swiftconf\") pod \"swift-ring-rebalance-s8254\" (UID: \"88a6624c-c601-461e-bc30-2a629d9addf0\") " pod="openstack/swift-ring-rebalance-s8254" Sep 30 06:35:04 crc kubenswrapper[4691]: I0930 06:35:04.840343 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/22775d02-1312-4d7a-917d-80dc62539dba-dispersionconf\") pod \"swift-ring-rebalance-f4xvm\" (UID: \"22775d02-1312-4d7a-917d-80dc62539dba\") " pod="openstack/swift-ring-rebalance-f4xvm" Sep 30 06:35:04 crc kubenswrapper[4691]: I0930 06:35:04.840977 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/22775d02-1312-4d7a-917d-80dc62539dba-swiftconf\") pod \"swift-ring-rebalance-f4xvm\" (UID: \"22775d02-1312-4d7a-917d-80dc62539dba\") " pod="openstack/swift-ring-rebalance-f4xvm" Sep 30 06:35:04 crc kubenswrapper[4691]: I0930 06:35:04.842042 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/88a6624c-c601-461e-bc30-2a629d9addf0-dispersionconf\") pod \"swift-ring-rebalance-s8254\" (UID: \"88a6624c-c601-461e-bc30-2a629d9addf0\") " pod="openstack/swift-ring-rebalance-s8254" Sep 30 06:35:04 crc kubenswrapper[4691]: I0930 06:35:04.842056 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88a6624c-c601-461e-bc30-2a629d9addf0-combined-ca-bundle\") pod \"swift-ring-rebalance-s8254\" (UID: \"88a6624c-c601-461e-bc30-2a629d9addf0\") " pod="openstack/swift-ring-rebalance-s8254" Sep 30 06:35:04 crc kubenswrapper[4691]: I0930 06:35:04.845612 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22775d02-1312-4d7a-917d-80dc62539dba-combined-ca-bundle\") pod \"swift-ring-rebalance-f4xvm\" (UID: \"22775d02-1312-4d7a-917d-80dc62539dba\") " pod="openstack/swift-ring-rebalance-f4xvm" Sep 30 06:35:04 crc kubenswrapper[4691]: I0930 06:35:04.848497 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rstpl\" (UniqueName: \"kubernetes.io/projected/88a6624c-c601-461e-bc30-2a629d9addf0-kube-api-access-rstpl\") pod \"swift-ring-rebalance-s8254\" (UID: \"88a6624c-c601-461e-bc30-2a629d9addf0\") " pod="openstack/swift-ring-rebalance-s8254" Sep 30 06:35:04 crc kubenswrapper[4691]: I0930 06:35:04.848697 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6btr\" (UniqueName: \"kubernetes.io/projected/22775d02-1312-4d7a-917d-80dc62539dba-kube-api-access-k6btr\") pod \"swift-ring-rebalance-f4xvm\" (UID: \"22775d02-1312-4d7a-917d-80dc62539dba\") " pod="openstack/swift-ring-rebalance-f4xvm" Sep 30 06:35:05 crc kubenswrapper[4691]: I0930 06:35:05.028987 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-f4xvm" Sep 30 06:35:05 crc kubenswrapper[4691]: I0930 06:35:05.471934 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-s8254" Sep 30 06:35:05 crc kubenswrapper[4691]: I0930 06:35:05.492301 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-s8254" Sep 30 06:35:05 crc kubenswrapper[4691]: I0930 06:35:05.543949 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/88a6624c-c601-461e-bc30-2a629d9addf0-etc-swift\") pod \"88a6624c-c601-461e-bc30-2a629d9addf0\" (UID: \"88a6624c-c601-461e-bc30-2a629d9addf0\") " Sep 30 06:35:05 crc kubenswrapper[4691]: I0930 06:35:05.544077 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/88a6624c-c601-461e-bc30-2a629d9addf0-swiftconf\") pod \"88a6624c-c601-461e-bc30-2a629d9addf0\" (UID: \"88a6624c-c601-461e-bc30-2a629d9addf0\") " Sep 30 06:35:05 crc kubenswrapper[4691]: I0930 06:35:05.544099 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rstpl\" (UniqueName: \"kubernetes.io/projected/88a6624c-c601-461e-bc30-2a629d9addf0-kube-api-access-rstpl\") pod \"88a6624c-c601-461e-bc30-2a629d9addf0\" (UID: \"88a6624c-c601-461e-bc30-2a629d9addf0\") " Sep 30 06:35:05 crc kubenswrapper[4691]: I0930 06:35:05.544144 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/88a6624c-c601-461e-bc30-2a629d9addf0-dispersionconf\") pod \"88a6624c-c601-461e-bc30-2a629d9addf0\" (UID: \"88a6624c-c601-461e-bc30-2a629d9addf0\") " Sep 30 06:35:05 crc kubenswrapper[4691]: I0930 06:35:05.544247 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88a6624c-c601-461e-bc30-2a629d9addf0-combined-ca-bundle\") pod \"88a6624c-c601-461e-bc30-2a629d9addf0\" (UID: \"88a6624c-c601-461e-bc30-2a629d9addf0\") " Sep 30 06:35:05 crc kubenswrapper[4691]: I0930 06:35:05.544269 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/88a6624c-c601-461e-bc30-2a629d9addf0-scripts\") pod \"88a6624c-c601-461e-bc30-2a629d9addf0\" (UID: \"88a6624c-c601-461e-bc30-2a629d9addf0\") " Sep 30 06:35:05 crc kubenswrapper[4691]: I0930 06:35:05.544279 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/88a6624c-c601-461e-bc30-2a629d9addf0-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "88a6624c-c601-461e-bc30-2a629d9addf0" (UID: "88a6624c-c601-461e-bc30-2a629d9addf0"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:35:05 crc kubenswrapper[4691]: I0930 06:35:05.544419 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/88a6624c-c601-461e-bc30-2a629d9addf0-ring-data-devices\") pod \"88a6624c-c601-461e-bc30-2a629d9addf0\" (UID: \"88a6624c-c601-461e-bc30-2a629d9addf0\") " Sep 30 06:35:05 crc kubenswrapper[4691]: I0930 06:35:05.544944 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88a6624c-c601-461e-bc30-2a629d9addf0-scripts" (OuterVolumeSpecName: "scripts") pod "88a6624c-c601-461e-bc30-2a629d9addf0" (UID: "88a6624c-c601-461e-bc30-2a629d9addf0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:35:05 crc kubenswrapper[4691]: I0930 06:35:05.545046 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88a6624c-c601-461e-bc30-2a629d9addf0-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "88a6624c-c601-461e-bc30-2a629d9addf0" (UID: "88a6624c-c601-461e-bc30-2a629d9addf0"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:35:05 crc kubenswrapper[4691]: I0930 06:35:05.545358 4691 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/88a6624c-c601-461e-bc30-2a629d9addf0-ring-data-devices\") on node \"crc\" DevicePath \"\"" Sep 30 06:35:05 crc kubenswrapper[4691]: I0930 06:35:05.545414 4691 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/88a6624c-c601-461e-bc30-2a629d9addf0-etc-swift\") on node \"crc\" DevicePath \"\"" Sep 30 06:35:05 crc kubenswrapper[4691]: I0930 06:35:05.545427 4691 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/88a6624c-c601-461e-bc30-2a629d9addf0-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 06:35:05 crc kubenswrapper[4691]: I0930 06:35:05.548526 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88a6624c-c601-461e-bc30-2a629d9addf0-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "88a6624c-c601-461e-bc30-2a629d9addf0" (UID: "88a6624c-c601-461e-bc30-2a629d9addf0"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:35:05 crc kubenswrapper[4691]: I0930 06:35:05.548816 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88a6624c-c601-461e-bc30-2a629d9addf0-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "88a6624c-c601-461e-bc30-2a629d9addf0" (UID: "88a6624c-c601-461e-bc30-2a629d9addf0"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:35:05 crc kubenswrapper[4691]: I0930 06:35:05.549071 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/88a6624c-c601-461e-bc30-2a629d9addf0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "88a6624c-c601-461e-bc30-2a629d9addf0" (UID: "88a6624c-c601-461e-bc30-2a629d9addf0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:35:05 crc kubenswrapper[4691]: I0930 06:35:05.549260 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88a6624c-c601-461e-bc30-2a629d9addf0-kube-api-access-rstpl" (OuterVolumeSpecName: "kube-api-access-rstpl") pod "88a6624c-c601-461e-bc30-2a629d9addf0" (UID: "88a6624c-c601-461e-bc30-2a629d9addf0"). InnerVolumeSpecName "kube-api-access-rstpl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:35:05 crc kubenswrapper[4691]: I0930 06:35:05.646739 4691 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/88a6624c-c601-461e-bc30-2a629d9addf0-swiftconf\") on node \"crc\" DevicePath \"\"" Sep 30 06:35:05 crc kubenswrapper[4691]: I0930 06:35:05.647083 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rstpl\" (UniqueName: \"kubernetes.io/projected/88a6624c-c601-461e-bc30-2a629d9addf0-kube-api-access-rstpl\") on node \"crc\" DevicePath \"\"" Sep 30 06:35:05 crc kubenswrapper[4691]: I0930 06:35:05.647095 4691 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/88a6624c-c601-461e-bc30-2a629d9addf0-dispersionconf\") on node \"crc\" DevicePath \"\"" Sep 30 06:35:05 crc kubenswrapper[4691]: I0930 06:35:05.647104 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/88a6624c-c601-461e-bc30-2a629d9addf0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:35:05 crc kubenswrapper[4691]: I0930 06:35:05.726181 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-hskhk"] Sep 30 06:35:05 crc kubenswrapper[4691]: W0930 06:35:05.727058 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod988ba2e2_8687_4a2d_91b4_f158c4725b65.slice/crio-9b54fb49afca53e73f6f6dd275132eccf76f000767c83cfcd136915507c4b83e WatchSource:0}: Error finding container 9b54fb49afca53e73f6f6dd275132eccf76f000767c83cfcd136915507c4b83e: Status 404 returned error can't find the container with id 9b54fb49afca53e73f6f6dd275132eccf76f000767c83cfcd136915507c4b83e Sep 30 06:35:05 crc kubenswrapper[4691]: I0930 06:35:05.843170 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-f4xvm"] Sep 30 06:35:05 crc kubenswrapper[4691]: W0930 06:35:05.849479 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod22775d02_1312_4d7a_917d_80dc62539dba.slice/crio-30403dfceebf9000828ca6a40a1a0617ed1e84c6cf51a938f0998d455f23f316 WatchSource:0}: Error finding container 30403dfceebf9000828ca6a40a1a0617ed1e84c6cf51a938f0998d455f23f316: Status 404 returned error can't find the container with id 30403dfceebf9000828ca6a40a1a0617ed1e84c6cf51a938f0998d455f23f316 Sep 30 06:35:06 crc kubenswrapper[4691]: I0930 06:35:06.479224 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-f4xvm" event={"ID":"22775d02-1312-4d7a-917d-80dc62539dba","Type":"ContainerStarted","Data":"30403dfceebf9000828ca6a40a1a0617ed1e84c6cf51a938f0998d455f23f316"} Sep 30 06:35:06 crc kubenswrapper[4691]: I0930 06:35:06.480921 4691 generic.go:334] "Generic (PLEG): container finished" podID="988ba2e2-8687-4a2d-91b4-f158c4725b65" containerID="e62a84f0d0da996c09af481de07b33b239fa29f5417410b7a62fbaf736a983cc" exitCode=0 Sep 30 06:35:06 crc kubenswrapper[4691]: I0930 06:35:06.480960 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-hskhk" event={"ID":"988ba2e2-8687-4a2d-91b4-f158c4725b65","Type":"ContainerDied","Data":"e62a84f0d0da996c09af481de07b33b239fa29f5417410b7a62fbaf736a983cc"} Sep 30 06:35:06 crc kubenswrapper[4691]: I0930 06:35:06.480974 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-hskhk" event={"ID":"988ba2e2-8687-4a2d-91b4-f158c4725b65","Type":"ContainerStarted","Data":"9b54fb49afca53e73f6f6dd275132eccf76f000767c83cfcd136915507c4b83e"} Sep 30 06:35:06 crc kubenswrapper[4691]: I0930 06:35:06.482849 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d61470fc-16c1-40fb-bc8a-17517013b3be","Type":"ContainerStarted","Data":"d935e49d4bf9dd6496df247011a570c49579ab41b053b37ff6b73f549dcaba1f"} Sep 30 06:35:06 crc kubenswrapper[4691]: I0930 06:35:06.482875 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-s8254" Sep 30 06:35:06 crc kubenswrapper[4691]: I0930 06:35:06.534985 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-s8254"] Sep 30 06:35:06 crc kubenswrapper[4691]: I0930 06:35:06.542497 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-s8254"] Sep 30 06:35:07 crc kubenswrapper[4691]: I0930 06:35:07.253606 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88a6624c-c601-461e-bc30-2a629d9addf0" path="/var/lib/kubelet/pods/88a6624c-c601-461e-bc30-2a629d9addf0/volumes" Sep 30 06:35:07 crc kubenswrapper[4691]: I0930 06:35:07.362621 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-cfvmh"] Sep 30 06:35:07 crc kubenswrapper[4691]: I0930 06:35:07.363727 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-cfvmh" Sep 30 06:35:07 crc kubenswrapper[4691]: I0930 06:35:07.386652 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-cfvmh"] Sep 30 06:35:07 crc kubenswrapper[4691]: I0930 06:35:07.479118 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdj7c\" (UniqueName: \"kubernetes.io/projected/f822b186-ff4b-4190-b86f-e20bcc5ae236-kube-api-access-wdj7c\") pod \"keystone-db-create-cfvmh\" (UID: \"f822b186-ff4b-4190-b86f-e20bcc5ae236\") " pod="openstack/keystone-db-create-cfvmh" Sep 30 06:35:07 crc kubenswrapper[4691]: I0930 06:35:07.580678 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdj7c\" (UniqueName: \"kubernetes.io/projected/f822b186-ff4b-4190-b86f-e20bcc5ae236-kube-api-access-wdj7c\") pod \"keystone-db-create-cfvmh\" (UID: \"f822b186-ff4b-4190-b86f-e20bcc5ae236\") " pod="openstack/keystone-db-create-cfvmh" Sep 30 06:35:07 crc kubenswrapper[4691]: I0930 06:35:07.610235 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdj7c\" (UniqueName: \"kubernetes.io/projected/f822b186-ff4b-4190-b86f-e20bcc5ae236-kube-api-access-wdj7c\") pod \"keystone-db-create-cfvmh\" (UID: \"f822b186-ff4b-4190-b86f-e20bcc5ae236\") " pod="openstack/keystone-db-create-cfvmh" Sep 30 06:35:07 crc kubenswrapper[4691]: I0930 06:35:07.675421 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-jzcb7"] Sep 30 06:35:07 crc kubenswrapper[4691]: I0930 06:35:07.676760 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-jzcb7" Sep 30 06:35:07 crc kubenswrapper[4691]: I0930 06:35:07.691519 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-jzcb7"] Sep 30 06:35:07 crc kubenswrapper[4691]: I0930 06:35:07.695861 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-cfvmh" Sep 30 06:35:07 crc kubenswrapper[4691]: I0930 06:35:07.782857 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwstd\" (UniqueName: \"kubernetes.io/projected/c4d262b4-6cf7-4fcc-bc2c-01fee2dbf62e-kube-api-access-dwstd\") pod \"placement-db-create-jzcb7\" (UID: \"c4d262b4-6cf7-4fcc-bc2c-01fee2dbf62e\") " pod="openstack/placement-db-create-jzcb7" Sep 30 06:35:07 crc kubenswrapper[4691]: I0930 06:35:07.884730 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwstd\" (UniqueName: \"kubernetes.io/projected/c4d262b4-6cf7-4fcc-bc2c-01fee2dbf62e-kube-api-access-dwstd\") pod \"placement-db-create-jzcb7\" (UID: \"c4d262b4-6cf7-4fcc-bc2c-01fee2dbf62e\") " pod="openstack/placement-db-create-jzcb7" Sep 30 06:35:07 crc kubenswrapper[4691]: I0930 06:35:07.903558 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwstd\" (UniqueName: \"kubernetes.io/projected/c4d262b4-6cf7-4fcc-bc2c-01fee2dbf62e-kube-api-access-dwstd\") pod \"placement-db-create-jzcb7\" (UID: \"c4d262b4-6cf7-4fcc-bc2c-01fee2dbf62e\") " pod="openstack/placement-db-create-jzcb7" Sep 30 06:35:08 crc kubenswrapper[4691]: I0930 06:35:08.004510 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-jzcb7" Sep 30 06:35:08 crc kubenswrapper[4691]: I0930 06:35:08.395710 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ccb4d975-40e7-4a38-8b86-b18e685c570b-etc-swift\") pod \"swift-storage-0\" (UID: \"ccb4d975-40e7-4a38-8b86-b18e685c570b\") " pod="openstack/swift-storage-0" Sep 30 06:35:08 crc kubenswrapper[4691]: E0930 06:35:08.396081 4691 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Sep 30 06:35:08 crc kubenswrapper[4691]: E0930 06:35:08.396285 4691 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Sep 30 06:35:08 crc kubenswrapper[4691]: E0930 06:35:08.396389 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/ccb4d975-40e7-4a38-8b86-b18e685c570b-etc-swift podName:ccb4d975-40e7-4a38-8b86-b18e685c570b nodeName:}" failed. No retries permitted until 2025-09-30 06:35:16.396354546 +0000 UTC m=+959.871375616 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/ccb4d975-40e7-4a38-8b86-b18e685c570b-etc-swift") pod "swift-storage-0" (UID: "ccb4d975-40e7-4a38-8b86-b18e685c570b") : configmap "swift-ring-files" not found Sep 30 06:35:08 crc kubenswrapper[4691]: I0930 06:35:08.455452 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-hskhk" Sep 30 06:35:08 crc kubenswrapper[4691]: I0930 06:35:08.497840 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khrvh\" (UniqueName: \"kubernetes.io/projected/988ba2e2-8687-4a2d-91b4-f158c4725b65-kube-api-access-khrvh\") pod \"988ba2e2-8687-4a2d-91b4-f158c4725b65\" (UID: \"988ba2e2-8687-4a2d-91b4-f158c4725b65\") " Sep 30 06:35:08 crc kubenswrapper[4691]: I0930 06:35:08.503903 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-hskhk" event={"ID":"988ba2e2-8687-4a2d-91b4-f158c4725b65","Type":"ContainerDied","Data":"9b54fb49afca53e73f6f6dd275132eccf76f000767c83cfcd136915507c4b83e"} Sep 30 06:35:08 crc kubenswrapper[4691]: I0930 06:35:08.503938 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b54fb49afca53e73f6f6dd275132eccf76f000767c83cfcd136915507c4b83e" Sep 30 06:35:08 crc kubenswrapper[4691]: I0930 06:35:08.503984 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-hskhk" Sep 30 06:35:08 crc kubenswrapper[4691]: I0930 06:35:08.506139 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/988ba2e2-8687-4a2d-91b4-f158c4725b65-kube-api-access-khrvh" (OuterVolumeSpecName: "kube-api-access-khrvh") pod "988ba2e2-8687-4a2d-91b4-f158c4725b65" (UID: "988ba2e2-8687-4a2d-91b4-f158c4725b65"). InnerVolumeSpecName "kube-api-access-khrvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:35:08 crc kubenswrapper[4691]: I0930 06:35:08.603054 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khrvh\" (UniqueName: \"kubernetes.io/projected/988ba2e2-8687-4a2d-91b4-f158c4725b65-kube-api-access-khrvh\") on node \"crc\" DevicePath \"\"" Sep 30 06:35:08 crc kubenswrapper[4691]: I0930 06:35:08.789349 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-jzcb7"] Sep 30 06:35:08 crc kubenswrapper[4691]: W0930 06:35:08.796269 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc4d262b4_6cf7_4fcc_bc2c_01fee2dbf62e.slice/crio-af40f3e0d604123a2da319e996b9282923bb45cc9a9e2fb6a963c456bf95a891 WatchSource:0}: Error finding container af40f3e0d604123a2da319e996b9282923bb45cc9a9e2fb6a963c456bf95a891: Status 404 returned error can't find the container with id af40f3e0d604123a2da319e996b9282923bb45cc9a9e2fb6a963c456bf95a891 Sep 30 06:35:08 crc kubenswrapper[4691]: I0930 06:35:08.878576 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-cfvmh"] Sep 30 06:35:08 crc kubenswrapper[4691]: W0930 06:35:08.885213 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf822b186_ff4b_4190_b86f_e20bcc5ae236.slice/crio-2c1dee28673910978a74f6d172797e58ffcf68ea6ea2c94487b13f28a28fc277 WatchSource:0}: Error finding container 2c1dee28673910978a74f6d172797e58ffcf68ea6ea2c94487b13f28a28fc277: Status 404 returned error can't find the container with id 2c1dee28673910978a74f6d172797e58ffcf68ea6ea2c94487b13f28a28fc277 Sep 30 06:35:09 crc kubenswrapper[4691]: I0930 06:35:09.442457 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-a15e-account-create-nt67d"] Sep 30 06:35:09 crc kubenswrapper[4691]: E0930 06:35:09.443000 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="988ba2e2-8687-4a2d-91b4-f158c4725b65" containerName="mariadb-database-create" Sep 30 06:35:09 crc kubenswrapper[4691]: I0930 06:35:09.443029 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="988ba2e2-8687-4a2d-91b4-f158c4725b65" containerName="mariadb-database-create" Sep 30 06:35:09 crc kubenswrapper[4691]: I0930 06:35:09.443387 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="988ba2e2-8687-4a2d-91b4-f158c4725b65" containerName="mariadb-database-create" Sep 30 06:35:09 crc kubenswrapper[4691]: I0930 06:35:09.444276 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-a15e-account-create-nt67d" Sep 30 06:35:09 crc kubenswrapper[4691]: I0930 06:35:09.451167 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-db-secret" Sep 30 06:35:09 crc kubenswrapper[4691]: I0930 06:35:09.456173 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-a15e-account-create-nt67d"] Sep 30 06:35:09 crc kubenswrapper[4691]: I0930 06:35:09.515731 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d61470fc-16c1-40fb-bc8a-17517013b3be","Type":"ContainerStarted","Data":"36efbf42370719637b353083faaa9b5e0765d528e8646b1f27e9289c5c3eb77a"} Sep 30 06:35:09 crc kubenswrapper[4691]: I0930 06:35:09.517328 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-f4xvm" event={"ID":"22775d02-1312-4d7a-917d-80dc62539dba","Type":"ContainerStarted","Data":"673641b5a644590eeb1e4e2bf3ee4b0ff6ec4221cd88979d3c445d47e1339418"} Sep 30 06:35:09 crc kubenswrapper[4691]: I0930 06:35:09.519364 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pksx7\" (UniqueName: \"kubernetes.io/projected/cf1e1d74-2b4c-41a7-92fb-95337bebfd86-kube-api-access-pksx7\") pod \"watcher-a15e-account-create-nt67d\" (UID: \"cf1e1d74-2b4c-41a7-92fb-95337bebfd86\") " pod="openstack/watcher-a15e-account-create-nt67d" Sep 30 06:35:09 crc kubenswrapper[4691]: I0930 06:35:09.521729 4691 generic.go:334] "Generic (PLEG): container finished" podID="c4d262b4-6cf7-4fcc-bc2c-01fee2dbf62e" containerID="47f28f4f9a1d60ef2129ffbc784bffc15130cea11546ff2647590cce6a56a3d6" exitCode=0 Sep 30 06:35:09 crc kubenswrapper[4691]: I0930 06:35:09.521815 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-jzcb7" event={"ID":"c4d262b4-6cf7-4fcc-bc2c-01fee2dbf62e","Type":"ContainerDied","Data":"47f28f4f9a1d60ef2129ffbc784bffc15130cea11546ff2647590cce6a56a3d6"} Sep 30 06:35:09 crc kubenswrapper[4691]: I0930 06:35:09.521844 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-jzcb7" event={"ID":"c4d262b4-6cf7-4fcc-bc2c-01fee2dbf62e","Type":"ContainerStarted","Data":"af40f3e0d604123a2da319e996b9282923bb45cc9a9e2fb6a963c456bf95a891"} Sep 30 06:35:09 crc kubenswrapper[4691]: I0930 06:35:09.524002 4691 generic.go:334] "Generic (PLEG): container finished" podID="f822b186-ff4b-4190-b86f-e20bcc5ae236" containerID="7e9471ae1a26bebf564997628cdb5c4645d5b0994bb9cfa27422177a4572f239" exitCode=0 Sep 30 06:35:09 crc kubenswrapper[4691]: I0930 06:35:09.524039 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-cfvmh" event={"ID":"f822b186-ff4b-4190-b86f-e20bcc5ae236","Type":"ContainerDied","Data":"7e9471ae1a26bebf564997628cdb5c4645d5b0994bb9cfa27422177a4572f239"} Sep 30 06:35:09 crc kubenswrapper[4691]: I0930 06:35:09.524075 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-cfvmh" event={"ID":"f822b186-ff4b-4190-b86f-e20bcc5ae236","Type":"ContainerStarted","Data":"2c1dee28673910978a74f6d172797e58ffcf68ea6ea2c94487b13f28a28fc277"} Sep 30 06:35:09 crc kubenswrapper[4691]: I0930 06:35:09.561267 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-f4xvm" podStartSLOduration=3.0617797 podStartE2EDuration="5.561234296s" podCreationTimestamp="2025-09-30 06:35:04 +0000 UTC" firstStartedPulling="2025-09-30 06:35:05.852849722 +0000 UTC m=+949.327870772" lastFinishedPulling="2025-09-30 06:35:08.352304328 +0000 UTC m=+951.827325368" observedRunningTime="2025-09-30 06:35:09.541787054 +0000 UTC m=+953.016808114" watchObservedRunningTime="2025-09-30 06:35:09.561234296 +0000 UTC m=+953.036255376" Sep 30 06:35:09 crc kubenswrapper[4691]: I0930 06:35:09.620638 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pksx7\" (UniqueName: \"kubernetes.io/projected/cf1e1d74-2b4c-41a7-92fb-95337bebfd86-kube-api-access-pksx7\") pod \"watcher-a15e-account-create-nt67d\" (UID: \"cf1e1d74-2b4c-41a7-92fb-95337bebfd86\") " pod="openstack/watcher-a15e-account-create-nt67d" Sep 30 06:35:09 crc kubenswrapper[4691]: I0930 06:35:09.640032 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pksx7\" (UniqueName: \"kubernetes.io/projected/cf1e1d74-2b4c-41a7-92fb-95337bebfd86-kube-api-access-pksx7\") pod \"watcher-a15e-account-create-nt67d\" (UID: \"cf1e1d74-2b4c-41a7-92fb-95337bebfd86\") " pod="openstack/watcher-a15e-account-create-nt67d" Sep 30 06:35:09 crc kubenswrapper[4691]: I0930 06:35:09.802782 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-a15e-account-create-nt67d" Sep 30 06:35:09 crc kubenswrapper[4691]: I0930 06:35:09.816120 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-648b6fc9cc-b6vxg" Sep 30 06:35:09 crc kubenswrapper[4691]: I0930 06:35:09.881713 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6749c445df-fjmdf"] Sep 30 06:35:09 crc kubenswrapper[4691]: I0930 06:35:09.882055 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6749c445df-fjmdf" podUID="871e32c8-9326-4b62-8a26-de0e8d3bc670" containerName="dnsmasq-dns" containerID="cri-o://d17ba032a6d235b38b24ab9fdd8b54353eddf8cfcfee745cd83dbcfa3ba34b53" gracePeriod=10 Sep 30 06:35:10 crc kubenswrapper[4691]: I0930 06:35:10.357149 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-a15e-account-create-nt67d"] Sep 30 06:35:10 crc kubenswrapper[4691]: I0930 06:35:10.450931 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6749c445df-fjmdf" Sep 30 06:35:10 crc kubenswrapper[4691]: I0930 06:35:10.547270 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/871e32c8-9326-4b62-8a26-de0e8d3bc670-dns-svc\") pod \"871e32c8-9326-4b62-8a26-de0e8d3bc670\" (UID: \"871e32c8-9326-4b62-8a26-de0e8d3bc670\") " Sep 30 06:35:10 crc kubenswrapper[4691]: I0930 06:35:10.547524 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/871e32c8-9326-4b62-8a26-de0e8d3bc670-config\") pod \"871e32c8-9326-4b62-8a26-de0e8d3bc670\" (UID: \"871e32c8-9326-4b62-8a26-de0e8d3bc670\") " Sep 30 06:35:10 crc kubenswrapper[4691]: I0930 06:35:10.547599 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4hgl5\" (UniqueName: \"kubernetes.io/projected/871e32c8-9326-4b62-8a26-de0e8d3bc670-kube-api-access-4hgl5\") pod \"871e32c8-9326-4b62-8a26-de0e8d3bc670\" (UID: \"871e32c8-9326-4b62-8a26-de0e8d3bc670\") " Sep 30 06:35:10 crc kubenswrapper[4691]: I0930 06:35:10.552145 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-a15e-account-create-nt67d" event={"ID":"cf1e1d74-2b4c-41a7-92fb-95337bebfd86","Type":"ContainerStarted","Data":"01191611f5df2cba60c056d88a4bc9ee5be9bc3f6f59e528d624b4997ad5a0a7"} Sep 30 06:35:10 crc kubenswrapper[4691]: I0930 06:35:10.557378 4691 generic.go:334] "Generic (PLEG): container finished" podID="871e32c8-9326-4b62-8a26-de0e8d3bc670" containerID="d17ba032a6d235b38b24ab9fdd8b54353eddf8cfcfee745cd83dbcfa3ba34b53" exitCode=0 Sep 30 06:35:10 crc kubenswrapper[4691]: I0930 06:35:10.557643 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6749c445df-fjmdf" Sep 30 06:35:10 crc kubenswrapper[4691]: I0930 06:35:10.557735 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6749c445df-fjmdf" event={"ID":"871e32c8-9326-4b62-8a26-de0e8d3bc670","Type":"ContainerDied","Data":"d17ba032a6d235b38b24ab9fdd8b54353eddf8cfcfee745cd83dbcfa3ba34b53"} Sep 30 06:35:10 crc kubenswrapper[4691]: I0930 06:35:10.557787 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6749c445df-fjmdf" event={"ID":"871e32c8-9326-4b62-8a26-de0e8d3bc670","Type":"ContainerDied","Data":"41f05a397f9a6bcb7302593bacb27e7971598339b22d60e49628a979d2fa85a7"} Sep 30 06:35:10 crc kubenswrapper[4691]: I0930 06:35:10.557809 4691 scope.go:117] "RemoveContainer" containerID="d17ba032a6d235b38b24ab9fdd8b54353eddf8cfcfee745cd83dbcfa3ba34b53" Sep 30 06:35:10 crc kubenswrapper[4691]: I0930 06:35:10.564250 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/871e32c8-9326-4b62-8a26-de0e8d3bc670-kube-api-access-4hgl5" (OuterVolumeSpecName: "kube-api-access-4hgl5") pod "871e32c8-9326-4b62-8a26-de0e8d3bc670" (UID: "871e32c8-9326-4b62-8a26-de0e8d3bc670"). InnerVolumeSpecName "kube-api-access-4hgl5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:35:10 crc kubenswrapper[4691]: I0930 06:35:10.590065 4691 scope.go:117] "RemoveContainer" containerID="46b06f5edfd77c83b92a54db9db0d111ad47b4d0f96cb664bd683cb91041801a" Sep 30 06:35:10 crc kubenswrapper[4691]: I0930 06:35:10.594566 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/871e32c8-9326-4b62-8a26-de0e8d3bc670-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "871e32c8-9326-4b62-8a26-de0e8d3bc670" (UID: "871e32c8-9326-4b62-8a26-de0e8d3bc670"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:35:10 crc kubenswrapper[4691]: I0930 06:35:10.594991 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/871e32c8-9326-4b62-8a26-de0e8d3bc670-config" (OuterVolumeSpecName: "config") pod "871e32c8-9326-4b62-8a26-de0e8d3bc670" (UID: "871e32c8-9326-4b62-8a26-de0e8d3bc670"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:35:10 crc kubenswrapper[4691]: I0930 06:35:10.615080 4691 scope.go:117] "RemoveContainer" containerID="d17ba032a6d235b38b24ab9fdd8b54353eddf8cfcfee745cd83dbcfa3ba34b53" Sep 30 06:35:10 crc kubenswrapper[4691]: E0930 06:35:10.615352 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d17ba032a6d235b38b24ab9fdd8b54353eddf8cfcfee745cd83dbcfa3ba34b53\": container with ID starting with d17ba032a6d235b38b24ab9fdd8b54353eddf8cfcfee745cd83dbcfa3ba34b53 not found: ID does not exist" containerID="d17ba032a6d235b38b24ab9fdd8b54353eddf8cfcfee745cd83dbcfa3ba34b53" Sep 30 06:35:10 crc kubenswrapper[4691]: I0930 06:35:10.615376 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d17ba032a6d235b38b24ab9fdd8b54353eddf8cfcfee745cd83dbcfa3ba34b53"} err="failed to get container status \"d17ba032a6d235b38b24ab9fdd8b54353eddf8cfcfee745cd83dbcfa3ba34b53\": rpc error: code = NotFound desc = could not find container \"d17ba032a6d235b38b24ab9fdd8b54353eddf8cfcfee745cd83dbcfa3ba34b53\": container with ID starting with d17ba032a6d235b38b24ab9fdd8b54353eddf8cfcfee745cd83dbcfa3ba34b53 not found: ID does not exist" Sep 30 06:35:10 crc kubenswrapper[4691]: I0930 06:35:10.615395 4691 scope.go:117] "RemoveContainer" containerID="46b06f5edfd77c83b92a54db9db0d111ad47b4d0f96cb664bd683cb91041801a" Sep 30 06:35:10 crc kubenswrapper[4691]: E0930 06:35:10.615786 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46b06f5edfd77c83b92a54db9db0d111ad47b4d0f96cb664bd683cb91041801a\": container with ID starting with 46b06f5edfd77c83b92a54db9db0d111ad47b4d0f96cb664bd683cb91041801a not found: ID does not exist" containerID="46b06f5edfd77c83b92a54db9db0d111ad47b4d0f96cb664bd683cb91041801a" Sep 30 06:35:10 crc kubenswrapper[4691]: I0930 06:35:10.615823 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46b06f5edfd77c83b92a54db9db0d111ad47b4d0f96cb664bd683cb91041801a"} err="failed to get container status \"46b06f5edfd77c83b92a54db9db0d111ad47b4d0f96cb664bd683cb91041801a\": rpc error: code = NotFound desc = could not find container \"46b06f5edfd77c83b92a54db9db0d111ad47b4d0f96cb664bd683cb91041801a\": container with ID starting with 46b06f5edfd77c83b92a54db9db0d111ad47b4d0f96cb664bd683cb91041801a not found: ID does not exist" Sep 30 06:35:10 crc kubenswrapper[4691]: I0930 06:35:10.653145 4691 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/871e32c8-9326-4b62-8a26-de0e8d3bc670-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 06:35:10 crc kubenswrapper[4691]: I0930 06:35:10.653185 4691 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/871e32c8-9326-4b62-8a26-de0e8d3bc670-config\") on node \"crc\" DevicePath \"\"" Sep 30 06:35:10 crc kubenswrapper[4691]: I0930 06:35:10.653197 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4hgl5\" (UniqueName: \"kubernetes.io/projected/871e32c8-9326-4b62-8a26-de0e8d3bc670-kube-api-access-4hgl5\") on node \"crc\" DevicePath \"\"" Sep 30 06:35:10 crc kubenswrapper[4691]: I0930 06:35:10.904335 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6749c445df-fjmdf"] Sep 30 06:35:10 crc kubenswrapper[4691]: I0930 06:35:10.910571 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6749c445df-fjmdf"] Sep 30 06:35:11 crc kubenswrapper[4691]: I0930 06:35:11.001604 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-jzcb7" Sep 30 06:35:11 crc kubenswrapper[4691]: I0930 06:35:11.059013 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwstd\" (UniqueName: \"kubernetes.io/projected/c4d262b4-6cf7-4fcc-bc2c-01fee2dbf62e-kube-api-access-dwstd\") pod \"c4d262b4-6cf7-4fcc-bc2c-01fee2dbf62e\" (UID: \"c4d262b4-6cf7-4fcc-bc2c-01fee2dbf62e\") " Sep 30 06:35:11 crc kubenswrapper[4691]: I0930 06:35:11.063326 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4d262b4-6cf7-4fcc-bc2c-01fee2dbf62e-kube-api-access-dwstd" (OuterVolumeSpecName: "kube-api-access-dwstd") pod "c4d262b4-6cf7-4fcc-bc2c-01fee2dbf62e" (UID: "c4d262b4-6cf7-4fcc-bc2c-01fee2dbf62e"). InnerVolumeSpecName "kube-api-access-dwstd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:35:11 crc kubenswrapper[4691]: I0930 06:35:11.070248 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-cfvmh" Sep 30 06:35:11 crc kubenswrapper[4691]: I0930 06:35:11.160757 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdj7c\" (UniqueName: \"kubernetes.io/projected/f822b186-ff4b-4190-b86f-e20bcc5ae236-kube-api-access-wdj7c\") pod \"f822b186-ff4b-4190-b86f-e20bcc5ae236\" (UID: \"f822b186-ff4b-4190-b86f-e20bcc5ae236\") " Sep 30 06:35:11 crc kubenswrapper[4691]: I0930 06:35:11.161395 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwstd\" (UniqueName: \"kubernetes.io/projected/c4d262b4-6cf7-4fcc-bc2c-01fee2dbf62e-kube-api-access-dwstd\") on node \"crc\" DevicePath \"\"" Sep 30 06:35:11 crc kubenswrapper[4691]: I0930 06:35:11.165009 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f822b186-ff4b-4190-b86f-e20bcc5ae236-kube-api-access-wdj7c" (OuterVolumeSpecName: "kube-api-access-wdj7c") pod "f822b186-ff4b-4190-b86f-e20bcc5ae236" (UID: "f822b186-ff4b-4190-b86f-e20bcc5ae236"). InnerVolumeSpecName "kube-api-access-wdj7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:35:11 crc kubenswrapper[4691]: I0930 06:35:11.234480 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="871e32c8-9326-4b62-8a26-de0e8d3bc670" path="/var/lib/kubelet/pods/871e32c8-9326-4b62-8a26-de0e8d3bc670/volumes" Sep 30 06:35:11 crc kubenswrapper[4691]: I0930 06:35:11.262597 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdj7c\" (UniqueName: \"kubernetes.io/projected/f822b186-ff4b-4190-b86f-e20bcc5ae236-kube-api-access-wdj7c\") on node \"crc\" DevicePath \"\"" Sep 30 06:35:11 crc kubenswrapper[4691]: I0930 06:35:11.567935 4691 generic.go:334] "Generic (PLEG): container finished" podID="cf1e1d74-2b4c-41a7-92fb-95337bebfd86" containerID="ba1e3bb947de74b652862715d50eb68d4a0699ecb0378f425923c6a725a18008" exitCode=0 Sep 30 06:35:11 crc kubenswrapper[4691]: I0930 06:35:11.568163 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-a15e-account-create-nt67d" event={"ID":"cf1e1d74-2b4c-41a7-92fb-95337bebfd86","Type":"ContainerDied","Data":"ba1e3bb947de74b652862715d50eb68d4a0699ecb0378f425923c6a725a18008"} Sep 30 06:35:11 crc kubenswrapper[4691]: I0930 06:35:11.574318 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-cfvmh" event={"ID":"f822b186-ff4b-4190-b86f-e20bcc5ae236","Type":"ContainerDied","Data":"2c1dee28673910978a74f6d172797e58ffcf68ea6ea2c94487b13f28a28fc277"} Sep 30 06:35:11 crc kubenswrapper[4691]: I0930 06:35:11.574360 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2c1dee28673910978a74f6d172797e58ffcf68ea6ea2c94487b13f28a28fc277" Sep 30 06:35:11 crc kubenswrapper[4691]: I0930 06:35:11.574419 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-cfvmh" Sep 30 06:35:11 crc kubenswrapper[4691]: I0930 06:35:11.579584 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-jzcb7" event={"ID":"c4d262b4-6cf7-4fcc-bc2c-01fee2dbf62e","Type":"ContainerDied","Data":"af40f3e0d604123a2da319e996b9282923bb45cc9a9e2fb6a963c456bf95a891"} Sep 30 06:35:11 crc kubenswrapper[4691]: I0930 06:35:11.579610 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af40f3e0d604123a2da319e996b9282923bb45cc9a9e2fb6a963c456bf95a891" Sep 30 06:35:11 crc kubenswrapper[4691]: I0930 06:35:11.579648 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-jzcb7" Sep 30 06:35:12 crc kubenswrapper[4691]: I0930 06:35:12.610787 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d61470fc-16c1-40fb-bc8a-17517013b3be","Type":"ContainerStarted","Data":"c03ae0e7e65cc371ff5f45ab64f69314193391be521bbea06ba84ea7dfc5adb0"} Sep 30 06:35:12 crc kubenswrapper[4691]: I0930 06:35:12.646938 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=8.201608959 podStartE2EDuration="43.646916503s" podCreationTimestamp="2025-09-30 06:34:29 +0000 UTC" firstStartedPulling="2025-09-30 06:34:36.77773044 +0000 UTC m=+920.252751480" lastFinishedPulling="2025-09-30 06:35:12.223037974 +0000 UTC m=+955.698059024" observedRunningTime="2025-09-30 06:35:12.646174329 +0000 UTC m=+956.121195399" watchObservedRunningTime="2025-09-30 06:35:12.646916503 +0000 UTC m=+956.121937553" Sep 30 06:35:13 crc kubenswrapper[4691]: I0930 06:35:13.051271 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-a15e-account-create-nt67d" Sep 30 06:35:13 crc kubenswrapper[4691]: I0930 06:35:13.096677 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pksx7\" (UniqueName: \"kubernetes.io/projected/cf1e1d74-2b4c-41a7-92fb-95337bebfd86-kube-api-access-pksx7\") pod \"cf1e1d74-2b4c-41a7-92fb-95337bebfd86\" (UID: \"cf1e1d74-2b4c-41a7-92fb-95337bebfd86\") " Sep 30 06:35:13 crc kubenswrapper[4691]: I0930 06:35:13.102078 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf1e1d74-2b4c-41a7-92fb-95337bebfd86-kube-api-access-pksx7" (OuterVolumeSpecName: "kube-api-access-pksx7") pod "cf1e1d74-2b4c-41a7-92fb-95337bebfd86" (UID: "cf1e1d74-2b4c-41a7-92fb-95337bebfd86"). InnerVolumeSpecName "kube-api-access-pksx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:35:13 crc kubenswrapper[4691]: I0930 06:35:13.199463 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pksx7\" (UniqueName: \"kubernetes.io/projected/cf1e1d74-2b4c-41a7-92fb-95337bebfd86-kube-api-access-pksx7\") on node \"crc\" DevicePath \"\"" Sep 30 06:35:13 crc kubenswrapper[4691]: I0930 06:35:13.501268 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Sep 30 06:35:13 crc kubenswrapper[4691]: I0930 06:35:13.626033 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-a15e-account-create-nt67d" Sep 30 06:35:13 crc kubenswrapper[4691]: I0930 06:35:13.626826 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-a15e-account-create-nt67d" event={"ID":"cf1e1d74-2b4c-41a7-92fb-95337bebfd86","Type":"ContainerDied","Data":"01191611f5df2cba60c056d88a4bc9ee5be9bc3f6f59e528d624b4997ad5a0a7"} Sep 30 06:35:13 crc kubenswrapper[4691]: I0930 06:35:13.626851 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01191611f5df2cba60c056d88a4bc9ee5be9bc3f6f59e528d624b4997ad5a0a7" Sep 30 06:35:15 crc kubenswrapper[4691]: I0930 06:35:15.885698 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Sep 30 06:35:15 crc kubenswrapper[4691]: I0930 06:35:15.886093 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Sep 30 06:35:15 crc kubenswrapper[4691]: I0930 06:35:15.890087 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Sep 30 06:35:16 crc kubenswrapper[4691]: I0930 06:35:16.459849 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ccb4d975-40e7-4a38-8b86-b18e685c570b-etc-swift\") pod \"swift-storage-0\" (UID: \"ccb4d975-40e7-4a38-8b86-b18e685c570b\") " pod="openstack/swift-storage-0" Sep 30 06:35:16 crc kubenswrapper[4691]: I0930 06:35:16.468080 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/ccb4d975-40e7-4a38-8b86-b18e685c570b-etc-swift\") pod \"swift-storage-0\" (UID: \"ccb4d975-40e7-4a38-8b86-b18e685c570b\") " pod="openstack/swift-storage-0" Sep 30 06:35:16 crc kubenswrapper[4691]: I0930 06:35:16.664496 4691 generic.go:334] "Generic (PLEG): container finished" podID="6b6ff7c5-6146-432e-a89c-fe95ac728e5c" containerID="b57a0525ddc4af294fd81dcc684a4dec26e9b838458599245b9086835932afd7" exitCode=0 Sep 30 06:35:16 crc kubenswrapper[4691]: I0930 06:35:16.664635 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6b6ff7c5-6146-432e-a89c-fe95ac728e5c","Type":"ContainerDied","Data":"b57a0525ddc4af294fd81dcc684a4dec26e9b838458599245b9086835932afd7"} Sep 30 06:35:16 crc kubenswrapper[4691]: I0930 06:35:16.668673 4691 generic.go:334] "Generic (PLEG): container finished" podID="fd5df9d9-7a0a-441c-b21d-92dff2af7376" containerID="23bbfb3a62c8abdf4f6d70e59cf97b8f36b8de2806338d340cd77a5da638b089" exitCode=0 Sep 30 06:35:16 crc kubenswrapper[4691]: I0930 06:35:16.668837 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"fd5df9d9-7a0a-441c-b21d-92dff2af7376","Type":"ContainerDied","Data":"23bbfb3a62c8abdf4f6d70e59cf97b8f36b8de2806338d340cd77a5da638b089"} Sep 30 06:35:16 crc kubenswrapper[4691]: I0930 06:35:16.674335 4691 generic.go:334] "Generic (PLEG): container finished" podID="22775d02-1312-4d7a-917d-80dc62539dba" containerID="673641b5a644590eeb1e4e2bf3ee4b0ff6ec4221cd88979d3c445d47e1339418" exitCode=0 Sep 30 06:35:16 crc kubenswrapper[4691]: I0930 06:35:16.674433 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-f4xvm" event={"ID":"22775d02-1312-4d7a-917d-80dc62539dba","Type":"ContainerDied","Data":"673641b5a644590eeb1e4e2bf3ee4b0ff6ec4221cd88979d3c445d47e1339418"} Sep 30 06:35:16 crc kubenswrapper[4691]: I0930 06:35:16.676961 4691 generic.go:334] "Generic (PLEG): container finished" podID="d454968e-74c7-45e3-9608-e915973c7f25" containerID="dedcda10d9f4175b32470999d84df603f681ecdb704564dc1ee4851215f61575" exitCode=0 Sep 30 06:35:16 crc kubenswrapper[4691]: I0930 06:35:16.678995 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-notifications-server-0" event={"ID":"d454968e-74c7-45e3-9608-e915973c7f25","Type":"ContainerDied","Data":"dedcda10d9f4175b32470999d84df603f681ecdb704564dc1ee4851215f61575"} Sep 30 06:35:16 crc kubenswrapper[4691]: I0930 06:35:16.680955 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Sep 30 06:35:16 crc kubenswrapper[4691]: I0930 06:35:16.744948 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Sep 30 06:35:17 crc kubenswrapper[4691]: I0930 06:35:17.432535 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Sep 30 06:35:17 crc kubenswrapper[4691]: I0930 06:35:17.453559 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-a0bc-account-create-89btn"] Sep 30 06:35:17 crc kubenswrapper[4691]: E0930 06:35:17.461352 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="871e32c8-9326-4b62-8a26-de0e8d3bc670" containerName="init" Sep 30 06:35:17 crc kubenswrapper[4691]: I0930 06:35:17.461378 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="871e32c8-9326-4b62-8a26-de0e8d3bc670" containerName="init" Sep 30 06:35:17 crc kubenswrapper[4691]: E0930 06:35:17.461389 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f822b186-ff4b-4190-b86f-e20bcc5ae236" containerName="mariadb-database-create" Sep 30 06:35:17 crc kubenswrapper[4691]: I0930 06:35:17.461410 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="f822b186-ff4b-4190-b86f-e20bcc5ae236" containerName="mariadb-database-create" Sep 30 06:35:17 crc kubenswrapper[4691]: E0930 06:35:17.461424 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4d262b4-6cf7-4fcc-bc2c-01fee2dbf62e" containerName="mariadb-database-create" Sep 30 06:35:17 crc kubenswrapper[4691]: I0930 06:35:17.461430 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4d262b4-6cf7-4fcc-bc2c-01fee2dbf62e" containerName="mariadb-database-create" Sep 30 06:35:17 crc kubenswrapper[4691]: E0930 06:35:17.461443 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="871e32c8-9326-4b62-8a26-de0e8d3bc670" containerName="dnsmasq-dns" Sep 30 06:35:17 crc kubenswrapper[4691]: I0930 06:35:17.461449 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="871e32c8-9326-4b62-8a26-de0e8d3bc670" containerName="dnsmasq-dns" Sep 30 06:35:17 crc kubenswrapper[4691]: E0930 06:35:17.461460 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf1e1d74-2b4c-41a7-92fb-95337bebfd86" containerName="mariadb-account-create" Sep 30 06:35:17 crc kubenswrapper[4691]: I0930 06:35:17.461465 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf1e1d74-2b4c-41a7-92fb-95337bebfd86" containerName="mariadb-account-create" Sep 30 06:35:17 crc kubenswrapper[4691]: I0930 06:35:17.462436 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4d262b4-6cf7-4fcc-bc2c-01fee2dbf62e" containerName="mariadb-database-create" Sep 30 06:35:17 crc kubenswrapper[4691]: I0930 06:35:17.462469 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf1e1d74-2b4c-41a7-92fb-95337bebfd86" containerName="mariadb-account-create" Sep 30 06:35:17 crc kubenswrapper[4691]: I0930 06:35:17.462490 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="f822b186-ff4b-4190-b86f-e20bcc5ae236" containerName="mariadb-database-create" Sep 30 06:35:17 crc kubenswrapper[4691]: I0930 06:35:17.462498 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="871e32c8-9326-4b62-8a26-de0e8d3bc670" containerName="dnsmasq-dns" Sep 30 06:35:17 crc kubenswrapper[4691]: I0930 06:35:17.462990 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-a0bc-account-create-89btn"] Sep 30 06:35:17 crc kubenswrapper[4691]: I0930 06:35:17.463064 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-a0bc-account-create-89btn" Sep 30 06:35:17 crc kubenswrapper[4691]: I0930 06:35:17.465128 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Sep 30 06:35:17 crc kubenswrapper[4691]: I0930 06:35:17.585734 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mmgq\" (UniqueName: \"kubernetes.io/projected/b5e86cb2-d4ee-4828-887f-4166440d359b-kube-api-access-5mmgq\") pod \"keystone-a0bc-account-create-89btn\" (UID: \"b5e86cb2-d4ee-4828-887f-4166440d359b\") " pod="openstack/keystone-a0bc-account-create-89btn" Sep 30 06:35:17 crc kubenswrapper[4691]: I0930 06:35:17.686851 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mmgq\" (UniqueName: \"kubernetes.io/projected/b5e86cb2-d4ee-4828-887f-4166440d359b-kube-api-access-5mmgq\") pod \"keystone-a0bc-account-create-89btn\" (UID: \"b5e86cb2-d4ee-4828-887f-4166440d359b\") " pod="openstack/keystone-a0bc-account-create-89btn" Sep 30 06:35:17 crc kubenswrapper[4691]: I0930 06:35:17.691459 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-notifications-server-0" event={"ID":"d454968e-74c7-45e3-9608-e915973c7f25","Type":"ContainerStarted","Data":"bdde1d62b3e17afece1c6a11a966bfc5b8ca1a42639deeed8c03e61cb778f24c"} Sep 30 06:35:17 crc kubenswrapper[4691]: I0930 06:35:17.691731 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-notifications-server-0" Sep 30 06:35:17 crc kubenswrapper[4691]: I0930 06:35:17.693422 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ccb4d975-40e7-4a38-8b86-b18e685c570b","Type":"ContainerStarted","Data":"6c7b03dd4f39c9ba6fa6957e7e817f977e1a2e3e99222b49419817d9879ed0e6"} Sep 30 06:35:17 crc kubenswrapper[4691]: I0930 06:35:17.695879 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6b6ff7c5-6146-432e-a89c-fe95ac728e5c","Type":"ContainerStarted","Data":"3a338c90117dd264286426ab2d4ddfa45266ae38adc725c1467b4129a7c3187a"} Sep 30 06:35:17 crc kubenswrapper[4691]: I0930 06:35:17.696121 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Sep 30 06:35:17 crc kubenswrapper[4691]: I0930 06:35:17.697912 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"fd5df9d9-7a0a-441c-b21d-92dff2af7376","Type":"ContainerStarted","Data":"25e026665d18a5a549dc7bd0f85f93fb72649cd66d04b7d5ccd6ddae22cf463e"} Sep 30 06:35:17 crc kubenswrapper[4691]: I0930 06:35:17.698145 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Sep 30 06:35:17 crc kubenswrapper[4691]: I0930 06:35:17.719974 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-notifications-server-0" podStartSLOduration=48.863989705 podStartE2EDuration="55.719958726s" podCreationTimestamp="2025-09-30 06:34:22 +0000 UTC" firstStartedPulling="2025-09-30 06:34:36.672446863 +0000 UTC m=+920.147467923" lastFinishedPulling="2025-09-30 06:34:43.528415904 +0000 UTC m=+927.003436944" observedRunningTime="2025-09-30 06:35:17.710612557 +0000 UTC m=+961.185633597" watchObservedRunningTime="2025-09-30 06:35:17.719958726 +0000 UTC m=+961.194979766" Sep 30 06:35:17 crc kubenswrapper[4691]: I0930 06:35:17.731310 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mmgq\" (UniqueName: \"kubernetes.io/projected/b5e86cb2-d4ee-4828-887f-4166440d359b-kube-api-access-5mmgq\") pod \"keystone-a0bc-account-create-89btn\" (UID: \"b5e86cb2-d4ee-4828-887f-4166440d359b\") " pod="openstack/keystone-a0bc-account-create-89btn" Sep 30 06:35:17 crc kubenswrapper[4691]: I0930 06:35:17.757035 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=48.443714924 podStartE2EDuration="55.757018522s" podCreationTimestamp="2025-09-30 06:34:22 +0000 UTC" firstStartedPulling="2025-09-30 06:34:36.777142072 +0000 UTC m=+920.252163112" lastFinishedPulling="2025-09-30 06:34:44.09044566 +0000 UTC m=+927.565466710" observedRunningTime="2025-09-30 06:35:17.750590026 +0000 UTC m=+961.225611076" watchObservedRunningTime="2025-09-30 06:35:17.757018522 +0000 UTC m=+961.232039562" Sep 30 06:35:17 crc kubenswrapper[4691]: I0930 06:35:17.769814 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-3c60-account-create-l5zxb"] Sep 30 06:35:17 crc kubenswrapper[4691]: I0930 06:35:17.770808 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3c60-account-create-l5zxb" Sep 30 06:35:17 crc kubenswrapper[4691]: I0930 06:35:17.772173 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Sep 30 06:35:17 crc kubenswrapper[4691]: I0930 06:35:17.778679 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=48.441013636 podStartE2EDuration="55.778657704s" podCreationTimestamp="2025-09-30 06:34:22 +0000 UTC" firstStartedPulling="2025-09-30 06:34:36.777475903 +0000 UTC m=+920.252496943" lastFinishedPulling="2025-09-30 06:34:44.115119921 +0000 UTC m=+927.590141011" observedRunningTime="2025-09-30 06:35:17.774609564 +0000 UTC m=+961.249630614" watchObservedRunningTime="2025-09-30 06:35:17.778657704 +0000 UTC m=+961.253678744" Sep 30 06:35:17 crc kubenswrapper[4691]: I0930 06:35:17.787706 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-3c60-account-create-l5zxb"] Sep 30 06:35:17 crc kubenswrapper[4691]: I0930 06:35:17.813564 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-a0bc-account-create-89btn" Sep 30 06:35:17 crc kubenswrapper[4691]: I0930 06:35:17.992303 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7gjv\" (UniqueName: \"kubernetes.io/projected/1a0abd23-3753-4910-9b91-539dde605d21-kube-api-access-h7gjv\") pod \"placement-3c60-account-create-l5zxb\" (UID: \"1a0abd23-3753-4910-9b91-539dde605d21\") " pod="openstack/placement-3c60-account-create-l5zxb" Sep 30 06:35:18 crc kubenswrapper[4691]: I0930 06:35:18.093907 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7gjv\" (UniqueName: \"kubernetes.io/projected/1a0abd23-3753-4910-9b91-539dde605d21-kube-api-access-h7gjv\") pod \"placement-3c60-account-create-l5zxb\" (UID: \"1a0abd23-3753-4910-9b91-539dde605d21\") " pod="openstack/placement-3c60-account-create-l5zxb" Sep 30 06:35:18 crc kubenswrapper[4691]: I0930 06:35:18.119556 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7gjv\" (UniqueName: \"kubernetes.io/projected/1a0abd23-3753-4910-9b91-539dde605d21-kube-api-access-h7gjv\") pod \"placement-3c60-account-create-l5zxb\" (UID: \"1a0abd23-3753-4910-9b91-539dde605d21\") " pod="openstack/placement-3c60-account-create-l5zxb" Sep 30 06:35:18 crc kubenswrapper[4691]: I0930 06:35:18.123492 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3c60-account-create-l5zxb" Sep 30 06:35:18 crc kubenswrapper[4691]: I0930 06:35:18.366685 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-csq87" Sep 30 06:35:18 crc kubenswrapper[4691]: I0930 06:35:18.367186 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-csq87" Sep 30 06:35:18 crc kubenswrapper[4691]: I0930 06:35:18.370727 4691 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-2wmg8" podUID="86397f09-76d1-4c35-a96a-5b6bde1e3574" containerName="ovn-controller" probeResult="failure" output=< Sep 30 06:35:18 crc kubenswrapper[4691]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Sep 30 06:35:18 crc kubenswrapper[4691]: > Sep 30 06:35:18 crc kubenswrapper[4691]: I0930 06:35:18.401745 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-f4xvm" Sep 30 06:35:18 crc kubenswrapper[4691]: I0930 06:35:18.503726 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6btr\" (UniqueName: \"kubernetes.io/projected/22775d02-1312-4d7a-917d-80dc62539dba-kube-api-access-k6btr\") pod \"22775d02-1312-4d7a-917d-80dc62539dba\" (UID: \"22775d02-1312-4d7a-917d-80dc62539dba\") " Sep 30 06:35:18 crc kubenswrapper[4691]: I0930 06:35:18.504158 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22775d02-1312-4d7a-917d-80dc62539dba-combined-ca-bundle\") pod \"22775d02-1312-4d7a-917d-80dc62539dba\" (UID: \"22775d02-1312-4d7a-917d-80dc62539dba\") " Sep 30 06:35:18 crc kubenswrapper[4691]: I0930 06:35:18.504211 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/22775d02-1312-4d7a-917d-80dc62539dba-ring-data-devices\") pod \"22775d02-1312-4d7a-917d-80dc62539dba\" (UID: \"22775d02-1312-4d7a-917d-80dc62539dba\") " Sep 30 06:35:18 crc kubenswrapper[4691]: I0930 06:35:18.504272 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/22775d02-1312-4d7a-917d-80dc62539dba-dispersionconf\") pod \"22775d02-1312-4d7a-917d-80dc62539dba\" (UID: \"22775d02-1312-4d7a-917d-80dc62539dba\") " Sep 30 06:35:18 crc kubenswrapper[4691]: I0930 06:35:18.504316 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/22775d02-1312-4d7a-917d-80dc62539dba-swiftconf\") pod \"22775d02-1312-4d7a-917d-80dc62539dba\" (UID: \"22775d02-1312-4d7a-917d-80dc62539dba\") " Sep 30 06:35:18 crc kubenswrapper[4691]: I0930 06:35:18.504343 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/22775d02-1312-4d7a-917d-80dc62539dba-etc-swift\") pod \"22775d02-1312-4d7a-917d-80dc62539dba\" (UID: \"22775d02-1312-4d7a-917d-80dc62539dba\") " Sep 30 06:35:18 crc kubenswrapper[4691]: I0930 06:35:18.504364 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/22775d02-1312-4d7a-917d-80dc62539dba-scripts\") pod \"22775d02-1312-4d7a-917d-80dc62539dba\" (UID: \"22775d02-1312-4d7a-917d-80dc62539dba\") " Sep 30 06:35:18 crc kubenswrapper[4691]: I0930 06:35:18.505551 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22775d02-1312-4d7a-917d-80dc62539dba-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "22775d02-1312-4d7a-917d-80dc62539dba" (UID: "22775d02-1312-4d7a-917d-80dc62539dba"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:35:18 crc kubenswrapper[4691]: I0930 06:35:18.508259 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22775d02-1312-4d7a-917d-80dc62539dba-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "22775d02-1312-4d7a-917d-80dc62539dba" (UID: "22775d02-1312-4d7a-917d-80dc62539dba"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:35:18 crc kubenswrapper[4691]: I0930 06:35:18.534149 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22775d02-1312-4d7a-917d-80dc62539dba-scripts" (OuterVolumeSpecName: "scripts") pod "22775d02-1312-4d7a-917d-80dc62539dba" (UID: "22775d02-1312-4d7a-917d-80dc62539dba"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:35:18 crc kubenswrapper[4691]: I0930 06:35:18.540240 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22775d02-1312-4d7a-917d-80dc62539dba-kube-api-access-k6btr" (OuterVolumeSpecName: "kube-api-access-k6btr") pod "22775d02-1312-4d7a-917d-80dc62539dba" (UID: "22775d02-1312-4d7a-917d-80dc62539dba"). InnerVolumeSpecName "kube-api-access-k6btr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:35:18 crc kubenswrapper[4691]: I0930 06:35:18.540323 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22775d02-1312-4d7a-917d-80dc62539dba-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "22775d02-1312-4d7a-917d-80dc62539dba" (UID: "22775d02-1312-4d7a-917d-80dc62539dba"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:35:18 crc kubenswrapper[4691]: I0930 06:35:18.541485 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22775d02-1312-4d7a-917d-80dc62539dba-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "22775d02-1312-4d7a-917d-80dc62539dba" (UID: "22775d02-1312-4d7a-917d-80dc62539dba"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:35:18 crc kubenswrapper[4691]: I0930 06:35:18.541794 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22775d02-1312-4d7a-917d-80dc62539dba-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "22775d02-1312-4d7a-917d-80dc62539dba" (UID: "22775d02-1312-4d7a-917d-80dc62539dba"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:35:18 crc kubenswrapper[4691]: I0930 06:35:18.607609 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6btr\" (UniqueName: \"kubernetes.io/projected/22775d02-1312-4d7a-917d-80dc62539dba-kube-api-access-k6btr\") on node \"crc\" DevicePath \"\"" Sep 30 06:35:18 crc kubenswrapper[4691]: I0930 06:35:18.607630 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22775d02-1312-4d7a-917d-80dc62539dba-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:35:18 crc kubenswrapper[4691]: I0930 06:35:18.607639 4691 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/22775d02-1312-4d7a-917d-80dc62539dba-ring-data-devices\") on node \"crc\" DevicePath \"\"" Sep 30 06:35:18 crc kubenswrapper[4691]: I0930 06:35:18.607649 4691 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/22775d02-1312-4d7a-917d-80dc62539dba-dispersionconf\") on node \"crc\" DevicePath \"\"" Sep 30 06:35:18 crc kubenswrapper[4691]: I0930 06:35:18.607657 4691 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/22775d02-1312-4d7a-917d-80dc62539dba-swiftconf\") on node \"crc\" DevicePath \"\"" Sep 30 06:35:18 crc kubenswrapper[4691]: I0930 06:35:18.607665 4691 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/22775d02-1312-4d7a-917d-80dc62539dba-etc-swift\") on node \"crc\" DevicePath \"\"" Sep 30 06:35:18 crc kubenswrapper[4691]: I0930 06:35:18.607674 4691 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/22775d02-1312-4d7a-917d-80dc62539dba-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 06:35:18 crc kubenswrapper[4691]: I0930 06:35:18.627716 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-2wmg8-config-pjtkf"] Sep 30 06:35:18 crc kubenswrapper[4691]: E0930 06:35:18.628096 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22775d02-1312-4d7a-917d-80dc62539dba" containerName="swift-ring-rebalance" Sep 30 06:35:18 crc kubenswrapper[4691]: I0930 06:35:18.628110 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="22775d02-1312-4d7a-917d-80dc62539dba" containerName="swift-ring-rebalance" Sep 30 06:35:18 crc kubenswrapper[4691]: I0930 06:35:18.628284 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="22775d02-1312-4d7a-917d-80dc62539dba" containerName="swift-ring-rebalance" Sep 30 06:35:18 crc kubenswrapper[4691]: I0930 06:35:18.628859 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-2wmg8-config-pjtkf" Sep 30 06:35:18 crc kubenswrapper[4691]: I0930 06:35:18.634220 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Sep 30 06:35:18 crc kubenswrapper[4691]: I0930 06:35:18.634369 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-2wmg8-config-pjtkf"] Sep 30 06:35:18 crc kubenswrapper[4691]: I0930 06:35:18.708622 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-f4xvm" event={"ID":"22775d02-1312-4d7a-917d-80dc62539dba","Type":"ContainerDied","Data":"30403dfceebf9000828ca6a40a1a0617ed1e84c6cf51a938f0998d455f23f316"} Sep 30 06:35:18 crc kubenswrapper[4691]: I0930 06:35:18.709128 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30403dfceebf9000828ca6a40a1a0617ed1e84c6cf51a938f0998d455f23f316" Sep 30 06:35:18 crc kubenswrapper[4691]: I0930 06:35:18.708720 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-f4xvm" Sep 30 06:35:18 crc kubenswrapper[4691]: I0930 06:35:18.810687 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zx87\" (UniqueName: \"kubernetes.io/projected/7d3185a5-d016-44b4-b090-2589b9e73ea2-kube-api-access-5zx87\") pod \"ovn-controller-2wmg8-config-pjtkf\" (UID: \"7d3185a5-d016-44b4-b090-2589b9e73ea2\") " pod="openstack/ovn-controller-2wmg8-config-pjtkf" Sep 30 06:35:18 crc kubenswrapper[4691]: I0930 06:35:18.810768 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/7d3185a5-d016-44b4-b090-2589b9e73ea2-additional-scripts\") pod \"ovn-controller-2wmg8-config-pjtkf\" (UID: \"7d3185a5-d016-44b4-b090-2589b9e73ea2\") " pod="openstack/ovn-controller-2wmg8-config-pjtkf" Sep 30 06:35:18 crc kubenswrapper[4691]: I0930 06:35:18.810825 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7d3185a5-d016-44b4-b090-2589b9e73ea2-scripts\") pod \"ovn-controller-2wmg8-config-pjtkf\" (UID: \"7d3185a5-d016-44b4-b090-2589b9e73ea2\") " pod="openstack/ovn-controller-2wmg8-config-pjtkf" Sep 30 06:35:18 crc kubenswrapper[4691]: I0930 06:35:18.810933 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7d3185a5-d016-44b4-b090-2589b9e73ea2-var-run\") pod \"ovn-controller-2wmg8-config-pjtkf\" (UID: \"7d3185a5-d016-44b4-b090-2589b9e73ea2\") " pod="openstack/ovn-controller-2wmg8-config-pjtkf" Sep 30 06:35:18 crc kubenswrapper[4691]: I0930 06:35:18.811014 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7d3185a5-d016-44b4-b090-2589b9e73ea2-var-run-ovn\") pod \"ovn-controller-2wmg8-config-pjtkf\" (UID: \"7d3185a5-d016-44b4-b090-2589b9e73ea2\") " pod="openstack/ovn-controller-2wmg8-config-pjtkf" Sep 30 06:35:18 crc kubenswrapper[4691]: I0930 06:35:18.811186 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7d3185a5-d016-44b4-b090-2589b9e73ea2-var-log-ovn\") pod \"ovn-controller-2wmg8-config-pjtkf\" (UID: \"7d3185a5-d016-44b4-b090-2589b9e73ea2\") " pod="openstack/ovn-controller-2wmg8-config-pjtkf" Sep 30 06:35:18 crc kubenswrapper[4691]: I0930 06:35:18.859096 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-3c60-account-create-l5zxb"] Sep 30 06:35:18 crc kubenswrapper[4691]: W0930 06:35:18.861206 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1a0abd23_3753_4910_9b91_539dde605d21.slice/crio-4be71fb1677b0ae9b390105f98ae84e904ff7037243b1f379b32c077e69431fe WatchSource:0}: Error finding container 4be71fb1677b0ae9b390105f98ae84e904ff7037243b1f379b32c077e69431fe: Status 404 returned error can't find the container with id 4be71fb1677b0ae9b390105f98ae84e904ff7037243b1f379b32c077e69431fe Sep 30 06:35:18 crc kubenswrapper[4691]: I0930 06:35:18.915605 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7d3185a5-d016-44b4-b090-2589b9e73ea2-scripts\") pod \"ovn-controller-2wmg8-config-pjtkf\" (UID: \"7d3185a5-d016-44b4-b090-2589b9e73ea2\") " pod="openstack/ovn-controller-2wmg8-config-pjtkf" Sep 30 06:35:18 crc kubenswrapper[4691]: I0930 06:35:18.915665 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7d3185a5-d016-44b4-b090-2589b9e73ea2-var-run\") pod \"ovn-controller-2wmg8-config-pjtkf\" (UID: \"7d3185a5-d016-44b4-b090-2589b9e73ea2\") " pod="openstack/ovn-controller-2wmg8-config-pjtkf" Sep 30 06:35:18 crc kubenswrapper[4691]: I0930 06:35:18.915690 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7d3185a5-d016-44b4-b090-2589b9e73ea2-var-run-ovn\") pod \"ovn-controller-2wmg8-config-pjtkf\" (UID: \"7d3185a5-d016-44b4-b090-2589b9e73ea2\") " pod="openstack/ovn-controller-2wmg8-config-pjtkf" Sep 30 06:35:18 crc kubenswrapper[4691]: I0930 06:35:18.915732 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7d3185a5-d016-44b4-b090-2589b9e73ea2-var-log-ovn\") pod \"ovn-controller-2wmg8-config-pjtkf\" (UID: \"7d3185a5-d016-44b4-b090-2589b9e73ea2\") " pod="openstack/ovn-controller-2wmg8-config-pjtkf" Sep 30 06:35:18 crc kubenswrapper[4691]: I0930 06:35:18.915793 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zx87\" (UniqueName: \"kubernetes.io/projected/7d3185a5-d016-44b4-b090-2589b9e73ea2-kube-api-access-5zx87\") pod \"ovn-controller-2wmg8-config-pjtkf\" (UID: \"7d3185a5-d016-44b4-b090-2589b9e73ea2\") " pod="openstack/ovn-controller-2wmg8-config-pjtkf" Sep 30 06:35:18 crc kubenswrapper[4691]: I0930 06:35:18.915816 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/7d3185a5-d016-44b4-b090-2589b9e73ea2-additional-scripts\") pod \"ovn-controller-2wmg8-config-pjtkf\" (UID: \"7d3185a5-d016-44b4-b090-2589b9e73ea2\") " pod="openstack/ovn-controller-2wmg8-config-pjtkf" Sep 30 06:35:18 crc kubenswrapper[4691]: I0930 06:35:18.916429 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/7d3185a5-d016-44b4-b090-2589b9e73ea2-additional-scripts\") pod \"ovn-controller-2wmg8-config-pjtkf\" (UID: \"7d3185a5-d016-44b4-b090-2589b9e73ea2\") " pod="openstack/ovn-controller-2wmg8-config-pjtkf" Sep 30 06:35:18 crc kubenswrapper[4691]: I0930 06:35:18.918527 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7d3185a5-d016-44b4-b090-2589b9e73ea2-var-run\") pod \"ovn-controller-2wmg8-config-pjtkf\" (UID: \"7d3185a5-d016-44b4-b090-2589b9e73ea2\") " pod="openstack/ovn-controller-2wmg8-config-pjtkf" Sep 30 06:35:18 crc kubenswrapper[4691]: I0930 06:35:18.918610 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7d3185a5-d016-44b4-b090-2589b9e73ea2-var-run-ovn\") pod \"ovn-controller-2wmg8-config-pjtkf\" (UID: \"7d3185a5-d016-44b4-b090-2589b9e73ea2\") " pod="openstack/ovn-controller-2wmg8-config-pjtkf" Sep 30 06:35:18 crc kubenswrapper[4691]: I0930 06:35:18.918647 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7d3185a5-d016-44b4-b090-2589b9e73ea2-var-log-ovn\") pod \"ovn-controller-2wmg8-config-pjtkf\" (UID: \"7d3185a5-d016-44b4-b090-2589b9e73ea2\") " pod="openstack/ovn-controller-2wmg8-config-pjtkf" Sep 30 06:35:18 crc kubenswrapper[4691]: I0930 06:35:18.935193 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7d3185a5-d016-44b4-b090-2589b9e73ea2-scripts\") pod \"ovn-controller-2wmg8-config-pjtkf\" (UID: \"7d3185a5-d016-44b4-b090-2589b9e73ea2\") " pod="openstack/ovn-controller-2wmg8-config-pjtkf" Sep 30 06:35:18 crc kubenswrapper[4691]: I0930 06:35:18.945779 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zx87\" (UniqueName: \"kubernetes.io/projected/7d3185a5-d016-44b4-b090-2589b9e73ea2-kube-api-access-5zx87\") pod \"ovn-controller-2wmg8-config-pjtkf\" (UID: \"7d3185a5-d016-44b4-b090-2589b9e73ea2\") " pod="openstack/ovn-controller-2wmg8-config-pjtkf" Sep 30 06:35:18 crc kubenswrapper[4691]: I0930 06:35:18.988674 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-2wmg8-config-pjtkf" Sep 30 06:35:19 crc kubenswrapper[4691]: I0930 06:35:19.031503 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-a0bc-account-create-89btn"] Sep 30 06:35:19 crc kubenswrapper[4691]: I0930 06:35:19.258993 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-2wmg8-config-pjtkf"] Sep 30 06:35:19 crc kubenswrapper[4691]: W0930 06:35:19.269801 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d3185a5_d016_44b4_b090_2589b9e73ea2.slice/crio-c1bc6c6668396132cea2e8f44b8386eca5fb1b5135ac22d28e05260fffb0e5af WatchSource:0}: Error finding container c1bc6c6668396132cea2e8f44b8386eca5fb1b5135ac22d28e05260fffb0e5af: Status 404 returned error can't find the container with id c1bc6c6668396132cea2e8f44b8386eca5fb1b5135ac22d28e05260fffb0e5af Sep 30 06:35:19 crc kubenswrapper[4691]: I0930 06:35:19.369534 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 06:35:19 crc kubenswrapper[4691]: I0930 06:35:19.369759 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="d61470fc-16c1-40fb-bc8a-17517013b3be" containerName="prometheus" containerID="cri-o://d935e49d4bf9dd6496df247011a570c49579ab41b053b37ff6b73f549dcaba1f" gracePeriod=600 Sep 30 06:35:19 crc kubenswrapper[4691]: I0930 06:35:19.370104 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="d61470fc-16c1-40fb-bc8a-17517013b3be" containerName="thanos-sidecar" containerID="cri-o://c03ae0e7e65cc371ff5f45ab64f69314193391be521bbea06ba84ea7dfc5adb0" gracePeriod=600 Sep 30 06:35:19 crc kubenswrapper[4691]: I0930 06:35:19.370151 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="d61470fc-16c1-40fb-bc8a-17517013b3be" containerName="config-reloader" containerID="cri-o://36efbf42370719637b353083faaa9b5e0765d528e8646b1f27e9289c5c3eb77a" gracePeriod=600 Sep 30 06:35:19 crc kubenswrapper[4691]: I0930 06:35:19.723030 4691 generic.go:334] "Generic (PLEG): container finished" podID="b5e86cb2-d4ee-4828-887f-4166440d359b" containerID="acd8eda80e88fd8232ebc4d373f7aa89e33d5783392512a53a5cf79bf487c07c" exitCode=0 Sep 30 06:35:19 crc kubenswrapper[4691]: I0930 06:35:19.723413 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-a0bc-account-create-89btn" event={"ID":"b5e86cb2-d4ee-4828-887f-4166440d359b","Type":"ContainerDied","Data":"acd8eda80e88fd8232ebc4d373f7aa89e33d5783392512a53a5cf79bf487c07c"} Sep 30 06:35:19 crc kubenswrapper[4691]: I0930 06:35:19.723459 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-a0bc-account-create-89btn" event={"ID":"b5e86cb2-d4ee-4828-887f-4166440d359b","Type":"ContainerStarted","Data":"085ed8e1811560a56504ac0042aa9a02f17a07328f3390f85ea4da3d1349fcc6"} Sep 30 06:35:19 crc kubenswrapper[4691]: I0930 06:35:19.727619 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ccb4d975-40e7-4a38-8b86-b18e685c570b","Type":"ContainerStarted","Data":"aba6b92e59ce9be39abc666dd94841c8c15220850be2b8b8ae2e4d25ebb7b8f5"} Sep 30 06:35:19 crc kubenswrapper[4691]: I0930 06:35:19.727654 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ccb4d975-40e7-4a38-8b86-b18e685c570b","Type":"ContainerStarted","Data":"4a63ba4bb03ff3092b02889c6e203550f9493325455385a89fa56c943891ad32"} Sep 30 06:35:19 crc kubenswrapper[4691]: I0930 06:35:19.727666 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ccb4d975-40e7-4a38-8b86-b18e685c570b","Type":"ContainerStarted","Data":"f64e98826fbddd0d996ac98f957f6c954f534ebdf1cff9183c6926f8e3de08c5"} Sep 30 06:35:19 crc kubenswrapper[4691]: I0930 06:35:19.727675 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ccb4d975-40e7-4a38-8b86-b18e685c570b","Type":"ContainerStarted","Data":"dbedbca556702155af5a3e3880384791cb3349224a4165a31ecc88a893a1eff1"} Sep 30 06:35:19 crc kubenswrapper[4691]: I0930 06:35:19.733474 4691 generic.go:334] "Generic (PLEG): container finished" podID="d61470fc-16c1-40fb-bc8a-17517013b3be" containerID="c03ae0e7e65cc371ff5f45ab64f69314193391be521bbea06ba84ea7dfc5adb0" exitCode=0 Sep 30 06:35:19 crc kubenswrapper[4691]: I0930 06:35:19.733506 4691 generic.go:334] "Generic (PLEG): container finished" podID="d61470fc-16c1-40fb-bc8a-17517013b3be" containerID="d935e49d4bf9dd6496df247011a570c49579ab41b053b37ff6b73f549dcaba1f" exitCode=0 Sep 30 06:35:19 crc kubenswrapper[4691]: I0930 06:35:19.733541 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d61470fc-16c1-40fb-bc8a-17517013b3be","Type":"ContainerDied","Data":"c03ae0e7e65cc371ff5f45ab64f69314193391be521bbea06ba84ea7dfc5adb0"} Sep 30 06:35:19 crc kubenswrapper[4691]: I0930 06:35:19.733573 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d61470fc-16c1-40fb-bc8a-17517013b3be","Type":"ContainerDied","Data":"d935e49d4bf9dd6496df247011a570c49579ab41b053b37ff6b73f549dcaba1f"} Sep 30 06:35:19 crc kubenswrapper[4691]: I0930 06:35:19.735923 4691 generic.go:334] "Generic (PLEG): container finished" podID="1a0abd23-3753-4910-9b91-539dde605d21" containerID="6f46d0d929e186ca33eb3a982e34e83930476b7afccb251fe18c624e6b12dd8d" exitCode=0 Sep 30 06:35:19 crc kubenswrapper[4691]: I0930 06:35:19.735986 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3c60-account-create-l5zxb" event={"ID":"1a0abd23-3753-4910-9b91-539dde605d21","Type":"ContainerDied","Data":"6f46d0d929e186ca33eb3a982e34e83930476b7afccb251fe18c624e6b12dd8d"} Sep 30 06:35:19 crc kubenswrapper[4691]: I0930 06:35:19.736011 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3c60-account-create-l5zxb" event={"ID":"1a0abd23-3753-4910-9b91-539dde605d21","Type":"ContainerStarted","Data":"4be71fb1677b0ae9b390105f98ae84e904ff7037243b1f379b32c077e69431fe"} Sep 30 06:35:19 crc kubenswrapper[4691]: I0930 06:35:19.743743 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-2wmg8-config-pjtkf" event={"ID":"7d3185a5-d016-44b4-b090-2589b9e73ea2","Type":"ContainerStarted","Data":"c1bc6c6668396132cea2e8f44b8386eca5fb1b5135ac22d28e05260fffb0e5af"} Sep 30 06:35:20 crc kubenswrapper[4691]: I0930 06:35:20.564221 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Sep 30 06:35:20 crc kubenswrapper[4691]: I0930 06:35:20.760944 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d61470fc-16c1-40fb-bc8a-17517013b3be-web-config\") pod \"d61470fc-16c1-40fb-bc8a-17517013b3be\" (UID: \"d61470fc-16c1-40fb-bc8a-17517013b3be\") " Sep 30 06:35:20 crc kubenswrapper[4691]: I0930 06:35:20.760979 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ggnpl\" (UniqueName: \"kubernetes.io/projected/d61470fc-16c1-40fb-bc8a-17517013b3be-kube-api-access-ggnpl\") pod \"d61470fc-16c1-40fb-bc8a-17517013b3be\" (UID: \"d61470fc-16c1-40fb-bc8a-17517013b3be\") " Sep 30 06:35:20 crc kubenswrapper[4691]: I0930 06:35:20.761061 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/d61470fc-16c1-40fb-bc8a-17517013b3be-config\") pod \"d61470fc-16c1-40fb-bc8a-17517013b3be\" (UID: \"d61470fc-16c1-40fb-bc8a-17517013b3be\") " Sep 30 06:35:20 crc kubenswrapper[4691]: I0930 06:35:20.761095 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d61470fc-16c1-40fb-bc8a-17517013b3be-config-out\") pod \"d61470fc-16c1-40fb-bc8a-17517013b3be\" (UID: \"d61470fc-16c1-40fb-bc8a-17517013b3be\") " Sep 30 06:35:20 crc kubenswrapper[4691]: I0930 06:35:20.761233 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-299d6a78-f368-48b6-8212-bc99e11a0dd5\") pod \"d61470fc-16c1-40fb-bc8a-17517013b3be\" (UID: \"d61470fc-16c1-40fb-bc8a-17517013b3be\") " Sep 30 06:35:20 crc kubenswrapper[4691]: I0930 06:35:20.761310 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d61470fc-16c1-40fb-bc8a-17517013b3be-thanos-prometheus-http-client-file\") pod \"d61470fc-16c1-40fb-bc8a-17517013b3be\" (UID: \"d61470fc-16c1-40fb-bc8a-17517013b3be\") " Sep 30 06:35:20 crc kubenswrapper[4691]: I0930 06:35:20.761335 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d61470fc-16c1-40fb-bc8a-17517013b3be-prometheus-metric-storage-rulefiles-0\") pod \"d61470fc-16c1-40fb-bc8a-17517013b3be\" (UID: \"d61470fc-16c1-40fb-bc8a-17517013b3be\") " Sep 30 06:35:20 crc kubenswrapper[4691]: I0930 06:35:20.761393 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d61470fc-16c1-40fb-bc8a-17517013b3be-tls-assets\") pod \"d61470fc-16c1-40fb-bc8a-17517013b3be\" (UID: \"d61470fc-16c1-40fb-bc8a-17517013b3be\") " Sep 30 06:35:20 crc kubenswrapper[4691]: I0930 06:35:20.762753 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d61470fc-16c1-40fb-bc8a-17517013b3be-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "d61470fc-16c1-40fb-bc8a-17517013b3be" (UID: "d61470fc-16c1-40fb-bc8a-17517013b3be"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:35:20 crc kubenswrapper[4691]: I0930 06:35:20.767151 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d61470fc-16c1-40fb-bc8a-17517013b3be-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "d61470fc-16c1-40fb-bc8a-17517013b3be" (UID: "d61470fc-16c1-40fb-bc8a-17517013b3be"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:35:20 crc kubenswrapper[4691]: I0930 06:35:20.769624 4691 generic.go:334] "Generic (PLEG): container finished" podID="d61470fc-16c1-40fb-bc8a-17517013b3be" containerID="36efbf42370719637b353083faaa9b5e0765d528e8646b1f27e9289c5c3eb77a" exitCode=0 Sep 30 06:35:20 crc kubenswrapper[4691]: I0930 06:35:20.769890 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d61470fc-16c1-40fb-bc8a-17517013b3be","Type":"ContainerDied","Data":"36efbf42370719637b353083faaa9b5e0765d528e8646b1f27e9289c5c3eb77a"} Sep 30 06:35:20 crc kubenswrapper[4691]: I0930 06:35:20.769932 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"d61470fc-16c1-40fb-bc8a-17517013b3be","Type":"ContainerDied","Data":"5cecc09118a6085ae692055c9030c58e0231d43d1b32afd1ea79888a196c7da5"} Sep 30 06:35:20 crc kubenswrapper[4691]: I0930 06:35:20.769987 4691 scope.go:117] "RemoveContainer" containerID="c03ae0e7e65cc371ff5f45ab64f69314193391be521bbea06ba84ea7dfc5adb0" Sep 30 06:35:20 crc kubenswrapper[4691]: I0930 06:35:20.770884 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Sep 30 06:35:20 crc kubenswrapper[4691]: I0930 06:35:20.784112 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d61470fc-16c1-40fb-bc8a-17517013b3be-kube-api-access-ggnpl" (OuterVolumeSpecName: "kube-api-access-ggnpl") pod "d61470fc-16c1-40fb-bc8a-17517013b3be" (UID: "d61470fc-16c1-40fb-bc8a-17517013b3be"). InnerVolumeSpecName "kube-api-access-ggnpl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:35:20 crc kubenswrapper[4691]: I0930 06:35:20.786770 4691 generic.go:334] "Generic (PLEG): container finished" podID="7d3185a5-d016-44b4-b090-2589b9e73ea2" containerID="e863e1d193f7d52a37582c68c9ce346b46e2528677a41ef234001593a370589f" exitCode=0 Sep 30 06:35:20 crc kubenswrapper[4691]: I0930 06:35:20.787493 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-2wmg8-config-pjtkf" event={"ID":"7d3185a5-d016-44b4-b090-2589b9e73ea2","Type":"ContainerDied","Data":"e863e1d193f7d52a37582c68c9ce346b46e2528677a41ef234001593a370589f"} Sep 30 06:35:20 crc kubenswrapper[4691]: I0930 06:35:20.789458 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d61470fc-16c1-40fb-bc8a-17517013b3be-config" (OuterVolumeSpecName: "config") pod "d61470fc-16c1-40fb-bc8a-17517013b3be" (UID: "d61470fc-16c1-40fb-bc8a-17517013b3be"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:35:20 crc kubenswrapper[4691]: I0930 06:35:20.790115 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d61470fc-16c1-40fb-bc8a-17517013b3be-config-out" (OuterVolumeSpecName: "config-out") pod "d61470fc-16c1-40fb-bc8a-17517013b3be" (UID: "d61470fc-16c1-40fb-bc8a-17517013b3be"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:35:20 crc kubenswrapper[4691]: I0930 06:35:20.792136 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d61470fc-16c1-40fb-bc8a-17517013b3be-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "d61470fc-16c1-40fb-bc8a-17517013b3be" (UID: "d61470fc-16c1-40fb-bc8a-17517013b3be"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:35:20 crc kubenswrapper[4691]: I0930 06:35:20.797275 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d61470fc-16c1-40fb-bc8a-17517013b3be-web-config" (OuterVolumeSpecName: "web-config") pod "d61470fc-16c1-40fb-bc8a-17517013b3be" (UID: "d61470fc-16c1-40fb-bc8a-17517013b3be"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:35:20 crc kubenswrapper[4691]: I0930 06:35:20.808342 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-299d6a78-f368-48b6-8212-bc99e11a0dd5" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "d61470fc-16c1-40fb-bc8a-17517013b3be" (UID: "d61470fc-16c1-40fb-bc8a-17517013b3be"). InnerVolumeSpecName "pvc-299d6a78-f368-48b6-8212-bc99e11a0dd5". PluginName "kubernetes.io/csi", VolumeGidValue "" Sep 30 06:35:20 crc kubenswrapper[4691]: I0930 06:35:20.865584 4691 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/d61470fc-16c1-40fb-bc8a-17517013b3be-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Sep 30 06:35:20 crc kubenswrapper[4691]: I0930 06:35:20.866706 4691 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/d61470fc-16c1-40fb-bc8a-17517013b3be-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Sep 30 06:35:20 crc kubenswrapper[4691]: I0930 06:35:20.866744 4691 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d61470fc-16c1-40fb-bc8a-17517013b3be-tls-assets\") on node \"crc\" DevicePath \"\"" Sep 30 06:35:20 crc kubenswrapper[4691]: I0930 06:35:20.866755 4691 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d61470fc-16c1-40fb-bc8a-17517013b3be-web-config\") on node \"crc\" DevicePath \"\"" Sep 30 06:35:20 crc kubenswrapper[4691]: I0930 06:35:20.866765 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ggnpl\" (UniqueName: \"kubernetes.io/projected/d61470fc-16c1-40fb-bc8a-17517013b3be-kube-api-access-ggnpl\") on node \"crc\" DevicePath \"\"" Sep 30 06:35:20 crc kubenswrapper[4691]: I0930 06:35:20.866775 4691 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/d61470fc-16c1-40fb-bc8a-17517013b3be-config\") on node \"crc\" DevicePath \"\"" Sep 30 06:35:20 crc kubenswrapper[4691]: I0930 06:35:20.866787 4691 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d61470fc-16c1-40fb-bc8a-17517013b3be-config-out\") on node \"crc\" DevicePath \"\"" Sep 30 06:35:20 crc kubenswrapper[4691]: I0930 06:35:20.866835 4691 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-299d6a78-f368-48b6-8212-bc99e11a0dd5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-299d6a78-f368-48b6-8212-bc99e11a0dd5\") on node \"crc\" " Sep 30 06:35:20 crc kubenswrapper[4691]: I0930 06:35:20.926137 4691 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Sep 30 06:35:20 crc kubenswrapper[4691]: I0930 06:35:20.926281 4691 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-299d6a78-f368-48b6-8212-bc99e11a0dd5" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-299d6a78-f368-48b6-8212-bc99e11a0dd5") on node "crc" Sep 30 06:35:20 crc kubenswrapper[4691]: I0930 06:35:20.968704 4691 reconciler_common.go:293] "Volume detached for volume \"pvc-299d6a78-f368-48b6-8212-bc99e11a0dd5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-299d6a78-f368-48b6-8212-bc99e11a0dd5\") on node \"crc\" DevicePath \"\"" Sep 30 06:35:21 crc kubenswrapper[4691]: I0930 06:35:21.058360 4691 scope.go:117] "RemoveContainer" containerID="36efbf42370719637b353083faaa9b5e0765d528e8646b1f27e9289c5c3eb77a" Sep 30 06:35:21 crc kubenswrapper[4691]: I0930 06:35:21.080308 4691 scope.go:117] "RemoveContainer" containerID="d935e49d4bf9dd6496df247011a570c49579ab41b053b37ff6b73f549dcaba1f" Sep 30 06:35:21 crc kubenswrapper[4691]: I0930 06:35:21.107689 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 06:35:21 crc kubenswrapper[4691]: I0930 06:35:21.113111 4691 scope.go:117] "RemoveContainer" containerID="abb28118e166575249eb35ac69a1322f28ea82829f07caec702e4fd6cda8d793" Sep 30 06:35:21 crc kubenswrapper[4691]: I0930 06:35:21.161560 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 06:35:21 crc kubenswrapper[4691]: I0930 06:35:21.174439 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 06:35:21 crc kubenswrapper[4691]: E0930 06:35:21.174735 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d61470fc-16c1-40fb-bc8a-17517013b3be" containerName="thanos-sidecar" Sep 30 06:35:21 crc kubenswrapper[4691]: I0930 06:35:21.174751 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="d61470fc-16c1-40fb-bc8a-17517013b3be" containerName="thanos-sidecar" Sep 30 06:35:21 crc kubenswrapper[4691]: E0930 06:35:21.174764 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d61470fc-16c1-40fb-bc8a-17517013b3be" containerName="prometheus" Sep 30 06:35:21 crc kubenswrapper[4691]: I0930 06:35:21.174771 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="d61470fc-16c1-40fb-bc8a-17517013b3be" containerName="prometheus" Sep 30 06:35:21 crc kubenswrapper[4691]: E0930 06:35:21.174791 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d61470fc-16c1-40fb-bc8a-17517013b3be" containerName="init-config-reloader" Sep 30 06:35:21 crc kubenswrapper[4691]: I0930 06:35:21.174797 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="d61470fc-16c1-40fb-bc8a-17517013b3be" containerName="init-config-reloader" Sep 30 06:35:21 crc kubenswrapper[4691]: E0930 06:35:21.174829 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d61470fc-16c1-40fb-bc8a-17517013b3be" containerName="config-reloader" Sep 30 06:35:21 crc kubenswrapper[4691]: I0930 06:35:21.174834 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="d61470fc-16c1-40fb-bc8a-17517013b3be" containerName="config-reloader" Sep 30 06:35:21 crc kubenswrapper[4691]: I0930 06:35:21.175074 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="d61470fc-16c1-40fb-bc8a-17517013b3be" containerName="thanos-sidecar" Sep 30 06:35:21 crc kubenswrapper[4691]: I0930 06:35:21.175092 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="d61470fc-16c1-40fb-bc8a-17517013b3be" containerName="prometheus" Sep 30 06:35:21 crc kubenswrapper[4691]: I0930 06:35:21.175101 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="d61470fc-16c1-40fb-bc8a-17517013b3be" containerName="config-reloader" Sep 30 06:35:21 crc kubenswrapper[4691]: I0930 06:35:21.176606 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Sep 30 06:35:21 crc kubenswrapper[4691]: I0930 06:35:21.181133 4691 scope.go:117] "RemoveContainer" containerID="c03ae0e7e65cc371ff5f45ab64f69314193391be521bbea06ba84ea7dfc5adb0" Sep 30 06:35:21 crc kubenswrapper[4691]: E0930 06:35:21.181521 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c03ae0e7e65cc371ff5f45ab64f69314193391be521bbea06ba84ea7dfc5adb0\": container with ID starting with c03ae0e7e65cc371ff5f45ab64f69314193391be521bbea06ba84ea7dfc5adb0 not found: ID does not exist" containerID="c03ae0e7e65cc371ff5f45ab64f69314193391be521bbea06ba84ea7dfc5adb0" Sep 30 06:35:21 crc kubenswrapper[4691]: I0930 06:35:21.181558 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c03ae0e7e65cc371ff5f45ab64f69314193391be521bbea06ba84ea7dfc5adb0"} err="failed to get container status \"c03ae0e7e65cc371ff5f45ab64f69314193391be521bbea06ba84ea7dfc5adb0\": rpc error: code = NotFound desc = could not find container \"c03ae0e7e65cc371ff5f45ab64f69314193391be521bbea06ba84ea7dfc5adb0\": container with ID starting with c03ae0e7e65cc371ff5f45ab64f69314193391be521bbea06ba84ea7dfc5adb0 not found: ID does not exist" Sep 30 06:35:21 crc kubenswrapper[4691]: I0930 06:35:21.181583 4691 scope.go:117] "RemoveContainer" containerID="36efbf42370719637b353083faaa9b5e0765d528e8646b1f27e9289c5c3eb77a" Sep 30 06:35:21 crc kubenswrapper[4691]: I0930 06:35:21.182026 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Sep 30 06:35:21 crc kubenswrapper[4691]: I0930 06:35:21.182202 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-5hw8h" Sep 30 06:35:21 crc kubenswrapper[4691]: I0930 06:35:21.182429 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 06:35:21 crc kubenswrapper[4691]: I0930 06:35:21.182721 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Sep 30 06:35:21 crc kubenswrapper[4691]: E0930 06:35:21.183490 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36efbf42370719637b353083faaa9b5e0765d528e8646b1f27e9289c5c3eb77a\": container with ID starting with 36efbf42370719637b353083faaa9b5e0765d528e8646b1f27e9289c5c3eb77a not found: ID does not exist" containerID="36efbf42370719637b353083faaa9b5e0765d528e8646b1f27e9289c5c3eb77a" Sep 30 06:35:21 crc kubenswrapper[4691]: I0930 06:35:21.183528 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36efbf42370719637b353083faaa9b5e0765d528e8646b1f27e9289c5c3eb77a"} err="failed to get container status \"36efbf42370719637b353083faaa9b5e0765d528e8646b1f27e9289c5c3eb77a\": rpc error: code = NotFound desc = could not find container \"36efbf42370719637b353083faaa9b5e0765d528e8646b1f27e9289c5c3eb77a\": container with ID starting with 36efbf42370719637b353083faaa9b5e0765d528e8646b1f27e9289c5c3eb77a not found: ID does not exist" Sep 30 06:35:21 crc kubenswrapper[4691]: I0930 06:35:21.183560 4691 scope.go:117] "RemoveContainer" containerID="d935e49d4bf9dd6496df247011a570c49579ab41b053b37ff6b73f549dcaba1f" Sep 30 06:35:21 crc kubenswrapper[4691]: I0930 06:35:21.184602 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Sep 30 06:35:21 crc kubenswrapper[4691]: E0930 06:35:21.187199 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d935e49d4bf9dd6496df247011a570c49579ab41b053b37ff6b73f549dcaba1f\": container with ID starting with d935e49d4bf9dd6496df247011a570c49579ab41b053b37ff6b73f549dcaba1f not found: ID does not exist" containerID="d935e49d4bf9dd6496df247011a570c49579ab41b053b37ff6b73f549dcaba1f" Sep 30 06:35:21 crc kubenswrapper[4691]: I0930 06:35:21.187247 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d935e49d4bf9dd6496df247011a570c49579ab41b053b37ff6b73f549dcaba1f"} err="failed to get container status \"d935e49d4bf9dd6496df247011a570c49579ab41b053b37ff6b73f549dcaba1f\": rpc error: code = NotFound desc = could not find container \"d935e49d4bf9dd6496df247011a570c49579ab41b053b37ff6b73f549dcaba1f\": container with ID starting with d935e49d4bf9dd6496df247011a570c49579ab41b053b37ff6b73f549dcaba1f not found: ID does not exist" Sep 30 06:35:21 crc kubenswrapper[4691]: I0930 06:35:21.187278 4691 scope.go:117] "RemoveContainer" containerID="abb28118e166575249eb35ac69a1322f28ea82829f07caec702e4fd6cda8d793" Sep 30 06:35:21 crc kubenswrapper[4691]: I0930 06:35:21.187531 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Sep 30 06:35:21 crc kubenswrapper[4691]: E0930 06:35:21.187676 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"abb28118e166575249eb35ac69a1322f28ea82829f07caec702e4fd6cda8d793\": container with ID starting with abb28118e166575249eb35ac69a1322f28ea82829f07caec702e4fd6cda8d793 not found: ID does not exist" containerID="abb28118e166575249eb35ac69a1322f28ea82829f07caec702e4fd6cda8d793" Sep 30 06:35:21 crc kubenswrapper[4691]: I0930 06:35:21.187702 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"abb28118e166575249eb35ac69a1322f28ea82829f07caec702e4fd6cda8d793"} err="failed to get container status \"abb28118e166575249eb35ac69a1322f28ea82829f07caec702e4fd6cda8d793\": rpc error: code = NotFound desc = could not find container \"abb28118e166575249eb35ac69a1322f28ea82829f07caec702e4fd6cda8d793\": container with ID starting with abb28118e166575249eb35ac69a1322f28ea82829f07caec702e4fd6cda8d793 not found: ID does not exist" Sep 30 06:35:21 crc kubenswrapper[4691]: I0930 06:35:21.191361 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Sep 30 06:35:21 crc kubenswrapper[4691]: I0930 06:35:21.191502 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Sep 30 06:35:21 crc kubenswrapper[4691]: I0930 06:35:21.253932 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d61470fc-16c1-40fb-bc8a-17517013b3be" path="/var/lib/kubelet/pods/d61470fc-16c1-40fb-bc8a-17517013b3be/volumes" Sep 30 06:35:21 crc kubenswrapper[4691]: I0930 06:35:21.378214 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6d36519-9195-4e0b-9760-844d420e2661-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"e6d36519-9195-4e0b-9760-844d420e2661\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:35:21 crc kubenswrapper[4691]: I0930 06:35:21.378305 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e6d36519-9195-4e0b-9760-844d420e2661-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"e6d36519-9195-4e0b-9760-844d420e2661\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:35:21 crc kubenswrapper[4691]: I0930 06:35:21.378342 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgfvq\" (UniqueName: \"kubernetes.io/projected/e6d36519-9195-4e0b-9760-844d420e2661-kube-api-access-wgfvq\") pod \"prometheus-metric-storage-0\" (UID: \"e6d36519-9195-4e0b-9760-844d420e2661\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:35:21 crc kubenswrapper[4691]: I0930 06:35:21.378393 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/e6d36519-9195-4e0b-9760-844d420e2661-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"e6d36519-9195-4e0b-9760-844d420e2661\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:35:21 crc kubenswrapper[4691]: I0930 06:35:21.378475 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-299d6a78-f368-48b6-8212-bc99e11a0dd5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-299d6a78-f368-48b6-8212-bc99e11a0dd5\") pod \"prometheus-metric-storage-0\" (UID: \"e6d36519-9195-4e0b-9760-844d420e2661\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:35:21 crc kubenswrapper[4691]: I0930 06:35:21.378512 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e6d36519-9195-4e0b-9760-844d420e2661-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"e6d36519-9195-4e0b-9760-844d420e2661\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:35:21 crc kubenswrapper[4691]: I0930 06:35:21.378544 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e6d36519-9195-4e0b-9760-844d420e2661-config\") pod \"prometheus-metric-storage-0\" (UID: \"e6d36519-9195-4e0b-9760-844d420e2661\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:35:21 crc kubenswrapper[4691]: I0930 06:35:21.378593 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e6d36519-9195-4e0b-9760-844d420e2661-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"e6d36519-9195-4e0b-9760-844d420e2661\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:35:21 crc kubenswrapper[4691]: I0930 06:35:21.378674 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e6d36519-9195-4e0b-9760-844d420e2661-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"e6d36519-9195-4e0b-9760-844d420e2661\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:35:21 crc kubenswrapper[4691]: I0930 06:35:21.378700 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e6d36519-9195-4e0b-9760-844d420e2661-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"e6d36519-9195-4e0b-9760-844d420e2661\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:35:21 crc kubenswrapper[4691]: I0930 06:35:21.378730 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/e6d36519-9195-4e0b-9760-844d420e2661-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"e6d36519-9195-4e0b-9760-844d420e2661\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:35:21 crc kubenswrapper[4691]: I0930 06:35:21.404782 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3c60-account-create-l5zxb" Sep 30 06:35:21 crc kubenswrapper[4691]: I0930 06:35:21.410034 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-a0bc-account-create-89btn" Sep 30 06:35:21 crc kubenswrapper[4691]: I0930 06:35:21.480443 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e6d36519-9195-4e0b-9760-844d420e2661-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"e6d36519-9195-4e0b-9760-844d420e2661\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:35:21 crc kubenswrapper[4691]: I0930 06:35:21.480493 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e6d36519-9195-4e0b-9760-844d420e2661-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"e6d36519-9195-4e0b-9760-844d420e2661\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:35:21 crc kubenswrapper[4691]: I0930 06:35:21.480523 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/e6d36519-9195-4e0b-9760-844d420e2661-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"e6d36519-9195-4e0b-9760-844d420e2661\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:35:21 crc kubenswrapper[4691]: I0930 06:35:21.480555 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6d36519-9195-4e0b-9760-844d420e2661-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"e6d36519-9195-4e0b-9760-844d420e2661\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:35:21 crc kubenswrapper[4691]: I0930 06:35:21.480585 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e6d36519-9195-4e0b-9760-844d420e2661-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"e6d36519-9195-4e0b-9760-844d420e2661\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:35:21 crc kubenswrapper[4691]: I0930 06:35:21.480608 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wgfvq\" (UniqueName: \"kubernetes.io/projected/e6d36519-9195-4e0b-9760-844d420e2661-kube-api-access-wgfvq\") pod \"prometheus-metric-storage-0\" (UID: \"e6d36519-9195-4e0b-9760-844d420e2661\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:35:21 crc kubenswrapper[4691]: I0930 06:35:21.480645 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/e6d36519-9195-4e0b-9760-844d420e2661-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"e6d36519-9195-4e0b-9760-844d420e2661\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:35:21 crc kubenswrapper[4691]: I0930 06:35:21.480679 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-299d6a78-f368-48b6-8212-bc99e11a0dd5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-299d6a78-f368-48b6-8212-bc99e11a0dd5\") pod \"prometheus-metric-storage-0\" (UID: \"e6d36519-9195-4e0b-9760-844d420e2661\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:35:21 crc kubenswrapper[4691]: I0930 06:35:21.480703 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e6d36519-9195-4e0b-9760-844d420e2661-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"e6d36519-9195-4e0b-9760-844d420e2661\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:35:21 crc kubenswrapper[4691]: I0930 06:35:21.480723 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e6d36519-9195-4e0b-9760-844d420e2661-config\") pod \"prometheus-metric-storage-0\" (UID: \"e6d36519-9195-4e0b-9760-844d420e2661\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:35:21 crc kubenswrapper[4691]: I0930 06:35:21.481565 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e6d36519-9195-4e0b-9760-844d420e2661-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"e6d36519-9195-4e0b-9760-844d420e2661\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:35:21 crc kubenswrapper[4691]: I0930 06:35:21.484044 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e6d36519-9195-4e0b-9760-844d420e2661-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"e6d36519-9195-4e0b-9760-844d420e2661\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:35:21 crc kubenswrapper[4691]: I0930 06:35:21.485994 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e6d36519-9195-4e0b-9760-844d420e2661-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"e6d36519-9195-4e0b-9760-844d420e2661\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:35:21 crc kubenswrapper[4691]: I0930 06:35:21.486220 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e6d36519-9195-4e0b-9760-844d420e2661-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"e6d36519-9195-4e0b-9760-844d420e2661\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:35:21 crc kubenswrapper[4691]: I0930 06:35:21.486263 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e6d36519-9195-4e0b-9760-844d420e2661-config\") pod \"prometheus-metric-storage-0\" (UID: \"e6d36519-9195-4e0b-9760-844d420e2661\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:35:21 crc kubenswrapper[4691]: I0930 06:35:21.486619 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6d36519-9195-4e0b-9760-844d420e2661-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"e6d36519-9195-4e0b-9760-844d420e2661\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:35:21 crc kubenswrapper[4691]: I0930 06:35:21.486968 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/e6d36519-9195-4e0b-9760-844d420e2661-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"e6d36519-9195-4e0b-9760-844d420e2661\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:35:21 crc kubenswrapper[4691]: I0930 06:35:21.490653 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e6d36519-9195-4e0b-9760-844d420e2661-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"e6d36519-9195-4e0b-9760-844d420e2661\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:35:21 crc kubenswrapper[4691]: I0930 06:35:21.491110 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e6d36519-9195-4e0b-9760-844d420e2661-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"e6d36519-9195-4e0b-9760-844d420e2661\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:35:21 crc kubenswrapper[4691]: I0930 06:35:21.491827 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/e6d36519-9195-4e0b-9760-844d420e2661-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"e6d36519-9195-4e0b-9760-844d420e2661\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:35:21 crc kubenswrapper[4691]: I0930 06:35:21.496216 4691 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Sep 30 06:35:21 crc kubenswrapper[4691]: I0930 06:35:21.496305 4691 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-299d6a78-f368-48b6-8212-bc99e11a0dd5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-299d6a78-f368-48b6-8212-bc99e11a0dd5\") pod \"prometheus-metric-storage-0\" (UID: \"e6d36519-9195-4e0b-9760-844d420e2661\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/c288a5d41a5881b6adb6be722d4e7a99207424eb3b5d2db5e4a72cf60753eefa/globalmount\"" pod="openstack/prometheus-metric-storage-0" Sep 30 06:35:21 crc kubenswrapper[4691]: I0930 06:35:21.499533 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgfvq\" (UniqueName: \"kubernetes.io/projected/e6d36519-9195-4e0b-9760-844d420e2661-kube-api-access-wgfvq\") pod \"prometheus-metric-storage-0\" (UID: \"e6d36519-9195-4e0b-9760-844d420e2661\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:35:21 crc kubenswrapper[4691]: I0930 06:35:21.528152 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-299d6a78-f368-48b6-8212-bc99e11a0dd5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-299d6a78-f368-48b6-8212-bc99e11a0dd5\") pod \"prometheus-metric-storage-0\" (UID: \"e6d36519-9195-4e0b-9760-844d420e2661\") " pod="openstack/prometheus-metric-storage-0" Sep 30 06:35:21 crc kubenswrapper[4691]: I0930 06:35:21.582265 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mmgq\" (UniqueName: \"kubernetes.io/projected/b5e86cb2-d4ee-4828-887f-4166440d359b-kube-api-access-5mmgq\") pod \"b5e86cb2-d4ee-4828-887f-4166440d359b\" (UID: \"b5e86cb2-d4ee-4828-887f-4166440d359b\") " Sep 30 06:35:21 crc kubenswrapper[4691]: I0930 06:35:21.582379 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h7gjv\" (UniqueName: \"kubernetes.io/projected/1a0abd23-3753-4910-9b91-539dde605d21-kube-api-access-h7gjv\") pod \"1a0abd23-3753-4910-9b91-539dde605d21\" (UID: \"1a0abd23-3753-4910-9b91-539dde605d21\") " Sep 30 06:35:21 crc kubenswrapper[4691]: I0930 06:35:21.585332 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a0abd23-3753-4910-9b91-539dde605d21-kube-api-access-h7gjv" (OuterVolumeSpecName: "kube-api-access-h7gjv") pod "1a0abd23-3753-4910-9b91-539dde605d21" (UID: "1a0abd23-3753-4910-9b91-539dde605d21"). InnerVolumeSpecName "kube-api-access-h7gjv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:35:21 crc kubenswrapper[4691]: I0930 06:35:21.585509 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5e86cb2-d4ee-4828-887f-4166440d359b-kube-api-access-5mmgq" (OuterVolumeSpecName: "kube-api-access-5mmgq") pod "b5e86cb2-d4ee-4828-887f-4166440d359b" (UID: "b5e86cb2-d4ee-4828-887f-4166440d359b"). InnerVolumeSpecName "kube-api-access-5mmgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:35:21 crc kubenswrapper[4691]: I0930 06:35:21.684811 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5mmgq\" (UniqueName: \"kubernetes.io/projected/b5e86cb2-d4ee-4828-887f-4166440d359b-kube-api-access-5mmgq\") on node \"crc\" DevicePath \"\"" Sep 30 06:35:21 crc kubenswrapper[4691]: I0930 06:35:21.684879 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h7gjv\" (UniqueName: \"kubernetes.io/projected/1a0abd23-3753-4910-9b91-539dde605d21-kube-api-access-h7gjv\") on node \"crc\" DevicePath \"\"" Sep 30 06:35:21 crc kubenswrapper[4691]: I0930 06:35:21.799817 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-a0bc-account-create-89btn" event={"ID":"b5e86cb2-d4ee-4828-887f-4166440d359b","Type":"ContainerDied","Data":"085ed8e1811560a56504ac0042aa9a02f17a07328f3390f85ea4da3d1349fcc6"} Sep 30 06:35:21 crc kubenswrapper[4691]: I0930 06:35:21.799871 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-a0bc-account-create-89btn" Sep 30 06:35:21 crc kubenswrapper[4691]: I0930 06:35:21.799889 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="085ed8e1811560a56504ac0042aa9a02f17a07328f3390f85ea4da3d1349fcc6" Sep 30 06:35:21 crc kubenswrapper[4691]: I0930 06:35:21.802993 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Sep 30 06:35:21 crc kubenswrapper[4691]: I0930 06:35:21.804707 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-3c60-account-create-l5zxb" event={"ID":"1a0abd23-3753-4910-9b91-539dde605d21","Type":"ContainerDied","Data":"4be71fb1677b0ae9b390105f98ae84e904ff7037243b1f379b32c077e69431fe"} Sep 30 06:35:21 crc kubenswrapper[4691]: I0930 06:35:21.804769 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4be71fb1677b0ae9b390105f98ae84e904ff7037243b1f379b32c077e69431fe" Sep 30 06:35:21 crc kubenswrapper[4691]: I0930 06:35:21.804801 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-3c60-account-create-l5zxb" Sep 30 06:35:22 crc kubenswrapper[4691]: I0930 06:35:22.223578 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-2wmg8-config-pjtkf" Sep 30 06:35:22 crc kubenswrapper[4691]: I0930 06:35:22.334511 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 06:35:22 crc kubenswrapper[4691]: W0930 06:35:22.336669 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6d36519_9195_4e0b_9760_844d420e2661.slice/crio-996dc983d58157a9561dd45d248b41b9d8d77e918cf36938e0e399b6d006c25a WatchSource:0}: Error finding container 996dc983d58157a9561dd45d248b41b9d8d77e918cf36938e0e399b6d006c25a: Status 404 returned error can't find the container with id 996dc983d58157a9561dd45d248b41b9d8d77e918cf36938e0e399b6d006c25a Sep 30 06:35:22 crc kubenswrapper[4691]: I0930 06:35:22.394652 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7d3185a5-d016-44b4-b090-2589b9e73ea2-var-run\") pod \"7d3185a5-d016-44b4-b090-2589b9e73ea2\" (UID: \"7d3185a5-d016-44b4-b090-2589b9e73ea2\") " Sep 30 06:35:22 crc kubenswrapper[4691]: I0930 06:35:22.394727 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7d3185a5-d016-44b4-b090-2589b9e73ea2-scripts\") pod \"7d3185a5-d016-44b4-b090-2589b9e73ea2\" (UID: \"7d3185a5-d016-44b4-b090-2589b9e73ea2\") " Sep 30 06:35:22 crc kubenswrapper[4691]: I0930 06:35:22.394775 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zx87\" (UniqueName: \"kubernetes.io/projected/7d3185a5-d016-44b4-b090-2589b9e73ea2-kube-api-access-5zx87\") pod \"7d3185a5-d016-44b4-b090-2589b9e73ea2\" (UID: \"7d3185a5-d016-44b4-b090-2589b9e73ea2\") " Sep 30 06:35:22 crc kubenswrapper[4691]: I0930 06:35:22.394774 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7d3185a5-d016-44b4-b090-2589b9e73ea2-var-run" (OuterVolumeSpecName: "var-run") pod "7d3185a5-d016-44b4-b090-2589b9e73ea2" (UID: "7d3185a5-d016-44b4-b090-2589b9e73ea2"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 06:35:22 crc kubenswrapper[4691]: I0930 06:35:22.394822 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/7d3185a5-d016-44b4-b090-2589b9e73ea2-additional-scripts\") pod \"7d3185a5-d016-44b4-b090-2589b9e73ea2\" (UID: \"7d3185a5-d016-44b4-b090-2589b9e73ea2\") " Sep 30 06:35:22 crc kubenswrapper[4691]: I0930 06:35:22.394915 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7d3185a5-d016-44b4-b090-2589b9e73ea2-var-run-ovn\") pod \"7d3185a5-d016-44b4-b090-2589b9e73ea2\" (UID: \"7d3185a5-d016-44b4-b090-2589b9e73ea2\") " Sep 30 06:35:22 crc kubenswrapper[4691]: I0930 06:35:22.394969 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7d3185a5-d016-44b4-b090-2589b9e73ea2-var-log-ovn\") pod \"7d3185a5-d016-44b4-b090-2589b9e73ea2\" (UID: \"7d3185a5-d016-44b4-b090-2589b9e73ea2\") " Sep 30 06:35:22 crc kubenswrapper[4691]: I0930 06:35:22.395341 4691 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7d3185a5-d016-44b4-b090-2589b9e73ea2-var-run\") on node \"crc\" DevicePath \"\"" Sep 30 06:35:22 crc kubenswrapper[4691]: I0930 06:35:22.396101 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d3185a5-d016-44b4-b090-2589b9e73ea2-scripts" (OuterVolumeSpecName: "scripts") pod "7d3185a5-d016-44b4-b090-2589b9e73ea2" (UID: "7d3185a5-d016-44b4-b090-2589b9e73ea2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:35:22 crc kubenswrapper[4691]: I0930 06:35:22.396404 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7d3185a5-d016-44b4-b090-2589b9e73ea2-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "7d3185a5-d016-44b4-b090-2589b9e73ea2" (UID: "7d3185a5-d016-44b4-b090-2589b9e73ea2"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 06:35:22 crc kubenswrapper[4691]: I0930 06:35:22.396655 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7d3185a5-d016-44b4-b090-2589b9e73ea2-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "7d3185a5-d016-44b4-b090-2589b9e73ea2" (UID: "7d3185a5-d016-44b4-b090-2589b9e73ea2"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 06:35:22 crc kubenswrapper[4691]: I0930 06:35:22.397366 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7d3185a5-d016-44b4-b090-2589b9e73ea2-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "7d3185a5-d016-44b4-b090-2589b9e73ea2" (UID: "7d3185a5-d016-44b4-b090-2589b9e73ea2"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:35:22 crc kubenswrapper[4691]: I0930 06:35:22.399132 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d3185a5-d016-44b4-b090-2589b9e73ea2-kube-api-access-5zx87" (OuterVolumeSpecName: "kube-api-access-5zx87") pod "7d3185a5-d016-44b4-b090-2589b9e73ea2" (UID: "7d3185a5-d016-44b4-b090-2589b9e73ea2"). InnerVolumeSpecName "kube-api-access-5zx87". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:35:22 crc kubenswrapper[4691]: I0930 06:35:22.497449 4691 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/7d3185a5-d016-44b4-b090-2589b9e73ea2-additional-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 06:35:22 crc kubenswrapper[4691]: I0930 06:35:22.497481 4691 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/7d3185a5-d016-44b4-b090-2589b9e73ea2-var-run-ovn\") on node \"crc\" DevicePath \"\"" Sep 30 06:35:22 crc kubenswrapper[4691]: I0930 06:35:22.497490 4691 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/7d3185a5-d016-44b4-b090-2589b9e73ea2-var-log-ovn\") on node \"crc\" DevicePath \"\"" Sep 30 06:35:22 crc kubenswrapper[4691]: I0930 06:35:22.497498 4691 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7d3185a5-d016-44b4-b090-2589b9e73ea2-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 06:35:22 crc kubenswrapper[4691]: I0930 06:35:22.497521 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zx87\" (UniqueName: \"kubernetes.io/projected/7d3185a5-d016-44b4-b090-2589b9e73ea2-kube-api-access-5zx87\") on node \"crc\" DevicePath \"\"" Sep 30 06:35:22 crc kubenswrapper[4691]: I0930 06:35:22.812514 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-2wmg8-config-pjtkf" event={"ID":"7d3185a5-d016-44b4-b090-2589b9e73ea2","Type":"ContainerDied","Data":"c1bc6c6668396132cea2e8f44b8386eca5fb1b5135ac22d28e05260fffb0e5af"} Sep 30 06:35:22 crc kubenswrapper[4691]: I0930 06:35:22.812552 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c1bc6c6668396132cea2e8f44b8386eca5fb1b5135ac22d28e05260fffb0e5af" Sep 30 06:35:22 crc kubenswrapper[4691]: I0930 06:35:22.812605 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-2wmg8-config-pjtkf" Sep 30 06:35:22 crc kubenswrapper[4691]: I0930 06:35:22.813344 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e6d36519-9195-4e0b-9760-844d420e2661","Type":"ContainerStarted","Data":"996dc983d58157a9561dd45d248b41b9d8d77e918cf36938e0e399b6d006c25a"} Sep 30 06:35:22 crc kubenswrapper[4691]: I0930 06:35:22.818209 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ccb4d975-40e7-4a38-8b86-b18e685c570b","Type":"ContainerStarted","Data":"05d036df88480263a8fda7c17f3352c38fe75047358320da884f12e9743df9d5"} Sep 30 06:35:22 crc kubenswrapper[4691]: I0930 06:35:22.818244 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ccb4d975-40e7-4a38-8b86-b18e685c570b","Type":"ContainerStarted","Data":"e23fcdfda6e94eb5d4905066bc82bd7798bf0bb4589ca9eaf20e49acd623c2e3"} Sep 30 06:35:22 crc kubenswrapper[4691]: I0930 06:35:22.818254 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ccb4d975-40e7-4a38-8b86-b18e685c570b","Type":"ContainerStarted","Data":"46512116d92cb7c1542d03e98800ef1005dd5364c162b60d6ad8ac93eaeed352"} Sep 30 06:35:22 crc kubenswrapper[4691]: I0930 06:35:22.818262 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ccb4d975-40e7-4a38-8b86-b18e685c570b","Type":"ContainerStarted","Data":"b7eae30fcbfa02f7070fc80e103bb46208b02fd03c586a2b55154c83196f112f"} Sep 30 06:35:23 crc kubenswrapper[4691]: I0930 06:35:23.046311 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-9400-account-create-2jdrq"] Sep 30 06:35:23 crc kubenswrapper[4691]: E0930 06:35:23.046640 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5e86cb2-d4ee-4828-887f-4166440d359b" containerName="mariadb-account-create" Sep 30 06:35:23 crc kubenswrapper[4691]: I0930 06:35:23.046651 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5e86cb2-d4ee-4828-887f-4166440d359b" containerName="mariadb-account-create" Sep 30 06:35:23 crc kubenswrapper[4691]: E0930 06:35:23.046669 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a0abd23-3753-4910-9b91-539dde605d21" containerName="mariadb-account-create" Sep 30 06:35:23 crc kubenswrapper[4691]: I0930 06:35:23.046675 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a0abd23-3753-4910-9b91-539dde605d21" containerName="mariadb-account-create" Sep 30 06:35:23 crc kubenswrapper[4691]: E0930 06:35:23.046694 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d3185a5-d016-44b4-b090-2589b9e73ea2" containerName="ovn-config" Sep 30 06:35:23 crc kubenswrapper[4691]: I0930 06:35:23.046700 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d3185a5-d016-44b4-b090-2589b9e73ea2" containerName="ovn-config" Sep 30 06:35:23 crc kubenswrapper[4691]: I0930 06:35:23.046878 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a0abd23-3753-4910-9b91-539dde605d21" containerName="mariadb-account-create" Sep 30 06:35:23 crc kubenswrapper[4691]: I0930 06:35:23.046899 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d3185a5-d016-44b4-b090-2589b9e73ea2" containerName="ovn-config" Sep 30 06:35:23 crc kubenswrapper[4691]: I0930 06:35:23.046914 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5e86cb2-d4ee-4828-887f-4166440d359b" containerName="mariadb-account-create" Sep 30 06:35:23 crc kubenswrapper[4691]: I0930 06:35:23.047468 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9400-account-create-2jdrq" Sep 30 06:35:23 crc kubenswrapper[4691]: I0930 06:35:23.050479 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Sep 30 06:35:23 crc kubenswrapper[4691]: I0930 06:35:23.062219 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-9400-account-create-2jdrq"] Sep 30 06:35:23 crc kubenswrapper[4691]: I0930 06:35:23.205077 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bb9kv\" (UniqueName: \"kubernetes.io/projected/da5644f7-944d-4df5-ae95-535bbf9399a1-kube-api-access-bb9kv\") pod \"glance-9400-account-create-2jdrq\" (UID: \"da5644f7-944d-4df5-ae95-535bbf9399a1\") " pod="openstack/glance-9400-account-create-2jdrq" Sep 30 06:35:23 crc kubenswrapper[4691]: I0930 06:35:23.306594 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bb9kv\" (UniqueName: \"kubernetes.io/projected/da5644f7-944d-4df5-ae95-535bbf9399a1-kube-api-access-bb9kv\") pod \"glance-9400-account-create-2jdrq\" (UID: \"da5644f7-944d-4df5-ae95-535bbf9399a1\") " pod="openstack/glance-9400-account-create-2jdrq" Sep 30 06:35:23 crc kubenswrapper[4691]: I0930 06:35:23.350630 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bb9kv\" (UniqueName: \"kubernetes.io/projected/da5644f7-944d-4df5-ae95-535bbf9399a1-kube-api-access-bb9kv\") pod \"glance-9400-account-create-2jdrq\" (UID: \"da5644f7-944d-4df5-ae95-535bbf9399a1\") " pod="openstack/glance-9400-account-create-2jdrq" Sep 30 06:35:23 crc kubenswrapper[4691]: I0930 06:35:23.361216 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-2wmg8" Sep 30 06:35:23 crc kubenswrapper[4691]: I0930 06:35:23.361547 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9400-account-create-2jdrq" Sep 30 06:35:23 crc kubenswrapper[4691]: I0930 06:35:23.371036 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-2wmg8-config-pjtkf"] Sep 30 06:35:23 crc kubenswrapper[4691]: I0930 06:35:23.377285 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-2wmg8-config-pjtkf"] Sep 30 06:35:24 crc kubenswrapper[4691]: I0930 06:35:24.626015 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-9400-account-create-2jdrq"] Sep 30 06:35:25 crc kubenswrapper[4691]: I0930 06:35:25.238678 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d3185a5-d016-44b4-b090-2589b9e73ea2" path="/var/lib/kubelet/pods/7d3185a5-d016-44b4-b090-2589b9e73ea2/volumes" Sep 30 06:35:25 crc kubenswrapper[4691]: I0930 06:35:25.844653 4691 generic.go:334] "Generic (PLEG): container finished" podID="da5644f7-944d-4df5-ae95-535bbf9399a1" containerID="fdc084a5e240d7ff6e63e954c36d38a5acc82bda4ee9495763fb9ce7d6d84272" exitCode=0 Sep 30 06:35:25 crc kubenswrapper[4691]: I0930 06:35:25.844830 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-9400-account-create-2jdrq" event={"ID":"da5644f7-944d-4df5-ae95-535bbf9399a1","Type":"ContainerDied","Data":"fdc084a5e240d7ff6e63e954c36d38a5acc82bda4ee9495763fb9ce7d6d84272"} Sep 30 06:35:25 crc kubenswrapper[4691]: I0930 06:35:25.845041 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-9400-account-create-2jdrq" event={"ID":"da5644f7-944d-4df5-ae95-535bbf9399a1","Type":"ContainerStarted","Data":"f1b8ec8fb8e19f729c2080110f69ecf2f477815fbcdcec2317e15a1f401f0244"} Sep 30 06:35:25 crc kubenswrapper[4691]: I0930 06:35:25.851328 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ccb4d975-40e7-4a38-8b86-b18e685c570b","Type":"ContainerStarted","Data":"6820d7c8d6e00d8ebac62f43aa6dcb75c94dbdcb5db3561552b4c4fd0db2fa92"} Sep 30 06:35:25 crc kubenswrapper[4691]: I0930 06:35:25.851369 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ccb4d975-40e7-4a38-8b86-b18e685c570b","Type":"ContainerStarted","Data":"8f63a34b648be4140d2575145055f275b7fb1633d5e180449a5c01fd6fd125df"} Sep 30 06:35:25 crc kubenswrapper[4691]: I0930 06:35:25.851382 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ccb4d975-40e7-4a38-8b86-b18e685c570b","Type":"ContainerStarted","Data":"1ecde1788bd0a9f01e264235e497ed3d5146f19f66b9c90994ee5f472c40e1fa"} Sep 30 06:35:25 crc kubenswrapper[4691]: I0930 06:35:25.853153 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e6d36519-9195-4e0b-9760-844d420e2661","Type":"ContainerStarted","Data":"e70ed48128da6ed15d63b20c1b58852c8191c4f116e99cd1584116f3c89961b6"} Sep 30 06:35:26 crc kubenswrapper[4691]: I0930 06:35:26.870032 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ccb4d975-40e7-4a38-8b86-b18e685c570b","Type":"ContainerStarted","Data":"4fb1ea667558e9a7d8cc34f48d166dbbf6106e259f9c273107af178d594968b9"} Sep 30 06:35:26 crc kubenswrapper[4691]: I0930 06:35:26.870344 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ccb4d975-40e7-4a38-8b86-b18e685c570b","Type":"ContainerStarted","Data":"e02108b8c860acc6f28d8b2aca338be5287a9a4ee0f3cf833c61aaf5703bf9d3"} Sep 30 06:35:26 crc kubenswrapper[4691]: I0930 06:35:26.870362 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ccb4d975-40e7-4a38-8b86-b18e685c570b","Type":"ContainerStarted","Data":"d4f74256618964e512b838ee42cb4bd8470a828690d22f84c6fc81e1e9cf5052"} Sep 30 06:35:26 crc kubenswrapper[4691]: I0930 06:35:26.870374 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"ccb4d975-40e7-4a38-8b86-b18e685c570b","Type":"ContainerStarted","Data":"94a15d6a23079b29fe2e2ed27624b392d9e9a3db9eebb95e0712a32febb2a0ac"} Sep 30 06:35:26 crc kubenswrapper[4691]: I0930 06:35:26.917644 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=20.306316637 podStartE2EDuration="27.917626889s" podCreationTimestamp="2025-09-30 06:34:59 +0000 UTC" firstStartedPulling="2025-09-30 06:35:17.44682186 +0000 UTC m=+960.921842910" lastFinishedPulling="2025-09-30 06:35:25.058132122 +0000 UTC m=+968.533153162" observedRunningTime="2025-09-30 06:35:26.913958531 +0000 UTC m=+970.388979581" watchObservedRunningTime="2025-09-30 06:35:26.917626889 +0000 UTC m=+970.392647949" Sep 30 06:35:27 crc kubenswrapper[4691]: I0930 06:35:27.203474 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-75fd5776c-42zjc"] Sep 30 06:35:27 crc kubenswrapper[4691]: I0930 06:35:27.204839 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75fd5776c-42zjc" Sep 30 06:35:27 crc kubenswrapper[4691]: I0930 06:35:27.206952 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Sep 30 06:35:27 crc kubenswrapper[4691]: I0930 06:35:27.220694 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75fd5776c-42zjc"] Sep 30 06:35:27 crc kubenswrapper[4691]: I0930 06:35:27.267007 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9400-account-create-2jdrq" Sep 30 06:35:27 crc kubenswrapper[4691]: I0930 06:35:27.379650 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bb9kv\" (UniqueName: \"kubernetes.io/projected/da5644f7-944d-4df5-ae95-535bbf9399a1-kube-api-access-bb9kv\") pod \"da5644f7-944d-4df5-ae95-535bbf9399a1\" (UID: \"da5644f7-944d-4df5-ae95-535bbf9399a1\") " Sep 30 06:35:27 crc kubenswrapper[4691]: I0930 06:35:27.379905 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5385150d-abdd-4b17-bbf4-fee7d4b5946e-dns-swift-storage-0\") pod \"dnsmasq-dns-75fd5776c-42zjc\" (UID: \"5385150d-abdd-4b17-bbf4-fee7d4b5946e\") " pod="openstack/dnsmasq-dns-75fd5776c-42zjc" Sep 30 06:35:27 crc kubenswrapper[4691]: I0930 06:35:27.379983 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5385150d-abdd-4b17-bbf4-fee7d4b5946e-config\") pod \"dnsmasq-dns-75fd5776c-42zjc\" (UID: \"5385150d-abdd-4b17-bbf4-fee7d4b5946e\") " pod="openstack/dnsmasq-dns-75fd5776c-42zjc" Sep 30 06:35:27 crc kubenswrapper[4691]: I0930 06:35:27.380002 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5385150d-abdd-4b17-bbf4-fee7d4b5946e-dns-svc\") pod \"dnsmasq-dns-75fd5776c-42zjc\" (UID: \"5385150d-abdd-4b17-bbf4-fee7d4b5946e\") " pod="openstack/dnsmasq-dns-75fd5776c-42zjc" Sep 30 06:35:27 crc kubenswrapper[4691]: I0930 06:35:27.380110 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5385150d-abdd-4b17-bbf4-fee7d4b5946e-ovsdbserver-sb\") pod \"dnsmasq-dns-75fd5776c-42zjc\" (UID: \"5385150d-abdd-4b17-bbf4-fee7d4b5946e\") " pod="openstack/dnsmasq-dns-75fd5776c-42zjc" Sep 30 06:35:27 crc kubenswrapper[4691]: I0930 06:35:27.380181 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5385150d-abdd-4b17-bbf4-fee7d4b5946e-ovsdbserver-nb\") pod \"dnsmasq-dns-75fd5776c-42zjc\" (UID: \"5385150d-abdd-4b17-bbf4-fee7d4b5946e\") " pod="openstack/dnsmasq-dns-75fd5776c-42zjc" Sep 30 06:35:27 crc kubenswrapper[4691]: I0930 06:35:27.380454 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4m8m\" (UniqueName: \"kubernetes.io/projected/5385150d-abdd-4b17-bbf4-fee7d4b5946e-kube-api-access-v4m8m\") pod \"dnsmasq-dns-75fd5776c-42zjc\" (UID: \"5385150d-abdd-4b17-bbf4-fee7d4b5946e\") " pod="openstack/dnsmasq-dns-75fd5776c-42zjc" Sep 30 06:35:27 crc kubenswrapper[4691]: I0930 06:35:27.387048 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da5644f7-944d-4df5-ae95-535bbf9399a1-kube-api-access-bb9kv" (OuterVolumeSpecName: "kube-api-access-bb9kv") pod "da5644f7-944d-4df5-ae95-535bbf9399a1" (UID: "da5644f7-944d-4df5-ae95-535bbf9399a1"). InnerVolumeSpecName "kube-api-access-bb9kv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:35:27 crc kubenswrapper[4691]: I0930 06:35:27.482320 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5385150d-abdd-4b17-bbf4-fee7d4b5946e-dns-swift-storage-0\") pod \"dnsmasq-dns-75fd5776c-42zjc\" (UID: \"5385150d-abdd-4b17-bbf4-fee7d4b5946e\") " pod="openstack/dnsmasq-dns-75fd5776c-42zjc" Sep 30 06:35:27 crc kubenswrapper[4691]: I0930 06:35:27.482456 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5385150d-abdd-4b17-bbf4-fee7d4b5946e-config\") pod \"dnsmasq-dns-75fd5776c-42zjc\" (UID: \"5385150d-abdd-4b17-bbf4-fee7d4b5946e\") " pod="openstack/dnsmasq-dns-75fd5776c-42zjc" Sep 30 06:35:27 crc kubenswrapper[4691]: I0930 06:35:27.482478 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5385150d-abdd-4b17-bbf4-fee7d4b5946e-dns-svc\") pod \"dnsmasq-dns-75fd5776c-42zjc\" (UID: \"5385150d-abdd-4b17-bbf4-fee7d4b5946e\") " pod="openstack/dnsmasq-dns-75fd5776c-42zjc" Sep 30 06:35:27 crc kubenswrapper[4691]: I0930 06:35:27.482508 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5385150d-abdd-4b17-bbf4-fee7d4b5946e-ovsdbserver-sb\") pod \"dnsmasq-dns-75fd5776c-42zjc\" (UID: \"5385150d-abdd-4b17-bbf4-fee7d4b5946e\") " pod="openstack/dnsmasq-dns-75fd5776c-42zjc" Sep 30 06:35:27 crc kubenswrapper[4691]: I0930 06:35:27.482540 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5385150d-abdd-4b17-bbf4-fee7d4b5946e-ovsdbserver-nb\") pod \"dnsmasq-dns-75fd5776c-42zjc\" (UID: \"5385150d-abdd-4b17-bbf4-fee7d4b5946e\") " pod="openstack/dnsmasq-dns-75fd5776c-42zjc" Sep 30 06:35:27 crc kubenswrapper[4691]: I0930 06:35:27.482562 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4m8m\" (UniqueName: \"kubernetes.io/projected/5385150d-abdd-4b17-bbf4-fee7d4b5946e-kube-api-access-v4m8m\") pod \"dnsmasq-dns-75fd5776c-42zjc\" (UID: \"5385150d-abdd-4b17-bbf4-fee7d4b5946e\") " pod="openstack/dnsmasq-dns-75fd5776c-42zjc" Sep 30 06:35:27 crc kubenswrapper[4691]: I0930 06:35:27.483578 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5385150d-abdd-4b17-bbf4-fee7d4b5946e-dns-svc\") pod \"dnsmasq-dns-75fd5776c-42zjc\" (UID: \"5385150d-abdd-4b17-bbf4-fee7d4b5946e\") " pod="openstack/dnsmasq-dns-75fd5776c-42zjc" Sep 30 06:35:27 crc kubenswrapper[4691]: I0930 06:35:27.483592 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5385150d-abdd-4b17-bbf4-fee7d4b5946e-dns-swift-storage-0\") pod \"dnsmasq-dns-75fd5776c-42zjc\" (UID: \"5385150d-abdd-4b17-bbf4-fee7d4b5946e\") " pod="openstack/dnsmasq-dns-75fd5776c-42zjc" Sep 30 06:35:27 crc kubenswrapper[4691]: I0930 06:35:27.483603 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5385150d-abdd-4b17-bbf4-fee7d4b5946e-ovsdbserver-nb\") pod \"dnsmasq-dns-75fd5776c-42zjc\" (UID: \"5385150d-abdd-4b17-bbf4-fee7d4b5946e\") " pod="openstack/dnsmasq-dns-75fd5776c-42zjc" Sep 30 06:35:27 crc kubenswrapper[4691]: I0930 06:35:27.483604 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5385150d-abdd-4b17-bbf4-fee7d4b5946e-ovsdbserver-sb\") pod \"dnsmasq-dns-75fd5776c-42zjc\" (UID: \"5385150d-abdd-4b17-bbf4-fee7d4b5946e\") " pod="openstack/dnsmasq-dns-75fd5776c-42zjc" Sep 30 06:35:27 crc kubenswrapper[4691]: I0930 06:35:27.483675 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bb9kv\" (UniqueName: \"kubernetes.io/projected/da5644f7-944d-4df5-ae95-535bbf9399a1-kube-api-access-bb9kv\") on node \"crc\" DevicePath \"\"" Sep 30 06:35:27 crc kubenswrapper[4691]: I0930 06:35:27.483708 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5385150d-abdd-4b17-bbf4-fee7d4b5946e-config\") pod \"dnsmasq-dns-75fd5776c-42zjc\" (UID: \"5385150d-abdd-4b17-bbf4-fee7d4b5946e\") " pod="openstack/dnsmasq-dns-75fd5776c-42zjc" Sep 30 06:35:27 crc kubenswrapper[4691]: I0930 06:35:27.503652 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4m8m\" (UniqueName: \"kubernetes.io/projected/5385150d-abdd-4b17-bbf4-fee7d4b5946e-kube-api-access-v4m8m\") pod \"dnsmasq-dns-75fd5776c-42zjc\" (UID: \"5385150d-abdd-4b17-bbf4-fee7d4b5946e\") " pod="openstack/dnsmasq-dns-75fd5776c-42zjc" Sep 30 06:35:27 crc kubenswrapper[4691]: I0930 06:35:27.594227 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75fd5776c-42zjc" Sep 30 06:35:27 crc kubenswrapper[4691]: I0930 06:35:27.876124 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-9400-account-create-2jdrq" Sep 30 06:35:27 crc kubenswrapper[4691]: I0930 06:35:27.876111 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-9400-account-create-2jdrq" event={"ID":"da5644f7-944d-4df5-ae95-535bbf9399a1","Type":"ContainerDied","Data":"f1b8ec8fb8e19f729c2080110f69ecf2f477815fbcdcec2317e15a1f401f0244"} Sep 30 06:35:27 crc kubenswrapper[4691]: I0930 06:35:27.876413 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1b8ec8fb8e19f729c2080110f69ecf2f477815fbcdcec2317e15a1f401f0244" Sep 30 06:35:28 crc kubenswrapper[4691]: I0930 06:35:28.099831 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-75fd5776c-42zjc"] Sep 30 06:35:28 crc kubenswrapper[4691]: I0930 06:35:28.893977 4691 generic.go:334] "Generic (PLEG): container finished" podID="5385150d-abdd-4b17-bbf4-fee7d4b5946e" containerID="40e732a792146e552b05fa44be23dda22f4f40a9eff19885c012d4acec28c16a" exitCode=0 Sep 30 06:35:28 crc kubenswrapper[4691]: I0930 06:35:28.894063 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75fd5776c-42zjc" event={"ID":"5385150d-abdd-4b17-bbf4-fee7d4b5946e","Type":"ContainerDied","Data":"40e732a792146e552b05fa44be23dda22f4f40a9eff19885c012d4acec28c16a"} Sep 30 06:35:28 crc kubenswrapper[4691]: I0930 06:35:28.894472 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75fd5776c-42zjc" event={"ID":"5385150d-abdd-4b17-bbf4-fee7d4b5946e","Type":"ContainerStarted","Data":"a7a31101f62228e8fcb6f246abc71893edf4eb180fd0485bf9ed33a3cd12716d"} Sep 30 06:35:29 crc kubenswrapper[4691]: I0930 06:35:29.911358 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75fd5776c-42zjc" event={"ID":"5385150d-abdd-4b17-bbf4-fee7d4b5946e","Type":"ContainerStarted","Data":"6a0753e503f95d846c3c259455c74b6e174a2c797cab68037838cd926dabbc48"} Sep 30 06:35:29 crc kubenswrapper[4691]: I0930 06:35:29.911981 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-75fd5776c-42zjc" Sep 30 06:35:29 crc kubenswrapper[4691]: I0930 06:35:29.945463 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-75fd5776c-42zjc" podStartSLOduration=2.945406437 podStartE2EDuration="2.945406437s" podCreationTimestamp="2025-09-30 06:35:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:35:29.937196194 +0000 UTC m=+973.412217244" watchObservedRunningTime="2025-09-30 06:35:29.945406437 +0000 UTC m=+973.420427507" Sep 30 06:35:33 crc kubenswrapper[4691]: I0930 06:35:33.299498 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-gvbbh"] Sep 30 06:35:33 crc kubenswrapper[4691]: E0930 06:35:33.300393 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da5644f7-944d-4df5-ae95-535bbf9399a1" containerName="mariadb-account-create" Sep 30 06:35:33 crc kubenswrapper[4691]: I0930 06:35:33.300435 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="da5644f7-944d-4df5-ae95-535bbf9399a1" containerName="mariadb-account-create" Sep 30 06:35:33 crc kubenswrapper[4691]: I0930 06:35:33.300769 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="da5644f7-944d-4df5-ae95-535bbf9399a1" containerName="mariadb-account-create" Sep 30 06:35:33 crc kubenswrapper[4691]: I0930 06:35:33.301724 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-gvbbh" Sep 30 06:35:33 crc kubenswrapper[4691]: I0930 06:35:33.310391 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-kzssm" Sep 30 06:35:33 crc kubenswrapper[4691]: I0930 06:35:33.310495 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Sep 30 06:35:33 crc kubenswrapper[4691]: I0930 06:35:33.310986 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-gvbbh"] Sep 30 06:35:33 crc kubenswrapper[4691]: I0930 06:35:33.387295 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30d87d59-039d-4ccf-a112-2beb7059e140-combined-ca-bundle\") pod \"glance-db-sync-gvbbh\" (UID: \"30d87d59-039d-4ccf-a112-2beb7059e140\") " pod="openstack/glance-db-sync-gvbbh" Sep 30 06:35:33 crc kubenswrapper[4691]: I0930 06:35:33.387331 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30d87d59-039d-4ccf-a112-2beb7059e140-config-data\") pod \"glance-db-sync-gvbbh\" (UID: \"30d87d59-039d-4ccf-a112-2beb7059e140\") " pod="openstack/glance-db-sync-gvbbh" Sep 30 06:35:33 crc kubenswrapper[4691]: I0930 06:35:33.387388 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rllr\" (UniqueName: \"kubernetes.io/projected/30d87d59-039d-4ccf-a112-2beb7059e140-kube-api-access-4rllr\") pod \"glance-db-sync-gvbbh\" (UID: \"30d87d59-039d-4ccf-a112-2beb7059e140\") " pod="openstack/glance-db-sync-gvbbh" Sep 30 06:35:33 crc kubenswrapper[4691]: I0930 06:35:33.387406 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/30d87d59-039d-4ccf-a112-2beb7059e140-db-sync-config-data\") pod \"glance-db-sync-gvbbh\" (UID: \"30d87d59-039d-4ccf-a112-2beb7059e140\") " pod="openstack/glance-db-sync-gvbbh" Sep 30 06:35:33 crc kubenswrapper[4691]: I0930 06:35:33.488981 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30d87d59-039d-4ccf-a112-2beb7059e140-combined-ca-bundle\") pod \"glance-db-sync-gvbbh\" (UID: \"30d87d59-039d-4ccf-a112-2beb7059e140\") " pod="openstack/glance-db-sync-gvbbh" Sep 30 06:35:33 crc kubenswrapper[4691]: I0930 06:35:33.489020 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30d87d59-039d-4ccf-a112-2beb7059e140-config-data\") pod \"glance-db-sync-gvbbh\" (UID: \"30d87d59-039d-4ccf-a112-2beb7059e140\") " pod="openstack/glance-db-sync-gvbbh" Sep 30 06:35:33 crc kubenswrapper[4691]: I0930 06:35:33.489061 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rllr\" (UniqueName: \"kubernetes.io/projected/30d87d59-039d-4ccf-a112-2beb7059e140-kube-api-access-4rllr\") pod \"glance-db-sync-gvbbh\" (UID: \"30d87d59-039d-4ccf-a112-2beb7059e140\") " pod="openstack/glance-db-sync-gvbbh" Sep 30 06:35:33 crc kubenswrapper[4691]: I0930 06:35:33.489079 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/30d87d59-039d-4ccf-a112-2beb7059e140-db-sync-config-data\") pod \"glance-db-sync-gvbbh\" (UID: \"30d87d59-039d-4ccf-a112-2beb7059e140\") " pod="openstack/glance-db-sync-gvbbh" Sep 30 06:35:33 crc kubenswrapper[4691]: I0930 06:35:33.494786 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30d87d59-039d-4ccf-a112-2beb7059e140-config-data\") pod \"glance-db-sync-gvbbh\" (UID: \"30d87d59-039d-4ccf-a112-2beb7059e140\") " pod="openstack/glance-db-sync-gvbbh" Sep 30 06:35:33 crc kubenswrapper[4691]: I0930 06:35:33.497533 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/30d87d59-039d-4ccf-a112-2beb7059e140-db-sync-config-data\") pod \"glance-db-sync-gvbbh\" (UID: \"30d87d59-039d-4ccf-a112-2beb7059e140\") " pod="openstack/glance-db-sync-gvbbh" Sep 30 06:35:33 crc kubenswrapper[4691]: I0930 06:35:33.515450 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rllr\" (UniqueName: \"kubernetes.io/projected/30d87d59-039d-4ccf-a112-2beb7059e140-kube-api-access-4rllr\") pod \"glance-db-sync-gvbbh\" (UID: \"30d87d59-039d-4ccf-a112-2beb7059e140\") " pod="openstack/glance-db-sync-gvbbh" Sep 30 06:35:33 crc kubenswrapper[4691]: I0930 06:35:33.519715 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30d87d59-039d-4ccf-a112-2beb7059e140-combined-ca-bundle\") pod \"glance-db-sync-gvbbh\" (UID: \"30d87d59-039d-4ccf-a112-2beb7059e140\") " pod="openstack/glance-db-sync-gvbbh" Sep 30 06:35:33 crc kubenswrapper[4691]: I0930 06:35:33.628491 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-gvbbh" Sep 30 06:35:33 crc kubenswrapper[4691]: I0930 06:35:33.667710 4691 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="6b6ff7c5-6146-432e-a89c-fe95ac728e5c" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.105:5671: connect: connection refused" Sep 30 06:35:33 crc kubenswrapper[4691]: I0930 06:35:33.902266 4691 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="fd5df9d9-7a0a-441c-b21d-92dff2af7376" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.106:5671: connect: connection refused" Sep 30 06:35:33 crc kubenswrapper[4691]: I0930 06:35:33.949626 4691 generic.go:334] "Generic (PLEG): container finished" podID="e6d36519-9195-4e0b-9760-844d420e2661" containerID="e70ed48128da6ed15d63b20c1b58852c8191c4f116e99cd1584116f3c89961b6" exitCode=0 Sep 30 06:35:33 crc kubenswrapper[4691]: I0930 06:35:33.949676 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e6d36519-9195-4e0b-9760-844d420e2661","Type":"ContainerDied","Data":"e70ed48128da6ed15d63b20c1b58852c8191c4f116e99cd1584116f3c89961b6"} Sep 30 06:35:34 crc kubenswrapper[4691]: I0930 06:35:34.205097 4691 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-notifications-server-0" podUID="d454968e-74c7-45e3-9608-e915973c7f25" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.107:5671: connect: connection refused" Sep 30 06:35:34 crc kubenswrapper[4691]: I0930 06:35:34.222742 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-gvbbh"] Sep 30 06:35:34 crc kubenswrapper[4691]: W0930 06:35:34.239939 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod30d87d59_039d_4ccf_a112_2beb7059e140.slice/crio-e269b54915772689413010625db1fae7b3f87e5f9c444d0233daecbf8662bbe0 WatchSource:0}: Error finding container e269b54915772689413010625db1fae7b3f87e5f9c444d0233daecbf8662bbe0: Status 404 returned error can't find the container with id e269b54915772689413010625db1fae7b3f87e5f9c444d0233daecbf8662bbe0 Sep 30 06:35:34 crc kubenswrapper[4691]: I0930 06:35:34.961120 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-gvbbh" event={"ID":"30d87d59-039d-4ccf-a112-2beb7059e140","Type":"ContainerStarted","Data":"e269b54915772689413010625db1fae7b3f87e5f9c444d0233daecbf8662bbe0"} Sep 30 06:35:34 crc kubenswrapper[4691]: I0930 06:35:34.965522 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e6d36519-9195-4e0b-9760-844d420e2661","Type":"ContainerStarted","Data":"f2066cd08b87186a70377d50558d23be269f769184c84558f83ebec7565f936b"} Sep 30 06:35:37 crc kubenswrapper[4691]: I0930 06:35:37.596453 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-75fd5776c-42zjc" Sep 30 06:35:37 crc kubenswrapper[4691]: I0930 06:35:37.646942 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-648b6fc9cc-b6vxg"] Sep 30 06:35:37 crc kubenswrapper[4691]: I0930 06:35:37.647159 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-648b6fc9cc-b6vxg" podUID="bc9dc6ca-48fa-4947-a151-db88fc6bcd0c" containerName="dnsmasq-dns" containerID="cri-o://bf77a04625f42b543380f37ddd1340ba394ea4c321e945ee76a74c7774bc1181" gracePeriod=10 Sep 30 06:35:37 crc kubenswrapper[4691]: I0930 06:35:37.998955 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e6d36519-9195-4e0b-9760-844d420e2661","Type":"ContainerStarted","Data":"ce79f06433243c760f5081894f47324ef44d9e935d4dfdcb32ee476850baf204"} Sep 30 06:35:37 crc kubenswrapper[4691]: I0930 06:35:37.999230 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e6d36519-9195-4e0b-9760-844d420e2661","Type":"ContainerStarted","Data":"8e24f260ef6ef82ea29d52730656e5557e2769f5ebd1c1afff062234d4267ed5"} Sep 30 06:35:38 crc kubenswrapper[4691]: I0930 06:35:38.008642 4691 generic.go:334] "Generic (PLEG): container finished" podID="bc9dc6ca-48fa-4947-a151-db88fc6bcd0c" containerID="bf77a04625f42b543380f37ddd1340ba394ea4c321e945ee76a74c7774bc1181" exitCode=0 Sep 30 06:35:38 crc kubenswrapper[4691]: I0930 06:35:38.008688 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-648b6fc9cc-b6vxg" event={"ID":"bc9dc6ca-48fa-4947-a151-db88fc6bcd0c","Type":"ContainerDied","Data":"bf77a04625f42b543380f37ddd1340ba394ea4c321e945ee76a74c7774bc1181"} Sep 30 06:35:38 crc kubenswrapper[4691]: I0930 06:35:38.032382 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=17.032367474 podStartE2EDuration="17.032367474s" podCreationTimestamp="2025-09-30 06:35:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:35:38.027796538 +0000 UTC m=+981.502817588" watchObservedRunningTime="2025-09-30 06:35:38.032367474 +0000 UTC m=+981.507388514" Sep 30 06:35:38 crc kubenswrapper[4691]: I0930 06:35:38.132480 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-648b6fc9cc-b6vxg" Sep 30 06:35:38 crc kubenswrapper[4691]: I0930 06:35:38.172416 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc9dc6ca-48fa-4947-a151-db88fc6bcd0c-config\") pod \"bc9dc6ca-48fa-4947-a151-db88fc6bcd0c\" (UID: \"bc9dc6ca-48fa-4947-a151-db88fc6bcd0c\") " Sep 30 06:35:38 crc kubenswrapper[4691]: I0930 06:35:38.172563 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bc9dc6ca-48fa-4947-a151-db88fc6bcd0c-ovsdbserver-sb\") pod \"bc9dc6ca-48fa-4947-a151-db88fc6bcd0c\" (UID: \"bc9dc6ca-48fa-4947-a151-db88fc6bcd0c\") " Sep 30 06:35:38 crc kubenswrapper[4691]: I0930 06:35:38.172605 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bc9dc6ca-48fa-4947-a151-db88fc6bcd0c-dns-svc\") pod \"bc9dc6ca-48fa-4947-a151-db88fc6bcd0c\" (UID: \"bc9dc6ca-48fa-4947-a151-db88fc6bcd0c\") " Sep 30 06:35:38 crc kubenswrapper[4691]: I0930 06:35:38.172626 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-778pw\" (UniqueName: \"kubernetes.io/projected/bc9dc6ca-48fa-4947-a151-db88fc6bcd0c-kube-api-access-778pw\") pod \"bc9dc6ca-48fa-4947-a151-db88fc6bcd0c\" (UID: \"bc9dc6ca-48fa-4947-a151-db88fc6bcd0c\") " Sep 30 06:35:38 crc kubenswrapper[4691]: I0930 06:35:38.172669 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bc9dc6ca-48fa-4947-a151-db88fc6bcd0c-ovsdbserver-nb\") pod \"bc9dc6ca-48fa-4947-a151-db88fc6bcd0c\" (UID: \"bc9dc6ca-48fa-4947-a151-db88fc6bcd0c\") " Sep 30 06:35:38 crc kubenswrapper[4691]: I0930 06:35:38.179308 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc9dc6ca-48fa-4947-a151-db88fc6bcd0c-kube-api-access-778pw" (OuterVolumeSpecName: "kube-api-access-778pw") pod "bc9dc6ca-48fa-4947-a151-db88fc6bcd0c" (UID: "bc9dc6ca-48fa-4947-a151-db88fc6bcd0c"). InnerVolumeSpecName "kube-api-access-778pw". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:35:38 crc kubenswrapper[4691]: I0930 06:35:38.226529 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc9dc6ca-48fa-4947-a151-db88fc6bcd0c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "bc9dc6ca-48fa-4947-a151-db88fc6bcd0c" (UID: "bc9dc6ca-48fa-4947-a151-db88fc6bcd0c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:35:38 crc kubenswrapper[4691]: I0930 06:35:38.229432 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc9dc6ca-48fa-4947-a151-db88fc6bcd0c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "bc9dc6ca-48fa-4947-a151-db88fc6bcd0c" (UID: "bc9dc6ca-48fa-4947-a151-db88fc6bcd0c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:35:38 crc kubenswrapper[4691]: I0930 06:35:38.231653 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc9dc6ca-48fa-4947-a151-db88fc6bcd0c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "bc9dc6ca-48fa-4947-a151-db88fc6bcd0c" (UID: "bc9dc6ca-48fa-4947-a151-db88fc6bcd0c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:35:38 crc kubenswrapper[4691]: I0930 06:35:38.243650 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc9dc6ca-48fa-4947-a151-db88fc6bcd0c-config" (OuterVolumeSpecName: "config") pod "bc9dc6ca-48fa-4947-a151-db88fc6bcd0c" (UID: "bc9dc6ca-48fa-4947-a151-db88fc6bcd0c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:35:38 crc kubenswrapper[4691]: I0930 06:35:38.275188 4691 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/bc9dc6ca-48fa-4947-a151-db88fc6bcd0c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 06:35:38 crc kubenswrapper[4691]: I0930 06:35:38.275214 4691 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/bc9dc6ca-48fa-4947-a151-db88fc6bcd0c-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 06:35:38 crc kubenswrapper[4691]: I0930 06:35:38.275226 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-778pw\" (UniqueName: \"kubernetes.io/projected/bc9dc6ca-48fa-4947-a151-db88fc6bcd0c-kube-api-access-778pw\") on node \"crc\" DevicePath \"\"" Sep 30 06:35:38 crc kubenswrapper[4691]: I0930 06:35:38.275236 4691 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/bc9dc6ca-48fa-4947-a151-db88fc6bcd0c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 06:35:38 crc kubenswrapper[4691]: I0930 06:35:38.275244 4691 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc9dc6ca-48fa-4947-a151-db88fc6bcd0c-config\") on node \"crc\" DevicePath \"\"" Sep 30 06:35:39 crc kubenswrapper[4691]: I0930 06:35:39.019012 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-648b6fc9cc-b6vxg" event={"ID":"bc9dc6ca-48fa-4947-a151-db88fc6bcd0c","Type":"ContainerDied","Data":"534019919df35d562c6066de8287c10ab907f04ed20c23bae1c21bca0ae17561"} Sep 30 06:35:39 crc kubenswrapper[4691]: I0930 06:35:39.019057 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-648b6fc9cc-b6vxg" Sep 30 06:35:39 crc kubenswrapper[4691]: I0930 06:35:39.019349 4691 scope.go:117] "RemoveContainer" containerID="bf77a04625f42b543380f37ddd1340ba394ea4c321e945ee76a74c7774bc1181" Sep 30 06:35:39 crc kubenswrapper[4691]: I0930 06:35:39.093809 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-648b6fc9cc-b6vxg"] Sep 30 06:35:39 crc kubenswrapper[4691]: I0930 06:35:39.101312 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-648b6fc9cc-b6vxg"] Sep 30 06:35:39 crc kubenswrapper[4691]: I0930 06:35:39.237967 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc9dc6ca-48fa-4947-a151-db88fc6bcd0c" path="/var/lib/kubelet/pods/bc9dc6ca-48fa-4947-a151-db88fc6bcd0c/volumes" Sep 30 06:35:41 crc kubenswrapper[4691]: I0930 06:35:41.804001 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Sep 30 06:35:43 crc kubenswrapper[4691]: I0930 06:35:43.667671 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Sep 30 06:35:43 crc kubenswrapper[4691]: I0930 06:35:43.900259 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Sep 30 06:35:44 crc kubenswrapper[4691]: I0930 06:35:44.094287 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-x5kh8"] Sep 30 06:35:44 crc kubenswrapper[4691]: E0930 06:35:44.094701 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc9dc6ca-48fa-4947-a151-db88fc6bcd0c" containerName="init" Sep 30 06:35:44 crc kubenswrapper[4691]: I0930 06:35:44.094723 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc9dc6ca-48fa-4947-a151-db88fc6bcd0c" containerName="init" Sep 30 06:35:44 crc kubenswrapper[4691]: E0930 06:35:44.094738 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc9dc6ca-48fa-4947-a151-db88fc6bcd0c" containerName="dnsmasq-dns" Sep 30 06:35:44 crc kubenswrapper[4691]: I0930 06:35:44.094745 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc9dc6ca-48fa-4947-a151-db88fc6bcd0c" containerName="dnsmasq-dns" Sep 30 06:35:44 crc kubenswrapper[4691]: I0930 06:35:44.094916 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc9dc6ca-48fa-4947-a151-db88fc6bcd0c" containerName="dnsmasq-dns" Sep 30 06:35:44 crc kubenswrapper[4691]: I0930 06:35:44.095595 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-x5kh8" Sep 30 06:35:44 crc kubenswrapper[4691]: I0930 06:35:44.115092 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-x5kh8"] Sep 30 06:35:44 crc kubenswrapper[4691]: I0930 06:35:44.193306 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-z5k76"] Sep 30 06:35:44 crc kubenswrapper[4691]: I0930 06:35:44.194310 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-z5k76" Sep 30 06:35:44 crc kubenswrapper[4691]: I0930 06:35:44.200831 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgrvl\" (UniqueName: \"kubernetes.io/projected/5c83300e-e91d-43bf-a9b7-ee763cea39b2-kube-api-access-tgrvl\") pod \"barbican-db-create-x5kh8\" (UID: \"5c83300e-e91d-43bf-a9b7-ee763cea39b2\") " pod="openstack/barbican-db-create-x5kh8" Sep 30 06:35:44 crc kubenswrapper[4691]: I0930 06:35:44.205048 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-notifications-server-0" Sep 30 06:35:44 crc kubenswrapper[4691]: I0930 06:35:44.270974 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-z5k76"] Sep 30 06:35:44 crc kubenswrapper[4691]: I0930 06:35:44.302643 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xcw6\" (UniqueName: \"kubernetes.io/projected/16b4442d-198c-4824-a8f3-3fbfd345e87f-kube-api-access-9xcw6\") pod \"cinder-db-create-z5k76\" (UID: \"16b4442d-198c-4824-a8f3-3fbfd345e87f\") " pod="openstack/cinder-db-create-z5k76" Sep 30 06:35:44 crc kubenswrapper[4691]: I0930 06:35:44.302914 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgrvl\" (UniqueName: \"kubernetes.io/projected/5c83300e-e91d-43bf-a9b7-ee763cea39b2-kube-api-access-tgrvl\") pod \"barbican-db-create-x5kh8\" (UID: \"5c83300e-e91d-43bf-a9b7-ee763cea39b2\") " pod="openstack/barbican-db-create-x5kh8" Sep 30 06:35:44 crc kubenswrapper[4691]: I0930 06:35:44.350217 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgrvl\" (UniqueName: \"kubernetes.io/projected/5c83300e-e91d-43bf-a9b7-ee763cea39b2-kube-api-access-tgrvl\") pod \"barbican-db-create-x5kh8\" (UID: \"5c83300e-e91d-43bf-a9b7-ee763cea39b2\") " pod="openstack/barbican-db-create-x5kh8" Sep 30 06:35:44 crc kubenswrapper[4691]: I0930 06:35:44.387567 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-4dcl7"] Sep 30 06:35:44 crc kubenswrapper[4691]: I0930 06:35:44.388528 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-4dcl7" Sep 30 06:35:44 crc kubenswrapper[4691]: I0930 06:35:44.404169 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-4dcl7"] Sep 30 06:35:44 crc kubenswrapper[4691]: I0930 06:35:44.404495 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xcw6\" (UniqueName: \"kubernetes.io/projected/16b4442d-198c-4824-a8f3-3fbfd345e87f-kube-api-access-9xcw6\") pod \"cinder-db-create-z5k76\" (UID: \"16b4442d-198c-4824-a8f3-3fbfd345e87f\") " pod="openstack/cinder-db-create-z5k76" Sep 30 06:35:44 crc kubenswrapper[4691]: I0930 06:35:44.404625 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwvfr\" (UniqueName: \"kubernetes.io/projected/b5ed4c1d-acca-4979-876b-1b0fbb34443c-kube-api-access-pwvfr\") pod \"neutron-db-create-4dcl7\" (UID: \"b5ed4c1d-acca-4979-876b-1b0fbb34443c\") " pod="openstack/neutron-db-create-4dcl7" Sep 30 06:35:44 crc kubenswrapper[4691]: I0930 06:35:44.431646 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xcw6\" (UniqueName: \"kubernetes.io/projected/16b4442d-198c-4824-a8f3-3fbfd345e87f-kube-api-access-9xcw6\") pod \"cinder-db-create-z5k76\" (UID: \"16b4442d-198c-4824-a8f3-3fbfd345e87f\") " pod="openstack/cinder-db-create-z5k76" Sep 30 06:35:44 crc kubenswrapper[4691]: I0930 06:35:44.439227 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-x5kh8" Sep 30 06:35:44 crc kubenswrapper[4691]: I0930 06:35:44.483686 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-6grqn"] Sep 30 06:35:44 crc kubenswrapper[4691]: I0930 06:35:44.484804 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-6grqn" Sep 30 06:35:44 crc kubenswrapper[4691]: I0930 06:35:44.488639 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Sep 30 06:35:44 crc kubenswrapper[4691]: I0930 06:35:44.488682 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-fxfcb" Sep 30 06:35:44 crc kubenswrapper[4691]: I0930 06:35:44.491829 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Sep 30 06:35:44 crc kubenswrapper[4691]: I0930 06:35:44.493571 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Sep 30 06:35:44 crc kubenswrapper[4691]: I0930 06:35:44.505567 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwvfr\" (UniqueName: \"kubernetes.io/projected/b5ed4c1d-acca-4979-876b-1b0fbb34443c-kube-api-access-pwvfr\") pod \"neutron-db-create-4dcl7\" (UID: \"b5ed4c1d-acca-4979-876b-1b0fbb34443c\") " pod="openstack/neutron-db-create-4dcl7" Sep 30 06:35:44 crc kubenswrapper[4691]: I0930 06:35:44.505795 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-6grqn"] Sep 30 06:35:44 crc kubenswrapper[4691]: I0930 06:35:44.511745 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-z5k76" Sep 30 06:35:44 crc kubenswrapper[4691]: I0930 06:35:44.557358 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwvfr\" (UniqueName: \"kubernetes.io/projected/b5ed4c1d-acca-4979-876b-1b0fbb34443c-kube-api-access-pwvfr\") pod \"neutron-db-create-4dcl7\" (UID: \"b5ed4c1d-acca-4979-876b-1b0fbb34443c\") " pod="openstack/neutron-db-create-4dcl7" Sep 30 06:35:44 crc kubenswrapper[4691]: I0930 06:35:44.608097 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlv4t\" (UniqueName: \"kubernetes.io/projected/29abdc62-6e41-49f9-8426-f8b4c1f25014-kube-api-access-tlv4t\") pod \"keystone-db-sync-6grqn\" (UID: \"29abdc62-6e41-49f9-8426-f8b4c1f25014\") " pod="openstack/keystone-db-sync-6grqn" Sep 30 06:35:44 crc kubenswrapper[4691]: I0930 06:35:44.608157 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29abdc62-6e41-49f9-8426-f8b4c1f25014-combined-ca-bundle\") pod \"keystone-db-sync-6grqn\" (UID: \"29abdc62-6e41-49f9-8426-f8b4c1f25014\") " pod="openstack/keystone-db-sync-6grqn" Sep 30 06:35:44 crc kubenswrapper[4691]: I0930 06:35:44.608200 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29abdc62-6e41-49f9-8426-f8b4c1f25014-config-data\") pod \"keystone-db-sync-6grqn\" (UID: \"29abdc62-6e41-49f9-8426-f8b4c1f25014\") " pod="openstack/keystone-db-sync-6grqn" Sep 30 06:35:44 crc kubenswrapper[4691]: I0930 06:35:44.708068 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-4dcl7" Sep 30 06:35:44 crc kubenswrapper[4691]: I0930 06:35:44.709277 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29abdc62-6e41-49f9-8426-f8b4c1f25014-config-data\") pod \"keystone-db-sync-6grqn\" (UID: \"29abdc62-6e41-49f9-8426-f8b4c1f25014\") " pod="openstack/keystone-db-sync-6grqn" Sep 30 06:35:44 crc kubenswrapper[4691]: I0930 06:35:44.709841 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tlv4t\" (UniqueName: \"kubernetes.io/projected/29abdc62-6e41-49f9-8426-f8b4c1f25014-kube-api-access-tlv4t\") pod \"keystone-db-sync-6grqn\" (UID: \"29abdc62-6e41-49f9-8426-f8b4c1f25014\") " pod="openstack/keystone-db-sync-6grqn" Sep 30 06:35:44 crc kubenswrapper[4691]: I0930 06:35:44.709878 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29abdc62-6e41-49f9-8426-f8b4c1f25014-combined-ca-bundle\") pod \"keystone-db-sync-6grqn\" (UID: \"29abdc62-6e41-49f9-8426-f8b4c1f25014\") " pod="openstack/keystone-db-sync-6grqn" Sep 30 06:35:44 crc kubenswrapper[4691]: I0930 06:35:44.712518 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29abdc62-6e41-49f9-8426-f8b4c1f25014-combined-ca-bundle\") pod \"keystone-db-sync-6grqn\" (UID: \"29abdc62-6e41-49f9-8426-f8b4c1f25014\") " pod="openstack/keystone-db-sync-6grqn" Sep 30 06:35:44 crc kubenswrapper[4691]: I0930 06:35:44.715962 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29abdc62-6e41-49f9-8426-f8b4c1f25014-config-data\") pod \"keystone-db-sync-6grqn\" (UID: \"29abdc62-6e41-49f9-8426-f8b4c1f25014\") " pod="openstack/keystone-db-sync-6grqn" Sep 30 06:35:44 crc kubenswrapper[4691]: I0930 06:35:44.730518 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tlv4t\" (UniqueName: \"kubernetes.io/projected/29abdc62-6e41-49f9-8426-f8b4c1f25014-kube-api-access-tlv4t\") pod \"keystone-db-sync-6grqn\" (UID: \"29abdc62-6e41-49f9-8426-f8b4c1f25014\") " pod="openstack/keystone-db-sync-6grqn" Sep 30 06:35:44 crc kubenswrapper[4691]: I0930 06:35:44.808956 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-6grqn" Sep 30 06:35:46 crc kubenswrapper[4691]: I0930 06:35:46.625980 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-db-sync-k2cvv"] Sep 30 06:35:46 crc kubenswrapper[4691]: I0930 06:35:46.627134 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-k2cvv" Sep 30 06:35:46 crc kubenswrapper[4691]: I0930 06:35:46.629079 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-config-data" Sep 30 06:35:46 crc kubenswrapper[4691]: I0930 06:35:46.629541 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-watcher-dockercfg-59ndn" Sep 30 06:35:46 crc kubenswrapper[4691]: I0930 06:35:46.638184 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/097bc1bb-8fd4-4ccd-ad03-246aa9ecdd22-db-sync-config-data\") pod \"watcher-db-sync-k2cvv\" (UID: \"097bc1bb-8fd4-4ccd-ad03-246aa9ecdd22\") " pod="openstack/watcher-db-sync-k2cvv" Sep 30 06:35:46 crc kubenswrapper[4691]: I0930 06:35:46.638266 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/097bc1bb-8fd4-4ccd-ad03-246aa9ecdd22-config-data\") pod \"watcher-db-sync-k2cvv\" (UID: \"097bc1bb-8fd4-4ccd-ad03-246aa9ecdd22\") " pod="openstack/watcher-db-sync-k2cvv" Sep 30 06:35:46 crc kubenswrapper[4691]: I0930 06:35:46.638363 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hds7w\" (UniqueName: \"kubernetes.io/projected/097bc1bb-8fd4-4ccd-ad03-246aa9ecdd22-kube-api-access-hds7w\") pod \"watcher-db-sync-k2cvv\" (UID: \"097bc1bb-8fd4-4ccd-ad03-246aa9ecdd22\") " pod="openstack/watcher-db-sync-k2cvv" Sep 30 06:35:46 crc kubenswrapper[4691]: I0930 06:35:46.638396 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/097bc1bb-8fd4-4ccd-ad03-246aa9ecdd22-combined-ca-bundle\") pod \"watcher-db-sync-k2cvv\" (UID: \"097bc1bb-8fd4-4ccd-ad03-246aa9ecdd22\") " pod="openstack/watcher-db-sync-k2cvv" Sep 30 06:35:46 crc kubenswrapper[4691]: I0930 06:35:46.639712 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-sync-k2cvv"] Sep 30 06:35:46 crc kubenswrapper[4691]: I0930 06:35:46.739552 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/097bc1bb-8fd4-4ccd-ad03-246aa9ecdd22-config-data\") pod \"watcher-db-sync-k2cvv\" (UID: \"097bc1bb-8fd4-4ccd-ad03-246aa9ecdd22\") " pod="openstack/watcher-db-sync-k2cvv" Sep 30 06:35:46 crc kubenswrapper[4691]: I0930 06:35:46.739654 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hds7w\" (UniqueName: \"kubernetes.io/projected/097bc1bb-8fd4-4ccd-ad03-246aa9ecdd22-kube-api-access-hds7w\") pod \"watcher-db-sync-k2cvv\" (UID: \"097bc1bb-8fd4-4ccd-ad03-246aa9ecdd22\") " pod="openstack/watcher-db-sync-k2cvv" Sep 30 06:35:46 crc kubenswrapper[4691]: I0930 06:35:46.739681 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/097bc1bb-8fd4-4ccd-ad03-246aa9ecdd22-combined-ca-bundle\") pod \"watcher-db-sync-k2cvv\" (UID: \"097bc1bb-8fd4-4ccd-ad03-246aa9ecdd22\") " pod="openstack/watcher-db-sync-k2cvv" Sep 30 06:35:46 crc kubenswrapper[4691]: I0930 06:35:46.739705 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/097bc1bb-8fd4-4ccd-ad03-246aa9ecdd22-db-sync-config-data\") pod \"watcher-db-sync-k2cvv\" (UID: \"097bc1bb-8fd4-4ccd-ad03-246aa9ecdd22\") " pod="openstack/watcher-db-sync-k2cvv" Sep 30 06:35:46 crc kubenswrapper[4691]: I0930 06:35:46.744816 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/097bc1bb-8fd4-4ccd-ad03-246aa9ecdd22-db-sync-config-data\") pod \"watcher-db-sync-k2cvv\" (UID: \"097bc1bb-8fd4-4ccd-ad03-246aa9ecdd22\") " pod="openstack/watcher-db-sync-k2cvv" Sep 30 06:35:46 crc kubenswrapper[4691]: I0930 06:35:46.745447 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/097bc1bb-8fd4-4ccd-ad03-246aa9ecdd22-combined-ca-bundle\") pod \"watcher-db-sync-k2cvv\" (UID: \"097bc1bb-8fd4-4ccd-ad03-246aa9ecdd22\") " pod="openstack/watcher-db-sync-k2cvv" Sep 30 06:35:46 crc kubenswrapper[4691]: I0930 06:35:46.747358 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/097bc1bb-8fd4-4ccd-ad03-246aa9ecdd22-config-data\") pod \"watcher-db-sync-k2cvv\" (UID: \"097bc1bb-8fd4-4ccd-ad03-246aa9ecdd22\") " pod="openstack/watcher-db-sync-k2cvv" Sep 30 06:35:46 crc kubenswrapper[4691]: I0930 06:35:46.754603 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hds7w\" (UniqueName: \"kubernetes.io/projected/097bc1bb-8fd4-4ccd-ad03-246aa9ecdd22-kube-api-access-hds7w\") pod \"watcher-db-sync-k2cvv\" (UID: \"097bc1bb-8fd4-4ccd-ad03-246aa9ecdd22\") " pod="openstack/watcher-db-sync-k2cvv" Sep 30 06:35:46 crc kubenswrapper[4691]: I0930 06:35:46.946354 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-k2cvv" Sep 30 06:35:47 crc kubenswrapper[4691]: I0930 06:35:47.621923 4691 scope.go:117] "RemoveContainer" containerID="6cd42cfdf019af1631abfd5a42ec4c12c920decb6fcb33ef972285226821cd47" Sep 30 06:35:48 crc kubenswrapper[4691]: I0930 06:35:48.276395 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-db-sync-k2cvv"] Sep 30 06:35:48 crc kubenswrapper[4691]: I0930 06:35:48.289660 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-x5kh8"] Sep 30 06:35:48 crc kubenswrapper[4691]: I0930 06:35:48.421799 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-z5k76"] Sep 30 06:35:48 crc kubenswrapper[4691]: I0930 06:35:48.452532 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-6grqn"] Sep 30 06:35:48 crc kubenswrapper[4691]: I0930 06:35:48.553908 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-4dcl7"] Sep 30 06:35:49 crc kubenswrapper[4691]: I0930 06:35:49.115334 4691 generic.go:334] "Generic (PLEG): container finished" podID="b5ed4c1d-acca-4979-876b-1b0fbb34443c" containerID="6922a5c6d13c5d78edb8eeb161243af2152c9433ba169d033271e10bf486a4ed" exitCode=0 Sep 30 06:35:49 crc kubenswrapper[4691]: I0930 06:35:49.115434 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-4dcl7" event={"ID":"b5ed4c1d-acca-4979-876b-1b0fbb34443c","Type":"ContainerDied","Data":"6922a5c6d13c5d78edb8eeb161243af2152c9433ba169d033271e10bf486a4ed"} Sep 30 06:35:49 crc kubenswrapper[4691]: I0930 06:35:49.115594 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-4dcl7" event={"ID":"b5ed4c1d-acca-4979-876b-1b0fbb34443c","Type":"ContainerStarted","Data":"38d30d2baa2f2c67f648c7dda689d1b71d32b688c55e123e79b97a7fc7ed58ee"} Sep 30 06:35:49 crc kubenswrapper[4691]: I0930 06:35:49.123090 4691 generic.go:334] "Generic (PLEG): container finished" podID="16b4442d-198c-4824-a8f3-3fbfd345e87f" containerID="3a051da935daad5848c7eff04402e13b3aae4eb8b6ebcbb57bf8a6a9fddcf520" exitCode=0 Sep 30 06:35:49 crc kubenswrapper[4691]: I0930 06:35:49.123128 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-z5k76" event={"ID":"16b4442d-198c-4824-a8f3-3fbfd345e87f","Type":"ContainerDied","Data":"3a051da935daad5848c7eff04402e13b3aae4eb8b6ebcbb57bf8a6a9fddcf520"} Sep 30 06:35:49 crc kubenswrapper[4691]: I0930 06:35:49.123185 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-z5k76" event={"ID":"16b4442d-198c-4824-a8f3-3fbfd345e87f","Type":"ContainerStarted","Data":"df3e0ea4d7de9f49859a45584aeeabd37ebd195b537d56b59ce0c561e30b5001"} Sep 30 06:35:49 crc kubenswrapper[4691]: I0930 06:35:49.125092 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-k2cvv" event={"ID":"097bc1bb-8fd4-4ccd-ad03-246aa9ecdd22","Type":"ContainerStarted","Data":"a6d4a5175e8c20122fb1136ddbceade8b9bb484a595f173ac3151b6b61285d6b"} Sep 30 06:35:49 crc kubenswrapper[4691]: I0930 06:35:49.129589 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-gvbbh" event={"ID":"30d87d59-039d-4ccf-a112-2beb7059e140","Type":"ContainerStarted","Data":"ea47924e4c369891476df9ecbb29c51739229ec3fd94fbc7ec8eb4455e40c7a7"} Sep 30 06:35:49 crc kubenswrapper[4691]: I0930 06:35:49.130644 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-6grqn" event={"ID":"29abdc62-6e41-49f9-8426-f8b4c1f25014","Type":"ContainerStarted","Data":"6746a57fd4cf253842594aa9d7ecdcd1f581317ca10ed6d03d4d06120f42c743"} Sep 30 06:35:49 crc kubenswrapper[4691]: I0930 06:35:49.132219 4691 generic.go:334] "Generic (PLEG): container finished" podID="5c83300e-e91d-43bf-a9b7-ee763cea39b2" containerID="f98c62251adcc264b995ebac457d42302c3ef8121e140070275e014a1e9248bb" exitCode=0 Sep 30 06:35:49 crc kubenswrapper[4691]: I0930 06:35:49.132259 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-x5kh8" event={"ID":"5c83300e-e91d-43bf-a9b7-ee763cea39b2","Type":"ContainerDied","Data":"f98c62251adcc264b995ebac457d42302c3ef8121e140070275e014a1e9248bb"} Sep 30 06:35:49 crc kubenswrapper[4691]: I0930 06:35:49.132281 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-x5kh8" event={"ID":"5c83300e-e91d-43bf-a9b7-ee763cea39b2","Type":"ContainerStarted","Data":"273c60da6487294fd8a7bc0eb451b73b22934cd4ae05ca3f09c4fb73322b9426"} Sep 30 06:35:49 crc kubenswrapper[4691]: I0930 06:35:49.160158 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-gvbbh" podStartSLOduration=2.597803518 podStartE2EDuration="16.160138956s" podCreationTimestamp="2025-09-30 06:35:33 +0000 UTC" firstStartedPulling="2025-09-30 06:35:34.242726348 +0000 UTC m=+977.717747388" lastFinishedPulling="2025-09-30 06:35:47.805061786 +0000 UTC m=+991.280082826" observedRunningTime="2025-09-30 06:35:49.155679963 +0000 UTC m=+992.630701003" watchObservedRunningTime="2025-09-30 06:35:49.160138956 +0000 UTC m=+992.635159996" Sep 30 06:35:51 crc kubenswrapper[4691]: I0930 06:35:51.803792 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Sep 30 06:35:51 crc kubenswrapper[4691]: I0930 06:35:51.813269 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Sep 30 06:35:52 crc kubenswrapper[4691]: I0930 06:35:52.170045 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Sep 30 06:35:52 crc kubenswrapper[4691]: I0930 06:35:52.776252 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-z5k76" Sep 30 06:35:52 crc kubenswrapper[4691]: I0930 06:35:52.793843 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-4dcl7" Sep 30 06:35:52 crc kubenswrapper[4691]: I0930 06:35:52.801740 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-x5kh8" Sep 30 06:35:52 crc kubenswrapper[4691]: I0930 06:35:52.849606 4691 patch_prober.go:28] interesting pod/machine-config-daemon-4w4k6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 06:35:52 crc kubenswrapper[4691]: I0930 06:35:52.849657 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 06:35:52 crc kubenswrapper[4691]: I0930 06:35:52.854478 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwvfr\" (UniqueName: \"kubernetes.io/projected/b5ed4c1d-acca-4979-876b-1b0fbb34443c-kube-api-access-pwvfr\") pod \"b5ed4c1d-acca-4979-876b-1b0fbb34443c\" (UID: \"b5ed4c1d-acca-4979-876b-1b0fbb34443c\") " Sep 30 06:35:52 crc kubenswrapper[4691]: I0930 06:35:52.854626 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xcw6\" (UniqueName: \"kubernetes.io/projected/16b4442d-198c-4824-a8f3-3fbfd345e87f-kube-api-access-9xcw6\") pod \"16b4442d-198c-4824-a8f3-3fbfd345e87f\" (UID: \"16b4442d-198c-4824-a8f3-3fbfd345e87f\") " Sep 30 06:35:52 crc kubenswrapper[4691]: I0930 06:35:52.860817 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5ed4c1d-acca-4979-876b-1b0fbb34443c-kube-api-access-pwvfr" (OuterVolumeSpecName: "kube-api-access-pwvfr") pod "b5ed4c1d-acca-4979-876b-1b0fbb34443c" (UID: "b5ed4c1d-acca-4979-876b-1b0fbb34443c"). InnerVolumeSpecName "kube-api-access-pwvfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:35:52 crc kubenswrapper[4691]: I0930 06:35:52.881126 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16b4442d-198c-4824-a8f3-3fbfd345e87f-kube-api-access-9xcw6" (OuterVolumeSpecName: "kube-api-access-9xcw6") pod "16b4442d-198c-4824-a8f3-3fbfd345e87f" (UID: "16b4442d-198c-4824-a8f3-3fbfd345e87f"). InnerVolumeSpecName "kube-api-access-9xcw6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:35:52 crc kubenswrapper[4691]: I0930 06:35:52.956521 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgrvl\" (UniqueName: \"kubernetes.io/projected/5c83300e-e91d-43bf-a9b7-ee763cea39b2-kube-api-access-tgrvl\") pod \"5c83300e-e91d-43bf-a9b7-ee763cea39b2\" (UID: \"5c83300e-e91d-43bf-a9b7-ee763cea39b2\") " Sep 30 06:35:52 crc kubenswrapper[4691]: I0930 06:35:52.957049 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwvfr\" (UniqueName: \"kubernetes.io/projected/b5ed4c1d-acca-4979-876b-1b0fbb34443c-kube-api-access-pwvfr\") on node \"crc\" DevicePath \"\"" Sep 30 06:35:52 crc kubenswrapper[4691]: I0930 06:35:52.957072 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xcw6\" (UniqueName: \"kubernetes.io/projected/16b4442d-198c-4824-a8f3-3fbfd345e87f-kube-api-access-9xcw6\") on node \"crc\" DevicePath \"\"" Sep 30 06:35:52 crc kubenswrapper[4691]: I0930 06:35:52.963042 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c83300e-e91d-43bf-a9b7-ee763cea39b2-kube-api-access-tgrvl" (OuterVolumeSpecName: "kube-api-access-tgrvl") pod "5c83300e-e91d-43bf-a9b7-ee763cea39b2" (UID: "5c83300e-e91d-43bf-a9b7-ee763cea39b2"). InnerVolumeSpecName "kube-api-access-tgrvl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:35:53 crc kubenswrapper[4691]: I0930 06:35:53.060474 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgrvl\" (UniqueName: \"kubernetes.io/projected/5c83300e-e91d-43bf-a9b7-ee763cea39b2-kube-api-access-tgrvl\") on node \"crc\" DevicePath \"\"" Sep 30 06:35:53 crc kubenswrapper[4691]: I0930 06:35:53.176597 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-z5k76" event={"ID":"16b4442d-198c-4824-a8f3-3fbfd345e87f","Type":"ContainerDied","Data":"df3e0ea4d7de9f49859a45584aeeabd37ebd195b537d56b59ce0c561e30b5001"} Sep 30 06:35:53 crc kubenswrapper[4691]: I0930 06:35:53.176615 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-z5k76" Sep 30 06:35:53 crc kubenswrapper[4691]: I0930 06:35:53.176644 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df3e0ea4d7de9f49859a45584aeeabd37ebd195b537d56b59ce0c561e30b5001" Sep 30 06:35:53 crc kubenswrapper[4691]: I0930 06:35:53.178986 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-x5kh8" event={"ID":"5c83300e-e91d-43bf-a9b7-ee763cea39b2","Type":"ContainerDied","Data":"273c60da6487294fd8a7bc0eb451b73b22934cd4ae05ca3f09c4fb73322b9426"} Sep 30 06:35:53 crc kubenswrapper[4691]: I0930 06:35:53.179032 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="273c60da6487294fd8a7bc0eb451b73b22934cd4ae05ca3f09c4fb73322b9426" Sep 30 06:35:53 crc kubenswrapper[4691]: I0930 06:35:53.178996 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-x5kh8" Sep 30 06:35:53 crc kubenswrapper[4691]: I0930 06:35:53.185162 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-4dcl7" Sep 30 06:35:53 crc kubenswrapper[4691]: I0930 06:35:53.185510 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-4dcl7" event={"ID":"b5ed4c1d-acca-4979-876b-1b0fbb34443c","Type":"ContainerDied","Data":"38d30d2baa2f2c67f648c7dda689d1b71d32b688c55e123e79b97a7fc7ed58ee"} Sep 30 06:35:53 crc kubenswrapper[4691]: I0930 06:35:53.185565 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38d30d2baa2f2c67f648c7dda689d1b71d32b688c55e123e79b97a7fc7ed58ee" Sep 30 06:35:57 crc kubenswrapper[4691]: I0930 06:35:57.227114 4691 generic.go:334] "Generic (PLEG): container finished" podID="30d87d59-039d-4ccf-a112-2beb7059e140" containerID="ea47924e4c369891476df9ecbb29c51739229ec3fd94fbc7ec8eb4455e40c7a7" exitCode=0 Sep 30 06:35:57 crc kubenswrapper[4691]: I0930 06:35:57.237476 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-gvbbh" event={"ID":"30d87d59-039d-4ccf-a112-2beb7059e140","Type":"ContainerDied","Data":"ea47924e4c369891476df9ecbb29c51739229ec3fd94fbc7ec8eb4455e40c7a7"} Sep 30 06:35:58 crc kubenswrapper[4691]: I0930 06:35:58.240397 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-k2cvv" event={"ID":"097bc1bb-8fd4-4ccd-ad03-246aa9ecdd22","Type":"ContainerStarted","Data":"0d7718ca1e11e88c44d8fa4dfb5710bb0f114057494cb8b506bac1c3b3c9bc3b"} Sep 30 06:35:58 crc kubenswrapper[4691]: I0930 06:35:58.243672 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-6grqn" event={"ID":"29abdc62-6e41-49f9-8426-f8b4c1f25014","Type":"ContainerStarted","Data":"6f136ee1f188d34813d25f3104b5f58f7826464fcf95988ad8bea55f1af5a2a7"} Sep 30 06:35:58 crc kubenswrapper[4691]: I0930 06:35:58.270796 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-db-sync-k2cvv" podStartSLOduration=3.190811781 podStartE2EDuration="12.270775639s" podCreationTimestamp="2025-09-30 06:35:46 +0000 UTC" firstStartedPulling="2025-09-30 06:35:48.336784191 +0000 UTC m=+991.811805231" lastFinishedPulling="2025-09-30 06:35:57.416748049 +0000 UTC m=+1000.891769089" observedRunningTime="2025-09-30 06:35:58.266371788 +0000 UTC m=+1001.741392868" watchObservedRunningTime="2025-09-30 06:35:58.270775639 +0000 UTC m=+1001.745796689" Sep 30 06:35:58 crc kubenswrapper[4691]: I0930 06:35:58.304117 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-6grqn" podStartSLOduration=5.546184866 podStartE2EDuration="14.304089898s" podCreationTimestamp="2025-09-30 06:35:44 +0000 UTC" firstStartedPulling="2025-09-30 06:35:48.623448601 +0000 UTC m=+992.098469641" lastFinishedPulling="2025-09-30 06:35:57.381353623 +0000 UTC m=+1000.856374673" observedRunningTime="2025-09-30 06:35:58.29138521 +0000 UTC m=+1001.766406330" watchObservedRunningTime="2025-09-30 06:35:58.304089898 +0000 UTC m=+1001.779110978" Sep 30 06:35:58 crc kubenswrapper[4691]: I0930 06:35:58.741166 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-gvbbh" Sep 30 06:35:58 crc kubenswrapper[4691]: I0930 06:35:58.815617 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4rllr\" (UniqueName: \"kubernetes.io/projected/30d87d59-039d-4ccf-a112-2beb7059e140-kube-api-access-4rllr\") pod \"30d87d59-039d-4ccf-a112-2beb7059e140\" (UID: \"30d87d59-039d-4ccf-a112-2beb7059e140\") " Sep 30 06:35:58 crc kubenswrapper[4691]: I0930 06:35:58.815860 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/30d87d59-039d-4ccf-a112-2beb7059e140-db-sync-config-data\") pod \"30d87d59-039d-4ccf-a112-2beb7059e140\" (UID: \"30d87d59-039d-4ccf-a112-2beb7059e140\") " Sep 30 06:35:58 crc kubenswrapper[4691]: I0930 06:35:58.815880 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30d87d59-039d-4ccf-a112-2beb7059e140-config-data\") pod \"30d87d59-039d-4ccf-a112-2beb7059e140\" (UID: \"30d87d59-039d-4ccf-a112-2beb7059e140\") " Sep 30 06:35:58 crc kubenswrapper[4691]: I0930 06:35:58.815917 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30d87d59-039d-4ccf-a112-2beb7059e140-combined-ca-bundle\") pod \"30d87d59-039d-4ccf-a112-2beb7059e140\" (UID: \"30d87d59-039d-4ccf-a112-2beb7059e140\") " Sep 30 06:35:58 crc kubenswrapper[4691]: I0930 06:35:58.822986 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30d87d59-039d-4ccf-a112-2beb7059e140-kube-api-access-4rllr" (OuterVolumeSpecName: "kube-api-access-4rllr") pod "30d87d59-039d-4ccf-a112-2beb7059e140" (UID: "30d87d59-039d-4ccf-a112-2beb7059e140"). InnerVolumeSpecName "kube-api-access-4rllr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:35:58 crc kubenswrapper[4691]: I0930 06:35:58.827968 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30d87d59-039d-4ccf-a112-2beb7059e140-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "30d87d59-039d-4ccf-a112-2beb7059e140" (UID: "30d87d59-039d-4ccf-a112-2beb7059e140"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:35:58 crc kubenswrapper[4691]: I0930 06:35:58.862944 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30d87d59-039d-4ccf-a112-2beb7059e140-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "30d87d59-039d-4ccf-a112-2beb7059e140" (UID: "30d87d59-039d-4ccf-a112-2beb7059e140"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:35:58 crc kubenswrapper[4691]: I0930 06:35:58.877824 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30d87d59-039d-4ccf-a112-2beb7059e140-config-data" (OuterVolumeSpecName: "config-data") pod "30d87d59-039d-4ccf-a112-2beb7059e140" (UID: "30d87d59-039d-4ccf-a112-2beb7059e140"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:35:58 crc kubenswrapper[4691]: I0930 06:35:58.917547 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4rllr\" (UniqueName: \"kubernetes.io/projected/30d87d59-039d-4ccf-a112-2beb7059e140-kube-api-access-4rllr\") on node \"crc\" DevicePath \"\"" Sep 30 06:35:58 crc kubenswrapper[4691]: I0930 06:35:58.917575 4691 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/30d87d59-039d-4ccf-a112-2beb7059e140-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 06:35:58 crc kubenswrapper[4691]: I0930 06:35:58.917584 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/30d87d59-039d-4ccf-a112-2beb7059e140-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 06:35:58 crc kubenswrapper[4691]: I0930 06:35:58.917593 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/30d87d59-039d-4ccf-a112-2beb7059e140-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:35:59 crc kubenswrapper[4691]: I0930 06:35:59.254114 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-gvbbh" event={"ID":"30d87d59-039d-4ccf-a112-2beb7059e140","Type":"ContainerDied","Data":"e269b54915772689413010625db1fae7b3f87e5f9c444d0233daecbf8662bbe0"} Sep 30 06:35:59 crc kubenswrapper[4691]: I0930 06:35:59.254144 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-gvbbh" Sep 30 06:35:59 crc kubenswrapper[4691]: I0930 06:35:59.254156 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e269b54915772689413010625db1fae7b3f87e5f9c444d0233daecbf8662bbe0" Sep 30 06:35:59 crc kubenswrapper[4691]: I0930 06:35:59.763937 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-64474f58f5-r2hzx"] Sep 30 06:35:59 crc kubenswrapper[4691]: E0930 06:35:59.764539 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5ed4c1d-acca-4979-876b-1b0fbb34443c" containerName="mariadb-database-create" Sep 30 06:35:59 crc kubenswrapper[4691]: I0930 06:35:59.764550 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5ed4c1d-acca-4979-876b-1b0fbb34443c" containerName="mariadb-database-create" Sep 30 06:35:59 crc kubenswrapper[4691]: E0930 06:35:59.764564 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16b4442d-198c-4824-a8f3-3fbfd345e87f" containerName="mariadb-database-create" Sep 30 06:35:59 crc kubenswrapper[4691]: I0930 06:35:59.764570 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="16b4442d-198c-4824-a8f3-3fbfd345e87f" containerName="mariadb-database-create" Sep 30 06:35:59 crc kubenswrapper[4691]: E0930 06:35:59.764604 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30d87d59-039d-4ccf-a112-2beb7059e140" containerName="glance-db-sync" Sep 30 06:35:59 crc kubenswrapper[4691]: I0930 06:35:59.764610 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="30d87d59-039d-4ccf-a112-2beb7059e140" containerName="glance-db-sync" Sep 30 06:35:59 crc kubenswrapper[4691]: E0930 06:35:59.764618 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c83300e-e91d-43bf-a9b7-ee763cea39b2" containerName="mariadb-database-create" Sep 30 06:35:59 crc kubenswrapper[4691]: I0930 06:35:59.764624 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c83300e-e91d-43bf-a9b7-ee763cea39b2" containerName="mariadb-database-create" Sep 30 06:35:59 crc kubenswrapper[4691]: I0930 06:35:59.764769 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="16b4442d-198c-4824-a8f3-3fbfd345e87f" containerName="mariadb-database-create" Sep 30 06:35:59 crc kubenswrapper[4691]: I0930 06:35:59.764782 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5ed4c1d-acca-4979-876b-1b0fbb34443c" containerName="mariadb-database-create" Sep 30 06:35:59 crc kubenswrapper[4691]: I0930 06:35:59.764800 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="30d87d59-039d-4ccf-a112-2beb7059e140" containerName="glance-db-sync" Sep 30 06:35:59 crc kubenswrapper[4691]: I0930 06:35:59.764809 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c83300e-e91d-43bf-a9b7-ee763cea39b2" containerName="mariadb-database-create" Sep 30 06:35:59 crc kubenswrapper[4691]: I0930 06:35:59.765736 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64474f58f5-r2hzx" Sep 30 06:35:59 crc kubenswrapper[4691]: I0930 06:35:59.783747 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64474f58f5-r2hzx"] Sep 30 06:35:59 crc kubenswrapper[4691]: I0930 06:35:59.852232 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9d0defd4-49be-4dd9-a218-dd734aa84089-ovsdbserver-nb\") pod \"dnsmasq-dns-64474f58f5-r2hzx\" (UID: \"9d0defd4-49be-4dd9-a218-dd734aa84089\") " pod="openstack/dnsmasq-dns-64474f58f5-r2hzx" Sep 30 06:35:59 crc kubenswrapper[4691]: I0930 06:35:59.852321 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9d0defd4-49be-4dd9-a218-dd734aa84089-ovsdbserver-sb\") pod \"dnsmasq-dns-64474f58f5-r2hzx\" (UID: \"9d0defd4-49be-4dd9-a218-dd734aa84089\") " pod="openstack/dnsmasq-dns-64474f58f5-r2hzx" Sep 30 06:35:59 crc kubenswrapper[4691]: I0930 06:35:59.852360 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d0defd4-49be-4dd9-a218-dd734aa84089-config\") pod \"dnsmasq-dns-64474f58f5-r2hzx\" (UID: \"9d0defd4-49be-4dd9-a218-dd734aa84089\") " pod="openstack/dnsmasq-dns-64474f58f5-r2hzx" Sep 30 06:35:59 crc kubenswrapper[4691]: I0930 06:35:59.852388 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9d0defd4-49be-4dd9-a218-dd734aa84089-dns-swift-storage-0\") pod \"dnsmasq-dns-64474f58f5-r2hzx\" (UID: \"9d0defd4-49be-4dd9-a218-dd734aa84089\") " pod="openstack/dnsmasq-dns-64474f58f5-r2hzx" Sep 30 06:35:59 crc kubenswrapper[4691]: I0930 06:35:59.852410 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d0defd4-49be-4dd9-a218-dd734aa84089-dns-svc\") pod \"dnsmasq-dns-64474f58f5-r2hzx\" (UID: \"9d0defd4-49be-4dd9-a218-dd734aa84089\") " pod="openstack/dnsmasq-dns-64474f58f5-r2hzx" Sep 30 06:35:59 crc kubenswrapper[4691]: I0930 06:35:59.852456 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7npvs\" (UniqueName: \"kubernetes.io/projected/9d0defd4-49be-4dd9-a218-dd734aa84089-kube-api-access-7npvs\") pod \"dnsmasq-dns-64474f58f5-r2hzx\" (UID: \"9d0defd4-49be-4dd9-a218-dd734aa84089\") " pod="openstack/dnsmasq-dns-64474f58f5-r2hzx" Sep 30 06:35:59 crc kubenswrapper[4691]: I0930 06:35:59.953394 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9d0defd4-49be-4dd9-a218-dd734aa84089-ovsdbserver-nb\") pod \"dnsmasq-dns-64474f58f5-r2hzx\" (UID: \"9d0defd4-49be-4dd9-a218-dd734aa84089\") " pod="openstack/dnsmasq-dns-64474f58f5-r2hzx" Sep 30 06:35:59 crc kubenswrapper[4691]: I0930 06:35:59.953491 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9d0defd4-49be-4dd9-a218-dd734aa84089-ovsdbserver-sb\") pod \"dnsmasq-dns-64474f58f5-r2hzx\" (UID: \"9d0defd4-49be-4dd9-a218-dd734aa84089\") " pod="openstack/dnsmasq-dns-64474f58f5-r2hzx" Sep 30 06:35:59 crc kubenswrapper[4691]: I0930 06:35:59.953535 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d0defd4-49be-4dd9-a218-dd734aa84089-config\") pod \"dnsmasq-dns-64474f58f5-r2hzx\" (UID: \"9d0defd4-49be-4dd9-a218-dd734aa84089\") " pod="openstack/dnsmasq-dns-64474f58f5-r2hzx" Sep 30 06:35:59 crc kubenswrapper[4691]: I0930 06:35:59.953563 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9d0defd4-49be-4dd9-a218-dd734aa84089-dns-swift-storage-0\") pod \"dnsmasq-dns-64474f58f5-r2hzx\" (UID: \"9d0defd4-49be-4dd9-a218-dd734aa84089\") " pod="openstack/dnsmasq-dns-64474f58f5-r2hzx" Sep 30 06:35:59 crc kubenswrapper[4691]: I0930 06:35:59.953592 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d0defd4-49be-4dd9-a218-dd734aa84089-dns-svc\") pod \"dnsmasq-dns-64474f58f5-r2hzx\" (UID: \"9d0defd4-49be-4dd9-a218-dd734aa84089\") " pod="openstack/dnsmasq-dns-64474f58f5-r2hzx" Sep 30 06:35:59 crc kubenswrapper[4691]: I0930 06:35:59.953636 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7npvs\" (UniqueName: \"kubernetes.io/projected/9d0defd4-49be-4dd9-a218-dd734aa84089-kube-api-access-7npvs\") pod \"dnsmasq-dns-64474f58f5-r2hzx\" (UID: \"9d0defd4-49be-4dd9-a218-dd734aa84089\") " pod="openstack/dnsmasq-dns-64474f58f5-r2hzx" Sep 30 06:35:59 crc kubenswrapper[4691]: I0930 06:35:59.954314 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9d0defd4-49be-4dd9-a218-dd734aa84089-ovsdbserver-nb\") pod \"dnsmasq-dns-64474f58f5-r2hzx\" (UID: \"9d0defd4-49be-4dd9-a218-dd734aa84089\") " pod="openstack/dnsmasq-dns-64474f58f5-r2hzx" Sep 30 06:35:59 crc kubenswrapper[4691]: I0930 06:35:59.954385 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9d0defd4-49be-4dd9-a218-dd734aa84089-ovsdbserver-sb\") pod \"dnsmasq-dns-64474f58f5-r2hzx\" (UID: \"9d0defd4-49be-4dd9-a218-dd734aa84089\") " pod="openstack/dnsmasq-dns-64474f58f5-r2hzx" Sep 30 06:35:59 crc kubenswrapper[4691]: I0930 06:35:59.954532 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9d0defd4-49be-4dd9-a218-dd734aa84089-dns-swift-storage-0\") pod \"dnsmasq-dns-64474f58f5-r2hzx\" (UID: \"9d0defd4-49be-4dd9-a218-dd734aa84089\") " pod="openstack/dnsmasq-dns-64474f58f5-r2hzx" Sep 30 06:35:59 crc kubenswrapper[4691]: I0930 06:35:59.954581 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d0defd4-49be-4dd9-a218-dd734aa84089-config\") pod \"dnsmasq-dns-64474f58f5-r2hzx\" (UID: \"9d0defd4-49be-4dd9-a218-dd734aa84089\") " pod="openstack/dnsmasq-dns-64474f58f5-r2hzx" Sep 30 06:35:59 crc kubenswrapper[4691]: I0930 06:35:59.955412 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d0defd4-49be-4dd9-a218-dd734aa84089-dns-svc\") pod \"dnsmasq-dns-64474f58f5-r2hzx\" (UID: \"9d0defd4-49be-4dd9-a218-dd734aa84089\") " pod="openstack/dnsmasq-dns-64474f58f5-r2hzx" Sep 30 06:35:59 crc kubenswrapper[4691]: I0930 06:35:59.979506 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7npvs\" (UniqueName: \"kubernetes.io/projected/9d0defd4-49be-4dd9-a218-dd734aa84089-kube-api-access-7npvs\") pod \"dnsmasq-dns-64474f58f5-r2hzx\" (UID: \"9d0defd4-49be-4dd9-a218-dd734aa84089\") " pod="openstack/dnsmasq-dns-64474f58f5-r2hzx" Sep 30 06:36:00 crc kubenswrapper[4691]: I0930 06:36:00.113914 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64474f58f5-r2hzx" Sep 30 06:36:00 crc kubenswrapper[4691]: I0930 06:36:00.638961 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-64474f58f5-r2hzx"] Sep 30 06:36:00 crc kubenswrapper[4691]: W0930 06:36:00.645285 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d0defd4_49be_4dd9_a218_dd734aa84089.slice/crio-589915edb111ce3e13ad5233c70d2474ff7c7677e8591734f082521eaabd4b51 WatchSource:0}: Error finding container 589915edb111ce3e13ad5233c70d2474ff7c7677e8591734f082521eaabd4b51: Status 404 returned error can't find the container with id 589915edb111ce3e13ad5233c70d2474ff7c7677e8591734f082521eaabd4b51 Sep 30 06:36:01 crc kubenswrapper[4691]: I0930 06:36:01.316667 4691 generic.go:334] "Generic (PLEG): container finished" podID="097bc1bb-8fd4-4ccd-ad03-246aa9ecdd22" containerID="0d7718ca1e11e88c44d8fa4dfb5710bb0f114057494cb8b506bac1c3b3c9bc3b" exitCode=0 Sep 30 06:36:01 crc kubenswrapper[4691]: I0930 06:36:01.316744 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-k2cvv" event={"ID":"097bc1bb-8fd4-4ccd-ad03-246aa9ecdd22","Type":"ContainerDied","Data":"0d7718ca1e11e88c44d8fa4dfb5710bb0f114057494cb8b506bac1c3b3c9bc3b"} Sep 30 06:36:01 crc kubenswrapper[4691]: I0930 06:36:01.318865 4691 generic.go:334] "Generic (PLEG): container finished" podID="9d0defd4-49be-4dd9-a218-dd734aa84089" containerID="78f1633b7b88765e2b27b86c74664b94401c1e2a15947993c4aadcc14476d046" exitCode=0 Sep 30 06:36:01 crc kubenswrapper[4691]: I0930 06:36:01.318910 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64474f58f5-r2hzx" event={"ID":"9d0defd4-49be-4dd9-a218-dd734aa84089","Type":"ContainerDied","Data":"78f1633b7b88765e2b27b86c74664b94401c1e2a15947993c4aadcc14476d046"} Sep 30 06:36:01 crc kubenswrapper[4691]: I0930 06:36:01.318930 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64474f58f5-r2hzx" event={"ID":"9d0defd4-49be-4dd9-a218-dd734aa84089","Type":"ContainerStarted","Data":"589915edb111ce3e13ad5233c70d2474ff7c7677e8591734f082521eaabd4b51"} Sep 30 06:36:02 crc kubenswrapper[4691]: I0930 06:36:02.331520 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64474f58f5-r2hzx" event={"ID":"9d0defd4-49be-4dd9-a218-dd734aa84089","Type":"ContainerStarted","Data":"5b8287334e2647cb03e73982442293c0a26f9013e07695a797d373fbf4f0a086"} Sep 30 06:36:02 crc kubenswrapper[4691]: I0930 06:36:02.331971 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-64474f58f5-r2hzx" Sep 30 06:36:02 crc kubenswrapper[4691]: I0930 06:36:02.334604 4691 generic.go:334] "Generic (PLEG): container finished" podID="29abdc62-6e41-49f9-8426-f8b4c1f25014" containerID="6f136ee1f188d34813d25f3104b5f58f7826464fcf95988ad8bea55f1af5a2a7" exitCode=0 Sep 30 06:36:02 crc kubenswrapper[4691]: I0930 06:36:02.334951 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-6grqn" event={"ID":"29abdc62-6e41-49f9-8426-f8b4c1f25014","Type":"ContainerDied","Data":"6f136ee1f188d34813d25f3104b5f58f7826464fcf95988ad8bea55f1af5a2a7"} Sep 30 06:36:02 crc kubenswrapper[4691]: I0930 06:36:02.366552 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-64474f58f5-r2hzx" podStartSLOduration=3.366537131 podStartE2EDuration="3.366537131s" podCreationTimestamp="2025-09-30 06:35:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:36:02.35869365 +0000 UTC m=+1005.833714690" watchObservedRunningTime="2025-09-30 06:36:02.366537131 +0000 UTC m=+1005.841558161" Sep 30 06:36:02 crc kubenswrapper[4691]: I0930 06:36:02.758811 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-k2cvv" Sep 30 06:36:02 crc kubenswrapper[4691]: I0930 06:36:02.815256 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hds7w\" (UniqueName: \"kubernetes.io/projected/097bc1bb-8fd4-4ccd-ad03-246aa9ecdd22-kube-api-access-hds7w\") pod \"097bc1bb-8fd4-4ccd-ad03-246aa9ecdd22\" (UID: \"097bc1bb-8fd4-4ccd-ad03-246aa9ecdd22\") " Sep 30 06:36:02 crc kubenswrapper[4691]: I0930 06:36:02.815424 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/097bc1bb-8fd4-4ccd-ad03-246aa9ecdd22-config-data\") pod \"097bc1bb-8fd4-4ccd-ad03-246aa9ecdd22\" (UID: \"097bc1bb-8fd4-4ccd-ad03-246aa9ecdd22\") " Sep 30 06:36:02 crc kubenswrapper[4691]: I0930 06:36:02.815516 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/097bc1bb-8fd4-4ccd-ad03-246aa9ecdd22-combined-ca-bundle\") pod \"097bc1bb-8fd4-4ccd-ad03-246aa9ecdd22\" (UID: \"097bc1bb-8fd4-4ccd-ad03-246aa9ecdd22\") " Sep 30 06:36:02 crc kubenswrapper[4691]: I0930 06:36:02.815625 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/097bc1bb-8fd4-4ccd-ad03-246aa9ecdd22-db-sync-config-data\") pod \"097bc1bb-8fd4-4ccd-ad03-246aa9ecdd22\" (UID: \"097bc1bb-8fd4-4ccd-ad03-246aa9ecdd22\") " Sep 30 06:36:02 crc kubenswrapper[4691]: I0930 06:36:02.824646 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/097bc1bb-8fd4-4ccd-ad03-246aa9ecdd22-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "097bc1bb-8fd4-4ccd-ad03-246aa9ecdd22" (UID: "097bc1bb-8fd4-4ccd-ad03-246aa9ecdd22"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:36:02 crc kubenswrapper[4691]: I0930 06:36:02.826003 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/097bc1bb-8fd4-4ccd-ad03-246aa9ecdd22-kube-api-access-hds7w" (OuterVolumeSpecName: "kube-api-access-hds7w") pod "097bc1bb-8fd4-4ccd-ad03-246aa9ecdd22" (UID: "097bc1bb-8fd4-4ccd-ad03-246aa9ecdd22"). InnerVolumeSpecName "kube-api-access-hds7w". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:36:02 crc kubenswrapper[4691]: I0930 06:36:02.847641 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/097bc1bb-8fd4-4ccd-ad03-246aa9ecdd22-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "097bc1bb-8fd4-4ccd-ad03-246aa9ecdd22" (UID: "097bc1bb-8fd4-4ccd-ad03-246aa9ecdd22"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:36:02 crc kubenswrapper[4691]: I0930 06:36:02.898228 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/097bc1bb-8fd4-4ccd-ad03-246aa9ecdd22-config-data" (OuterVolumeSpecName: "config-data") pod "097bc1bb-8fd4-4ccd-ad03-246aa9ecdd22" (UID: "097bc1bb-8fd4-4ccd-ad03-246aa9ecdd22"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:36:02 crc kubenswrapper[4691]: I0930 06:36:02.918541 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hds7w\" (UniqueName: \"kubernetes.io/projected/097bc1bb-8fd4-4ccd-ad03-246aa9ecdd22-kube-api-access-hds7w\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:02 crc kubenswrapper[4691]: I0930 06:36:02.918628 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/097bc1bb-8fd4-4ccd-ad03-246aa9ecdd22-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:02 crc kubenswrapper[4691]: I0930 06:36:02.918658 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/097bc1bb-8fd4-4ccd-ad03-246aa9ecdd22-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:02 crc kubenswrapper[4691]: I0930 06:36:02.918685 4691 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/097bc1bb-8fd4-4ccd-ad03-246aa9ecdd22-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:03 crc kubenswrapper[4691]: I0930 06:36:03.361937 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-db-sync-k2cvv" event={"ID":"097bc1bb-8fd4-4ccd-ad03-246aa9ecdd22","Type":"ContainerDied","Data":"a6d4a5175e8c20122fb1136ddbceade8b9bb484a595f173ac3151b6b61285d6b"} Sep 30 06:36:03 crc kubenswrapper[4691]: I0930 06:36:03.362311 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a6d4a5175e8c20122fb1136ddbceade8b9bb484a595f173ac3151b6b61285d6b" Sep 30 06:36:03 crc kubenswrapper[4691]: I0930 06:36:03.362387 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-db-sync-k2cvv" Sep 30 06:36:03 crc kubenswrapper[4691]: I0930 06:36:03.793870 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-6grqn" Sep 30 06:36:03 crc kubenswrapper[4691]: I0930 06:36:03.837654 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29abdc62-6e41-49f9-8426-f8b4c1f25014-combined-ca-bundle\") pod \"29abdc62-6e41-49f9-8426-f8b4c1f25014\" (UID: \"29abdc62-6e41-49f9-8426-f8b4c1f25014\") " Sep 30 06:36:03 crc kubenswrapper[4691]: I0930 06:36:03.837796 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tlv4t\" (UniqueName: \"kubernetes.io/projected/29abdc62-6e41-49f9-8426-f8b4c1f25014-kube-api-access-tlv4t\") pod \"29abdc62-6e41-49f9-8426-f8b4c1f25014\" (UID: \"29abdc62-6e41-49f9-8426-f8b4c1f25014\") " Sep 30 06:36:03 crc kubenswrapper[4691]: I0930 06:36:03.837880 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29abdc62-6e41-49f9-8426-f8b4c1f25014-config-data\") pod \"29abdc62-6e41-49f9-8426-f8b4c1f25014\" (UID: \"29abdc62-6e41-49f9-8426-f8b4c1f25014\") " Sep 30 06:36:03 crc kubenswrapper[4691]: I0930 06:36:03.843226 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29abdc62-6e41-49f9-8426-f8b4c1f25014-kube-api-access-tlv4t" (OuterVolumeSpecName: "kube-api-access-tlv4t") pod "29abdc62-6e41-49f9-8426-f8b4c1f25014" (UID: "29abdc62-6e41-49f9-8426-f8b4c1f25014"). InnerVolumeSpecName "kube-api-access-tlv4t". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:36:03 crc kubenswrapper[4691]: I0930 06:36:03.866197 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29abdc62-6e41-49f9-8426-f8b4c1f25014-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "29abdc62-6e41-49f9-8426-f8b4c1f25014" (UID: "29abdc62-6e41-49f9-8426-f8b4c1f25014"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:36:03 crc kubenswrapper[4691]: I0930 06:36:03.910236 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29abdc62-6e41-49f9-8426-f8b4c1f25014-config-data" (OuterVolumeSpecName: "config-data") pod "29abdc62-6e41-49f9-8426-f8b4c1f25014" (UID: "29abdc62-6e41-49f9-8426-f8b4c1f25014"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:36:03 crc kubenswrapper[4691]: I0930 06:36:03.940141 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29abdc62-6e41-49f9-8426-f8b4c1f25014-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:03 crc kubenswrapper[4691]: I0930 06:36:03.940179 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tlv4t\" (UniqueName: \"kubernetes.io/projected/29abdc62-6e41-49f9-8426-f8b4c1f25014-kube-api-access-tlv4t\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:03 crc kubenswrapper[4691]: I0930 06:36:03.940195 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29abdc62-6e41-49f9-8426-f8b4c1f25014-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.208542 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-7341-account-create-ln8wm"] Sep 30 06:36:04 crc kubenswrapper[4691]: E0930 06:36:04.209260 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="097bc1bb-8fd4-4ccd-ad03-246aa9ecdd22" containerName="watcher-db-sync" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.209280 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="097bc1bb-8fd4-4ccd-ad03-246aa9ecdd22" containerName="watcher-db-sync" Sep 30 06:36:04 crc kubenswrapper[4691]: E0930 06:36:04.209302 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29abdc62-6e41-49f9-8426-f8b4c1f25014" containerName="keystone-db-sync" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.209309 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="29abdc62-6e41-49f9-8426-f8b4c1f25014" containerName="keystone-db-sync" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.209518 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="097bc1bb-8fd4-4ccd-ad03-246aa9ecdd22" containerName="watcher-db-sync" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.209541 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="29abdc62-6e41-49f9-8426-f8b4c1f25014" containerName="keystone-db-sync" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.210175 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7341-account-create-ln8wm" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.215213 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.309038 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-7341-account-create-ln8wm"] Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.310557 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prd5c\" (UniqueName: \"kubernetes.io/projected/0ce9ee68-c2b5-456f-9a12-5493b94729ea-kube-api-access-prd5c\") pod \"cinder-7341-account-create-ln8wm\" (UID: \"0ce9ee68-c2b5-456f-9a12-5493b94729ea\") " pod="openstack/cinder-7341-account-create-ln8wm" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.378269 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-6grqn" event={"ID":"29abdc62-6e41-49f9-8426-f8b4c1f25014","Type":"ContainerDied","Data":"6746a57fd4cf253842594aa9d7ecdcd1f581317ca10ed6d03d4d06120f42c743"} Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.378947 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6746a57fd4cf253842594aa9d7ecdcd1f581317ca10ed6d03d4d06120f42c743" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.378342 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-6grqn" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.414064 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prd5c\" (UniqueName: \"kubernetes.io/projected/0ce9ee68-c2b5-456f-9a12-5493b94729ea-kube-api-access-prd5c\") pod \"cinder-7341-account-create-ln8wm\" (UID: \"0ce9ee68-c2b5-456f-9a12-5493b94729ea\") " pod="openstack/cinder-7341-account-create-ln8wm" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.439231 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prd5c\" (UniqueName: \"kubernetes.io/projected/0ce9ee68-c2b5-456f-9a12-5493b94729ea-kube-api-access-prd5c\") pod \"cinder-7341-account-create-ln8wm\" (UID: \"0ce9ee68-c2b5-456f-9a12-5493b94729ea\") " pod="openstack/cinder-7341-account-create-ln8wm" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.452815 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-c5ed-account-create-d7rxh"] Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.454212 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-c5ed-account-create-d7rxh" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.457268 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.474338 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-c5ed-account-create-d7rxh"] Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.494579 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-d99e-account-create-zrrb2"] Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.495625 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-d99e-account-create-zrrb2" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.498760 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.515533 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-d99e-account-create-zrrb2"] Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.585294 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64474f58f5-r2hzx"] Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.585771 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-64474f58f5-r2hzx" podUID="9d0defd4-49be-4dd9-a218-dd734aa84089" containerName="dnsmasq-dns" containerID="cri-o://5b8287334e2647cb03e73982442293c0a26f9013e07695a797d373fbf4f0a086" gracePeriod=10 Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.598418 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-wmzh4"] Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.599895 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-wmzh4" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.604516 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-fxfcb" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.604900 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.605133 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.605309 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.618964 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r65zg\" (UniqueName: \"kubernetes.io/projected/cae1274f-384f-452d-b80f-1c2a3712bb49-kube-api-access-r65zg\") pod \"barbican-c5ed-account-create-d7rxh\" (UID: \"cae1274f-384f-452d-b80f-1c2a3712bb49\") " pod="openstack/barbican-c5ed-account-create-d7rxh" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.619054 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d84l5\" (UniqueName: \"kubernetes.io/projected/4f76bf7e-6020-4d8a-a15c-f2d497629fd9-kube-api-access-d84l5\") pod \"neutron-d99e-account-create-zrrb2\" (UID: \"4f76bf7e-6020-4d8a-a15c-f2d497629fd9\") " pod="openstack/neutron-d99e-account-create-zrrb2" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.620509 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7341-account-create-ln8wm" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.626442 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-wmzh4"] Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.643943 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56884c66c5-5zr58"] Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.645391 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56884c66c5-5zr58" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.656263 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56884c66c5-5zr58"] Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.720480 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.721586 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.722154 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7f1deff0-17e6-49cd-bf71-eeeecee2c1c1-credential-keys\") pod \"keystone-bootstrap-wmzh4\" (UID: \"7f1deff0-17e6-49cd-bf71-eeeecee2c1c1\") " pod="openstack/keystone-bootstrap-wmzh4" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.722261 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r65zg\" (UniqueName: \"kubernetes.io/projected/cae1274f-384f-452d-b80f-1c2a3712bb49-kube-api-access-r65zg\") pod \"barbican-c5ed-account-create-d7rxh\" (UID: \"cae1274f-384f-452d-b80f-1c2a3712bb49\") " pod="openstack/barbican-c5ed-account-create-d7rxh" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.722368 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7f1deff0-17e6-49cd-bf71-eeeecee2c1c1-fernet-keys\") pod \"keystone-bootstrap-wmzh4\" (UID: \"7f1deff0-17e6-49cd-bf71-eeeecee2c1c1\") " pod="openstack/keystone-bootstrap-wmzh4" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.722439 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d84l5\" (UniqueName: \"kubernetes.io/projected/4f76bf7e-6020-4d8a-a15c-f2d497629fd9-kube-api-access-d84l5\") pod \"neutron-d99e-account-create-zrrb2\" (UID: \"4f76bf7e-6020-4d8a-a15c-f2d497629fd9\") " pod="openstack/neutron-d99e-account-create-zrrb2" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.722553 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7t6h\" (UniqueName: \"kubernetes.io/projected/7f1deff0-17e6-49cd-bf71-eeeecee2c1c1-kube-api-access-w7t6h\") pod \"keystone-bootstrap-wmzh4\" (UID: \"7f1deff0-17e6-49cd-bf71-eeeecee2c1c1\") " pod="openstack/keystone-bootstrap-wmzh4" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.722671 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f1deff0-17e6-49cd-bf71-eeeecee2c1c1-config-data\") pod \"keystone-bootstrap-wmzh4\" (UID: \"7f1deff0-17e6-49cd-bf71-eeeecee2c1c1\") " pod="openstack/keystone-bootstrap-wmzh4" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.722755 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f1deff0-17e6-49cd-bf71-eeeecee2c1c1-combined-ca-bundle\") pod \"keystone-bootstrap-wmzh4\" (UID: \"7f1deff0-17e6-49cd-bf71-eeeecee2c1c1\") " pod="openstack/keystone-bootstrap-wmzh4" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.722842 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f1deff0-17e6-49cd-bf71-eeeecee2c1c1-scripts\") pod \"keystone-bootstrap-wmzh4\" (UID: \"7f1deff0-17e6-49cd-bf71-eeeecee2c1c1\") " pod="openstack/keystone-bootstrap-wmzh4" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.734122 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.734533 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-watcher-dockercfg-59ndn" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.753606 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.755140 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.756663 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d84l5\" (UniqueName: \"kubernetes.io/projected/4f76bf7e-6020-4d8a-a15c-f2d497629fd9-kube-api-access-d84l5\") pod \"neutron-d99e-account-create-zrrb2\" (UID: \"4f76bf7e-6020-4d8a-a15c-f2d497629fd9\") " pod="openstack/neutron-d99e-account-create-zrrb2" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.762154 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.762395 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r65zg\" (UniqueName: \"kubernetes.io/projected/cae1274f-384f-452d-b80f-1c2a3712bb49-kube-api-access-r65zg\") pod \"barbican-c5ed-account-create-d7rxh\" (UID: \"cae1274f-384f-452d-b80f-1c2a3712bb49\") " pod="openstack/barbican-c5ed-account-create-d7rxh" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.767425 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-c5ed-account-create-d7rxh" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.771963 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.800228 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-applier-0"] Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.801321 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.809284 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-applier-config-data" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.813823 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-d99e-account-create-zrrb2" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.825777 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7f1deff0-17e6-49cd-bf71-eeeecee2c1c1-fernet-keys\") pod \"keystone-bootstrap-wmzh4\" (UID: \"7f1deff0-17e6-49cd-bf71-eeeecee2c1c1\") " pod="openstack/keystone-bootstrap-wmzh4" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.825824 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9e263ab-2384-42b0-8c7b-a787bcf361a9-config-data\") pod \"watcher-decision-engine-0\" (UID: \"b9e263ab-2384-42b0-8c7b-a787bcf361a9\") " pod="openstack/watcher-decision-engine-0" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.825861 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/b9e263ab-2384-42b0-8c7b-a787bcf361a9-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"b9e263ab-2384-42b0-8c7b-a787bcf361a9\") " pod="openstack/watcher-decision-engine-0" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.825899 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7t6h\" (UniqueName: \"kubernetes.io/projected/7f1deff0-17e6-49cd-bf71-eeeecee2c1c1-kube-api-access-w7t6h\") pod \"keystone-bootstrap-wmzh4\" (UID: \"7f1deff0-17e6-49cd-bf71-eeeecee2c1c1\") " pod="openstack/keystone-bootstrap-wmzh4" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.825924 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0420af49-e022-4b04-8cbf-9ba0139fbcb1-config\") pod \"dnsmasq-dns-56884c66c5-5zr58\" (UID: \"0420af49-e022-4b04-8cbf-9ba0139fbcb1\") " pod="openstack/dnsmasq-dns-56884c66c5-5zr58" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.825966 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f1deff0-17e6-49cd-bf71-eeeecee2c1c1-config-data\") pod \"keystone-bootstrap-wmzh4\" (UID: \"7f1deff0-17e6-49cd-bf71-eeeecee2c1c1\") " pod="openstack/keystone-bootstrap-wmzh4" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.826070 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f1deff0-17e6-49cd-bf71-eeeecee2c1c1-combined-ca-bundle\") pod \"keystone-bootstrap-wmzh4\" (UID: \"7f1deff0-17e6-49cd-bf71-eeeecee2c1c1\") " pod="openstack/keystone-bootstrap-wmzh4" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.826121 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f1deff0-17e6-49cd-bf71-eeeecee2c1c1-scripts\") pod \"keystone-bootstrap-wmzh4\" (UID: \"7f1deff0-17e6-49cd-bf71-eeeecee2c1c1\") " pod="openstack/keystone-bootstrap-wmzh4" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.826155 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0420af49-e022-4b04-8cbf-9ba0139fbcb1-dns-svc\") pod \"dnsmasq-dns-56884c66c5-5zr58\" (UID: \"0420af49-e022-4b04-8cbf-9ba0139fbcb1\") " pod="openstack/dnsmasq-dns-56884c66c5-5zr58" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.826193 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0420af49-e022-4b04-8cbf-9ba0139fbcb1-ovsdbserver-sb\") pod \"dnsmasq-dns-56884c66c5-5zr58\" (UID: \"0420af49-e022-4b04-8cbf-9ba0139fbcb1\") " pod="openstack/dnsmasq-dns-56884c66c5-5zr58" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.826273 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0420af49-e022-4b04-8cbf-9ba0139fbcb1-dns-swift-storage-0\") pod \"dnsmasq-dns-56884c66c5-5zr58\" (UID: \"0420af49-e022-4b04-8cbf-9ba0139fbcb1\") " pod="openstack/dnsmasq-dns-56884c66c5-5zr58" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.826308 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7f1deff0-17e6-49cd-bf71-eeeecee2c1c1-credential-keys\") pod \"keystone-bootstrap-wmzh4\" (UID: \"7f1deff0-17e6-49cd-bf71-eeeecee2c1c1\") " pod="openstack/keystone-bootstrap-wmzh4" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.826340 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9e263ab-2384-42b0-8c7b-a787bcf361a9-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"b9e263ab-2384-42b0-8c7b-a787bcf361a9\") " pod="openstack/watcher-decision-engine-0" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.826388 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0420af49-e022-4b04-8cbf-9ba0139fbcb1-ovsdbserver-nb\") pod \"dnsmasq-dns-56884c66c5-5zr58\" (UID: \"0420af49-e022-4b04-8cbf-9ba0139fbcb1\") " pod="openstack/dnsmasq-dns-56884c66c5-5zr58" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.826418 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6ff9\" (UniqueName: \"kubernetes.io/projected/0420af49-e022-4b04-8cbf-9ba0139fbcb1-kube-api-access-z6ff9\") pod \"dnsmasq-dns-56884c66c5-5zr58\" (UID: \"0420af49-e022-4b04-8cbf-9ba0139fbcb1\") " pod="openstack/dnsmasq-dns-56884c66c5-5zr58" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.826457 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6zpp\" (UniqueName: \"kubernetes.io/projected/b9e263ab-2384-42b0-8c7b-a787bcf361a9-kube-api-access-k6zpp\") pod \"watcher-decision-engine-0\" (UID: \"b9e263ab-2384-42b0-8c7b-a787bcf361a9\") " pod="openstack/watcher-decision-engine-0" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.826526 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9e263ab-2384-42b0-8c7b-a787bcf361a9-logs\") pod \"watcher-decision-engine-0\" (UID: \"b9e263ab-2384-42b0-8c7b-a787bcf361a9\") " pod="openstack/watcher-decision-engine-0" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.841450 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f1deff0-17e6-49cd-bf71-eeeecee2c1c1-config-data\") pod \"keystone-bootstrap-wmzh4\" (UID: \"7f1deff0-17e6-49cd-bf71-eeeecee2c1c1\") " pod="openstack/keystone-bootstrap-wmzh4" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.844818 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f1deff0-17e6-49cd-bf71-eeeecee2c1c1-scripts\") pod \"keystone-bootstrap-wmzh4\" (UID: \"7f1deff0-17e6-49cd-bf71-eeeecee2c1c1\") " pod="openstack/keystone-bootstrap-wmzh4" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.845268 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7f1deff0-17e6-49cd-bf71-eeeecee2c1c1-credential-keys\") pod \"keystone-bootstrap-wmzh4\" (UID: \"7f1deff0-17e6-49cd-bf71-eeeecee2c1c1\") " pod="openstack/keystone-bootstrap-wmzh4" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.863384 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7f1deff0-17e6-49cd-bf71-eeeecee2c1c1-fernet-keys\") pod \"keystone-bootstrap-wmzh4\" (UID: \"7f1deff0-17e6-49cd-bf71-eeeecee2c1c1\") " pod="openstack/keystone-bootstrap-wmzh4" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.863725 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f1deff0-17e6-49cd-bf71-eeeecee2c1c1-combined-ca-bundle\") pod \"keystone-bootstrap-wmzh4\" (UID: \"7f1deff0-17e6-49cd-bf71-eeeecee2c1c1\") " pod="openstack/keystone-bootstrap-wmzh4" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.867958 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.877009 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7t6h\" (UniqueName: \"kubernetes.io/projected/7f1deff0-17e6-49cd-bf71-eeeecee2c1c1-kube-api-access-w7t6h\") pod \"keystone-bootstrap-wmzh4\" (UID: \"7f1deff0-17e6-49cd-bf71-eeeecee2c1c1\") " pod="openstack/keystone-bootstrap-wmzh4" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.898942 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.928776 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0cca0db-c02c-407f-9ba0-9dbaabdfa22d-config-data\") pod \"watcher-api-0\" (UID: \"a0cca0db-c02c-407f-9ba0-9dbaabdfa22d\") " pod="openstack/watcher-api-0" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.928826 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9e263ab-2384-42b0-8c7b-a787bcf361a9-logs\") pod \"watcher-decision-engine-0\" (UID: \"b9e263ab-2384-42b0-8c7b-a787bcf361a9\") " pod="openstack/watcher-decision-engine-0" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.928848 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9e263ab-2384-42b0-8c7b-a787bcf361a9-config-data\") pod \"watcher-decision-engine-0\" (UID: \"b9e263ab-2384-42b0-8c7b-a787bcf361a9\") " pod="openstack/watcher-decision-engine-0" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.928878 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spwrv\" (UniqueName: \"kubernetes.io/projected/a0cca0db-c02c-407f-9ba0-9dbaabdfa22d-kube-api-access-spwrv\") pod \"watcher-api-0\" (UID: \"a0cca0db-c02c-407f-9ba0-9dbaabdfa22d\") " pod="openstack/watcher-api-0" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.928909 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9dpg\" (UniqueName: \"kubernetes.io/projected/a1cbb8f9-118e-48b5-ae92-067ece5295a2-kube-api-access-s9dpg\") pod \"watcher-applier-0\" (UID: \"a1cbb8f9-118e-48b5-ae92-067ece5295a2\") " pod="openstack/watcher-applier-0" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.928926 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/b9e263ab-2384-42b0-8c7b-a787bcf361a9-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"b9e263ab-2384-42b0-8c7b-a787bcf361a9\") " pod="openstack/watcher-decision-engine-0" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.928946 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1cbb8f9-118e-48b5-ae92-067ece5295a2-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"a1cbb8f9-118e-48b5-ae92-067ece5295a2\") " pod="openstack/watcher-applier-0" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.928970 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0420af49-e022-4b04-8cbf-9ba0139fbcb1-config\") pod \"dnsmasq-dns-56884c66c5-5zr58\" (UID: \"0420af49-e022-4b04-8cbf-9ba0139fbcb1\") " pod="openstack/dnsmasq-dns-56884c66c5-5zr58" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.929022 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0420af49-e022-4b04-8cbf-9ba0139fbcb1-dns-svc\") pod \"dnsmasq-dns-56884c66c5-5zr58\" (UID: \"0420af49-e022-4b04-8cbf-9ba0139fbcb1\") " pod="openstack/dnsmasq-dns-56884c66c5-5zr58" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.929039 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0420af49-e022-4b04-8cbf-9ba0139fbcb1-ovsdbserver-sb\") pod \"dnsmasq-dns-56884c66c5-5zr58\" (UID: \"0420af49-e022-4b04-8cbf-9ba0139fbcb1\") " pod="openstack/dnsmasq-dns-56884c66c5-5zr58" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.929055 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0cca0db-c02c-407f-9ba0-9dbaabdfa22d-logs\") pod \"watcher-api-0\" (UID: \"a0cca0db-c02c-407f-9ba0-9dbaabdfa22d\") " pod="openstack/watcher-api-0" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.929080 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0cca0db-c02c-407f-9ba0-9dbaabdfa22d-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"a0cca0db-c02c-407f-9ba0-9dbaabdfa22d\") " pod="openstack/watcher-api-0" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.929101 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0420af49-e022-4b04-8cbf-9ba0139fbcb1-dns-swift-storage-0\") pod \"dnsmasq-dns-56884c66c5-5zr58\" (UID: \"0420af49-e022-4b04-8cbf-9ba0139fbcb1\") " pod="openstack/dnsmasq-dns-56884c66c5-5zr58" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.929120 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9e263ab-2384-42b0-8c7b-a787bcf361a9-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"b9e263ab-2384-42b0-8c7b-a787bcf361a9\") " pod="openstack/watcher-decision-engine-0" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.929141 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1cbb8f9-118e-48b5-ae92-067ece5295a2-config-data\") pod \"watcher-applier-0\" (UID: \"a1cbb8f9-118e-48b5-ae92-067ece5295a2\") " pod="openstack/watcher-applier-0" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.929238 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0420af49-e022-4b04-8cbf-9ba0139fbcb1-ovsdbserver-nb\") pod \"dnsmasq-dns-56884c66c5-5zr58\" (UID: \"0420af49-e022-4b04-8cbf-9ba0139fbcb1\") " pod="openstack/dnsmasq-dns-56884c66c5-5zr58" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.929253 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1cbb8f9-118e-48b5-ae92-067ece5295a2-logs\") pod \"watcher-applier-0\" (UID: \"a1cbb8f9-118e-48b5-ae92-067ece5295a2\") " pod="openstack/watcher-applier-0" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.929273 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z6ff9\" (UniqueName: \"kubernetes.io/projected/0420af49-e022-4b04-8cbf-9ba0139fbcb1-kube-api-access-z6ff9\") pod \"dnsmasq-dns-56884c66c5-5zr58\" (UID: \"0420af49-e022-4b04-8cbf-9ba0139fbcb1\") " pod="openstack/dnsmasq-dns-56884c66c5-5zr58" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.929289 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/a0cca0db-c02c-407f-9ba0-9dbaabdfa22d-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"a0cca0db-c02c-407f-9ba0-9dbaabdfa22d\") " pod="openstack/watcher-api-0" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.929310 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6zpp\" (UniqueName: \"kubernetes.io/projected/b9e263ab-2384-42b0-8c7b-a787bcf361a9-kube-api-access-k6zpp\") pod \"watcher-decision-engine-0\" (UID: \"b9e263ab-2384-42b0-8c7b-a787bcf361a9\") " pod="openstack/watcher-decision-engine-0" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.930012 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9e263ab-2384-42b0-8c7b-a787bcf361a9-logs\") pod \"watcher-decision-engine-0\" (UID: \"b9e263ab-2384-42b0-8c7b-a787bcf361a9\") " pod="openstack/watcher-decision-engine-0" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.937988 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0420af49-e022-4b04-8cbf-9ba0139fbcb1-dns-swift-storage-0\") pod \"dnsmasq-dns-56884c66c5-5zr58\" (UID: \"0420af49-e022-4b04-8cbf-9ba0139fbcb1\") " pod="openstack/dnsmasq-dns-56884c66c5-5zr58" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.938485 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0420af49-e022-4b04-8cbf-9ba0139fbcb1-config\") pod \"dnsmasq-dns-56884c66c5-5zr58\" (UID: \"0420af49-e022-4b04-8cbf-9ba0139fbcb1\") " pod="openstack/dnsmasq-dns-56884c66c5-5zr58" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.938996 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0420af49-e022-4b04-8cbf-9ba0139fbcb1-dns-svc\") pod \"dnsmasq-dns-56884c66c5-5zr58\" (UID: \"0420af49-e022-4b04-8cbf-9ba0139fbcb1\") " pod="openstack/dnsmasq-dns-56884c66c5-5zr58" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.939552 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0420af49-e022-4b04-8cbf-9ba0139fbcb1-ovsdbserver-sb\") pod \"dnsmasq-dns-56884c66c5-5zr58\" (UID: \"0420af49-e022-4b04-8cbf-9ba0139fbcb1\") " pod="openstack/dnsmasq-dns-56884c66c5-5zr58" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.940111 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0420af49-e022-4b04-8cbf-9ba0139fbcb1-ovsdbserver-nb\") pod \"dnsmasq-dns-56884c66c5-5zr58\" (UID: \"0420af49-e022-4b04-8cbf-9ba0139fbcb1\") " pod="openstack/dnsmasq-dns-56884c66c5-5zr58" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.940567 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9e263ab-2384-42b0-8c7b-a787bcf361a9-config-data\") pod \"watcher-decision-engine-0\" (UID: \"b9e263ab-2384-42b0-8c7b-a787bcf361a9\") " pod="openstack/watcher-decision-engine-0" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.941880 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-wmzh4" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.942344 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/b9e263ab-2384-42b0-8c7b-a787bcf361a9-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"b9e263ab-2384-42b0-8c7b-a787bcf361a9\") " pod="openstack/watcher-decision-engine-0" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.945853 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-67dcbcd77c-9lrb5"] Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.947238 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-67dcbcd77c-9lrb5" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.953618 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9e263ab-2384-42b0-8c7b-a787bcf361a9-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"b9e263ab-2384-42b0-8c7b-a787bcf361a9\") " pod="openstack/watcher-decision-engine-0" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.964025 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.964369 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-67dcbcd77c-9lrb5"] Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.964442 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-7gpfr" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.964551 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Sep 30 06:36:04 crc kubenswrapper[4691]: I0930 06:36:04.964656 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:04.976625 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6zpp\" (UniqueName: \"kubernetes.io/projected/b9e263ab-2384-42b0-8c7b-a787bcf361a9-kube-api-access-k6zpp\") pod \"watcher-decision-engine-0\" (UID: \"b9e263ab-2384-42b0-8c7b-a787bcf361a9\") " pod="openstack/watcher-decision-engine-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:04.984218 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6ff9\" (UniqueName: \"kubernetes.io/projected/0420af49-e022-4b04-8cbf-9ba0139fbcb1-kube-api-access-z6ff9\") pod \"dnsmasq-dns-56884c66c5-5zr58\" (UID: \"0420af49-e022-4b04-8cbf-9ba0139fbcb1\") " pod="openstack/dnsmasq-dns-56884c66c5-5zr58" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:04.986423 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:04.988443 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:04.996182 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56884c66c5-5zr58" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:04.996029 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:04.997329 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:04.999722 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.002032 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.032744 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1cbb8f9-118e-48b5-ae92-067ece5295a2-logs\") pod \"watcher-applier-0\" (UID: \"a1cbb8f9-118e-48b5-ae92-067ece5295a2\") " pod="openstack/watcher-applier-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.032789 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/a0cca0db-c02c-407f-9ba0-9dbaabdfa22d-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"a0cca0db-c02c-407f-9ba0-9dbaabdfa22d\") " pod="openstack/watcher-api-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.032828 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0cca0db-c02c-407f-9ba0-9dbaabdfa22d-config-data\") pod \"watcher-api-0\" (UID: \"a0cca0db-c02c-407f-9ba0-9dbaabdfa22d\") " pod="openstack/watcher-api-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.032867 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spwrv\" (UniqueName: \"kubernetes.io/projected/a0cca0db-c02c-407f-9ba0-9dbaabdfa22d-kube-api-access-spwrv\") pod \"watcher-api-0\" (UID: \"a0cca0db-c02c-407f-9ba0-9dbaabdfa22d\") " pod="openstack/watcher-api-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.032897 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9dpg\" (UniqueName: \"kubernetes.io/projected/a1cbb8f9-118e-48b5-ae92-067ece5295a2-kube-api-access-s9dpg\") pod \"watcher-applier-0\" (UID: \"a1cbb8f9-118e-48b5-ae92-067ece5295a2\") " pod="openstack/watcher-applier-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.032917 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1cbb8f9-118e-48b5-ae92-067ece5295a2-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"a1cbb8f9-118e-48b5-ae92-067ece5295a2\") " pod="openstack/watcher-applier-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.032973 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0cca0db-c02c-407f-9ba0-9dbaabdfa22d-logs\") pod \"watcher-api-0\" (UID: \"a0cca0db-c02c-407f-9ba0-9dbaabdfa22d\") " pod="openstack/watcher-api-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.032997 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0cca0db-c02c-407f-9ba0-9dbaabdfa22d-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"a0cca0db-c02c-407f-9ba0-9dbaabdfa22d\") " pod="openstack/watcher-api-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.033031 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1cbb8f9-118e-48b5-ae92-067ece5295a2-config-data\") pod \"watcher-applier-0\" (UID: \"a1cbb8f9-118e-48b5-ae92-067ece5295a2\") " pod="openstack/watcher-applier-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.040048 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1cbb8f9-118e-48b5-ae92-067ece5295a2-config-data\") pod \"watcher-applier-0\" (UID: \"a1cbb8f9-118e-48b5-ae92-067ece5295a2\") " pod="openstack/watcher-applier-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.043643 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1cbb8f9-118e-48b5-ae92-067ece5295a2-logs\") pod \"watcher-applier-0\" (UID: \"a1cbb8f9-118e-48b5-ae92-067ece5295a2\") " pod="openstack/watcher-applier-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.044150 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0cca0db-c02c-407f-9ba0-9dbaabdfa22d-logs\") pod \"watcher-api-0\" (UID: \"a0cca0db-c02c-407f-9ba0-9dbaabdfa22d\") " pod="openstack/watcher-api-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.055375 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1cbb8f9-118e-48b5-ae92-067ece5295a2-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"a1cbb8f9-118e-48b5-ae92-067ece5295a2\") " pod="openstack/watcher-applier-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.057756 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0cca0db-c02c-407f-9ba0-9dbaabdfa22d-config-data\") pod \"watcher-api-0\" (UID: \"a0cca0db-c02c-407f-9ba0-9dbaabdfa22d\") " pod="openstack/watcher-api-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.066491 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/a0cca0db-c02c-407f-9ba0-9dbaabdfa22d-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"a0cca0db-c02c-407f-9ba0-9dbaabdfa22d\") " pod="openstack/watcher-api-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.067325 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0cca0db-c02c-407f-9ba0-9dbaabdfa22d-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"a0cca0db-c02c-407f-9ba0-9dbaabdfa22d\") " pod="openstack/watcher-api-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.072977 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.074508 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9dpg\" (UniqueName: \"kubernetes.io/projected/a1cbb8f9-118e-48b5-ae92-067ece5295a2-kube-api-access-s9dpg\") pod \"watcher-applier-0\" (UID: \"a1cbb8f9-118e-48b5-ae92-067ece5295a2\") " pod="openstack/watcher-applier-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.075019 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.085849 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-f69875457-kvdnf"] Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.087330 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-f69875457-kvdnf" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.090259 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.090486 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.090674 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-kzssm" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.090762 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.098545 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spwrv\" (UniqueName: \"kubernetes.io/projected/a0cca0db-c02c-407f-9ba0-9dbaabdfa22d-kube-api-access-spwrv\") pod \"watcher-api-0\" (UID: \"a0cca0db-c02c-407f-9ba0-9dbaabdfa22d\") " pod="openstack/watcher-api-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.098600 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-jfg2g"] Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.125236 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-jfg2g" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.129770 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-fc6j9" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.130481 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.133068 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.139420 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ca89855f-21bb-4d05-93aa-5705f6d93548-scripts\") pod \"horizon-67dcbcd77c-9lrb5\" (UID: \"ca89855f-21bb-4d05-93aa-5705f6d93548\") " pod="openstack/horizon-67dcbcd77c-9lrb5" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.139464 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6781ac88-7516-4101-8abd-9cacfbb930b7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6781ac88-7516-4101-8abd-9cacfbb930b7\") " pod="openstack/ceilometer-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.139517 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prxkn\" (UniqueName: \"kubernetes.io/projected/ca89855f-21bb-4d05-93aa-5705f6d93548-kube-api-access-prxkn\") pod \"horizon-67dcbcd77c-9lrb5\" (UID: \"ca89855f-21bb-4d05-93aa-5705f6d93548\") " pod="openstack/horizon-67dcbcd77c-9lrb5" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.139552 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmmmv\" (UniqueName: \"kubernetes.io/projected/6781ac88-7516-4101-8abd-9cacfbb930b7-kube-api-access-vmmmv\") pod \"ceilometer-0\" (UID: \"6781ac88-7516-4101-8abd-9cacfbb930b7\") " pod="openstack/ceilometer-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.139595 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ca89855f-21bb-4d05-93aa-5705f6d93548-horizon-secret-key\") pod \"horizon-67dcbcd77c-9lrb5\" (UID: \"ca89855f-21bb-4d05-93aa-5705f6d93548\") " pod="openstack/horizon-67dcbcd77c-9lrb5" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.139621 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6781ac88-7516-4101-8abd-9cacfbb930b7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6781ac88-7516-4101-8abd-9cacfbb930b7\") " pod="openstack/ceilometer-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.139696 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca89855f-21bb-4d05-93aa-5705f6d93548-logs\") pod \"horizon-67dcbcd77c-9lrb5\" (UID: \"ca89855f-21bb-4d05-93aa-5705f6d93548\") " pod="openstack/horizon-67dcbcd77c-9lrb5" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.139725 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6781ac88-7516-4101-8abd-9cacfbb930b7-log-httpd\") pod \"ceilometer-0\" (UID: \"6781ac88-7516-4101-8abd-9cacfbb930b7\") " pod="openstack/ceilometer-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.139791 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6781ac88-7516-4101-8abd-9cacfbb930b7-scripts\") pod \"ceilometer-0\" (UID: \"6781ac88-7516-4101-8abd-9cacfbb930b7\") " pod="openstack/ceilometer-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.139866 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6781ac88-7516-4101-8abd-9cacfbb930b7-run-httpd\") pod \"ceilometer-0\" (UID: \"6781ac88-7516-4101-8abd-9cacfbb930b7\") " pod="openstack/ceilometer-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.139904 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6781ac88-7516-4101-8abd-9cacfbb930b7-config-data\") pod \"ceilometer-0\" (UID: \"6781ac88-7516-4101-8abd-9cacfbb930b7\") " pod="openstack/ceilometer-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.139940 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ca89855f-21bb-4d05-93aa-5705f6d93548-config-data\") pod \"horizon-67dcbcd77c-9lrb5\" (UID: \"ca89855f-21bb-4d05-93aa-5705f6d93548\") " pod="openstack/horizon-67dcbcd77c-9lrb5" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.180730 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.242087 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ca89855f-21bb-4d05-93aa-5705f6d93548-config-data\") pod \"horizon-67dcbcd77c-9lrb5\" (UID: \"ca89855f-21bb-4d05-93aa-5705f6d93548\") " pod="openstack/horizon-67dcbcd77c-9lrb5" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.242392 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpxzz\" (UniqueName: \"kubernetes.io/projected/730e43c6-3b1f-4a8c-9540-4ff131592381-kube-api-access-tpxzz\") pod \"horizon-f69875457-kvdnf\" (UID: \"730e43c6-3b1f-4a8c-9540-4ff131592381\") " pod="openstack/horizon-f69875457-kvdnf" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.242431 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4\") " pod="openstack/glance-default-external-api-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.242450 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6315532b-2604-4b19-8b3f-cb4bb9ff83f6-config-data\") pod \"placement-db-sync-jfg2g\" (UID: \"6315532b-2604-4b19-8b3f-cb4bb9ff83f6\") " pod="openstack/placement-db-sync-jfg2g" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.242470 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pp995\" (UniqueName: \"kubernetes.io/projected/6315532b-2604-4b19-8b3f-cb4bb9ff83f6-kube-api-access-pp995\") pod \"placement-db-sync-jfg2g\" (UID: \"6315532b-2604-4b19-8b3f-cb4bb9ff83f6\") " pod="openstack/placement-db-sync-jfg2g" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.242509 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ca89855f-21bb-4d05-93aa-5705f6d93548-scripts\") pod \"horizon-67dcbcd77c-9lrb5\" (UID: \"ca89855f-21bb-4d05-93aa-5705f6d93548\") " pod="openstack/horizon-67dcbcd77c-9lrb5" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.242528 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4\") " pod="openstack/glance-default-external-api-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.242688 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6781ac88-7516-4101-8abd-9cacfbb930b7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6781ac88-7516-4101-8abd-9cacfbb930b7\") " pod="openstack/ceilometer-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.242708 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4-logs\") pod \"glance-default-external-api-0\" (UID: \"d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4\") " pod="openstack/glance-default-external-api-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.242947 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prxkn\" (UniqueName: \"kubernetes.io/projected/ca89855f-21bb-4d05-93aa-5705f6d93548-kube-api-access-prxkn\") pod \"horizon-67dcbcd77c-9lrb5\" (UID: \"ca89855f-21bb-4d05-93aa-5705f6d93548\") " pod="openstack/horizon-67dcbcd77c-9lrb5" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.242981 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4\") " pod="openstack/glance-default-external-api-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.243200 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmmmv\" (UniqueName: \"kubernetes.io/projected/6781ac88-7516-4101-8abd-9cacfbb930b7-kube-api-access-vmmmv\") pod \"ceilometer-0\" (UID: \"6781ac88-7516-4101-8abd-9cacfbb930b7\") " pod="openstack/ceilometer-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.243217 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49bf9\" (UniqueName: \"kubernetes.io/projected/d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4-kube-api-access-49bf9\") pod \"glance-default-external-api-0\" (UID: \"d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4\") " pod="openstack/glance-default-external-api-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.243239 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6315532b-2604-4b19-8b3f-cb4bb9ff83f6-logs\") pod \"placement-db-sync-jfg2g\" (UID: \"6315532b-2604-4b19-8b3f-cb4bb9ff83f6\") " pod="openstack/placement-db-sync-jfg2g" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.243273 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4\") " pod="openstack/glance-default-external-api-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.243290 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ca89855f-21bb-4d05-93aa-5705f6d93548-horizon-secret-key\") pod \"horizon-67dcbcd77c-9lrb5\" (UID: \"ca89855f-21bb-4d05-93aa-5705f6d93548\") " pod="openstack/horizon-67dcbcd77c-9lrb5" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.244691 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ca89855f-21bb-4d05-93aa-5705f6d93548-scripts\") pod \"horizon-67dcbcd77c-9lrb5\" (UID: \"ca89855f-21bb-4d05-93aa-5705f6d93548\") " pod="openstack/horizon-67dcbcd77c-9lrb5" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.245130 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ca89855f-21bb-4d05-93aa-5705f6d93548-config-data\") pod \"horizon-67dcbcd77c-9lrb5\" (UID: \"ca89855f-21bb-4d05-93aa-5705f6d93548\") " pod="openstack/horizon-67dcbcd77c-9lrb5" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.245690 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4-scripts\") pod \"glance-default-external-api-0\" (UID: \"d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4\") " pod="openstack/glance-default-external-api-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.246233 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6781ac88-7516-4101-8abd-9cacfbb930b7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6781ac88-7516-4101-8abd-9cacfbb930b7\") " pod="openstack/ceilometer-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.246815 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/730e43c6-3b1f-4a8c-9540-4ff131592381-scripts\") pod \"horizon-f69875457-kvdnf\" (UID: \"730e43c6-3b1f-4a8c-9540-4ff131592381\") " pod="openstack/horizon-f69875457-kvdnf" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.246842 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/730e43c6-3b1f-4a8c-9540-4ff131592381-horizon-secret-key\") pod \"horizon-f69875457-kvdnf\" (UID: \"730e43c6-3b1f-4a8c-9540-4ff131592381\") " pod="openstack/horizon-f69875457-kvdnf" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.246997 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca89855f-21bb-4d05-93aa-5705f6d93548-logs\") pod \"horizon-67dcbcd77c-9lrb5\" (UID: \"ca89855f-21bb-4d05-93aa-5705f6d93548\") " pod="openstack/horizon-67dcbcd77c-9lrb5" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.247048 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6781ac88-7516-4101-8abd-9cacfbb930b7-log-httpd\") pod \"ceilometer-0\" (UID: \"6781ac88-7516-4101-8abd-9cacfbb930b7\") " pod="openstack/ceilometer-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.247081 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6315532b-2604-4b19-8b3f-cb4bb9ff83f6-scripts\") pod \"placement-db-sync-jfg2g\" (UID: \"6315532b-2604-4b19-8b3f-cb4bb9ff83f6\") " pod="openstack/placement-db-sync-jfg2g" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.247163 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6781ac88-7516-4101-8abd-9cacfbb930b7-scripts\") pod \"ceilometer-0\" (UID: \"6781ac88-7516-4101-8abd-9cacfbb930b7\") " pod="openstack/ceilometer-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.247180 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6315532b-2604-4b19-8b3f-cb4bb9ff83f6-combined-ca-bundle\") pod \"placement-db-sync-jfg2g\" (UID: \"6315532b-2604-4b19-8b3f-cb4bb9ff83f6\") " pod="openstack/placement-db-sync-jfg2g" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.247222 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/730e43c6-3b1f-4a8c-9540-4ff131592381-logs\") pod \"horizon-f69875457-kvdnf\" (UID: \"730e43c6-3b1f-4a8c-9540-4ff131592381\") " pod="openstack/horizon-f69875457-kvdnf" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.247267 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/730e43c6-3b1f-4a8c-9540-4ff131592381-config-data\") pod \"horizon-f69875457-kvdnf\" (UID: \"730e43c6-3b1f-4a8c-9540-4ff131592381\") " pod="openstack/horizon-f69875457-kvdnf" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.247301 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4-config-data\") pod \"glance-default-external-api-0\" (UID: \"d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4\") " pod="openstack/glance-default-external-api-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.247342 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6781ac88-7516-4101-8abd-9cacfbb930b7-run-httpd\") pod \"ceilometer-0\" (UID: \"6781ac88-7516-4101-8abd-9cacfbb930b7\") " pod="openstack/ceilometer-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.247360 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6781ac88-7516-4101-8abd-9cacfbb930b7-config-data\") pod \"ceilometer-0\" (UID: \"6781ac88-7516-4101-8abd-9cacfbb930b7\") " pod="openstack/ceilometer-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.248345 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6781ac88-7516-4101-8abd-9cacfbb930b7-log-httpd\") pod \"ceilometer-0\" (UID: \"6781ac88-7516-4101-8abd-9cacfbb930b7\") " pod="openstack/ceilometer-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.248755 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6781ac88-7516-4101-8abd-9cacfbb930b7-run-httpd\") pod \"ceilometer-0\" (UID: \"6781ac88-7516-4101-8abd-9cacfbb930b7\") " pod="openstack/ceilometer-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.249818 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca89855f-21bb-4d05-93aa-5705f6d93548-logs\") pod \"horizon-67dcbcd77c-9lrb5\" (UID: \"ca89855f-21bb-4d05-93aa-5705f6d93548\") " pod="openstack/horizon-67dcbcd77c-9lrb5" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.259936 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ca89855f-21bb-4d05-93aa-5705f6d93548-horizon-secret-key\") pod \"horizon-67dcbcd77c-9lrb5\" (UID: \"ca89855f-21bb-4d05-93aa-5705f6d93548\") " pod="openstack/horizon-67dcbcd77c-9lrb5" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.265487 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6781ac88-7516-4101-8abd-9cacfbb930b7-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6781ac88-7516-4101-8abd-9cacfbb930b7\") " pod="openstack/ceilometer-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.265860 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6781ac88-7516-4101-8abd-9cacfbb930b7-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6781ac88-7516-4101-8abd-9cacfbb930b7\") " pod="openstack/ceilometer-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.267756 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6781ac88-7516-4101-8abd-9cacfbb930b7-scripts\") pod \"ceilometer-0\" (UID: \"6781ac88-7516-4101-8abd-9cacfbb930b7\") " pod="openstack/ceilometer-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.268830 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6781ac88-7516-4101-8abd-9cacfbb930b7-config-data\") pod \"ceilometer-0\" (UID: \"6781ac88-7516-4101-8abd-9cacfbb930b7\") " pod="openstack/ceilometer-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.280043 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prxkn\" (UniqueName: \"kubernetes.io/projected/ca89855f-21bb-4d05-93aa-5705f6d93548-kube-api-access-prxkn\") pod \"horizon-67dcbcd77c-9lrb5\" (UID: \"ca89855f-21bb-4d05-93aa-5705f6d93548\") " pod="openstack/horizon-67dcbcd77c-9lrb5" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.281853 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmmmv\" (UniqueName: \"kubernetes.io/projected/6781ac88-7516-4101-8abd-9cacfbb930b7-kube-api-access-vmmmv\") pod \"ceilometer-0\" (UID: \"6781ac88-7516-4101-8abd-9cacfbb930b7\") " pod="openstack/ceilometer-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.309782 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-jfg2g"] Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.309813 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-f69875457-kvdnf"] Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.309824 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56884c66c5-5zr58"] Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.309842 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7597958cd9-k94q9"] Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.311471 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7597958cd9-k94q9" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.323625 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7597958cd9-k94q9"] Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.326738 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.349591 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4-logs\") pod \"glance-default-external-api-0\" (UID: \"d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4\") " pod="openstack/glance-default-external-api-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.349639 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4-logs\") pod \"glance-default-external-api-0\" (UID: \"d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4\") " pod="openstack/glance-default-external-api-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.349672 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4\") " pod="openstack/glance-default-external-api-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.349696 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49bf9\" (UniqueName: \"kubernetes.io/projected/d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4-kube-api-access-49bf9\") pod \"glance-default-external-api-0\" (UID: \"d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4\") " pod="openstack/glance-default-external-api-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.349732 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6315532b-2604-4b19-8b3f-cb4bb9ff83f6-logs\") pod \"placement-db-sync-jfg2g\" (UID: \"6315532b-2604-4b19-8b3f-cb4bb9ff83f6\") " pod="openstack/placement-db-sync-jfg2g" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.349747 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4\") " pod="openstack/glance-default-external-api-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.349766 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4-scripts\") pod \"glance-default-external-api-0\" (UID: \"d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4\") " pod="openstack/glance-default-external-api-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.349795 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/730e43c6-3b1f-4a8c-9540-4ff131592381-scripts\") pod \"horizon-f69875457-kvdnf\" (UID: \"730e43c6-3b1f-4a8c-9540-4ff131592381\") " pod="openstack/horizon-f69875457-kvdnf" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.349816 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/730e43c6-3b1f-4a8c-9540-4ff131592381-horizon-secret-key\") pod \"horizon-f69875457-kvdnf\" (UID: \"730e43c6-3b1f-4a8c-9540-4ff131592381\") " pod="openstack/horizon-f69875457-kvdnf" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.349873 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6315532b-2604-4b19-8b3f-cb4bb9ff83f6-scripts\") pod \"placement-db-sync-jfg2g\" (UID: \"6315532b-2604-4b19-8b3f-cb4bb9ff83f6\") " pod="openstack/placement-db-sync-jfg2g" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.350057 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6315532b-2604-4b19-8b3f-cb4bb9ff83f6-combined-ca-bundle\") pod \"placement-db-sync-jfg2g\" (UID: \"6315532b-2604-4b19-8b3f-cb4bb9ff83f6\") " pod="openstack/placement-db-sync-jfg2g" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.350087 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/730e43c6-3b1f-4a8c-9540-4ff131592381-logs\") pod \"horizon-f69875457-kvdnf\" (UID: \"730e43c6-3b1f-4a8c-9540-4ff131592381\") " pod="openstack/horizon-f69875457-kvdnf" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.350106 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/730e43c6-3b1f-4a8c-9540-4ff131592381-config-data\") pod \"horizon-f69875457-kvdnf\" (UID: \"730e43c6-3b1f-4a8c-9540-4ff131592381\") " pod="openstack/horizon-f69875457-kvdnf" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.350125 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4-config-data\") pod \"glance-default-external-api-0\" (UID: \"d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4\") " pod="openstack/glance-default-external-api-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.350161 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpxzz\" (UniqueName: \"kubernetes.io/projected/730e43c6-3b1f-4a8c-9540-4ff131592381-kube-api-access-tpxzz\") pod \"horizon-f69875457-kvdnf\" (UID: \"730e43c6-3b1f-4a8c-9540-4ff131592381\") " pod="openstack/horizon-f69875457-kvdnf" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.350179 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4\") " pod="openstack/glance-default-external-api-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.350196 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6315532b-2604-4b19-8b3f-cb4bb9ff83f6-config-data\") pod \"placement-db-sync-jfg2g\" (UID: \"6315532b-2604-4b19-8b3f-cb4bb9ff83f6\") " pod="openstack/placement-db-sync-jfg2g" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.350211 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pp995\" (UniqueName: \"kubernetes.io/projected/6315532b-2604-4b19-8b3f-cb4bb9ff83f6-kube-api-access-pp995\") pod \"placement-db-sync-jfg2g\" (UID: \"6315532b-2604-4b19-8b3f-cb4bb9ff83f6\") " pod="openstack/placement-db-sync-jfg2g" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.350252 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4\") " pod="openstack/glance-default-external-api-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.350559 4691 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.352156 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/730e43c6-3b1f-4a8c-9540-4ff131592381-scripts\") pod \"horizon-f69875457-kvdnf\" (UID: \"730e43c6-3b1f-4a8c-9540-4ff131592381\") " pod="openstack/horizon-f69875457-kvdnf" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.350842 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.352706 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.353464 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6315532b-2604-4b19-8b3f-cb4bb9ff83f6-logs\") pod \"placement-db-sync-jfg2g\" (UID: \"6315532b-2604-4b19-8b3f-cb4bb9ff83f6\") " pod="openstack/placement-db-sync-jfg2g" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.354102 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4\") " pod="openstack/glance-default-external-api-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.354397 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/730e43c6-3b1f-4a8c-9540-4ff131592381-logs\") pod \"horizon-f69875457-kvdnf\" (UID: \"730e43c6-3b1f-4a8c-9540-4ff131592381\") " pod="openstack/horizon-f69875457-kvdnf" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.356848 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/730e43c6-3b1f-4a8c-9540-4ff131592381-config-data\") pod \"horizon-f69875457-kvdnf\" (UID: \"730e43c6-3b1f-4a8c-9540-4ff131592381\") " pod="openstack/horizon-f69875457-kvdnf" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.359119 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6315532b-2604-4b19-8b3f-cb4bb9ff83f6-combined-ca-bundle\") pod \"placement-db-sync-jfg2g\" (UID: \"6315532b-2604-4b19-8b3f-cb4bb9ff83f6\") " pod="openstack/placement-db-sync-jfg2g" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.361159 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6315532b-2604-4b19-8b3f-cb4bb9ff83f6-scripts\") pod \"placement-db-sync-jfg2g\" (UID: \"6315532b-2604-4b19-8b3f-cb4bb9ff83f6\") " pod="openstack/placement-db-sync-jfg2g" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.361742 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.362246 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.365564 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.365588 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.369304 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4\") " pod="openstack/glance-default-external-api-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.374677 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-67dcbcd77c-9lrb5" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.375628 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6315532b-2604-4b19-8b3f-cb4bb9ff83f6-config-data\") pod \"placement-db-sync-jfg2g\" (UID: \"6315532b-2604-4b19-8b3f-cb4bb9ff83f6\") " pod="openstack/placement-db-sync-jfg2g" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.379615 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/730e43c6-3b1f-4a8c-9540-4ff131592381-horizon-secret-key\") pod \"horizon-f69875457-kvdnf\" (UID: \"730e43c6-3b1f-4a8c-9540-4ff131592381\") " pod="openstack/horizon-f69875457-kvdnf" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.381476 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4-config-data\") pod \"glance-default-external-api-0\" (UID: \"d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4\") " pod="openstack/glance-default-external-api-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.381780 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4-scripts\") pod \"glance-default-external-api-0\" (UID: \"d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4\") " pod="openstack/glance-default-external-api-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.387291 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpxzz\" (UniqueName: \"kubernetes.io/projected/730e43c6-3b1f-4a8c-9540-4ff131592381-kube-api-access-tpxzz\") pod \"horizon-f69875457-kvdnf\" (UID: \"730e43c6-3b1f-4a8c-9540-4ff131592381\") " pod="openstack/horizon-f69875457-kvdnf" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.394674 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pp995\" (UniqueName: \"kubernetes.io/projected/6315532b-2604-4b19-8b3f-cb4bb9ff83f6-kube-api-access-pp995\") pod \"placement-db-sync-jfg2g\" (UID: \"6315532b-2604-4b19-8b3f-cb4bb9ff83f6\") " pod="openstack/placement-db-sync-jfg2g" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.399129 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49bf9\" (UniqueName: \"kubernetes.io/projected/d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4-kube-api-access-49bf9\") pod \"glance-default-external-api-0\" (UID: \"d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4\") " pod="openstack/glance-default-external-api-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.418613 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4\") " pod="openstack/glance-default-external-api-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.442358 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4\") " pod="openstack/glance-default-external-api-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.453158 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a08181f-97e1-4058-b391-f380edf04dc4-logs\") pod \"glance-default-internal-api-0\" (UID: \"4a08181f-97e1-4058-b391-f380edf04dc4\") " pod="openstack/glance-default-internal-api-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.453208 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04be0ebf-14ea-4b62-b235-af7e6fdff8ee-dns-svc\") pod \"dnsmasq-dns-7597958cd9-k94q9\" (UID: \"04be0ebf-14ea-4b62-b235-af7e6fdff8ee\") " pod="openstack/dnsmasq-dns-7597958cd9-k94q9" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.453226 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a08181f-97e1-4058-b391-f380edf04dc4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4a08181f-97e1-4058-b391-f380edf04dc4\") " pod="openstack/glance-default-internal-api-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.453288 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"4a08181f-97e1-4058-b391-f380edf04dc4\") " pod="openstack/glance-default-internal-api-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.453305 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04be0ebf-14ea-4b62-b235-af7e6fdff8ee-ovsdbserver-nb\") pod \"dnsmasq-dns-7597958cd9-k94q9\" (UID: \"04be0ebf-14ea-4b62-b235-af7e6fdff8ee\") " pod="openstack/dnsmasq-dns-7597958cd9-k94q9" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.453323 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/04be0ebf-14ea-4b62-b235-af7e6fdff8ee-dns-swift-storage-0\") pod \"dnsmasq-dns-7597958cd9-k94q9\" (UID: \"04be0ebf-14ea-4b62-b235-af7e6fdff8ee\") " pod="openstack/dnsmasq-dns-7597958cd9-k94q9" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.453344 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a08181f-97e1-4058-b391-f380edf04dc4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4a08181f-97e1-4058-b391-f380edf04dc4\") " pod="openstack/glance-default-internal-api-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.453360 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a08181f-97e1-4058-b391-f380edf04dc4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"4a08181f-97e1-4058-b391-f380edf04dc4\") " pod="openstack/glance-default-internal-api-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.453389 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9xfp\" (UniqueName: \"kubernetes.io/projected/04be0ebf-14ea-4b62-b235-af7e6fdff8ee-kube-api-access-n9xfp\") pod \"dnsmasq-dns-7597958cd9-k94q9\" (UID: \"04be0ebf-14ea-4b62-b235-af7e6fdff8ee\") " pod="openstack/dnsmasq-dns-7597958cd9-k94q9" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.453434 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4a08181f-97e1-4058-b391-f380edf04dc4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4a08181f-97e1-4058-b391-f380edf04dc4\") " pod="openstack/glance-default-internal-api-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.454146 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04be0ebf-14ea-4b62-b235-af7e6fdff8ee-config\") pod \"dnsmasq-dns-7597958cd9-k94q9\" (UID: \"04be0ebf-14ea-4b62-b235-af7e6fdff8ee\") " pod="openstack/dnsmasq-dns-7597958cd9-k94q9" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.454195 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04be0ebf-14ea-4b62-b235-af7e6fdff8ee-ovsdbserver-sb\") pod \"dnsmasq-dns-7597958cd9-k94q9\" (UID: \"04be0ebf-14ea-4b62-b235-af7e6fdff8ee\") " pod="openstack/dnsmasq-dns-7597958cd9-k94q9" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.454224 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a08181f-97e1-4058-b391-f380edf04dc4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4a08181f-97e1-4058-b391-f380edf04dc4\") " pod="openstack/glance-default-internal-api-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.454290 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kt7t\" (UniqueName: \"kubernetes.io/projected/4a08181f-97e1-4058-b391-f380edf04dc4-kube-api-access-6kt7t\") pod \"glance-default-internal-api-0\" (UID: \"4a08181f-97e1-4058-b391-f380edf04dc4\") " pod="openstack/glance-default-internal-api-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.456553 4691 generic.go:334] "Generic (PLEG): container finished" podID="9d0defd4-49be-4dd9-a218-dd734aa84089" containerID="5b8287334e2647cb03e73982442293c0a26f9013e07695a797d373fbf4f0a086" exitCode=0 Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.456591 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64474f58f5-r2hzx" event={"ID":"9d0defd4-49be-4dd9-a218-dd734aa84089","Type":"ContainerDied","Data":"5b8287334e2647cb03e73982442293c0a26f9013e07695a797d373fbf4f0a086"} Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.460298 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.493176 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.555945 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4a08181f-97e1-4058-b391-f380edf04dc4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4a08181f-97e1-4058-b391-f380edf04dc4\") " pod="openstack/glance-default-internal-api-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.556030 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04be0ebf-14ea-4b62-b235-af7e6fdff8ee-config\") pod \"dnsmasq-dns-7597958cd9-k94q9\" (UID: \"04be0ebf-14ea-4b62-b235-af7e6fdff8ee\") " pod="openstack/dnsmasq-dns-7597958cd9-k94q9" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.556051 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04be0ebf-14ea-4b62-b235-af7e6fdff8ee-ovsdbserver-sb\") pod \"dnsmasq-dns-7597958cd9-k94q9\" (UID: \"04be0ebf-14ea-4b62-b235-af7e6fdff8ee\") " pod="openstack/dnsmasq-dns-7597958cd9-k94q9" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.556069 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a08181f-97e1-4058-b391-f380edf04dc4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4a08181f-97e1-4058-b391-f380edf04dc4\") " pod="openstack/glance-default-internal-api-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.556094 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kt7t\" (UniqueName: \"kubernetes.io/projected/4a08181f-97e1-4058-b391-f380edf04dc4-kube-api-access-6kt7t\") pod \"glance-default-internal-api-0\" (UID: \"4a08181f-97e1-4058-b391-f380edf04dc4\") " pod="openstack/glance-default-internal-api-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.556121 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a08181f-97e1-4058-b391-f380edf04dc4-logs\") pod \"glance-default-internal-api-0\" (UID: \"4a08181f-97e1-4058-b391-f380edf04dc4\") " pod="openstack/glance-default-internal-api-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.556144 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04be0ebf-14ea-4b62-b235-af7e6fdff8ee-dns-svc\") pod \"dnsmasq-dns-7597958cd9-k94q9\" (UID: \"04be0ebf-14ea-4b62-b235-af7e6fdff8ee\") " pod="openstack/dnsmasq-dns-7597958cd9-k94q9" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.556159 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a08181f-97e1-4058-b391-f380edf04dc4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4a08181f-97e1-4058-b391-f380edf04dc4\") " pod="openstack/glance-default-internal-api-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.556200 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"4a08181f-97e1-4058-b391-f380edf04dc4\") " pod="openstack/glance-default-internal-api-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.556219 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04be0ebf-14ea-4b62-b235-af7e6fdff8ee-ovsdbserver-nb\") pod \"dnsmasq-dns-7597958cd9-k94q9\" (UID: \"04be0ebf-14ea-4b62-b235-af7e6fdff8ee\") " pod="openstack/dnsmasq-dns-7597958cd9-k94q9" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.556240 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/04be0ebf-14ea-4b62-b235-af7e6fdff8ee-dns-swift-storage-0\") pod \"dnsmasq-dns-7597958cd9-k94q9\" (UID: \"04be0ebf-14ea-4b62-b235-af7e6fdff8ee\") " pod="openstack/dnsmasq-dns-7597958cd9-k94q9" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.556262 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a08181f-97e1-4058-b391-f380edf04dc4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4a08181f-97e1-4058-b391-f380edf04dc4\") " pod="openstack/glance-default-internal-api-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.556278 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a08181f-97e1-4058-b391-f380edf04dc4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"4a08181f-97e1-4058-b391-f380edf04dc4\") " pod="openstack/glance-default-internal-api-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.556303 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9xfp\" (UniqueName: \"kubernetes.io/projected/04be0ebf-14ea-4b62-b235-af7e6fdff8ee-kube-api-access-n9xfp\") pod \"dnsmasq-dns-7597958cd9-k94q9\" (UID: \"04be0ebf-14ea-4b62-b235-af7e6fdff8ee\") " pod="openstack/dnsmasq-dns-7597958cd9-k94q9" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.556967 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4a08181f-97e1-4058-b391-f380edf04dc4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"4a08181f-97e1-4058-b391-f380edf04dc4\") " pod="openstack/glance-default-internal-api-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.557653 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04be0ebf-14ea-4b62-b235-af7e6fdff8ee-config\") pod \"dnsmasq-dns-7597958cd9-k94q9\" (UID: \"04be0ebf-14ea-4b62-b235-af7e6fdff8ee\") " pod="openstack/dnsmasq-dns-7597958cd9-k94q9" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.558183 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04be0ebf-14ea-4b62-b235-af7e6fdff8ee-ovsdbserver-sb\") pod \"dnsmasq-dns-7597958cd9-k94q9\" (UID: \"04be0ebf-14ea-4b62-b235-af7e6fdff8ee\") " pod="openstack/dnsmasq-dns-7597958cd9-k94q9" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.558714 4691 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"4a08181f-97e1-4058-b391-f380edf04dc4\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-internal-api-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.559483 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/04be0ebf-14ea-4b62-b235-af7e6fdff8ee-dns-swift-storage-0\") pod \"dnsmasq-dns-7597958cd9-k94q9\" (UID: \"04be0ebf-14ea-4b62-b235-af7e6fdff8ee\") " pod="openstack/dnsmasq-dns-7597958cd9-k94q9" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.559726 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a08181f-97e1-4058-b391-f380edf04dc4-logs\") pod \"glance-default-internal-api-0\" (UID: \"4a08181f-97e1-4058-b391-f380edf04dc4\") " pod="openstack/glance-default-internal-api-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.562390 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a08181f-97e1-4058-b391-f380edf04dc4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"4a08181f-97e1-4058-b391-f380edf04dc4\") " pod="openstack/glance-default-internal-api-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.562583 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04be0ebf-14ea-4b62-b235-af7e6fdff8ee-ovsdbserver-nb\") pod \"dnsmasq-dns-7597958cd9-k94q9\" (UID: \"04be0ebf-14ea-4b62-b235-af7e6fdff8ee\") " pod="openstack/dnsmasq-dns-7597958cd9-k94q9" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.562760 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04be0ebf-14ea-4b62-b235-af7e6fdff8ee-dns-svc\") pod \"dnsmasq-dns-7597958cd9-k94q9\" (UID: \"04be0ebf-14ea-4b62-b235-af7e6fdff8ee\") " pod="openstack/dnsmasq-dns-7597958cd9-k94q9" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.563177 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a08181f-97e1-4058-b391-f380edf04dc4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"4a08181f-97e1-4058-b391-f380edf04dc4\") " pod="openstack/glance-default-internal-api-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.564358 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a08181f-97e1-4058-b391-f380edf04dc4-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"4a08181f-97e1-4058-b391-f380edf04dc4\") " pod="openstack/glance-default-internal-api-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.564698 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a08181f-97e1-4058-b391-f380edf04dc4-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"4a08181f-97e1-4058-b391-f380edf04dc4\") " pod="openstack/glance-default-internal-api-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.569012 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-f69875457-kvdnf" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.572534 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9xfp\" (UniqueName: \"kubernetes.io/projected/04be0ebf-14ea-4b62-b235-af7e6fdff8ee-kube-api-access-n9xfp\") pod \"dnsmasq-dns-7597958cd9-k94q9\" (UID: \"04be0ebf-14ea-4b62-b235-af7e6fdff8ee\") " pod="openstack/dnsmasq-dns-7597958cd9-k94q9" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.580149 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kt7t\" (UniqueName: \"kubernetes.io/projected/4a08181f-97e1-4058-b391-f380edf04dc4-kube-api-access-6kt7t\") pod \"glance-default-internal-api-0\" (UID: \"4a08181f-97e1-4058-b391-f380edf04dc4\") " pod="openstack/glance-default-internal-api-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.612052 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"4a08181f-97e1-4058-b391-f380edf04dc4\") " pod="openstack/glance-default-internal-api-0" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.643583 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-jfg2g" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.667528 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7597958cd9-k94q9" Sep 30 06:36:05 crc kubenswrapper[4691]: I0930 06:36:05.707062 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 06:36:06 crc kubenswrapper[4691]: I0930 06:36:06.226714 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64474f58f5-r2hzx" Sep 30 06:36:06 crc kubenswrapper[4691]: I0930 06:36:06.268677 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9d0defd4-49be-4dd9-a218-dd734aa84089-ovsdbserver-nb\") pod \"9d0defd4-49be-4dd9-a218-dd734aa84089\" (UID: \"9d0defd4-49be-4dd9-a218-dd734aa84089\") " Sep 30 06:36:06 crc kubenswrapper[4691]: I0930 06:36:06.268812 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d0defd4-49be-4dd9-a218-dd734aa84089-dns-svc\") pod \"9d0defd4-49be-4dd9-a218-dd734aa84089\" (UID: \"9d0defd4-49be-4dd9-a218-dd734aa84089\") " Sep 30 06:36:06 crc kubenswrapper[4691]: I0930 06:36:06.268841 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d0defd4-49be-4dd9-a218-dd734aa84089-config\") pod \"9d0defd4-49be-4dd9-a218-dd734aa84089\" (UID: \"9d0defd4-49be-4dd9-a218-dd734aa84089\") " Sep 30 06:36:06 crc kubenswrapper[4691]: I0930 06:36:06.268873 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7npvs\" (UniqueName: \"kubernetes.io/projected/9d0defd4-49be-4dd9-a218-dd734aa84089-kube-api-access-7npvs\") pod \"9d0defd4-49be-4dd9-a218-dd734aa84089\" (UID: \"9d0defd4-49be-4dd9-a218-dd734aa84089\") " Sep 30 06:36:06 crc kubenswrapper[4691]: I0930 06:36:06.268929 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9d0defd4-49be-4dd9-a218-dd734aa84089-ovsdbserver-sb\") pod \"9d0defd4-49be-4dd9-a218-dd734aa84089\" (UID: \"9d0defd4-49be-4dd9-a218-dd734aa84089\") " Sep 30 06:36:06 crc kubenswrapper[4691]: I0930 06:36:06.268980 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9d0defd4-49be-4dd9-a218-dd734aa84089-dns-swift-storage-0\") pod \"9d0defd4-49be-4dd9-a218-dd734aa84089\" (UID: \"9d0defd4-49be-4dd9-a218-dd734aa84089\") " Sep 30 06:36:06 crc kubenswrapper[4691]: I0930 06:36:06.283049 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d0defd4-49be-4dd9-a218-dd734aa84089-kube-api-access-7npvs" (OuterVolumeSpecName: "kube-api-access-7npvs") pod "9d0defd4-49be-4dd9-a218-dd734aa84089" (UID: "9d0defd4-49be-4dd9-a218-dd734aa84089"). InnerVolumeSpecName "kube-api-access-7npvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:36:06 crc kubenswrapper[4691]: I0930 06:36:06.308906 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d0defd4-49be-4dd9-a218-dd734aa84089-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9d0defd4-49be-4dd9-a218-dd734aa84089" (UID: "9d0defd4-49be-4dd9-a218-dd734aa84089"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:36:06 crc kubenswrapper[4691]: I0930 06:36:06.316278 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d0defd4-49be-4dd9-a218-dd734aa84089-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9d0defd4-49be-4dd9-a218-dd734aa84089" (UID: "9d0defd4-49be-4dd9-a218-dd734aa84089"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:36:06 crc kubenswrapper[4691]: I0930 06:36:06.328353 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d0defd4-49be-4dd9-a218-dd734aa84089-config" (OuterVolumeSpecName: "config") pod "9d0defd4-49be-4dd9-a218-dd734aa84089" (UID: "9d0defd4-49be-4dd9-a218-dd734aa84089"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:36:06 crc kubenswrapper[4691]: I0930 06:36:06.334294 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d0defd4-49be-4dd9-a218-dd734aa84089-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9d0defd4-49be-4dd9-a218-dd734aa84089" (UID: "9d0defd4-49be-4dd9-a218-dd734aa84089"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:36:06 crc kubenswrapper[4691]: I0930 06:36:06.352757 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d0defd4-49be-4dd9-a218-dd734aa84089-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9d0defd4-49be-4dd9-a218-dd734aa84089" (UID: "9d0defd4-49be-4dd9-a218-dd734aa84089"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:36:06 crc kubenswrapper[4691]: I0930 06:36:06.371275 4691 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9d0defd4-49be-4dd9-a218-dd734aa84089-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:06 crc kubenswrapper[4691]: I0930 06:36:06.371306 4691 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d0defd4-49be-4dd9-a218-dd734aa84089-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:06 crc kubenswrapper[4691]: I0930 06:36:06.371314 4691 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d0defd4-49be-4dd9-a218-dd734aa84089-config\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:06 crc kubenswrapper[4691]: I0930 06:36:06.371323 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7npvs\" (UniqueName: \"kubernetes.io/projected/9d0defd4-49be-4dd9-a218-dd734aa84089-kube-api-access-7npvs\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:06 crc kubenswrapper[4691]: I0930 06:36:06.371333 4691 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9d0defd4-49be-4dd9-a218-dd734aa84089-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:06 crc kubenswrapper[4691]: I0930 06:36:06.371343 4691 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9d0defd4-49be-4dd9-a218-dd734aa84089-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:06 crc kubenswrapper[4691]: I0930 06:36:06.468539 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-64474f58f5-r2hzx" event={"ID":"9d0defd4-49be-4dd9-a218-dd734aa84089","Type":"ContainerDied","Data":"589915edb111ce3e13ad5233c70d2474ff7c7677e8591734f082521eaabd4b51"} Sep 30 06:36:06 crc kubenswrapper[4691]: I0930 06:36:06.468604 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-64474f58f5-r2hzx" Sep 30 06:36:06 crc kubenswrapper[4691]: I0930 06:36:06.468642 4691 scope.go:117] "RemoveContainer" containerID="5b8287334e2647cb03e73982442293c0a26f9013e07695a797d373fbf4f0a086" Sep 30 06:36:06 crc kubenswrapper[4691]: I0930 06:36:06.487274 4691 scope.go:117] "RemoveContainer" containerID="78f1633b7b88765e2b27b86c74664b94401c1e2a15947993c4aadcc14476d046" Sep 30 06:36:06 crc kubenswrapper[4691]: I0930 06:36:06.509969 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-64474f58f5-r2hzx"] Sep 30 06:36:06 crc kubenswrapper[4691]: I0930 06:36:06.521461 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-64474f58f5-r2hzx"] Sep 30 06:36:07 crc kubenswrapper[4691]: I0930 06:36:07.170291 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-wmzh4"] Sep 30 06:36:07 crc kubenswrapper[4691]: I0930 06:36:07.353536 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d0defd4-49be-4dd9-a218-dd734aa84089" path="/var/lib/kubelet/pods/9d0defd4-49be-4dd9-a218-dd734aa84089/volumes" Sep 30 06:36:07 crc kubenswrapper[4691]: I0930 06:36:07.354392 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-jfg2g"] Sep 30 06:36:07 crc kubenswrapper[4691]: I0930 06:36:07.354425 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-67dcbcd77c-9lrb5"] Sep 30 06:36:07 crc kubenswrapper[4691]: I0930 06:36:07.354438 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 06:36:07 crc kubenswrapper[4691]: I0930 06:36:07.354447 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-c5ed-account-create-d7rxh"] Sep 30 06:36:07 crc kubenswrapper[4691]: I0930 06:36:07.354456 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-7341-account-create-ln8wm"] Sep 30 06:36:07 crc kubenswrapper[4691]: I0930 06:36:07.354465 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56884c66c5-5zr58"] Sep 30 06:36:07 crc kubenswrapper[4691]: I0930 06:36:07.354476 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7597958cd9-k94q9"] Sep 30 06:36:07 crc kubenswrapper[4691]: I0930 06:36:07.354485 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Sep 30 06:36:07 crc kubenswrapper[4691]: I0930 06:36:07.354495 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-f69875457-kvdnf"] Sep 30 06:36:07 crc kubenswrapper[4691]: I0930 06:36:07.354505 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Sep 30 06:36:07 crc kubenswrapper[4691]: I0930 06:36:07.357419 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-d99e-account-create-zrrb2"] Sep 30 06:36:07 crc kubenswrapper[4691]: I0930 06:36:07.363012 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Sep 30 06:36:07 crc kubenswrapper[4691]: I0930 06:36:07.374139 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 06:36:07 crc kubenswrapper[4691]: W0930 06:36:07.376176 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd9d464ee_9ac9_4aed_ba1c_5c8fbc0d9fa4.slice/crio-d7d5dbafc155878b8eed18e2b35f3613b5d776cb483ae10923a6957d8f04e189 WatchSource:0}: Error finding container d7d5dbafc155878b8eed18e2b35f3613b5d776cb483ae10923a6957d8f04e189: Status 404 returned error can't find the container with id d7d5dbafc155878b8eed18e2b35f3613b5d776cb483ae10923a6957d8f04e189 Sep 30 06:36:07 crc kubenswrapper[4691]: I0930 06:36:07.400911 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 06:36:07 crc kubenswrapper[4691]: W0930 06:36:07.444590 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4a08181f_97e1_4058_b391_f380edf04dc4.slice/crio-8e4e3fbf7d5088afd10ef352f1633264dcb34ab47c991a6b945010bc5a767aa2 WatchSource:0}: Error finding container 8e4e3fbf7d5088afd10ef352f1633264dcb34ab47c991a6b945010bc5a767aa2: Status 404 returned error can't find the container with id 8e4e3fbf7d5088afd10ef352f1633264dcb34ab47c991a6b945010bc5a767aa2 Sep 30 06:36:07 crc kubenswrapper[4691]: I0930 06:36:07.510106 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"b9e263ab-2384-42b0-8c7b-a787bcf361a9","Type":"ContainerStarted","Data":"7b769bc63e2eed1b3da0b54fd040ddc4dc6bd4ba6a38e7c98cbf24afbaa35567"} Sep 30 06:36:07 crc kubenswrapper[4691]: I0930 06:36:07.520441 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"a0cca0db-c02c-407f-9ba0-9dbaabdfa22d","Type":"ContainerStarted","Data":"1799b4e2e11916b8561607f8bd29fe120a6c0971cacdebfdd8856c7ae07d3ee5"} Sep 30 06:36:07 crc kubenswrapper[4691]: I0930 06:36:07.524116 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4a08181f-97e1-4058-b391-f380edf04dc4","Type":"ContainerStarted","Data":"8e4e3fbf7d5088afd10ef352f1633264dcb34ab47c991a6b945010bc5a767aa2"} Sep 30 06:36:07 crc kubenswrapper[4691]: I0930 06:36:07.537867 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-jfg2g" event={"ID":"6315532b-2604-4b19-8b3f-cb4bb9ff83f6","Type":"ContainerStarted","Data":"a8f545033fc6e1ac50240d5db06872701110df32f61d65121614a167ed2acc9f"} Sep 30 06:36:07 crc kubenswrapper[4691]: I0930 06:36:07.540401 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d99e-account-create-zrrb2" event={"ID":"4f76bf7e-6020-4d8a-a15c-f2d497629fd9","Type":"ContainerStarted","Data":"9d70f561bd9a05917f30998f661855d65f87422e663ee6959a1992cfead0778e"} Sep 30 06:36:07 crc kubenswrapper[4691]: I0930 06:36:07.545068 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56884c66c5-5zr58" event={"ID":"0420af49-e022-4b04-8cbf-9ba0139fbcb1","Type":"ContainerStarted","Data":"57ae3cd41c66ec8325fbf0b409597f6dd6399dab2947012eabb216d897400a4b"} Sep 30 06:36:07 crc kubenswrapper[4691]: I0930 06:36:07.546285 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-wmzh4" event={"ID":"7f1deff0-17e6-49cd-bf71-eeeecee2c1c1","Type":"ContainerStarted","Data":"21646715f8fefd8d0e0e219b03bab727da2d833cfd1b85af79f809a130c9a005"} Sep 30 06:36:07 crc kubenswrapper[4691]: I0930 06:36:07.547396 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4","Type":"ContainerStarted","Data":"d7d5dbafc155878b8eed18e2b35f3613b5d776cb483ae10923a6957d8f04e189"} Sep 30 06:36:07 crc kubenswrapper[4691]: I0930 06:36:07.548504 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7341-account-create-ln8wm" event={"ID":"0ce9ee68-c2b5-456f-9a12-5493b94729ea","Type":"ContainerStarted","Data":"6ea42558cd89fdbcf602cd6366287450c7f412f4318d07fe01f54914a22f3b1f"} Sep 30 06:36:07 crc kubenswrapper[4691]: I0930 06:36:07.548521 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7341-account-create-ln8wm" event={"ID":"0ce9ee68-c2b5-456f-9a12-5493b94729ea","Type":"ContainerStarted","Data":"b22f4e1e047f93d0915d09e7cacd20ab204fe1b30a600792271dfaee3d4bcffc"} Sep 30 06:36:07 crc kubenswrapper[4691]: I0930 06:36:07.555302 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7597958cd9-k94q9" event={"ID":"04be0ebf-14ea-4b62-b235-af7e6fdff8ee","Type":"ContainerStarted","Data":"002be7e11cef2be9f31196c501b5898291e0636d0ba563a6f6c02f623012a8b3"} Sep 30 06:36:07 crc kubenswrapper[4691]: I0930 06:36:07.556174 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f69875457-kvdnf" event={"ID":"730e43c6-3b1f-4a8c-9540-4ff131592381","Type":"ContainerStarted","Data":"973c7707713ff6b9ae3abbb47e9445dd8ea045b05ea1451d71c0c768973e3ff1"} Sep 30 06:36:07 crc kubenswrapper[4691]: I0930 06:36:07.559374 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-c5ed-account-create-d7rxh" event={"ID":"cae1274f-384f-452d-b80f-1c2a3712bb49","Type":"ContainerStarted","Data":"cc1ecb54b3b70313330ffb39e48682572ab7ba3f76dae551786cced814f4bb10"} Sep 30 06:36:07 crc kubenswrapper[4691]: I0930 06:36:07.561129 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-7341-account-create-ln8wm" podStartSLOduration=3.56112097 podStartE2EDuration="3.56112097s" podCreationTimestamp="2025-09-30 06:36:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:36:07.560769699 +0000 UTC m=+1011.035790759" watchObservedRunningTime="2025-09-30 06:36:07.56112097 +0000 UTC m=+1011.036142010" Sep 30 06:36:07 crc kubenswrapper[4691]: I0930 06:36:07.565055 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6781ac88-7516-4101-8abd-9cacfbb930b7","Type":"ContainerStarted","Data":"7f7be92aa56741158622c73542f0e8a2e3cf431bdb549192ed6cc40991f90812"} Sep 30 06:36:07 crc kubenswrapper[4691]: I0930 06:36:07.572750 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"a1cbb8f9-118e-48b5-ae92-067ece5295a2","Type":"ContainerStarted","Data":"d8b8310b1fbbb98511bfeaa0ddaf5c55824e3d49e5b5758e3664cdb2043c48ae"} Sep 30 06:36:07 crc kubenswrapper[4691]: I0930 06:36:07.590063 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67dcbcd77c-9lrb5" event={"ID":"ca89855f-21bb-4d05-93aa-5705f6d93548","Type":"ContainerStarted","Data":"3817e72d071dad559c56b9f590afb0ee39a713a0f26a6196e6f73c2414d19909"} Sep 30 06:36:08 crc kubenswrapper[4691]: I0930 06:36:08.138201 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Sep 30 06:36:08 crc kubenswrapper[4691]: I0930 06:36:08.187969 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 06:36:08 crc kubenswrapper[4691]: I0930 06:36:08.197765 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-67dcbcd77c-9lrb5"] Sep 30 06:36:08 crc kubenswrapper[4691]: I0930 06:36:08.228642 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-596cd479b5-w8rf2"] Sep 30 06:36:08 crc kubenswrapper[4691]: E0930 06:36:08.229163 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d0defd4-49be-4dd9-a218-dd734aa84089" containerName="dnsmasq-dns" Sep 30 06:36:08 crc kubenswrapper[4691]: I0930 06:36:08.229177 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d0defd4-49be-4dd9-a218-dd734aa84089" containerName="dnsmasq-dns" Sep 30 06:36:08 crc kubenswrapper[4691]: E0930 06:36:08.229200 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d0defd4-49be-4dd9-a218-dd734aa84089" containerName="init" Sep 30 06:36:08 crc kubenswrapper[4691]: I0930 06:36:08.229205 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d0defd4-49be-4dd9-a218-dd734aa84089" containerName="init" Sep 30 06:36:08 crc kubenswrapper[4691]: I0930 06:36:08.229495 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d0defd4-49be-4dd9-a218-dd734aa84089" containerName="dnsmasq-dns" Sep 30 06:36:08 crc kubenswrapper[4691]: I0930 06:36:08.231334 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-596cd479b5-w8rf2" Sep 30 06:36:08 crc kubenswrapper[4691]: I0930 06:36:08.269675 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-596cd479b5-w8rf2"] Sep 30 06:36:08 crc kubenswrapper[4691]: I0930 06:36:08.284948 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 06:36:08 crc kubenswrapper[4691]: I0930 06:36:08.321469 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 06:36:08 crc kubenswrapper[4691]: I0930 06:36:08.401971 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95c4dd18-3200-4193-9868-7315a13103b3-logs\") pod \"horizon-596cd479b5-w8rf2\" (UID: \"95c4dd18-3200-4193-9868-7315a13103b3\") " pod="openstack/horizon-596cd479b5-w8rf2" Sep 30 06:36:08 crc kubenswrapper[4691]: I0930 06:36:08.402019 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/95c4dd18-3200-4193-9868-7315a13103b3-config-data\") pod \"horizon-596cd479b5-w8rf2\" (UID: \"95c4dd18-3200-4193-9868-7315a13103b3\") " pod="openstack/horizon-596cd479b5-w8rf2" Sep 30 06:36:08 crc kubenswrapper[4691]: I0930 06:36:08.402273 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/95c4dd18-3200-4193-9868-7315a13103b3-horizon-secret-key\") pod \"horizon-596cd479b5-w8rf2\" (UID: \"95c4dd18-3200-4193-9868-7315a13103b3\") " pod="openstack/horizon-596cd479b5-w8rf2" Sep 30 06:36:08 crc kubenswrapper[4691]: I0930 06:36:08.403079 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/95c4dd18-3200-4193-9868-7315a13103b3-scripts\") pod \"horizon-596cd479b5-w8rf2\" (UID: \"95c4dd18-3200-4193-9868-7315a13103b3\") " pod="openstack/horizon-596cd479b5-w8rf2" Sep 30 06:36:08 crc kubenswrapper[4691]: I0930 06:36:08.403150 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8qqs\" (UniqueName: \"kubernetes.io/projected/95c4dd18-3200-4193-9868-7315a13103b3-kube-api-access-s8qqs\") pod \"horizon-596cd479b5-w8rf2\" (UID: \"95c4dd18-3200-4193-9868-7315a13103b3\") " pod="openstack/horizon-596cd479b5-w8rf2" Sep 30 06:36:08 crc kubenswrapper[4691]: I0930 06:36:08.511977 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/95c4dd18-3200-4193-9868-7315a13103b3-scripts\") pod \"horizon-596cd479b5-w8rf2\" (UID: \"95c4dd18-3200-4193-9868-7315a13103b3\") " pod="openstack/horizon-596cd479b5-w8rf2" Sep 30 06:36:08 crc kubenswrapper[4691]: I0930 06:36:08.512223 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8qqs\" (UniqueName: \"kubernetes.io/projected/95c4dd18-3200-4193-9868-7315a13103b3-kube-api-access-s8qqs\") pod \"horizon-596cd479b5-w8rf2\" (UID: \"95c4dd18-3200-4193-9868-7315a13103b3\") " pod="openstack/horizon-596cd479b5-w8rf2" Sep 30 06:36:08 crc kubenswrapper[4691]: I0930 06:36:08.512281 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95c4dd18-3200-4193-9868-7315a13103b3-logs\") pod \"horizon-596cd479b5-w8rf2\" (UID: \"95c4dd18-3200-4193-9868-7315a13103b3\") " pod="openstack/horizon-596cd479b5-w8rf2" Sep 30 06:36:08 crc kubenswrapper[4691]: I0930 06:36:08.512300 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/95c4dd18-3200-4193-9868-7315a13103b3-config-data\") pod \"horizon-596cd479b5-w8rf2\" (UID: \"95c4dd18-3200-4193-9868-7315a13103b3\") " pod="openstack/horizon-596cd479b5-w8rf2" Sep 30 06:36:08 crc kubenswrapper[4691]: I0930 06:36:08.512328 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/95c4dd18-3200-4193-9868-7315a13103b3-horizon-secret-key\") pod \"horizon-596cd479b5-w8rf2\" (UID: \"95c4dd18-3200-4193-9868-7315a13103b3\") " pod="openstack/horizon-596cd479b5-w8rf2" Sep 30 06:36:08 crc kubenswrapper[4691]: I0930 06:36:08.515358 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/95c4dd18-3200-4193-9868-7315a13103b3-scripts\") pod \"horizon-596cd479b5-w8rf2\" (UID: \"95c4dd18-3200-4193-9868-7315a13103b3\") " pod="openstack/horizon-596cd479b5-w8rf2" Sep 30 06:36:08 crc kubenswrapper[4691]: I0930 06:36:08.522693 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95c4dd18-3200-4193-9868-7315a13103b3-logs\") pod \"horizon-596cd479b5-w8rf2\" (UID: \"95c4dd18-3200-4193-9868-7315a13103b3\") " pod="openstack/horizon-596cd479b5-w8rf2" Sep 30 06:36:08 crc kubenswrapper[4691]: I0930 06:36:08.523391 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/95c4dd18-3200-4193-9868-7315a13103b3-config-data\") pod \"horizon-596cd479b5-w8rf2\" (UID: \"95c4dd18-3200-4193-9868-7315a13103b3\") " pod="openstack/horizon-596cd479b5-w8rf2" Sep 30 06:36:08 crc kubenswrapper[4691]: I0930 06:36:08.529706 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/95c4dd18-3200-4193-9868-7315a13103b3-horizon-secret-key\") pod \"horizon-596cd479b5-w8rf2\" (UID: \"95c4dd18-3200-4193-9868-7315a13103b3\") " pod="openstack/horizon-596cd479b5-w8rf2" Sep 30 06:36:08 crc kubenswrapper[4691]: I0930 06:36:08.532745 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8qqs\" (UniqueName: \"kubernetes.io/projected/95c4dd18-3200-4193-9868-7315a13103b3-kube-api-access-s8qqs\") pod \"horizon-596cd479b5-w8rf2\" (UID: \"95c4dd18-3200-4193-9868-7315a13103b3\") " pod="openstack/horizon-596cd479b5-w8rf2" Sep 30 06:36:08 crc kubenswrapper[4691]: I0930 06:36:08.612895 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-596cd479b5-w8rf2" Sep 30 06:36:08 crc kubenswrapper[4691]: I0930 06:36:08.616935 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4","Type":"ContainerStarted","Data":"804d1b04157cf67e2e31ef98ae0b953c9c0f08bdd533624d4fec354dc9620e6a"} Sep 30 06:36:08 crc kubenswrapper[4691]: I0930 06:36:08.630542 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4a08181f-97e1-4058-b391-f380edf04dc4","Type":"ContainerStarted","Data":"fc1ff28b65d91551a626aebd026f44be68a915fe6412e4c733806a8d41529709"} Sep 30 06:36:08 crc kubenswrapper[4691]: I0930 06:36:08.637667 4691 generic.go:334] "Generic (PLEG): container finished" podID="04be0ebf-14ea-4b62-b235-af7e6fdff8ee" containerID="8e80ea8403d22c99056604869ce2d20a2c678097c8ae26225c3988c26c277c1c" exitCode=0 Sep 30 06:36:08 crc kubenswrapper[4691]: I0930 06:36:08.637727 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7597958cd9-k94q9" event={"ID":"04be0ebf-14ea-4b62-b235-af7e6fdff8ee","Type":"ContainerDied","Data":"8e80ea8403d22c99056604869ce2d20a2c678097c8ae26225c3988c26c277c1c"} Sep 30 06:36:08 crc kubenswrapper[4691]: I0930 06:36:08.640933 4691 generic.go:334] "Generic (PLEG): container finished" podID="0420af49-e022-4b04-8cbf-9ba0139fbcb1" containerID="c969900617664210534e055bec1fd1c39872613650f66087196d86c90d6076d2" exitCode=0 Sep 30 06:36:08 crc kubenswrapper[4691]: I0930 06:36:08.640991 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56884c66c5-5zr58" event={"ID":"0420af49-e022-4b04-8cbf-9ba0139fbcb1","Type":"ContainerDied","Data":"c969900617664210534e055bec1fd1c39872613650f66087196d86c90d6076d2"} Sep 30 06:36:08 crc kubenswrapper[4691]: I0930 06:36:08.651002 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-wmzh4" event={"ID":"7f1deff0-17e6-49cd-bf71-eeeecee2c1c1","Type":"ContainerStarted","Data":"0becdf2d4e2225f3ba85bfa6fa6aabc1805b8268d9fe2c4cf31a3f417ac427ba"} Sep 30 06:36:08 crc kubenswrapper[4691]: I0930 06:36:08.659807 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"a0cca0db-c02c-407f-9ba0-9dbaabdfa22d","Type":"ContainerStarted","Data":"58ea47eba5766897abe87f1915f5f3bb77b89b35aba64c8932cb8aad7e531a91"} Sep 30 06:36:08 crc kubenswrapper[4691]: I0930 06:36:08.676342 4691 generic.go:334] "Generic (PLEG): container finished" podID="cae1274f-384f-452d-b80f-1c2a3712bb49" containerID="1836171cceace96fc472d2ebd5aed274964abb8c115c73e77f97741aa6d01dad" exitCode=0 Sep 30 06:36:08 crc kubenswrapper[4691]: I0930 06:36:08.676451 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-c5ed-account-create-d7rxh" event={"ID":"cae1274f-384f-452d-b80f-1c2a3712bb49","Type":"ContainerDied","Data":"1836171cceace96fc472d2ebd5aed274964abb8c115c73e77f97741aa6d01dad"} Sep 30 06:36:08 crc kubenswrapper[4691]: I0930 06:36:08.686252 4691 generic.go:334] "Generic (PLEG): container finished" podID="0ce9ee68-c2b5-456f-9a12-5493b94729ea" containerID="6ea42558cd89fdbcf602cd6366287450c7f412f4318d07fe01f54914a22f3b1f" exitCode=0 Sep 30 06:36:08 crc kubenswrapper[4691]: I0930 06:36:08.686555 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7341-account-create-ln8wm" event={"ID":"0ce9ee68-c2b5-456f-9a12-5493b94729ea","Type":"ContainerDied","Data":"6ea42558cd89fdbcf602cd6366287450c7f412f4318d07fe01f54914a22f3b1f"} Sep 30 06:36:08 crc kubenswrapper[4691]: I0930 06:36:08.712552 4691 generic.go:334] "Generic (PLEG): container finished" podID="4f76bf7e-6020-4d8a-a15c-f2d497629fd9" containerID="b315a9a23aa8a12ed4e264ade7a72252704a75749ba990f4d4a22db0dcd64e9f" exitCode=0 Sep 30 06:36:08 crc kubenswrapper[4691]: I0930 06:36:08.712594 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d99e-account-create-zrrb2" event={"ID":"4f76bf7e-6020-4d8a-a15c-f2d497629fd9","Type":"ContainerDied","Data":"b315a9a23aa8a12ed4e264ade7a72252704a75749ba990f4d4a22db0dcd64e9f"} Sep 30 06:36:08 crc kubenswrapper[4691]: I0930 06:36:08.712626 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-wmzh4" podStartSLOduration=4.712608307 podStartE2EDuration="4.712608307s" podCreationTimestamp="2025-09-30 06:36:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:36:08.705704505 +0000 UTC m=+1012.180725555" watchObservedRunningTime="2025-09-30 06:36:08.712608307 +0000 UTC m=+1012.187629347" Sep 30 06:36:09 crc kubenswrapper[4691]: I0930 06:36:09.470129 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56884c66c5-5zr58" Sep 30 06:36:09 crc kubenswrapper[4691]: I0930 06:36:09.636641 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0420af49-e022-4b04-8cbf-9ba0139fbcb1-ovsdbserver-sb\") pod \"0420af49-e022-4b04-8cbf-9ba0139fbcb1\" (UID: \"0420af49-e022-4b04-8cbf-9ba0139fbcb1\") " Sep 30 06:36:09 crc kubenswrapper[4691]: I0930 06:36:09.636712 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0420af49-e022-4b04-8cbf-9ba0139fbcb1-config\") pod \"0420af49-e022-4b04-8cbf-9ba0139fbcb1\" (UID: \"0420af49-e022-4b04-8cbf-9ba0139fbcb1\") " Sep 30 06:36:09 crc kubenswrapper[4691]: I0930 06:36:09.636783 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6ff9\" (UniqueName: \"kubernetes.io/projected/0420af49-e022-4b04-8cbf-9ba0139fbcb1-kube-api-access-z6ff9\") pod \"0420af49-e022-4b04-8cbf-9ba0139fbcb1\" (UID: \"0420af49-e022-4b04-8cbf-9ba0139fbcb1\") " Sep 30 06:36:09 crc kubenswrapper[4691]: I0930 06:36:09.636876 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0420af49-e022-4b04-8cbf-9ba0139fbcb1-dns-swift-storage-0\") pod \"0420af49-e022-4b04-8cbf-9ba0139fbcb1\" (UID: \"0420af49-e022-4b04-8cbf-9ba0139fbcb1\") " Sep 30 06:36:09 crc kubenswrapper[4691]: I0930 06:36:09.636908 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0420af49-e022-4b04-8cbf-9ba0139fbcb1-ovsdbserver-nb\") pod \"0420af49-e022-4b04-8cbf-9ba0139fbcb1\" (UID: \"0420af49-e022-4b04-8cbf-9ba0139fbcb1\") " Sep 30 06:36:09 crc kubenswrapper[4691]: I0930 06:36:09.636959 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0420af49-e022-4b04-8cbf-9ba0139fbcb1-dns-svc\") pod \"0420af49-e022-4b04-8cbf-9ba0139fbcb1\" (UID: \"0420af49-e022-4b04-8cbf-9ba0139fbcb1\") " Sep 30 06:36:09 crc kubenswrapper[4691]: I0930 06:36:09.640316 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0420af49-e022-4b04-8cbf-9ba0139fbcb1-kube-api-access-z6ff9" (OuterVolumeSpecName: "kube-api-access-z6ff9") pod "0420af49-e022-4b04-8cbf-9ba0139fbcb1" (UID: "0420af49-e022-4b04-8cbf-9ba0139fbcb1"). InnerVolumeSpecName "kube-api-access-z6ff9". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:36:09 crc kubenswrapper[4691]: I0930 06:36:09.659692 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0420af49-e022-4b04-8cbf-9ba0139fbcb1-config" (OuterVolumeSpecName: "config") pod "0420af49-e022-4b04-8cbf-9ba0139fbcb1" (UID: "0420af49-e022-4b04-8cbf-9ba0139fbcb1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:36:09 crc kubenswrapper[4691]: I0930 06:36:09.660151 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0420af49-e022-4b04-8cbf-9ba0139fbcb1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "0420af49-e022-4b04-8cbf-9ba0139fbcb1" (UID: "0420af49-e022-4b04-8cbf-9ba0139fbcb1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:36:09 crc kubenswrapper[4691]: I0930 06:36:09.660225 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0420af49-e022-4b04-8cbf-9ba0139fbcb1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "0420af49-e022-4b04-8cbf-9ba0139fbcb1" (UID: "0420af49-e022-4b04-8cbf-9ba0139fbcb1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:36:09 crc kubenswrapper[4691]: I0930 06:36:09.664400 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0420af49-e022-4b04-8cbf-9ba0139fbcb1-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "0420af49-e022-4b04-8cbf-9ba0139fbcb1" (UID: "0420af49-e022-4b04-8cbf-9ba0139fbcb1"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:36:09 crc kubenswrapper[4691]: I0930 06:36:09.685168 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0420af49-e022-4b04-8cbf-9ba0139fbcb1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "0420af49-e022-4b04-8cbf-9ba0139fbcb1" (UID: "0420af49-e022-4b04-8cbf-9ba0139fbcb1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:36:09 crc kubenswrapper[4691]: I0930 06:36:09.735939 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56884c66c5-5zr58" event={"ID":"0420af49-e022-4b04-8cbf-9ba0139fbcb1","Type":"ContainerDied","Data":"57ae3cd41c66ec8325fbf0b409597f6dd6399dab2947012eabb216d897400a4b"} Sep 30 06:36:09 crc kubenswrapper[4691]: I0930 06:36:09.736004 4691 scope.go:117] "RemoveContainer" containerID="c969900617664210534e055bec1fd1c39872613650f66087196d86c90d6076d2" Sep 30 06:36:09 crc kubenswrapper[4691]: I0930 06:36:09.735963 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56884c66c5-5zr58" Sep 30 06:36:09 crc kubenswrapper[4691]: I0930 06:36:09.739246 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z6ff9\" (UniqueName: \"kubernetes.io/projected/0420af49-e022-4b04-8cbf-9ba0139fbcb1-kube-api-access-z6ff9\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:09 crc kubenswrapper[4691]: I0930 06:36:09.739267 4691 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/0420af49-e022-4b04-8cbf-9ba0139fbcb1-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:09 crc kubenswrapper[4691]: I0930 06:36:09.739276 4691 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/0420af49-e022-4b04-8cbf-9ba0139fbcb1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:09 crc kubenswrapper[4691]: I0930 06:36:09.739286 4691 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/0420af49-e022-4b04-8cbf-9ba0139fbcb1-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:09 crc kubenswrapper[4691]: I0930 06:36:09.739294 4691 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/0420af49-e022-4b04-8cbf-9ba0139fbcb1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:09 crc kubenswrapper[4691]: I0930 06:36:09.739302 4691 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0420af49-e022-4b04-8cbf-9ba0139fbcb1-config\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:09 crc kubenswrapper[4691]: I0930 06:36:09.740779 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"a0cca0db-c02c-407f-9ba0-9dbaabdfa22d","Type":"ContainerStarted","Data":"804a939313f46aebffa0047313d44f607be5216b0c042e5bcae445243847c38d"} Sep 30 06:36:09 crc kubenswrapper[4691]: I0930 06:36:09.741100 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="a0cca0db-c02c-407f-9ba0-9dbaabdfa22d" containerName="watcher-api-log" containerID="cri-o://58ea47eba5766897abe87f1915f5f3bb77b89b35aba64c8932cb8aad7e531a91" gracePeriod=30 Sep 30 06:36:09 crc kubenswrapper[4691]: I0930 06:36:09.741190 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="a0cca0db-c02c-407f-9ba0-9dbaabdfa22d" containerName="watcher-api" containerID="cri-o://804a939313f46aebffa0047313d44f607be5216b0c042e5bcae445243847c38d" gracePeriod=30 Sep 30 06:36:09 crc kubenswrapper[4691]: I0930 06:36:09.762578 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=5.762563095 podStartE2EDuration="5.762563095s" podCreationTimestamp="2025-09-30 06:36:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:36:09.755344352 +0000 UTC m=+1013.230365392" watchObservedRunningTime="2025-09-30 06:36:09.762563095 +0000 UTC m=+1013.237584135" Sep 30 06:36:09 crc kubenswrapper[4691]: I0930 06:36:09.821263 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56884c66c5-5zr58"] Sep 30 06:36:09 crc kubenswrapper[4691]: I0930 06:36:09.828469 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56884c66c5-5zr58"] Sep 30 06:36:10 crc kubenswrapper[4691]: I0930 06:36:10.327534 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Sep 30 06:36:10 crc kubenswrapper[4691]: I0930 06:36:10.327597 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Sep 30 06:36:10 crc kubenswrapper[4691]: I0930 06:36:10.753937 4691 generic.go:334] "Generic (PLEG): container finished" podID="a0cca0db-c02c-407f-9ba0-9dbaabdfa22d" containerID="58ea47eba5766897abe87f1915f5f3bb77b89b35aba64c8932cb8aad7e531a91" exitCode=143 Sep 30 06:36:10 crc kubenswrapper[4691]: I0930 06:36:10.754191 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"a0cca0db-c02c-407f-9ba0-9dbaabdfa22d","Type":"ContainerDied","Data":"58ea47eba5766897abe87f1915f5f3bb77b89b35aba64c8932cb8aad7e531a91"} Sep 30 06:36:11 crc kubenswrapper[4691]: I0930 06:36:11.245933 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0420af49-e022-4b04-8cbf-9ba0139fbcb1" path="/var/lib/kubelet/pods/0420af49-e022-4b04-8cbf-9ba0139fbcb1/volumes" Sep 30 06:36:12 crc kubenswrapper[4691]: I0930 06:36:12.592610 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Sep 30 06:36:12 crc kubenswrapper[4691]: E0930 06:36:12.631793 4691 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f1deff0_17e6_49cd_bf71_eeeecee2c1c1.slice/crio-0becdf2d4e2225f3ba85bfa6fa6aabc1805b8268d9fe2c4cf31a3f417ac427ba.scope\": RecentStats: unable to find data in memory cache]" Sep 30 06:36:12 crc kubenswrapper[4691]: I0930 06:36:12.798150 4691 generic.go:334] "Generic (PLEG): container finished" podID="7f1deff0-17e6-49cd-bf71-eeeecee2c1c1" containerID="0becdf2d4e2225f3ba85bfa6fa6aabc1805b8268d9fe2c4cf31a3f417ac427ba" exitCode=0 Sep 30 06:36:12 crc kubenswrapper[4691]: I0930 06:36:12.798188 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-wmzh4" event={"ID":"7f1deff0-17e6-49cd-bf71-eeeecee2c1c1","Type":"ContainerDied","Data":"0becdf2d4e2225f3ba85bfa6fa6aabc1805b8268d9fe2c4cf31a3f417ac427ba"} Sep 30 06:36:13 crc kubenswrapper[4691]: I0930 06:36:13.678787 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-f69875457-kvdnf"] Sep 30 06:36:13 crc kubenswrapper[4691]: I0930 06:36:13.706125 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5755bf4df8-zx9td"] Sep 30 06:36:13 crc kubenswrapper[4691]: E0930 06:36:13.706444 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0420af49-e022-4b04-8cbf-9ba0139fbcb1" containerName="init" Sep 30 06:36:13 crc kubenswrapper[4691]: I0930 06:36:13.706458 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="0420af49-e022-4b04-8cbf-9ba0139fbcb1" containerName="init" Sep 30 06:36:13 crc kubenswrapper[4691]: I0930 06:36:13.706650 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="0420af49-e022-4b04-8cbf-9ba0139fbcb1" containerName="init" Sep 30 06:36:13 crc kubenswrapper[4691]: I0930 06:36:13.707527 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5755bf4df8-zx9td" Sep 30 06:36:13 crc kubenswrapper[4691]: I0930 06:36:13.710609 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Sep 30 06:36:13 crc kubenswrapper[4691]: I0930 06:36:13.725350 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5755bf4df8-zx9td"] Sep 30 06:36:13 crc kubenswrapper[4691]: I0930 06:36:13.763235 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-596cd479b5-w8rf2"] Sep 30 06:36:13 crc kubenswrapper[4691]: I0930 06:36:13.792774 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-56948c48fd-czzmm"] Sep 30 06:36:13 crc kubenswrapper[4691]: I0930 06:36:13.794201 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-56948c48fd-czzmm" Sep 30 06:36:13 crc kubenswrapper[4691]: I0930 06:36:13.803721 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-56948c48fd-czzmm"] Sep 30 06:36:13 crc kubenswrapper[4691]: I0930 06:36:13.813123 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlmvf\" (UniqueName: \"kubernetes.io/projected/cc29be35-3ceb-4a88-af6e-77e2d0cbab83-kube-api-access-rlmvf\") pod \"horizon-5755bf4df8-zx9td\" (UID: \"cc29be35-3ceb-4a88-af6e-77e2d0cbab83\") " pod="openstack/horizon-5755bf4df8-zx9td" Sep 30 06:36:13 crc kubenswrapper[4691]: I0930 06:36:13.813161 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc29be35-3ceb-4a88-af6e-77e2d0cbab83-logs\") pod \"horizon-5755bf4df8-zx9td\" (UID: \"cc29be35-3ceb-4a88-af6e-77e2d0cbab83\") " pod="openstack/horizon-5755bf4df8-zx9td" Sep 30 06:36:13 crc kubenswrapper[4691]: I0930 06:36:13.813237 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/cc29be35-3ceb-4a88-af6e-77e2d0cbab83-horizon-secret-key\") pod \"horizon-5755bf4df8-zx9td\" (UID: \"cc29be35-3ceb-4a88-af6e-77e2d0cbab83\") " pod="openstack/horizon-5755bf4df8-zx9td" Sep 30 06:36:13 crc kubenswrapper[4691]: I0930 06:36:13.813269 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cc29be35-3ceb-4a88-af6e-77e2d0cbab83-scripts\") pod \"horizon-5755bf4df8-zx9td\" (UID: \"cc29be35-3ceb-4a88-af6e-77e2d0cbab83\") " pod="openstack/horizon-5755bf4df8-zx9td" Sep 30 06:36:13 crc kubenswrapper[4691]: I0930 06:36:13.813313 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc29be35-3ceb-4a88-af6e-77e2d0cbab83-combined-ca-bundle\") pod \"horizon-5755bf4df8-zx9td\" (UID: \"cc29be35-3ceb-4a88-af6e-77e2d0cbab83\") " pod="openstack/horizon-5755bf4df8-zx9td" Sep 30 06:36:13 crc kubenswrapper[4691]: I0930 06:36:13.813332 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc29be35-3ceb-4a88-af6e-77e2d0cbab83-horizon-tls-certs\") pod \"horizon-5755bf4df8-zx9td\" (UID: \"cc29be35-3ceb-4a88-af6e-77e2d0cbab83\") " pod="openstack/horizon-5755bf4df8-zx9td" Sep 30 06:36:13 crc kubenswrapper[4691]: I0930 06:36:13.813359 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cc29be35-3ceb-4a88-af6e-77e2d0cbab83-config-data\") pod \"horizon-5755bf4df8-zx9td\" (UID: \"cc29be35-3ceb-4a88-af6e-77e2d0cbab83\") " pod="openstack/horizon-5755bf4df8-zx9td" Sep 30 06:36:13 crc kubenswrapper[4691]: I0930 06:36:13.921971 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlmvf\" (UniqueName: \"kubernetes.io/projected/cc29be35-3ceb-4a88-af6e-77e2d0cbab83-kube-api-access-rlmvf\") pod \"horizon-5755bf4df8-zx9td\" (UID: \"cc29be35-3ceb-4a88-af6e-77e2d0cbab83\") " pod="openstack/horizon-5755bf4df8-zx9td" Sep 30 06:36:13 crc kubenswrapper[4691]: I0930 06:36:13.922021 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc29be35-3ceb-4a88-af6e-77e2d0cbab83-logs\") pod \"horizon-5755bf4df8-zx9td\" (UID: \"cc29be35-3ceb-4a88-af6e-77e2d0cbab83\") " pod="openstack/horizon-5755bf4df8-zx9td" Sep 30 06:36:13 crc kubenswrapper[4691]: I0930 06:36:13.922064 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fec95e1e-14f4-4093-b1d4-402c29686348-horizon-secret-key\") pod \"horizon-56948c48fd-czzmm\" (UID: \"fec95e1e-14f4-4093-b1d4-402c29686348\") " pod="openstack/horizon-56948c48fd-czzmm" Sep 30 06:36:13 crc kubenswrapper[4691]: I0930 06:36:13.922095 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fec95e1e-14f4-4093-b1d4-402c29686348-config-data\") pod \"horizon-56948c48fd-czzmm\" (UID: \"fec95e1e-14f4-4093-b1d4-402c29686348\") " pod="openstack/horizon-56948c48fd-czzmm" Sep 30 06:36:13 crc kubenswrapper[4691]: I0930 06:36:13.922128 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/cc29be35-3ceb-4a88-af6e-77e2d0cbab83-horizon-secret-key\") pod \"horizon-5755bf4df8-zx9td\" (UID: \"cc29be35-3ceb-4a88-af6e-77e2d0cbab83\") " pod="openstack/horizon-5755bf4df8-zx9td" Sep 30 06:36:13 crc kubenswrapper[4691]: I0930 06:36:13.922147 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fec95e1e-14f4-4093-b1d4-402c29686348-scripts\") pod \"horizon-56948c48fd-czzmm\" (UID: \"fec95e1e-14f4-4093-b1d4-402c29686348\") " pod="openstack/horizon-56948c48fd-czzmm" Sep 30 06:36:13 crc kubenswrapper[4691]: I0930 06:36:13.922165 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cc29be35-3ceb-4a88-af6e-77e2d0cbab83-scripts\") pod \"horizon-5755bf4df8-zx9td\" (UID: \"cc29be35-3ceb-4a88-af6e-77e2d0cbab83\") " pod="openstack/horizon-5755bf4df8-zx9td" Sep 30 06:36:13 crc kubenswrapper[4691]: I0930 06:36:13.922186 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/fec95e1e-14f4-4093-b1d4-402c29686348-horizon-tls-certs\") pod \"horizon-56948c48fd-czzmm\" (UID: \"fec95e1e-14f4-4093-b1d4-402c29686348\") " pod="openstack/horizon-56948c48fd-czzmm" Sep 30 06:36:13 crc kubenswrapper[4691]: I0930 06:36:13.922221 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc29be35-3ceb-4a88-af6e-77e2d0cbab83-combined-ca-bundle\") pod \"horizon-5755bf4df8-zx9td\" (UID: \"cc29be35-3ceb-4a88-af6e-77e2d0cbab83\") " pod="openstack/horizon-5755bf4df8-zx9td" Sep 30 06:36:13 crc kubenswrapper[4691]: I0930 06:36:13.922240 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc29be35-3ceb-4a88-af6e-77e2d0cbab83-horizon-tls-certs\") pod \"horizon-5755bf4df8-zx9td\" (UID: \"cc29be35-3ceb-4a88-af6e-77e2d0cbab83\") " pod="openstack/horizon-5755bf4df8-zx9td" Sep 30 06:36:13 crc kubenswrapper[4691]: I0930 06:36:13.922270 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wg5l\" (UniqueName: \"kubernetes.io/projected/fec95e1e-14f4-4093-b1d4-402c29686348-kube-api-access-2wg5l\") pod \"horizon-56948c48fd-czzmm\" (UID: \"fec95e1e-14f4-4093-b1d4-402c29686348\") " pod="openstack/horizon-56948c48fd-czzmm" Sep 30 06:36:13 crc kubenswrapper[4691]: I0930 06:36:13.922286 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cc29be35-3ceb-4a88-af6e-77e2d0cbab83-config-data\") pod \"horizon-5755bf4df8-zx9td\" (UID: \"cc29be35-3ceb-4a88-af6e-77e2d0cbab83\") " pod="openstack/horizon-5755bf4df8-zx9td" Sep 30 06:36:13 crc kubenswrapper[4691]: I0930 06:36:13.922303 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fec95e1e-14f4-4093-b1d4-402c29686348-combined-ca-bundle\") pod \"horizon-56948c48fd-czzmm\" (UID: \"fec95e1e-14f4-4093-b1d4-402c29686348\") " pod="openstack/horizon-56948c48fd-czzmm" Sep 30 06:36:13 crc kubenswrapper[4691]: I0930 06:36:13.922330 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fec95e1e-14f4-4093-b1d4-402c29686348-logs\") pod \"horizon-56948c48fd-czzmm\" (UID: \"fec95e1e-14f4-4093-b1d4-402c29686348\") " pod="openstack/horizon-56948c48fd-czzmm" Sep 30 06:36:13 crc kubenswrapper[4691]: I0930 06:36:13.922987 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc29be35-3ceb-4a88-af6e-77e2d0cbab83-logs\") pod \"horizon-5755bf4df8-zx9td\" (UID: \"cc29be35-3ceb-4a88-af6e-77e2d0cbab83\") " pod="openstack/horizon-5755bf4df8-zx9td" Sep 30 06:36:13 crc kubenswrapper[4691]: I0930 06:36:13.926774 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cc29be35-3ceb-4a88-af6e-77e2d0cbab83-config-data\") pod \"horizon-5755bf4df8-zx9td\" (UID: \"cc29be35-3ceb-4a88-af6e-77e2d0cbab83\") " pod="openstack/horizon-5755bf4df8-zx9td" Sep 30 06:36:13 crc kubenswrapper[4691]: I0930 06:36:13.929823 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cc29be35-3ceb-4a88-af6e-77e2d0cbab83-scripts\") pod \"horizon-5755bf4df8-zx9td\" (UID: \"cc29be35-3ceb-4a88-af6e-77e2d0cbab83\") " pod="openstack/horizon-5755bf4df8-zx9td" Sep 30 06:36:13 crc kubenswrapper[4691]: I0930 06:36:13.930811 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc29be35-3ceb-4a88-af6e-77e2d0cbab83-combined-ca-bundle\") pod \"horizon-5755bf4df8-zx9td\" (UID: \"cc29be35-3ceb-4a88-af6e-77e2d0cbab83\") " pod="openstack/horizon-5755bf4df8-zx9td" Sep 30 06:36:13 crc kubenswrapper[4691]: I0930 06:36:13.931294 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/cc29be35-3ceb-4a88-af6e-77e2d0cbab83-horizon-secret-key\") pod \"horizon-5755bf4df8-zx9td\" (UID: \"cc29be35-3ceb-4a88-af6e-77e2d0cbab83\") " pod="openstack/horizon-5755bf4df8-zx9td" Sep 30 06:36:13 crc kubenswrapper[4691]: I0930 06:36:13.931341 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc29be35-3ceb-4a88-af6e-77e2d0cbab83-horizon-tls-certs\") pod \"horizon-5755bf4df8-zx9td\" (UID: \"cc29be35-3ceb-4a88-af6e-77e2d0cbab83\") " pod="openstack/horizon-5755bf4df8-zx9td" Sep 30 06:36:13 crc kubenswrapper[4691]: I0930 06:36:13.957627 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlmvf\" (UniqueName: \"kubernetes.io/projected/cc29be35-3ceb-4a88-af6e-77e2d0cbab83-kube-api-access-rlmvf\") pod \"horizon-5755bf4df8-zx9td\" (UID: \"cc29be35-3ceb-4a88-af6e-77e2d0cbab83\") " pod="openstack/horizon-5755bf4df8-zx9td" Sep 30 06:36:14 crc kubenswrapper[4691]: I0930 06:36:14.029694 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wg5l\" (UniqueName: \"kubernetes.io/projected/fec95e1e-14f4-4093-b1d4-402c29686348-kube-api-access-2wg5l\") pod \"horizon-56948c48fd-czzmm\" (UID: \"fec95e1e-14f4-4093-b1d4-402c29686348\") " pod="openstack/horizon-56948c48fd-czzmm" Sep 30 06:36:14 crc kubenswrapper[4691]: I0930 06:36:14.030086 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fec95e1e-14f4-4093-b1d4-402c29686348-combined-ca-bundle\") pod \"horizon-56948c48fd-czzmm\" (UID: \"fec95e1e-14f4-4093-b1d4-402c29686348\") " pod="openstack/horizon-56948c48fd-czzmm" Sep 30 06:36:14 crc kubenswrapper[4691]: I0930 06:36:14.030142 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fec95e1e-14f4-4093-b1d4-402c29686348-logs\") pod \"horizon-56948c48fd-czzmm\" (UID: \"fec95e1e-14f4-4093-b1d4-402c29686348\") " pod="openstack/horizon-56948c48fd-czzmm" Sep 30 06:36:14 crc kubenswrapper[4691]: I0930 06:36:14.030270 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fec95e1e-14f4-4093-b1d4-402c29686348-horizon-secret-key\") pod \"horizon-56948c48fd-czzmm\" (UID: \"fec95e1e-14f4-4093-b1d4-402c29686348\") " pod="openstack/horizon-56948c48fd-czzmm" Sep 30 06:36:14 crc kubenswrapper[4691]: I0930 06:36:14.032962 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fec95e1e-14f4-4093-b1d4-402c29686348-config-data\") pod \"horizon-56948c48fd-czzmm\" (UID: \"fec95e1e-14f4-4093-b1d4-402c29686348\") " pod="openstack/horizon-56948c48fd-czzmm" Sep 30 06:36:14 crc kubenswrapper[4691]: I0930 06:36:14.033112 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fec95e1e-14f4-4093-b1d4-402c29686348-scripts\") pod \"horizon-56948c48fd-czzmm\" (UID: \"fec95e1e-14f4-4093-b1d4-402c29686348\") " pod="openstack/horizon-56948c48fd-czzmm" Sep 30 06:36:14 crc kubenswrapper[4691]: I0930 06:36:14.033165 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/fec95e1e-14f4-4093-b1d4-402c29686348-horizon-tls-certs\") pod \"horizon-56948c48fd-czzmm\" (UID: \"fec95e1e-14f4-4093-b1d4-402c29686348\") " pod="openstack/horizon-56948c48fd-czzmm" Sep 30 06:36:14 crc kubenswrapper[4691]: I0930 06:36:14.037024 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fec95e1e-14f4-4093-b1d4-402c29686348-logs\") pod \"horizon-56948c48fd-czzmm\" (UID: \"fec95e1e-14f4-4093-b1d4-402c29686348\") " pod="openstack/horizon-56948c48fd-czzmm" Sep 30 06:36:14 crc kubenswrapper[4691]: I0930 06:36:14.037472 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/fec95e1e-14f4-4093-b1d4-402c29686348-scripts\") pod \"horizon-56948c48fd-czzmm\" (UID: \"fec95e1e-14f4-4093-b1d4-402c29686348\") " pod="openstack/horizon-56948c48fd-czzmm" Sep 30 06:36:14 crc kubenswrapper[4691]: I0930 06:36:14.037652 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fec95e1e-14f4-4093-b1d4-402c29686348-combined-ca-bundle\") pod \"horizon-56948c48fd-czzmm\" (UID: \"fec95e1e-14f4-4093-b1d4-402c29686348\") " pod="openstack/horizon-56948c48fd-czzmm" Sep 30 06:36:14 crc kubenswrapper[4691]: I0930 06:36:14.039240 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5755bf4df8-zx9td" Sep 30 06:36:14 crc kubenswrapper[4691]: I0930 06:36:14.039818 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fec95e1e-14f4-4093-b1d4-402c29686348-config-data\") pod \"horizon-56948c48fd-czzmm\" (UID: \"fec95e1e-14f4-4093-b1d4-402c29686348\") " pod="openstack/horizon-56948c48fd-czzmm" Sep 30 06:36:14 crc kubenswrapper[4691]: I0930 06:36:14.041377 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/fec95e1e-14f4-4093-b1d4-402c29686348-horizon-secret-key\") pod \"horizon-56948c48fd-czzmm\" (UID: \"fec95e1e-14f4-4093-b1d4-402c29686348\") " pod="openstack/horizon-56948c48fd-czzmm" Sep 30 06:36:14 crc kubenswrapper[4691]: I0930 06:36:14.041683 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/fec95e1e-14f4-4093-b1d4-402c29686348-horizon-tls-certs\") pod \"horizon-56948c48fd-czzmm\" (UID: \"fec95e1e-14f4-4093-b1d4-402c29686348\") " pod="openstack/horizon-56948c48fd-czzmm" Sep 30 06:36:14 crc kubenswrapper[4691]: I0930 06:36:14.054612 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wg5l\" (UniqueName: \"kubernetes.io/projected/fec95e1e-14f4-4093-b1d4-402c29686348-kube-api-access-2wg5l\") pod \"horizon-56948c48fd-czzmm\" (UID: \"fec95e1e-14f4-4093-b1d4-402c29686348\") " pod="openstack/horizon-56948c48fd-czzmm" Sep 30 06:36:14 crc kubenswrapper[4691]: I0930 06:36:14.113269 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-56948c48fd-czzmm" Sep 30 06:36:14 crc kubenswrapper[4691]: I0930 06:36:14.384165 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-d99e-account-create-zrrb2" Sep 30 06:36:14 crc kubenswrapper[4691]: I0930 06:36:14.400962 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7341-account-create-ln8wm" Sep 30 06:36:14 crc kubenswrapper[4691]: I0930 06:36:14.407131 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-c5ed-account-create-d7rxh" Sep 30 06:36:14 crc kubenswrapper[4691]: I0930 06:36:14.547166 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r65zg\" (UniqueName: \"kubernetes.io/projected/cae1274f-384f-452d-b80f-1c2a3712bb49-kube-api-access-r65zg\") pod \"cae1274f-384f-452d-b80f-1c2a3712bb49\" (UID: \"cae1274f-384f-452d-b80f-1c2a3712bb49\") " Sep 30 06:36:14 crc kubenswrapper[4691]: I0930 06:36:14.547493 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-prd5c\" (UniqueName: \"kubernetes.io/projected/0ce9ee68-c2b5-456f-9a12-5493b94729ea-kube-api-access-prd5c\") pod \"0ce9ee68-c2b5-456f-9a12-5493b94729ea\" (UID: \"0ce9ee68-c2b5-456f-9a12-5493b94729ea\") " Sep 30 06:36:14 crc kubenswrapper[4691]: I0930 06:36:14.547517 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d84l5\" (UniqueName: \"kubernetes.io/projected/4f76bf7e-6020-4d8a-a15c-f2d497629fd9-kube-api-access-d84l5\") pod \"4f76bf7e-6020-4d8a-a15c-f2d497629fd9\" (UID: \"4f76bf7e-6020-4d8a-a15c-f2d497629fd9\") " Sep 30 06:36:14 crc kubenswrapper[4691]: I0930 06:36:14.553741 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ce9ee68-c2b5-456f-9a12-5493b94729ea-kube-api-access-prd5c" (OuterVolumeSpecName: "kube-api-access-prd5c") pod "0ce9ee68-c2b5-456f-9a12-5493b94729ea" (UID: "0ce9ee68-c2b5-456f-9a12-5493b94729ea"). InnerVolumeSpecName "kube-api-access-prd5c". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:36:14 crc kubenswrapper[4691]: I0930 06:36:14.553825 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f76bf7e-6020-4d8a-a15c-f2d497629fd9-kube-api-access-d84l5" (OuterVolumeSpecName: "kube-api-access-d84l5") pod "4f76bf7e-6020-4d8a-a15c-f2d497629fd9" (UID: "4f76bf7e-6020-4d8a-a15c-f2d497629fd9"). InnerVolumeSpecName "kube-api-access-d84l5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:36:14 crc kubenswrapper[4691]: I0930 06:36:14.554049 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cae1274f-384f-452d-b80f-1c2a3712bb49-kube-api-access-r65zg" (OuterVolumeSpecName: "kube-api-access-r65zg") pod "cae1274f-384f-452d-b80f-1c2a3712bb49" (UID: "cae1274f-384f-452d-b80f-1c2a3712bb49"). InnerVolumeSpecName "kube-api-access-r65zg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:36:14 crc kubenswrapper[4691]: I0930 06:36:14.650429 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r65zg\" (UniqueName: \"kubernetes.io/projected/cae1274f-384f-452d-b80f-1c2a3712bb49-kube-api-access-r65zg\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:14 crc kubenswrapper[4691]: I0930 06:36:14.650458 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-prd5c\" (UniqueName: \"kubernetes.io/projected/0ce9ee68-c2b5-456f-9a12-5493b94729ea-kube-api-access-prd5c\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:14 crc kubenswrapper[4691]: I0930 06:36:14.650468 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d84l5\" (UniqueName: \"kubernetes.io/projected/4f76bf7e-6020-4d8a-a15c-f2d497629fd9-kube-api-access-d84l5\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:14 crc kubenswrapper[4691]: I0930 06:36:14.821358 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-c5ed-account-create-d7rxh" event={"ID":"cae1274f-384f-452d-b80f-1c2a3712bb49","Type":"ContainerDied","Data":"cc1ecb54b3b70313330ffb39e48682572ab7ba3f76dae551786cced814f4bb10"} Sep 30 06:36:14 crc kubenswrapper[4691]: I0930 06:36:14.821402 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc1ecb54b3b70313330ffb39e48682572ab7ba3f76dae551786cced814f4bb10" Sep 30 06:36:14 crc kubenswrapper[4691]: I0930 06:36:14.821448 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-c5ed-account-create-d7rxh" Sep 30 06:36:14 crc kubenswrapper[4691]: I0930 06:36:14.825047 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-7341-account-create-ln8wm" event={"ID":"0ce9ee68-c2b5-456f-9a12-5493b94729ea","Type":"ContainerDied","Data":"b22f4e1e047f93d0915d09e7cacd20ab204fe1b30a600792271dfaee3d4bcffc"} Sep 30 06:36:14 crc kubenswrapper[4691]: I0930 06:36:14.825072 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b22f4e1e047f93d0915d09e7cacd20ab204fe1b30a600792271dfaee3d4bcffc" Sep 30 06:36:14 crc kubenswrapper[4691]: I0930 06:36:14.825087 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-7341-account-create-ln8wm" Sep 30 06:36:14 crc kubenswrapper[4691]: I0930 06:36:14.826332 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-d99e-account-create-zrrb2" event={"ID":"4f76bf7e-6020-4d8a-a15c-f2d497629fd9","Type":"ContainerDied","Data":"9d70f561bd9a05917f30998f661855d65f87422e663ee6959a1992cfead0778e"} Sep 30 06:36:14 crc kubenswrapper[4691]: I0930 06:36:14.826355 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9d70f561bd9a05917f30998f661855d65f87422e663ee6959a1992cfead0778e" Sep 30 06:36:14 crc kubenswrapper[4691]: I0930 06:36:14.826388 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-d99e-account-create-zrrb2" Sep 30 06:36:14 crc kubenswrapper[4691]: I0930 06:36:14.855016 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-596cd479b5-w8rf2"] Sep 30 06:36:19 crc kubenswrapper[4691]: I0930 06:36:19.407236 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-fh78b"] Sep 30 06:36:19 crc kubenswrapper[4691]: E0930 06:36:19.408134 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cae1274f-384f-452d-b80f-1c2a3712bb49" containerName="mariadb-account-create" Sep 30 06:36:19 crc kubenswrapper[4691]: I0930 06:36:19.408152 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="cae1274f-384f-452d-b80f-1c2a3712bb49" containerName="mariadb-account-create" Sep 30 06:36:19 crc kubenswrapper[4691]: E0930 06:36:19.408183 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f76bf7e-6020-4d8a-a15c-f2d497629fd9" containerName="mariadb-account-create" Sep 30 06:36:19 crc kubenswrapper[4691]: I0930 06:36:19.408190 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f76bf7e-6020-4d8a-a15c-f2d497629fd9" containerName="mariadb-account-create" Sep 30 06:36:19 crc kubenswrapper[4691]: E0930 06:36:19.408213 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ce9ee68-c2b5-456f-9a12-5493b94729ea" containerName="mariadb-account-create" Sep 30 06:36:19 crc kubenswrapper[4691]: I0930 06:36:19.408220 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ce9ee68-c2b5-456f-9a12-5493b94729ea" containerName="mariadb-account-create" Sep 30 06:36:19 crc kubenswrapper[4691]: I0930 06:36:19.408407 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="cae1274f-384f-452d-b80f-1c2a3712bb49" containerName="mariadb-account-create" Sep 30 06:36:19 crc kubenswrapper[4691]: I0930 06:36:19.408420 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ce9ee68-c2b5-456f-9a12-5493b94729ea" containerName="mariadb-account-create" Sep 30 06:36:19 crc kubenswrapper[4691]: I0930 06:36:19.408435 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f76bf7e-6020-4d8a-a15c-f2d497629fd9" containerName="mariadb-account-create" Sep 30 06:36:19 crc kubenswrapper[4691]: I0930 06:36:19.409151 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-fh78b" Sep 30 06:36:19 crc kubenswrapper[4691]: I0930 06:36:19.411518 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Sep 30 06:36:19 crc kubenswrapper[4691]: I0930 06:36:19.418396 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Sep 30 06:36:19 crc kubenswrapper[4691]: I0930 06:36:19.434352 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-n5crj" Sep 30 06:36:19 crc kubenswrapper[4691]: I0930 06:36:19.444809 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-fh78b"] Sep 30 06:36:19 crc kubenswrapper[4691]: I0930 06:36:19.558071 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/071e402d-9775-412e-ad8a-1643cd646d7c-etc-machine-id\") pod \"cinder-db-sync-fh78b\" (UID: \"071e402d-9775-412e-ad8a-1643cd646d7c\") " pod="openstack/cinder-db-sync-fh78b" Sep 30 06:36:19 crc kubenswrapper[4691]: I0930 06:36:19.558155 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/071e402d-9775-412e-ad8a-1643cd646d7c-combined-ca-bundle\") pod \"cinder-db-sync-fh78b\" (UID: \"071e402d-9775-412e-ad8a-1643cd646d7c\") " pod="openstack/cinder-db-sync-fh78b" Sep 30 06:36:19 crc kubenswrapper[4691]: I0930 06:36:19.558274 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/071e402d-9775-412e-ad8a-1643cd646d7c-scripts\") pod \"cinder-db-sync-fh78b\" (UID: \"071e402d-9775-412e-ad8a-1643cd646d7c\") " pod="openstack/cinder-db-sync-fh78b" Sep 30 06:36:19 crc kubenswrapper[4691]: I0930 06:36:19.558299 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/071e402d-9775-412e-ad8a-1643cd646d7c-config-data\") pod \"cinder-db-sync-fh78b\" (UID: \"071e402d-9775-412e-ad8a-1643cd646d7c\") " pod="openstack/cinder-db-sync-fh78b" Sep 30 06:36:19 crc kubenswrapper[4691]: I0930 06:36:19.558326 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/071e402d-9775-412e-ad8a-1643cd646d7c-db-sync-config-data\") pod \"cinder-db-sync-fh78b\" (UID: \"071e402d-9775-412e-ad8a-1643cd646d7c\") " pod="openstack/cinder-db-sync-fh78b" Sep 30 06:36:19 crc kubenswrapper[4691]: I0930 06:36:19.558379 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjg5r\" (UniqueName: \"kubernetes.io/projected/071e402d-9775-412e-ad8a-1643cd646d7c-kube-api-access-mjg5r\") pod \"cinder-db-sync-fh78b\" (UID: \"071e402d-9775-412e-ad8a-1643cd646d7c\") " pod="openstack/cinder-db-sync-fh78b" Sep 30 06:36:19 crc kubenswrapper[4691]: I0930 06:36:19.659565 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/071e402d-9775-412e-ad8a-1643cd646d7c-combined-ca-bundle\") pod \"cinder-db-sync-fh78b\" (UID: \"071e402d-9775-412e-ad8a-1643cd646d7c\") " pod="openstack/cinder-db-sync-fh78b" Sep 30 06:36:19 crc kubenswrapper[4691]: I0930 06:36:19.659667 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/071e402d-9775-412e-ad8a-1643cd646d7c-scripts\") pod \"cinder-db-sync-fh78b\" (UID: \"071e402d-9775-412e-ad8a-1643cd646d7c\") " pod="openstack/cinder-db-sync-fh78b" Sep 30 06:36:19 crc kubenswrapper[4691]: I0930 06:36:19.659689 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/071e402d-9775-412e-ad8a-1643cd646d7c-config-data\") pod \"cinder-db-sync-fh78b\" (UID: \"071e402d-9775-412e-ad8a-1643cd646d7c\") " pod="openstack/cinder-db-sync-fh78b" Sep 30 06:36:19 crc kubenswrapper[4691]: I0930 06:36:19.659721 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/071e402d-9775-412e-ad8a-1643cd646d7c-db-sync-config-data\") pod \"cinder-db-sync-fh78b\" (UID: \"071e402d-9775-412e-ad8a-1643cd646d7c\") " pod="openstack/cinder-db-sync-fh78b" Sep 30 06:36:19 crc kubenswrapper[4691]: I0930 06:36:19.659775 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjg5r\" (UniqueName: \"kubernetes.io/projected/071e402d-9775-412e-ad8a-1643cd646d7c-kube-api-access-mjg5r\") pod \"cinder-db-sync-fh78b\" (UID: \"071e402d-9775-412e-ad8a-1643cd646d7c\") " pod="openstack/cinder-db-sync-fh78b" Sep 30 06:36:19 crc kubenswrapper[4691]: I0930 06:36:19.659799 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/071e402d-9775-412e-ad8a-1643cd646d7c-etc-machine-id\") pod \"cinder-db-sync-fh78b\" (UID: \"071e402d-9775-412e-ad8a-1643cd646d7c\") " pod="openstack/cinder-db-sync-fh78b" Sep 30 06:36:19 crc kubenswrapper[4691]: I0930 06:36:19.659905 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/071e402d-9775-412e-ad8a-1643cd646d7c-etc-machine-id\") pod \"cinder-db-sync-fh78b\" (UID: \"071e402d-9775-412e-ad8a-1643cd646d7c\") " pod="openstack/cinder-db-sync-fh78b" Sep 30 06:36:19 crc kubenswrapper[4691]: I0930 06:36:19.665754 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/071e402d-9775-412e-ad8a-1643cd646d7c-scripts\") pod \"cinder-db-sync-fh78b\" (UID: \"071e402d-9775-412e-ad8a-1643cd646d7c\") " pod="openstack/cinder-db-sync-fh78b" Sep 30 06:36:19 crc kubenswrapper[4691]: I0930 06:36:19.666085 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/071e402d-9775-412e-ad8a-1643cd646d7c-combined-ca-bundle\") pod \"cinder-db-sync-fh78b\" (UID: \"071e402d-9775-412e-ad8a-1643cd646d7c\") " pod="openstack/cinder-db-sync-fh78b" Sep 30 06:36:19 crc kubenswrapper[4691]: I0930 06:36:19.668139 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/071e402d-9775-412e-ad8a-1643cd646d7c-db-sync-config-data\") pod \"cinder-db-sync-fh78b\" (UID: \"071e402d-9775-412e-ad8a-1643cd646d7c\") " pod="openstack/cinder-db-sync-fh78b" Sep 30 06:36:19 crc kubenswrapper[4691]: I0930 06:36:19.673499 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/071e402d-9775-412e-ad8a-1643cd646d7c-config-data\") pod \"cinder-db-sync-fh78b\" (UID: \"071e402d-9775-412e-ad8a-1643cd646d7c\") " pod="openstack/cinder-db-sync-fh78b" Sep 30 06:36:19 crc kubenswrapper[4691]: I0930 06:36:19.688712 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjg5r\" (UniqueName: \"kubernetes.io/projected/071e402d-9775-412e-ad8a-1643cd646d7c-kube-api-access-mjg5r\") pod \"cinder-db-sync-fh78b\" (UID: \"071e402d-9775-412e-ad8a-1643cd646d7c\") " pod="openstack/cinder-db-sync-fh78b" Sep 30 06:36:19 crc kubenswrapper[4691]: I0930 06:36:19.769708 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-fh78b" Sep 30 06:36:19 crc kubenswrapper[4691]: I0930 06:36:19.904340 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-sb2qw"] Sep 30 06:36:19 crc kubenswrapper[4691]: I0930 06:36:19.905546 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-sb2qw" Sep 30 06:36:19 crc kubenswrapper[4691]: I0930 06:36:19.907211 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Sep 30 06:36:19 crc kubenswrapper[4691]: I0930 06:36:19.907589 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-pzgvz" Sep 30 06:36:19 crc kubenswrapper[4691]: I0930 06:36:19.907957 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Sep 30 06:36:19 crc kubenswrapper[4691]: I0930 06:36:19.918550 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-pblkf"] Sep 30 06:36:19 crc kubenswrapper[4691]: I0930 06:36:19.919769 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-pblkf" Sep 30 06:36:19 crc kubenswrapper[4691]: I0930 06:36:19.921640 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-84nz8" Sep 30 06:36:19 crc kubenswrapper[4691]: I0930 06:36:19.921896 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Sep 30 06:36:19 crc kubenswrapper[4691]: I0930 06:36:19.926743 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-sb2qw"] Sep 30 06:36:19 crc kubenswrapper[4691]: I0930 06:36:19.933331 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-pblkf"] Sep 30 06:36:20 crc kubenswrapper[4691]: I0930 06:36:20.066194 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3e1ab390-f1ae-4ec9-b5d6-fb137a511e21-db-sync-config-data\") pod \"barbican-db-sync-pblkf\" (UID: \"3e1ab390-f1ae-4ec9-b5d6-fb137a511e21\") " pod="openstack/barbican-db-sync-pblkf" Sep 30 06:36:20 crc kubenswrapper[4691]: I0930 06:36:20.066273 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e1ab390-f1ae-4ec9-b5d6-fb137a511e21-combined-ca-bundle\") pod \"barbican-db-sync-pblkf\" (UID: \"3e1ab390-f1ae-4ec9-b5d6-fb137a511e21\") " pod="openstack/barbican-db-sync-pblkf" Sep 30 06:36:20 crc kubenswrapper[4691]: I0930 06:36:20.066334 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f010c19-f02f-4c8b-8b12-1f357e860666-combined-ca-bundle\") pod \"neutron-db-sync-sb2qw\" (UID: \"1f010c19-f02f-4c8b-8b12-1f357e860666\") " pod="openstack/neutron-db-sync-sb2qw" Sep 30 06:36:20 crc kubenswrapper[4691]: I0930 06:36:20.066351 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f292l\" (UniqueName: \"kubernetes.io/projected/1f010c19-f02f-4c8b-8b12-1f357e860666-kube-api-access-f292l\") pod \"neutron-db-sync-sb2qw\" (UID: \"1f010c19-f02f-4c8b-8b12-1f357e860666\") " pod="openstack/neutron-db-sync-sb2qw" Sep 30 06:36:20 crc kubenswrapper[4691]: I0930 06:36:20.066439 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjv72\" (UniqueName: \"kubernetes.io/projected/3e1ab390-f1ae-4ec9-b5d6-fb137a511e21-kube-api-access-tjv72\") pod \"barbican-db-sync-pblkf\" (UID: \"3e1ab390-f1ae-4ec9-b5d6-fb137a511e21\") " pod="openstack/barbican-db-sync-pblkf" Sep 30 06:36:20 crc kubenswrapper[4691]: I0930 06:36:20.066497 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1f010c19-f02f-4c8b-8b12-1f357e860666-config\") pod \"neutron-db-sync-sb2qw\" (UID: \"1f010c19-f02f-4c8b-8b12-1f357e860666\") " pod="openstack/neutron-db-sync-sb2qw" Sep 30 06:36:20 crc kubenswrapper[4691]: I0930 06:36:20.172768 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3e1ab390-f1ae-4ec9-b5d6-fb137a511e21-db-sync-config-data\") pod \"barbican-db-sync-pblkf\" (UID: \"3e1ab390-f1ae-4ec9-b5d6-fb137a511e21\") " pod="openstack/barbican-db-sync-pblkf" Sep 30 06:36:20 crc kubenswrapper[4691]: I0930 06:36:20.172828 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e1ab390-f1ae-4ec9-b5d6-fb137a511e21-combined-ca-bundle\") pod \"barbican-db-sync-pblkf\" (UID: \"3e1ab390-f1ae-4ec9-b5d6-fb137a511e21\") " pod="openstack/barbican-db-sync-pblkf" Sep 30 06:36:20 crc kubenswrapper[4691]: I0930 06:36:20.172860 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f010c19-f02f-4c8b-8b12-1f357e860666-combined-ca-bundle\") pod \"neutron-db-sync-sb2qw\" (UID: \"1f010c19-f02f-4c8b-8b12-1f357e860666\") " pod="openstack/neutron-db-sync-sb2qw" Sep 30 06:36:20 crc kubenswrapper[4691]: I0930 06:36:20.172876 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f292l\" (UniqueName: \"kubernetes.io/projected/1f010c19-f02f-4c8b-8b12-1f357e860666-kube-api-access-f292l\") pod \"neutron-db-sync-sb2qw\" (UID: \"1f010c19-f02f-4c8b-8b12-1f357e860666\") " pod="openstack/neutron-db-sync-sb2qw" Sep 30 06:36:20 crc kubenswrapper[4691]: I0930 06:36:20.172937 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjv72\" (UniqueName: \"kubernetes.io/projected/3e1ab390-f1ae-4ec9-b5d6-fb137a511e21-kube-api-access-tjv72\") pod \"barbican-db-sync-pblkf\" (UID: \"3e1ab390-f1ae-4ec9-b5d6-fb137a511e21\") " pod="openstack/barbican-db-sync-pblkf" Sep 30 06:36:20 crc kubenswrapper[4691]: I0930 06:36:20.172993 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1f010c19-f02f-4c8b-8b12-1f357e860666-config\") pod \"neutron-db-sync-sb2qw\" (UID: \"1f010c19-f02f-4c8b-8b12-1f357e860666\") " pod="openstack/neutron-db-sync-sb2qw" Sep 30 06:36:20 crc kubenswrapper[4691]: I0930 06:36:20.181470 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3e1ab390-f1ae-4ec9-b5d6-fb137a511e21-db-sync-config-data\") pod \"barbican-db-sync-pblkf\" (UID: \"3e1ab390-f1ae-4ec9-b5d6-fb137a511e21\") " pod="openstack/barbican-db-sync-pblkf" Sep 30 06:36:20 crc kubenswrapper[4691]: I0930 06:36:20.181860 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/1f010c19-f02f-4c8b-8b12-1f357e860666-config\") pod \"neutron-db-sync-sb2qw\" (UID: \"1f010c19-f02f-4c8b-8b12-1f357e860666\") " pod="openstack/neutron-db-sync-sb2qw" Sep 30 06:36:20 crc kubenswrapper[4691]: I0930 06:36:20.191500 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f010c19-f02f-4c8b-8b12-1f357e860666-combined-ca-bundle\") pod \"neutron-db-sync-sb2qw\" (UID: \"1f010c19-f02f-4c8b-8b12-1f357e860666\") " pod="openstack/neutron-db-sync-sb2qw" Sep 30 06:36:20 crc kubenswrapper[4691]: I0930 06:36:20.193618 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f292l\" (UniqueName: \"kubernetes.io/projected/1f010c19-f02f-4c8b-8b12-1f357e860666-kube-api-access-f292l\") pod \"neutron-db-sync-sb2qw\" (UID: \"1f010c19-f02f-4c8b-8b12-1f357e860666\") " pod="openstack/neutron-db-sync-sb2qw" Sep 30 06:36:20 crc kubenswrapper[4691]: I0930 06:36:20.193774 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e1ab390-f1ae-4ec9-b5d6-fb137a511e21-combined-ca-bundle\") pod \"barbican-db-sync-pblkf\" (UID: \"3e1ab390-f1ae-4ec9-b5d6-fb137a511e21\") " pod="openstack/barbican-db-sync-pblkf" Sep 30 06:36:20 crc kubenswrapper[4691]: I0930 06:36:20.207555 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjv72\" (UniqueName: \"kubernetes.io/projected/3e1ab390-f1ae-4ec9-b5d6-fb137a511e21-kube-api-access-tjv72\") pod \"barbican-db-sync-pblkf\" (UID: \"3e1ab390-f1ae-4ec9-b5d6-fb137a511e21\") " pod="openstack/barbican-db-sync-pblkf" Sep 30 06:36:20 crc kubenswrapper[4691]: I0930 06:36:20.242245 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-sb2qw" Sep 30 06:36:20 crc kubenswrapper[4691]: I0930 06:36:20.254556 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-pblkf" Sep 30 06:36:20 crc kubenswrapper[4691]: I0930 06:36:20.274198 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-wmzh4" Sep 30 06:36:20 crc kubenswrapper[4691]: I0930 06:36:20.375279 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7f1deff0-17e6-49cd-bf71-eeeecee2c1c1-credential-keys\") pod \"7f1deff0-17e6-49cd-bf71-eeeecee2c1c1\" (UID: \"7f1deff0-17e6-49cd-bf71-eeeecee2c1c1\") " Sep 30 06:36:20 crc kubenswrapper[4691]: I0930 06:36:20.375559 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f1deff0-17e6-49cd-bf71-eeeecee2c1c1-scripts\") pod \"7f1deff0-17e6-49cd-bf71-eeeecee2c1c1\" (UID: \"7f1deff0-17e6-49cd-bf71-eeeecee2c1c1\") " Sep 30 06:36:20 crc kubenswrapper[4691]: I0930 06:36:20.375784 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7f1deff0-17e6-49cd-bf71-eeeecee2c1c1-fernet-keys\") pod \"7f1deff0-17e6-49cd-bf71-eeeecee2c1c1\" (UID: \"7f1deff0-17e6-49cd-bf71-eeeecee2c1c1\") " Sep 30 06:36:20 crc kubenswrapper[4691]: I0930 06:36:20.375913 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f1deff0-17e6-49cd-bf71-eeeecee2c1c1-config-data\") pod \"7f1deff0-17e6-49cd-bf71-eeeecee2c1c1\" (UID: \"7f1deff0-17e6-49cd-bf71-eeeecee2c1c1\") " Sep 30 06:36:20 crc kubenswrapper[4691]: I0930 06:36:20.376062 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7t6h\" (UniqueName: \"kubernetes.io/projected/7f1deff0-17e6-49cd-bf71-eeeecee2c1c1-kube-api-access-w7t6h\") pod \"7f1deff0-17e6-49cd-bf71-eeeecee2c1c1\" (UID: \"7f1deff0-17e6-49cd-bf71-eeeecee2c1c1\") " Sep 30 06:36:20 crc kubenswrapper[4691]: I0930 06:36:20.376143 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f1deff0-17e6-49cd-bf71-eeeecee2c1c1-combined-ca-bundle\") pod \"7f1deff0-17e6-49cd-bf71-eeeecee2c1c1\" (UID: \"7f1deff0-17e6-49cd-bf71-eeeecee2c1c1\") " Sep 30 06:36:20 crc kubenswrapper[4691]: I0930 06:36:20.379496 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f1deff0-17e6-49cd-bf71-eeeecee2c1c1-scripts" (OuterVolumeSpecName: "scripts") pod "7f1deff0-17e6-49cd-bf71-eeeecee2c1c1" (UID: "7f1deff0-17e6-49cd-bf71-eeeecee2c1c1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:36:20 crc kubenswrapper[4691]: I0930 06:36:20.379940 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f1deff0-17e6-49cd-bf71-eeeecee2c1c1-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "7f1deff0-17e6-49cd-bf71-eeeecee2c1c1" (UID: "7f1deff0-17e6-49cd-bf71-eeeecee2c1c1"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:36:20 crc kubenswrapper[4691]: I0930 06:36:20.382935 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f1deff0-17e6-49cd-bf71-eeeecee2c1c1-kube-api-access-w7t6h" (OuterVolumeSpecName: "kube-api-access-w7t6h") pod "7f1deff0-17e6-49cd-bf71-eeeecee2c1c1" (UID: "7f1deff0-17e6-49cd-bf71-eeeecee2c1c1"). InnerVolumeSpecName "kube-api-access-w7t6h". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:36:20 crc kubenswrapper[4691]: I0930 06:36:20.386125 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f1deff0-17e6-49cd-bf71-eeeecee2c1c1-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "7f1deff0-17e6-49cd-bf71-eeeecee2c1c1" (UID: "7f1deff0-17e6-49cd-bf71-eeeecee2c1c1"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:36:20 crc kubenswrapper[4691]: E0930 06:36:20.404191 4691 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f1deff0-17e6-49cd-bf71-eeeecee2c1c1-combined-ca-bundle podName:7f1deff0-17e6-49cd-bf71-eeeecee2c1c1 nodeName:}" failed. No retries permitted until 2025-09-30 06:36:20.904164581 +0000 UTC m=+1024.379185621 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/7f1deff0-17e6-49cd-bf71-eeeecee2c1c1-combined-ca-bundle") pod "7f1deff0-17e6-49cd-bf71-eeeecee2c1c1" (UID: "7f1deff0-17e6-49cd-bf71-eeeecee2c1c1") : error deleting /var/lib/kubelet/pods/7f1deff0-17e6-49cd-bf71-eeeecee2c1c1/volume-subpaths: remove /var/lib/kubelet/pods/7f1deff0-17e6-49cd-bf71-eeeecee2c1c1/volume-subpaths: no such file or directory Sep 30 06:36:20 crc kubenswrapper[4691]: I0930 06:36:20.406689 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f1deff0-17e6-49cd-bf71-eeeecee2c1c1-config-data" (OuterVolumeSpecName: "config-data") pod "7f1deff0-17e6-49cd-bf71-eeeecee2c1c1" (UID: "7f1deff0-17e6-49cd-bf71-eeeecee2c1c1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:36:20 crc kubenswrapper[4691]: I0930 06:36:20.478124 4691 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/7f1deff0-17e6-49cd-bf71-eeeecee2c1c1-credential-keys\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:20 crc kubenswrapper[4691]: I0930 06:36:20.478164 4691 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7f1deff0-17e6-49cd-bf71-eeeecee2c1c1-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:20 crc kubenswrapper[4691]: I0930 06:36:20.478175 4691 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/7f1deff0-17e6-49cd-bf71-eeeecee2c1c1-fernet-keys\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:20 crc kubenswrapper[4691]: I0930 06:36:20.478187 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7f1deff0-17e6-49cd-bf71-eeeecee2c1c1-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:20 crc kubenswrapper[4691]: I0930 06:36:20.478199 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7t6h\" (UniqueName: \"kubernetes.io/projected/7f1deff0-17e6-49cd-bf71-eeeecee2c1c1-kube-api-access-w7t6h\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:20 crc kubenswrapper[4691]: E0930 06:36:20.665008 4691 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.30:5001/podified-master-centos10/openstack-ceilometer-central:watcher_latest" Sep 30 06:36:20 crc kubenswrapper[4691]: E0930 06:36:20.665089 4691 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.30:5001/podified-master-centos10/openstack-ceilometer-central:watcher_latest" Sep 30 06:36:20 crc kubenswrapper[4691]: E0930 06:36:20.665279 4691 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:38.102.83.30:5001/podified-master-centos10/openstack-ceilometer-central:watcher_latest,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5d5h5b9h5f7h597h57h5d7h9hf4h5b9h598h55bh5f9h586h5f6h5c8h664hd9h696h545h56fh647h54ch656hb5h58h6dh76h85h7ch687h577h7q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vmmmv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(6781ac88-7516-4101-8abd-9cacfbb930b7): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 06:36:20 crc kubenswrapper[4691]: W0930 06:36:20.734120 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95c4dd18_3200_4193_9868_7315a13103b3.slice/crio-f1718dd175fa0a0a38f6f40c796e91a7ca46eb4853eeef23b158e6929a434fe5 WatchSource:0}: Error finding container f1718dd175fa0a0a38f6f40c796e91a7ca46eb4853eeef23b158e6929a434fe5: Status 404 returned error can't find the container with id f1718dd175fa0a0a38f6f40c796e91a7ca46eb4853eeef23b158e6929a434fe5 Sep 30 06:36:20 crc kubenswrapper[4691]: I0930 06:36:20.891125 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-wmzh4" Sep 30 06:36:20 crc kubenswrapper[4691]: I0930 06:36:20.891125 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-wmzh4" event={"ID":"7f1deff0-17e6-49cd-bf71-eeeecee2c1c1","Type":"ContainerDied","Data":"21646715f8fefd8d0e0e219b03bab727da2d833cfd1b85af79f809a130c9a005"} Sep 30 06:36:20 crc kubenswrapper[4691]: I0930 06:36:20.891498 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21646715f8fefd8d0e0e219b03bab727da2d833cfd1b85af79f809a130c9a005" Sep 30 06:36:20 crc kubenswrapper[4691]: I0930 06:36:20.893353 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-596cd479b5-w8rf2" event={"ID":"95c4dd18-3200-4193-9868-7315a13103b3","Type":"ContainerStarted","Data":"f1718dd175fa0a0a38f6f40c796e91a7ca46eb4853eeef23b158e6929a434fe5"} Sep 30 06:36:20 crc kubenswrapper[4691]: I0930 06:36:20.986800 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f1deff0-17e6-49cd-bf71-eeeecee2c1c1-combined-ca-bundle\") pod \"7f1deff0-17e6-49cd-bf71-eeeecee2c1c1\" (UID: \"7f1deff0-17e6-49cd-bf71-eeeecee2c1c1\") " Sep 30 06:36:20 crc kubenswrapper[4691]: I0930 06:36:20.992450 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f1deff0-17e6-49cd-bf71-eeeecee2c1c1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7f1deff0-17e6-49cd-bf71-eeeecee2c1c1" (UID: "7f1deff0-17e6-49cd-bf71-eeeecee2c1c1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:36:21 crc kubenswrapper[4691]: I0930 06:36:21.089039 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f1deff0-17e6-49cd-bf71-eeeecee2c1c1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:21 crc kubenswrapper[4691]: I0930 06:36:21.349292 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5755bf4df8-zx9td"] Sep 30 06:36:21 crc kubenswrapper[4691]: I0930 06:36:21.364193 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-wmzh4"] Sep 30 06:36:21 crc kubenswrapper[4691]: I0930 06:36:21.374021 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-wmzh4"] Sep 30 06:36:21 crc kubenswrapper[4691]: I0930 06:36:21.448615 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-j7phx"] Sep 30 06:36:21 crc kubenswrapper[4691]: E0930 06:36:21.450082 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f1deff0-17e6-49cd-bf71-eeeecee2c1c1" containerName="keystone-bootstrap" Sep 30 06:36:21 crc kubenswrapper[4691]: I0930 06:36:21.450101 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f1deff0-17e6-49cd-bf71-eeeecee2c1c1" containerName="keystone-bootstrap" Sep 30 06:36:21 crc kubenswrapper[4691]: I0930 06:36:21.450290 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f1deff0-17e6-49cd-bf71-eeeecee2c1c1" containerName="keystone-bootstrap" Sep 30 06:36:21 crc kubenswrapper[4691]: I0930 06:36:21.453422 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-j7phx" Sep 30 06:36:21 crc kubenswrapper[4691]: I0930 06:36:21.461801 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-j7phx"] Sep 30 06:36:21 crc kubenswrapper[4691]: I0930 06:36:21.462812 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Sep 30 06:36:21 crc kubenswrapper[4691]: I0930 06:36:21.462995 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Sep 30 06:36:21 crc kubenswrapper[4691]: I0930 06:36:21.463105 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Sep 30 06:36:21 crc kubenswrapper[4691]: I0930 06:36:21.463199 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-fxfcb" Sep 30 06:36:21 crc kubenswrapper[4691]: I0930 06:36:21.537930 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-56948c48fd-czzmm"] Sep 30 06:36:21 crc kubenswrapper[4691]: I0930 06:36:21.596718 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6ww5\" (UniqueName: \"kubernetes.io/projected/ca252689-a07e-4e84-a79f-7884687c6db3-kube-api-access-b6ww5\") pod \"keystone-bootstrap-j7phx\" (UID: \"ca252689-a07e-4e84-a79f-7884687c6db3\") " pod="openstack/keystone-bootstrap-j7phx" Sep 30 06:36:21 crc kubenswrapper[4691]: I0930 06:36:21.597002 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ca252689-a07e-4e84-a79f-7884687c6db3-credential-keys\") pod \"keystone-bootstrap-j7phx\" (UID: \"ca252689-a07e-4e84-a79f-7884687c6db3\") " pod="openstack/keystone-bootstrap-j7phx" Sep 30 06:36:21 crc kubenswrapper[4691]: I0930 06:36:21.597098 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca252689-a07e-4e84-a79f-7884687c6db3-config-data\") pod \"keystone-bootstrap-j7phx\" (UID: \"ca252689-a07e-4e84-a79f-7884687c6db3\") " pod="openstack/keystone-bootstrap-j7phx" Sep 30 06:36:21 crc kubenswrapper[4691]: I0930 06:36:21.597122 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca252689-a07e-4e84-a79f-7884687c6db3-scripts\") pod \"keystone-bootstrap-j7phx\" (UID: \"ca252689-a07e-4e84-a79f-7884687c6db3\") " pod="openstack/keystone-bootstrap-j7phx" Sep 30 06:36:21 crc kubenswrapper[4691]: I0930 06:36:21.597149 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca252689-a07e-4e84-a79f-7884687c6db3-combined-ca-bundle\") pod \"keystone-bootstrap-j7phx\" (UID: \"ca252689-a07e-4e84-a79f-7884687c6db3\") " pod="openstack/keystone-bootstrap-j7phx" Sep 30 06:36:21 crc kubenswrapper[4691]: I0930 06:36:21.597206 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ca252689-a07e-4e84-a79f-7884687c6db3-fernet-keys\") pod \"keystone-bootstrap-j7phx\" (UID: \"ca252689-a07e-4e84-a79f-7884687c6db3\") " pod="openstack/keystone-bootstrap-j7phx" Sep 30 06:36:21 crc kubenswrapper[4691]: I0930 06:36:21.612957 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-sb2qw"] Sep 30 06:36:21 crc kubenswrapper[4691]: I0930 06:36:21.622905 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-pblkf"] Sep 30 06:36:21 crc kubenswrapper[4691]: I0930 06:36:21.698342 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ca252689-a07e-4e84-a79f-7884687c6db3-fernet-keys\") pod \"keystone-bootstrap-j7phx\" (UID: \"ca252689-a07e-4e84-a79f-7884687c6db3\") " pod="openstack/keystone-bootstrap-j7phx" Sep 30 06:36:21 crc kubenswrapper[4691]: I0930 06:36:21.698378 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6ww5\" (UniqueName: \"kubernetes.io/projected/ca252689-a07e-4e84-a79f-7884687c6db3-kube-api-access-b6ww5\") pod \"keystone-bootstrap-j7phx\" (UID: \"ca252689-a07e-4e84-a79f-7884687c6db3\") " pod="openstack/keystone-bootstrap-j7phx" Sep 30 06:36:21 crc kubenswrapper[4691]: I0930 06:36:21.698398 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ca252689-a07e-4e84-a79f-7884687c6db3-credential-keys\") pod \"keystone-bootstrap-j7phx\" (UID: \"ca252689-a07e-4e84-a79f-7884687c6db3\") " pod="openstack/keystone-bootstrap-j7phx" Sep 30 06:36:21 crc kubenswrapper[4691]: I0930 06:36:21.698486 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca252689-a07e-4e84-a79f-7884687c6db3-config-data\") pod \"keystone-bootstrap-j7phx\" (UID: \"ca252689-a07e-4e84-a79f-7884687c6db3\") " pod="openstack/keystone-bootstrap-j7phx" Sep 30 06:36:21 crc kubenswrapper[4691]: I0930 06:36:21.698508 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca252689-a07e-4e84-a79f-7884687c6db3-scripts\") pod \"keystone-bootstrap-j7phx\" (UID: \"ca252689-a07e-4e84-a79f-7884687c6db3\") " pod="openstack/keystone-bootstrap-j7phx" Sep 30 06:36:21 crc kubenswrapper[4691]: I0930 06:36:21.698540 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca252689-a07e-4e84-a79f-7884687c6db3-combined-ca-bundle\") pod \"keystone-bootstrap-j7phx\" (UID: \"ca252689-a07e-4e84-a79f-7884687c6db3\") " pod="openstack/keystone-bootstrap-j7phx" Sep 30 06:36:21 crc kubenswrapper[4691]: I0930 06:36:21.726088 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca252689-a07e-4e84-a79f-7884687c6db3-combined-ca-bundle\") pod \"keystone-bootstrap-j7phx\" (UID: \"ca252689-a07e-4e84-a79f-7884687c6db3\") " pod="openstack/keystone-bootstrap-j7phx" Sep 30 06:36:21 crc kubenswrapper[4691]: I0930 06:36:21.726287 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca252689-a07e-4e84-a79f-7884687c6db3-scripts\") pod \"keystone-bootstrap-j7phx\" (UID: \"ca252689-a07e-4e84-a79f-7884687c6db3\") " pod="openstack/keystone-bootstrap-j7phx" Sep 30 06:36:21 crc kubenswrapper[4691]: I0930 06:36:21.727482 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6ww5\" (UniqueName: \"kubernetes.io/projected/ca252689-a07e-4e84-a79f-7884687c6db3-kube-api-access-b6ww5\") pod \"keystone-bootstrap-j7phx\" (UID: \"ca252689-a07e-4e84-a79f-7884687c6db3\") " pod="openstack/keystone-bootstrap-j7phx" Sep 30 06:36:21 crc kubenswrapper[4691]: I0930 06:36:21.728860 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca252689-a07e-4e84-a79f-7884687c6db3-config-data\") pod \"keystone-bootstrap-j7phx\" (UID: \"ca252689-a07e-4e84-a79f-7884687c6db3\") " pod="openstack/keystone-bootstrap-j7phx" Sep 30 06:36:21 crc kubenswrapper[4691]: I0930 06:36:21.731408 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ca252689-a07e-4e84-a79f-7884687c6db3-fernet-keys\") pod \"keystone-bootstrap-j7phx\" (UID: \"ca252689-a07e-4e84-a79f-7884687c6db3\") " pod="openstack/keystone-bootstrap-j7phx" Sep 30 06:36:21 crc kubenswrapper[4691]: I0930 06:36:21.734223 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ca252689-a07e-4e84-a79f-7884687c6db3-credential-keys\") pod \"keystone-bootstrap-j7phx\" (UID: \"ca252689-a07e-4e84-a79f-7884687c6db3\") " pod="openstack/keystone-bootstrap-j7phx" Sep 30 06:36:21 crc kubenswrapper[4691]: I0930 06:36:21.792049 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-j7phx" Sep 30 06:36:21 crc kubenswrapper[4691]: I0930 06:36:21.863952 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-fh78b"] Sep 30 06:36:21 crc kubenswrapper[4691]: I0930 06:36:21.984731 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f69875457-kvdnf" event={"ID":"730e43c6-3b1f-4a8c-9540-4ff131592381","Type":"ContainerStarted","Data":"be42e16a38be0dbe741dfa7d7ee8358eef6373849e72d61dfa4010eb140cc226"} Sep 30 06:36:22 crc kubenswrapper[4691]: I0930 06:36:22.009104 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"a1cbb8f9-118e-48b5-ae92-067ece5295a2","Type":"ContainerStarted","Data":"0b11f18aefe7f7db7a62a58255ec5d31d85eaf58a99f7dfcbd8f5869b1dd973b"} Sep 30 06:36:22 crc kubenswrapper[4691]: I0930 06:36:22.038119 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-pblkf" event={"ID":"3e1ab390-f1ae-4ec9-b5d6-fb137a511e21","Type":"ContainerStarted","Data":"b9703e08c9440c1c4381523edd6c55dba9d1a581cb0311632ae4dd7494ec41bf"} Sep 30 06:36:22 crc kubenswrapper[4691]: I0930 06:36:22.040963 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-596cd479b5-w8rf2" event={"ID":"95c4dd18-3200-4193-9868-7315a13103b3","Type":"ContainerStarted","Data":"97368ae4b6b7644978046f502ca13057d3a35202a18e92e04b831593c190457f"} Sep 30 06:36:22 crc kubenswrapper[4691]: I0930 06:36:22.043351 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-jfg2g" event={"ID":"6315532b-2604-4b19-8b3f-cb4bb9ff83f6","Type":"ContainerStarted","Data":"10cd68f7901e1f9bda7ddcbd82ff985ba27907d359a88feddc49294014fb128d"} Sep 30 06:36:22 crc kubenswrapper[4691]: I0930 06:36:22.046275 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7597958cd9-k94q9" event={"ID":"04be0ebf-14ea-4b62-b235-af7e6fdff8ee","Type":"ContainerStarted","Data":"1adf575f4136c8beacd48f416506a13b997d4fab22b6618e34606bc3c0c95a54"} Sep 30 06:36:22 crc kubenswrapper[4691]: I0930 06:36:22.046987 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7597958cd9-k94q9" Sep 30 06:36:22 crc kubenswrapper[4691]: I0930 06:36:22.050182 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"b9e263ab-2384-42b0-8c7b-a787bcf361a9","Type":"ContainerStarted","Data":"3d912d20c8216080d5d5d0f48d6809ba39b87bf21d5a0613b94bdc715405462c"} Sep 30 06:36:22 crc kubenswrapper[4691]: I0930 06:36:22.052022 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-56948c48fd-czzmm" event={"ID":"fec95e1e-14f4-4093-b1d4-402c29686348","Type":"ContainerStarted","Data":"4de7cb943d4965cc60ff6df97931bda51292ac3d62d8c7731c658ffed245065b"} Sep 30 06:36:22 crc kubenswrapper[4691]: I0930 06:36:22.053038 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-sb2qw" event={"ID":"1f010c19-f02f-4c8b-8b12-1f357e860666","Type":"ContainerStarted","Data":"7c2b1cb34fb8435a0af0821e1b6738645ad380afe74f7ff343bdb442000c1ed7"} Sep 30 06:36:22 crc kubenswrapper[4691]: I0930 06:36:22.054379 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67dcbcd77c-9lrb5" event={"ID":"ca89855f-21bb-4d05-93aa-5705f6d93548","Type":"ContainerStarted","Data":"3e7e8bac6245462541c670bb16eee4e523e1118c6c7a122350fd5ef75f6ad5a3"} Sep 30 06:36:22 crc kubenswrapper[4691]: I0930 06:36:22.060749 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5755bf4df8-zx9td" event={"ID":"cc29be35-3ceb-4a88-af6e-77e2d0cbab83","Type":"ContainerStarted","Data":"7989021c2b637df47a2ef8b27fea833ce3c2ef304e7224dad15d0d8faa5ab7fb"} Sep 30 06:36:22 crc kubenswrapper[4691]: I0930 06:36:22.071556 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-applier-0" podStartSLOduration=4.472713069 podStartE2EDuration="18.071536116s" podCreationTimestamp="2025-09-30 06:36:04 +0000 UTC" firstStartedPulling="2025-09-30 06:36:07.183380967 +0000 UTC m=+1010.658402007" lastFinishedPulling="2025-09-30 06:36:20.782204014 +0000 UTC m=+1024.257225054" observedRunningTime="2025-09-30 06:36:22.038396832 +0000 UTC m=+1025.513417872" watchObservedRunningTime="2025-09-30 06:36:22.071536116 +0000 UTC m=+1025.546557156" Sep 30 06:36:22 crc kubenswrapper[4691]: I0930 06:36:22.082044 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-jfg2g" podStartSLOduration=3.344989147 podStartE2EDuration="17.082027842s" podCreationTimestamp="2025-09-30 06:36:05 +0000 UTC" firstStartedPulling="2025-09-30 06:36:07.062058992 +0000 UTC m=+1010.537080032" lastFinishedPulling="2025-09-30 06:36:20.799097687 +0000 UTC m=+1024.274118727" observedRunningTime="2025-09-30 06:36:22.060334266 +0000 UTC m=+1025.535355306" watchObservedRunningTime="2025-09-30 06:36:22.082027842 +0000 UTC m=+1025.557048882" Sep 30 06:36:22 crc kubenswrapper[4691]: I0930 06:36:22.095471 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7597958cd9-k94q9" podStartSLOduration=17.095457513 podStartE2EDuration="17.095457513s" podCreationTimestamp="2025-09-30 06:36:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:36:22.076220196 +0000 UTC m=+1025.551241236" watchObservedRunningTime="2025-09-30 06:36:22.095457513 +0000 UTC m=+1025.570478553" Sep 30 06:36:22 crc kubenswrapper[4691]: I0930 06:36:22.107189 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=4.747749475 podStartE2EDuration="18.107170409s" podCreationTimestamp="2025-09-30 06:36:04 +0000 UTC" firstStartedPulling="2025-09-30 06:36:07.427959316 +0000 UTC m=+1010.902980356" lastFinishedPulling="2025-09-30 06:36:20.78738024 +0000 UTC m=+1024.262401290" observedRunningTime="2025-09-30 06:36:22.099583366 +0000 UTC m=+1025.574604416" watchObservedRunningTime="2025-09-30 06:36:22.107170409 +0000 UTC m=+1025.582191449" Sep 30 06:36:22 crc kubenswrapper[4691]: I0930 06:36:22.850574 4691 patch_prober.go:28] interesting pod/machine-config-daemon-4w4k6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 06:36:22 crc kubenswrapper[4691]: I0930 06:36:22.850998 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 06:36:23 crc kubenswrapper[4691]: I0930 06:36:23.081485 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-56948c48fd-czzmm" event={"ID":"fec95e1e-14f4-4093-b1d4-402c29686348","Type":"ContainerStarted","Data":"1de65832b2754676c6b02cb9fbb2da0f0a0b1d713b9d4441ff1b0cc629be5059"} Sep 30 06:36:23 crc kubenswrapper[4691]: I0930 06:36:23.091706 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4","Type":"ContainerStarted","Data":"9ef33d0d68543fde3c6f90b5ade44a9c735c42e3b762cfebb590e6e556c7a956"} Sep 30 06:36:23 crc kubenswrapper[4691]: I0930 06:36:23.091782 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4" containerName="glance-log" containerID="cri-o://804d1b04157cf67e2e31ef98ae0b953c9c0f08bdd533624d4fec354dc9620e6a" gracePeriod=30 Sep 30 06:36:23 crc kubenswrapper[4691]: I0930 06:36:23.091823 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4" containerName="glance-httpd" containerID="cri-o://9ef33d0d68543fde3c6f90b5ade44a9c735c42e3b762cfebb590e6e556c7a956" gracePeriod=30 Sep 30 06:36:23 crc kubenswrapper[4691]: I0930 06:36:23.095097 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-sb2qw" event={"ID":"1f010c19-f02f-4c8b-8b12-1f357e860666","Type":"ContainerStarted","Data":"f53ce7512c565ebd65d02bacc532dddd113915699808a3404774f62e491fb73d"} Sep 30 06:36:23 crc kubenswrapper[4691]: I0930 06:36:23.106250 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5755bf4df8-zx9td" event={"ID":"cc29be35-3ceb-4a88-af6e-77e2d0cbab83","Type":"ContainerStarted","Data":"b433c5a51bea7ed49e49b10d76eaae08ac7b93bc283b6448ac328bf46caa578c"} Sep 30 06:36:23 crc kubenswrapper[4691]: I0930 06:36:23.111522 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4a08181f-97e1-4058-b391-f380edf04dc4","Type":"ContainerStarted","Data":"1fd770ef1f9e0798800dc17f91cc04eeba9741e3bc31160f379a8023d84ac7cf"} Sep 30 06:36:23 crc kubenswrapper[4691]: I0930 06:36:23.111662 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="4a08181f-97e1-4058-b391-f380edf04dc4" containerName="glance-httpd" containerID="cri-o://1fd770ef1f9e0798800dc17f91cc04eeba9741e3bc31160f379a8023d84ac7cf" gracePeriod=30 Sep 30 06:36:23 crc kubenswrapper[4691]: I0930 06:36:23.111654 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="4a08181f-97e1-4058-b391-f380edf04dc4" containerName="glance-log" containerID="cri-o://fc1ff28b65d91551a626aebd026f44be68a915fe6412e4c733806a8d41529709" gracePeriod=30 Sep 30 06:36:23 crc kubenswrapper[4691]: I0930 06:36:23.125171 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-fh78b" event={"ID":"071e402d-9775-412e-ad8a-1643cd646d7c","Type":"ContainerStarted","Data":"3e47f682055924cdf667683eaf5a375316289e58027ae0b596c305f0f002919c"} Sep 30 06:36:23 crc kubenswrapper[4691]: I0930 06:36:23.138492 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=18.138473929 podStartE2EDuration="18.138473929s" podCreationTimestamp="2025-09-30 06:36:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:36:23.11856714 +0000 UTC m=+1026.593588190" watchObservedRunningTime="2025-09-30 06:36:23.138473929 +0000 UTC m=+1026.613494969" Sep 30 06:36:23 crc kubenswrapper[4691]: I0930 06:36:23.150676 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-sb2qw" podStartSLOduration=4.150652599 podStartE2EDuration="4.150652599s" podCreationTimestamp="2025-09-30 06:36:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:36:23.135164622 +0000 UTC m=+1026.610185662" watchObservedRunningTime="2025-09-30 06:36:23.150652599 +0000 UTC m=+1026.625673639" Sep 30 06:36:23 crc kubenswrapper[4691]: I0930 06:36:23.150797 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-j7phx"] Sep 30 06:36:23 crc kubenswrapper[4691]: W0930 06:36:23.160688 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca252689_a07e_4e84_a79f_7884687c6db3.slice/crio-f0efd8fd7ae1b1bccd870316d8314f6abc06f1014bbd00c71ad10cdfd06fa70c WatchSource:0}: Error finding container f0efd8fd7ae1b1bccd870316d8314f6abc06f1014bbd00c71ad10cdfd06fa70c: Status 404 returned error can't find the container with id f0efd8fd7ae1b1bccd870316d8314f6abc06f1014bbd00c71ad10cdfd06fa70c Sep 30 06:36:23 crc kubenswrapper[4691]: I0930 06:36:23.207543 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=18.207522964 podStartE2EDuration="18.207522964s" podCreationTimestamp="2025-09-30 06:36:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:36:23.171533099 +0000 UTC m=+1026.646554149" watchObservedRunningTime="2025-09-30 06:36:23.207522964 +0000 UTC m=+1026.682543994" Sep 30 06:36:23 crc kubenswrapper[4691]: I0930 06:36:23.316125 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f1deff0-17e6-49cd-bf71-eeeecee2c1c1" path="/var/lib/kubelet/pods/7f1deff0-17e6-49cd-bf71-eeeecee2c1c1/volumes" Sep 30 06:36:23 crc kubenswrapper[4691]: I0930 06:36:23.977511 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.115389 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a08181f-97e1-4058-b391-f380edf04dc4-config-data\") pod \"4a08181f-97e1-4058-b391-f380edf04dc4\" (UID: \"4a08181f-97e1-4058-b391-f380edf04dc4\") " Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.115498 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a08181f-97e1-4058-b391-f380edf04dc4-combined-ca-bundle\") pod \"4a08181f-97e1-4058-b391-f380edf04dc4\" (UID: \"4a08181f-97e1-4058-b391-f380edf04dc4\") " Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.115542 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a08181f-97e1-4058-b391-f380edf04dc4-internal-tls-certs\") pod \"4a08181f-97e1-4058-b391-f380edf04dc4\" (UID: \"4a08181f-97e1-4058-b391-f380edf04dc4\") " Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.115557 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"4a08181f-97e1-4058-b391-f380edf04dc4\" (UID: \"4a08181f-97e1-4058-b391-f380edf04dc4\") " Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.115575 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a08181f-97e1-4058-b391-f380edf04dc4-logs\") pod \"4a08181f-97e1-4058-b391-f380edf04dc4\" (UID: \"4a08181f-97e1-4058-b391-f380edf04dc4\") " Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.115612 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4a08181f-97e1-4058-b391-f380edf04dc4-httpd-run\") pod \"4a08181f-97e1-4058-b391-f380edf04dc4\" (UID: \"4a08181f-97e1-4058-b391-f380edf04dc4\") " Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.115694 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a08181f-97e1-4058-b391-f380edf04dc4-scripts\") pod \"4a08181f-97e1-4058-b391-f380edf04dc4\" (UID: \"4a08181f-97e1-4058-b391-f380edf04dc4\") " Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.115735 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6kt7t\" (UniqueName: \"kubernetes.io/projected/4a08181f-97e1-4058-b391-f380edf04dc4-kube-api-access-6kt7t\") pod \"4a08181f-97e1-4058-b391-f380edf04dc4\" (UID: \"4a08181f-97e1-4058-b391-f380edf04dc4\") " Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.116206 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a08181f-97e1-4058-b391-f380edf04dc4-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "4a08181f-97e1-4058-b391-f380edf04dc4" (UID: "4a08181f-97e1-4058-b391-f380edf04dc4"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.116375 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a08181f-97e1-4058-b391-f380edf04dc4-logs" (OuterVolumeSpecName: "logs") pod "4a08181f-97e1-4058-b391-f380edf04dc4" (UID: "4a08181f-97e1-4058-b391-f380edf04dc4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.118158 4691 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4a08181f-97e1-4058-b391-f380edf04dc4-logs\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.118182 4691 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/4a08181f-97e1-4058-b391-f380edf04dc4-httpd-run\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.121175 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a08181f-97e1-4058-b391-f380edf04dc4-kube-api-access-6kt7t" (OuterVolumeSpecName: "kube-api-access-6kt7t") pod "4a08181f-97e1-4058-b391-f380edf04dc4" (UID: "4a08181f-97e1-4058-b391-f380edf04dc4"). InnerVolumeSpecName "kube-api-access-6kt7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.129082 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a08181f-97e1-4058-b391-f380edf04dc4-scripts" (OuterVolumeSpecName: "scripts") pod "4a08181f-97e1-4058-b391-f380edf04dc4" (UID: "4a08181f-97e1-4058-b391-f380edf04dc4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.137472 4691 generic.go:334] "Generic (PLEG): container finished" podID="4a08181f-97e1-4058-b391-f380edf04dc4" containerID="1fd770ef1f9e0798800dc17f91cc04eeba9741e3bc31160f379a8023d84ac7cf" exitCode=0 Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.137499 4691 generic.go:334] "Generic (PLEG): container finished" podID="4a08181f-97e1-4058-b391-f380edf04dc4" containerID="fc1ff28b65d91551a626aebd026f44be68a915fe6412e4c733806a8d41529709" exitCode=143 Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.137535 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4a08181f-97e1-4058-b391-f380edf04dc4","Type":"ContainerDied","Data":"1fd770ef1f9e0798800dc17f91cc04eeba9741e3bc31160f379a8023d84ac7cf"} Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.137560 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4a08181f-97e1-4058-b391-f380edf04dc4","Type":"ContainerDied","Data":"fc1ff28b65d91551a626aebd026f44be68a915fe6412e4c733806a8d41529709"} Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.137570 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"4a08181f-97e1-4058-b391-f380edf04dc4","Type":"ContainerDied","Data":"8e4e3fbf7d5088afd10ef352f1633264dcb34ab47c991a6b945010bc5a767aa2"} Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.137584 4691 scope.go:117] "RemoveContainer" containerID="1fd770ef1f9e0798800dc17f91cc04eeba9741e3bc31160f379a8023d84ac7cf" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.137726 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.140540 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-j7phx" event={"ID":"ca252689-a07e-4e84-a79f-7884687c6db3","Type":"ContainerStarted","Data":"102e58ebdd5146960fce63254dfc2ca4b3e0d16f64058217cd413713ce9faf5e"} Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.140567 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-j7phx" event={"ID":"ca252689-a07e-4e84-a79f-7884687c6db3","Type":"ContainerStarted","Data":"f0efd8fd7ae1b1bccd870316d8314f6abc06f1014bbd00c71ad10cdfd06fa70c"} Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.158311 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "4a08181f-97e1-4058-b391-f380edf04dc4" (UID: "4a08181f-97e1-4058-b391-f380edf04dc4"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.160644 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-56948c48fd-czzmm" event={"ID":"fec95e1e-14f4-4093-b1d4-402c29686348","Type":"ContainerStarted","Data":"2dc14da3c76b924988caa1182d9ec5fc15bea9864fac9f94f0c0a665944a34e2"} Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.179082 4691 generic.go:334] "Generic (PLEG): container finished" podID="d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4" containerID="9ef33d0d68543fde3c6f90b5ade44a9c735c42e3b762cfebb590e6e556c7a956" exitCode=0 Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.179543 4691 generic.go:334] "Generic (PLEG): container finished" podID="d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4" containerID="804d1b04157cf67e2e31ef98ae0b953c9c0f08bdd533624d4fec354dc9620e6a" exitCode=143 Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.179234 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4","Type":"ContainerDied","Data":"9ef33d0d68543fde3c6f90b5ade44a9c735c42e3b762cfebb590e6e556c7a956"} Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.179629 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4","Type":"ContainerDied","Data":"804d1b04157cf67e2e31ef98ae0b953c9c0f08bdd533624d4fec354dc9620e6a"} Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.188880 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67dcbcd77c-9lrb5" event={"ID":"ca89855f-21bb-4d05-93aa-5705f6d93548","Type":"ContainerStarted","Data":"95d2427331b0abd1a63d88739144370f1f4b11996b83b09c945ab4a2bfed6adc"} Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.189042 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-67dcbcd77c-9lrb5" podUID="ca89855f-21bb-4d05-93aa-5705f6d93548" containerName="horizon-log" containerID="cri-o://3e7e8bac6245462541c670bb16eee4e523e1118c6c7a122350fd5ef75f6ad5a3" gracePeriod=30 Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.189118 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-67dcbcd77c-9lrb5" podUID="ca89855f-21bb-4d05-93aa-5705f6d93548" containerName="horizon" containerID="cri-o://95d2427331b0abd1a63d88739144370f1f4b11996b83b09c945ab4a2bfed6adc" gracePeriod=30 Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.191459 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-j7phx" podStartSLOduration=3.191448424 podStartE2EDuration="3.191448424s" podCreationTimestamp="2025-09-30 06:36:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:36:24.178986653 +0000 UTC m=+1027.654007693" watchObservedRunningTime="2025-09-30 06:36:24.191448424 +0000 UTC m=+1027.666469454" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.197529 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6781ac88-7516-4101-8abd-9cacfbb930b7","Type":"ContainerStarted","Data":"50d0d88a5f1231386758cbaf3ab91925e14ccbf64d6051e87f834402f40a6ee1"} Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.201243 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a08181f-97e1-4058-b391-f380edf04dc4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4a08181f-97e1-4058-b391-f380edf04dc4" (UID: "4a08181f-97e1-4058-b391-f380edf04dc4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.206918 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-596cd479b5-w8rf2" podUID="95c4dd18-3200-4193-9868-7315a13103b3" containerName="horizon-log" containerID="cri-o://97368ae4b6b7644978046f502ca13057d3a35202a18e92e04b831593c190457f" gracePeriod=30 Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.206993 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-596cd479b5-w8rf2" event={"ID":"95c4dd18-3200-4193-9868-7315a13103b3","Type":"ContainerStarted","Data":"45fe2b4c11d6247066471f070f55ce4395f18598527836e58e649a9a6563713c"} Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.207020 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-596cd479b5-w8rf2" podUID="95c4dd18-3200-4193-9868-7315a13103b3" containerName="horizon" containerID="cri-o://45fe2b4c11d6247066471f070f55ce4395f18598527836e58e649a9a6563713c" gracePeriod=30 Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.212819 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5755bf4df8-zx9td" event={"ID":"cc29be35-3ceb-4a88-af6e-77e2d0cbab83","Type":"ContainerStarted","Data":"6cf0134f04e4636a4eb6deba05544969528280e6e7e910d2525ad3eab4aef3df"} Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.215174 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-56948c48fd-czzmm" podStartSLOduration=11.215155984 podStartE2EDuration="11.215155984s" podCreationTimestamp="2025-09-30 06:36:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:36:24.202794407 +0000 UTC m=+1027.677815447" watchObservedRunningTime="2025-09-30 06:36:24.215155984 +0000 UTC m=+1027.690177024" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.222771 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6kt7t\" (UniqueName: \"kubernetes.io/projected/4a08181f-97e1-4058-b391-f380edf04dc4-kube-api-access-6kt7t\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.222795 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a08181f-97e1-4058-b391-f380edf04dc4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.222816 4691 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.222826 4691 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a08181f-97e1-4058-b391-f380edf04dc4-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.228955 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a08181f-97e1-4058-b391-f380edf04dc4-config-data" (OuterVolumeSpecName: "config-data") pod "4a08181f-97e1-4058-b391-f380edf04dc4" (UID: "4a08181f-97e1-4058-b391-f380edf04dc4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.229633 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-67dcbcd77c-9lrb5" podStartSLOduration=6.38805375 podStartE2EDuration="20.229614568s" podCreationTimestamp="2025-09-30 06:36:04 +0000 UTC" firstStartedPulling="2025-09-30 06:36:07.17536916 +0000 UTC m=+1010.650390200" lastFinishedPulling="2025-09-30 06:36:21.016929978 +0000 UTC m=+1024.491951018" observedRunningTime="2025-09-30 06:36:24.227022785 +0000 UTC m=+1027.702043845" watchObservedRunningTime="2025-09-30 06:36:24.229614568 +0000 UTC m=+1027.704635608" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.230524 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f69875457-kvdnf" event={"ID":"730e43c6-3b1f-4a8c-9540-4ff131592381","Type":"ContainerStarted","Data":"0a8b70070c7c4c4b4e43f91d12a71b2845f849b3dbe059d8b6fc0efc4dd3e0e8"} Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.231548 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-f69875457-kvdnf" podUID="730e43c6-3b1f-4a8c-9540-4ff131592381" containerName="horizon-log" containerID="cri-o://be42e16a38be0dbe741dfa7d7ee8358eef6373849e72d61dfa4010eb140cc226" gracePeriod=30 Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.231640 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-f69875457-kvdnf" podUID="730e43c6-3b1f-4a8c-9540-4ff131592381" containerName="horizon" containerID="cri-o://0a8b70070c7c4c4b4e43f91d12a71b2845f849b3dbe059d8b6fc0efc4dd3e0e8" gracePeriod=30 Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.268547 4691 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.275331 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-596cd479b5-w8rf2" podStartSLOduration=15.882062404 podStartE2EDuration="16.275310205s" podCreationTimestamp="2025-09-30 06:36:08 +0000 UTC" firstStartedPulling="2025-09-30 06:36:20.767028758 +0000 UTC m=+1024.242049808" lastFinishedPulling="2025-09-30 06:36:21.160276569 +0000 UTC m=+1024.635297609" observedRunningTime="2025-09-30 06:36:24.259841788 +0000 UTC m=+1027.734862838" watchObservedRunningTime="2025-09-30 06:36:24.275310205 +0000 UTC m=+1027.750331245" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.292958 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5755bf4df8-zx9td" podStartSLOduration=11.29293974 podStartE2EDuration="11.29293974s" podCreationTimestamp="2025-09-30 06:36:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:36:24.281234244 +0000 UTC m=+1027.756255294" watchObservedRunningTime="2025-09-30 06:36:24.29293974 +0000 UTC m=+1027.767960780" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.309001 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a08181f-97e1-4058-b391-f380edf04dc4-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "4a08181f-97e1-4058-b391-f380edf04dc4" (UID: "4a08181f-97e1-4058-b391-f380edf04dc4"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.309727 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.313831 4691 scope.go:117] "RemoveContainer" containerID="fc1ff28b65d91551a626aebd026f44be68a915fe6412e4c733806a8d41529709" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.323538 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4-scripts\") pod \"d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4\" (UID: \"d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4\") " Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.323580 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4-combined-ca-bundle\") pod \"d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4\" (UID: \"d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4\") " Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.323634 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4\" (UID: \"d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4\") " Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.323656 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49bf9\" (UniqueName: \"kubernetes.io/projected/d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4-kube-api-access-49bf9\") pod \"d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4\" (UID: \"d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4\") " Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.323680 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4-public-tls-certs\") pod \"d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4\" (UID: \"d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4\") " Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.323707 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4-config-data\") pod \"d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4\" (UID: \"d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4\") " Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.323728 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4-httpd-run\") pod \"d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4\" (UID: \"d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4\") " Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.323754 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4-logs\") pod \"d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4\" (UID: \"d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4\") " Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.324382 4691 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4a08181f-97e1-4058-b391-f380edf04dc4-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.324399 4691 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.324408 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a08181f-97e1-4058-b391-f380edf04dc4-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.325248 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4-logs" (OuterVolumeSpecName: "logs") pod "d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4" (UID: "d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.327511 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4-kube-api-access-49bf9" (OuterVolumeSpecName: "kube-api-access-49bf9") pod "d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4" (UID: "d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4"). InnerVolumeSpecName "kube-api-access-49bf9". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.329505 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4" (UID: "d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.331969 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-f69875457-kvdnf" podStartSLOduration=5.855875192 podStartE2EDuration="19.331954632s" podCreationTimestamp="2025-09-30 06:36:05 +0000 UTC" firstStartedPulling="2025-09-30 06:36:07.428498833 +0000 UTC m=+1010.903519873" lastFinishedPulling="2025-09-30 06:36:20.904578273 +0000 UTC m=+1024.379599313" observedRunningTime="2025-09-30 06:36:24.316455385 +0000 UTC m=+1027.791476445" watchObservedRunningTime="2025-09-30 06:36:24.331954632 +0000 UTC m=+1027.806975672" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.340227 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4-scripts" (OuterVolumeSpecName: "scripts") pod "d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4" (UID: "d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.359049 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4" (UID: "d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.382978 4691 scope.go:117] "RemoveContainer" containerID="1fd770ef1f9e0798800dc17f91cc04eeba9741e3bc31160f379a8023d84ac7cf" Sep 30 06:36:24 crc kubenswrapper[4691]: E0930 06:36:24.383580 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fd770ef1f9e0798800dc17f91cc04eeba9741e3bc31160f379a8023d84ac7cf\": container with ID starting with 1fd770ef1f9e0798800dc17f91cc04eeba9741e3bc31160f379a8023d84ac7cf not found: ID does not exist" containerID="1fd770ef1f9e0798800dc17f91cc04eeba9741e3bc31160f379a8023d84ac7cf" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.383607 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fd770ef1f9e0798800dc17f91cc04eeba9741e3bc31160f379a8023d84ac7cf"} err="failed to get container status \"1fd770ef1f9e0798800dc17f91cc04eeba9741e3bc31160f379a8023d84ac7cf\": rpc error: code = NotFound desc = could not find container \"1fd770ef1f9e0798800dc17f91cc04eeba9741e3bc31160f379a8023d84ac7cf\": container with ID starting with 1fd770ef1f9e0798800dc17f91cc04eeba9741e3bc31160f379a8023d84ac7cf not found: ID does not exist" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.383624 4691 scope.go:117] "RemoveContainer" containerID="fc1ff28b65d91551a626aebd026f44be68a915fe6412e4c733806a8d41529709" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.383721 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4" (UID: "d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:36:24 crc kubenswrapper[4691]: E0930 06:36:24.384008 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc1ff28b65d91551a626aebd026f44be68a915fe6412e4c733806a8d41529709\": container with ID starting with fc1ff28b65d91551a626aebd026f44be68a915fe6412e4c733806a8d41529709 not found: ID does not exist" containerID="fc1ff28b65d91551a626aebd026f44be68a915fe6412e4c733806a8d41529709" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.384027 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc1ff28b65d91551a626aebd026f44be68a915fe6412e4c733806a8d41529709"} err="failed to get container status \"fc1ff28b65d91551a626aebd026f44be68a915fe6412e4c733806a8d41529709\": rpc error: code = NotFound desc = could not find container \"fc1ff28b65d91551a626aebd026f44be68a915fe6412e4c733806a8d41529709\": container with ID starting with fc1ff28b65d91551a626aebd026f44be68a915fe6412e4c733806a8d41529709 not found: ID does not exist" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.384040 4691 scope.go:117] "RemoveContainer" containerID="1fd770ef1f9e0798800dc17f91cc04eeba9741e3bc31160f379a8023d84ac7cf" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.385667 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fd770ef1f9e0798800dc17f91cc04eeba9741e3bc31160f379a8023d84ac7cf"} err="failed to get container status \"1fd770ef1f9e0798800dc17f91cc04eeba9741e3bc31160f379a8023d84ac7cf\": rpc error: code = NotFound desc = could not find container \"1fd770ef1f9e0798800dc17f91cc04eeba9741e3bc31160f379a8023d84ac7cf\": container with ID starting with 1fd770ef1f9e0798800dc17f91cc04eeba9741e3bc31160f379a8023d84ac7cf not found: ID does not exist" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.385686 4691 scope.go:117] "RemoveContainer" containerID="fc1ff28b65d91551a626aebd026f44be68a915fe6412e4c733806a8d41529709" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.386824 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc1ff28b65d91551a626aebd026f44be68a915fe6412e4c733806a8d41529709"} err="failed to get container status \"fc1ff28b65d91551a626aebd026f44be68a915fe6412e4c733806a8d41529709\": rpc error: code = NotFound desc = could not find container \"fc1ff28b65d91551a626aebd026f44be68a915fe6412e4c733806a8d41529709\": container with ID starting with fc1ff28b65d91551a626aebd026f44be68a915fe6412e4c733806a8d41529709 not found: ID does not exist" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.415056 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4" (UID: "d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.427512 4691 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.427709 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.427785 4691 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.427841 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-49bf9\" (UniqueName: \"kubernetes.io/projected/d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4-kube-api-access-49bf9\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.427905 4691 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4-public-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.427965 4691 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4-httpd-run\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.428022 4691 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4-logs\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.449828 4691 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.450030 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4-config-data" (OuterVolumeSpecName: "config-data") pod "d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4" (UID: "d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.513659 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.522609 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.529415 4691 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.529447 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.538036 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 06:36:24 crc kubenswrapper[4691]: E0930 06:36:24.538391 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a08181f-97e1-4058-b391-f380edf04dc4" containerName="glance-httpd" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.538403 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a08181f-97e1-4058-b391-f380edf04dc4" containerName="glance-httpd" Sep 30 06:36:24 crc kubenswrapper[4691]: E0930 06:36:24.538410 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4" containerName="glance-log" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.538416 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4" containerName="glance-log" Sep 30 06:36:24 crc kubenswrapper[4691]: E0930 06:36:24.538430 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a08181f-97e1-4058-b391-f380edf04dc4" containerName="glance-log" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.538437 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a08181f-97e1-4058-b391-f380edf04dc4" containerName="glance-log" Sep 30 06:36:24 crc kubenswrapper[4691]: E0930 06:36:24.538453 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4" containerName="glance-httpd" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.538460 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4" containerName="glance-httpd" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.538651 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a08181f-97e1-4058-b391-f380edf04dc4" containerName="glance-log" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.538668 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4" containerName="glance-httpd" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.538681 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4" containerName="glance-log" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.538696 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a08181f-97e1-4058-b391-f380edf04dc4" containerName="glance-httpd" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.539651 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.553537 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.553719 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.561988 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.732238 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9h4x\" (UniqueName: \"kubernetes.io/projected/df950ad9-45e7-4c79-ba30-ef4b423809b0-kube-api-access-g9h4x\") pod \"glance-default-internal-api-0\" (UID: \"df950ad9-45e7-4c79-ba30-ef4b423809b0\") " pod="openstack/glance-default-internal-api-0" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.732289 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df950ad9-45e7-4c79-ba30-ef4b423809b0-scripts\") pod \"glance-default-internal-api-0\" (UID: \"df950ad9-45e7-4c79-ba30-ef4b423809b0\") " pod="openstack/glance-default-internal-api-0" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.732353 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df950ad9-45e7-4c79-ba30-ef4b423809b0-config-data\") pod \"glance-default-internal-api-0\" (UID: \"df950ad9-45e7-4c79-ba30-ef4b423809b0\") " pod="openstack/glance-default-internal-api-0" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.732397 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/df950ad9-45e7-4c79-ba30-ef4b423809b0-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"df950ad9-45e7-4c79-ba30-ef4b423809b0\") " pod="openstack/glance-default-internal-api-0" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.732446 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"df950ad9-45e7-4c79-ba30-ef4b423809b0\") " pod="openstack/glance-default-internal-api-0" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.732502 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df950ad9-45e7-4c79-ba30-ef4b423809b0-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"df950ad9-45e7-4c79-ba30-ef4b423809b0\") " pod="openstack/glance-default-internal-api-0" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.732534 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df950ad9-45e7-4c79-ba30-ef4b423809b0-logs\") pod \"glance-default-internal-api-0\" (UID: \"df950ad9-45e7-4c79-ba30-ef4b423809b0\") " pod="openstack/glance-default-internal-api-0" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.732575 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/df950ad9-45e7-4c79-ba30-ef4b423809b0-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"df950ad9-45e7-4c79-ba30-ef4b423809b0\") " pod="openstack/glance-default-internal-api-0" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.834481 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9h4x\" (UniqueName: \"kubernetes.io/projected/df950ad9-45e7-4c79-ba30-ef4b423809b0-kube-api-access-g9h4x\") pod \"glance-default-internal-api-0\" (UID: \"df950ad9-45e7-4c79-ba30-ef4b423809b0\") " pod="openstack/glance-default-internal-api-0" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.834526 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df950ad9-45e7-4c79-ba30-ef4b423809b0-scripts\") pod \"glance-default-internal-api-0\" (UID: \"df950ad9-45e7-4c79-ba30-ef4b423809b0\") " pod="openstack/glance-default-internal-api-0" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.834568 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df950ad9-45e7-4c79-ba30-ef4b423809b0-config-data\") pod \"glance-default-internal-api-0\" (UID: \"df950ad9-45e7-4c79-ba30-ef4b423809b0\") " pod="openstack/glance-default-internal-api-0" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.834592 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/df950ad9-45e7-4c79-ba30-ef4b423809b0-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"df950ad9-45e7-4c79-ba30-ef4b423809b0\") " pod="openstack/glance-default-internal-api-0" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.834640 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"df950ad9-45e7-4c79-ba30-ef4b423809b0\") " pod="openstack/glance-default-internal-api-0" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.834671 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df950ad9-45e7-4c79-ba30-ef4b423809b0-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"df950ad9-45e7-4c79-ba30-ef4b423809b0\") " pod="openstack/glance-default-internal-api-0" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.834695 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df950ad9-45e7-4c79-ba30-ef4b423809b0-logs\") pod \"glance-default-internal-api-0\" (UID: \"df950ad9-45e7-4c79-ba30-ef4b423809b0\") " pod="openstack/glance-default-internal-api-0" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.834713 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/df950ad9-45e7-4c79-ba30-ef4b423809b0-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"df950ad9-45e7-4c79-ba30-ef4b423809b0\") " pod="openstack/glance-default-internal-api-0" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.836252 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df950ad9-45e7-4c79-ba30-ef4b423809b0-logs\") pod \"glance-default-internal-api-0\" (UID: \"df950ad9-45e7-4c79-ba30-ef4b423809b0\") " pod="openstack/glance-default-internal-api-0" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.836289 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/df950ad9-45e7-4c79-ba30-ef4b423809b0-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"df950ad9-45e7-4c79-ba30-ef4b423809b0\") " pod="openstack/glance-default-internal-api-0" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.836320 4691 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"df950ad9-45e7-4c79-ba30-ef4b423809b0\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-internal-api-0" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.851516 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df950ad9-45e7-4c79-ba30-ef4b423809b0-scripts\") pod \"glance-default-internal-api-0\" (UID: \"df950ad9-45e7-4c79-ba30-ef4b423809b0\") " pod="openstack/glance-default-internal-api-0" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.858193 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/df950ad9-45e7-4c79-ba30-ef4b423809b0-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"df950ad9-45e7-4c79-ba30-ef4b423809b0\") " pod="openstack/glance-default-internal-api-0" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.858548 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df950ad9-45e7-4c79-ba30-ef4b423809b0-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"df950ad9-45e7-4c79-ba30-ef4b423809b0\") " pod="openstack/glance-default-internal-api-0" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.860717 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9h4x\" (UniqueName: \"kubernetes.io/projected/df950ad9-45e7-4c79-ba30-ef4b423809b0-kube-api-access-g9h4x\") pod \"glance-default-internal-api-0\" (UID: \"df950ad9-45e7-4c79-ba30-ef4b423809b0\") " pod="openstack/glance-default-internal-api-0" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.860862 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df950ad9-45e7-4c79-ba30-ef4b423809b0-config-data\") pod \"glance-default-internal-api-0\" (UID: \"df950ad9-45e7-4c79-ba30-ef4b423809b0\") " pod="openstack/glance-default-internal-api-0" Sep 30 06:36:24 crc kubenswrapper[4691]: I0930 06:36:24.892874 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"df950ad9-45e7-4c79-ba30-ef4b423809b0\") " pod="openstack/glance-default-internal-api-0" Sep 30 06:36:25 crc kubenswrapper[4691]: I0930 06:36:25.010094 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Sep 30 06:36:25 crc kubenswrapper[4691]: I0930 06:36:25.059122 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Sep 30 06:36:25 crc kubenswrapper[4691]: I0930 06:36:25.183242 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 06:36:25 crc kubenswrapper[4691]: I0930 06:36:25.298580 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a08181f-97e1-4058-b391-f380edf04dc4" path="/var/lib/kubelet/pods/4a08181f-97e1-4058-b391-f380edf04dc4/volumes" Sep 30 06:36:25 crc kubenswrapper[4691]: I0930 06:36:25.352719 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4","Type":"ContainerDied","Data":"d7d5dbafc155878b8eed18e2b35f3613b5d776cb483ae10923a6957d8f04e189"} Sep 30 06:36:25 crc kubenswrapper[4691]: I0930 06:36:25.352782 4691 scope.go:117] "RemoveContainer" containerID="9ef33d0d68543fde3c6f90b5ade44a9c735c42e3b762cfebb590e6e556c7a956" Sep 30 06:36:25 crc kubenswrapper[4691]: I0930 06:36:25.353010 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 06:36:25 crc kubenswrapper[4691]: I0930 06:36:25.354355 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Sep 30 06:36:25 crc kubenswrapper[4691]: I0930 06:36:25.355609 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-applier-0" Sep 30 06:36:25 crc kubenswrapper[4691]: I0930 06:36:25.356366 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-applier-0" Sep 30 06:36:25 crc kubenswrapper[4691]: I0930 06:36:25.375368 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-67dcbcd77c-9lrb5" Sep 30 06:36:25 crc kubenswrapper[4691]: I0930 06:36:25.412222 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 06:36:25 crc kubenswrapper[4691]: I0930 06:36:25.422461 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 06:36:25 crc kubenswrapper[4691]: I0930 06:36:25.475817 4691 scope.go:117] "RemoveContainer" containerID="804d1b04157cf67e2e31ef98ae0b953c9c0f08bdd533624d4fec354dc9620e6a" Sep 30 06:36:25 crc kubenswrapper[4691]: I0930 06:36:25.476550 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 06:36:25 crc kubenswrapper[4691]: I0930 06:36:25.482762 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 06:36:25 crc kubenswrapper[4691]: I0930 06:36:25.487038 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Sep 30 06:36:25 crc kubenswrapper[4691]: I0930 06:36:25.487519 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Sep 30 06:36:25 crc kubenswrapper[4691]: I0930 06:36:25.492477 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Sep 30 06:36:25 crc kubenswrapper[4691]: I0930 06:36:25.511660 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 06:36:25 crc kubenswrapper[4691]: I0930 06:36:25.550858 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd274ff4-663a-4621-af28-d5fca3e5b139-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"bd274ff4-663a-4621-af28-d5fca3e5b139\") " pod="openstack/glance-default-external-api-0" Sep 30 06:36:25 crc kubenswrapper[4691]: I0930 06:36:25.550929 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd274ff4-663a-4621-af28-d5fca3e5b139-scripts\") pod \"glance-default-external-api-0\" (UID: \"bd274ff4-663a-4621-af28-d5fca3e5b139\") " pod="openstack/glance-default-external-api-0" Sep 30 06:36:25 crc kubenswrapper[4691]: I0930 06:36:25.550955 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"bd274ff4-663a-4621-af28-d5fca3e5b139\") " pod="openstack/glance-default-external-api-0" Sep 30 06:36:25 crc kubenswrapper[4691]: I0930 06:36:25.551004 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd274ff4-663a-4621-af28-d5fca3e5b139-config-data\") pod \"glance-default-external-api-0\" (UID: \"bd274ff4-663a-4621-af28-d5fca3e5b139\") " pod="openstack/glance-default-external-api-0" Sep 30 06:36:25 crc kubenswrapper[4691]: I0930 06:36:25.551041 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd274ff4-663a-4621-af28-d5fca3e5b139-logs\") pod \"glance-default-external-api-0\" (UID: \"bd274ff4-663a-4621-af28-d5fca3e5b139\") " pod="openstack/glance-default-external-api-0" Sep 30 06:36:25 crc kubenswrapper[4691]: I0930 06:36:25.551081 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd274ff4-663a-4621-af28-d5fca3e5b139-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"bd274ff4-663a-4621-af28-d5fca3e5b139\") " pod="openstack/glance-default-external-api-0" Sep 30 06:36:25 crc kubenswrapper[4691]: I0930 06:36:25.551113 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjdqv\" (UniqueName: \"kubernetes.io/projected/bd274ff4-663a-4621-af28-d5fca3e5b139-kube-api-access-bjdqv\") pod \"glance-default-external-api-0\" (UID: \"bd274ff4-663a-4621-af28-d5fca3e5b139\") " pod="openstack/glance-default-external-api-0" Sep 30 06:36:25 crc kubenswrapper[4691]: I0930 06:36:25.551149 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bd274ff4-663a-4621-af28-d5fca3e5b139-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"bd274ff4-663a-4621-af28-d5fca3e5b139\") " pod="openstack/glance-default-external-api-0" Sep 30 06:36:25 crc kubenswrapper[4691]: I0930 06:36:25.572253 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-f69875457-kvdnf" Sep 30 06:36:25 crc kubenswrapper[4691]: I0930 06:36:25.622393 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Sep 30 06:36:25 crc kubenswrapper[4691]: I0930 06:36:25.625155 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-applier-0" Sep 30 06:36:25 crc kubenswrapper[4691]: I0930 06:36:25.653555 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd274ff4-663a-4621-af28-d5fca3e5b139-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"bd274ff4-663a-4621-af28-d5fca3e5b139\") " pod="openstack/glance-default-external-api-0" Sep 30 06:36:25 crc kubenswrapper[4691]: I0930 06:36:25.653615 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd274ff4-663a-4621-af28-d5fca3e5b139-scripts\") pod \"glance-default-external-api-0\" (UID: \"bd274ff4-663a-4621-af28-d5fca3e5b139\") " pod="openstack/glance-default-external-api-0" Sep 30 06:36:25 crc kubenswrapper[4691]: I0930 06:36:25.653642 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"bd274ff4-663a-4621-af28-d5fca3e5b139\") " pod="openstack/glance-default-external-api-0" Sep 30 06:36:25 crc kubenswrapper[4691]: I0930 06:36:25.653699 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd274ff4-663a-4621-af28-d5fca3e5b139-config-data\") pod \"glance-default-external-api-0\" (UID: \"bd274ff4-663a-4621-af28-d5fca3e5b139\") " pod="openstack/glance-default-external-api-0" Sep 30 06:36:25 crc kubenswrapper[4691]: I0930 06:36:25.653737 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd274ff4-663a-4621-af28-d5fca3e5b139-logs\") pod \"glance-default-external-api-0\" (UID: \"bd274ff4-663a-4621-af28-d5fca3e5b139\") " pod="openstack/glance-default-external-api-0" Sep 30 06:36:25 crc kubenswrapper[4691]: I0930 06:36:25.653786 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd274ff4-663a-4621-af28-d5fca3e5b139-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"bd274ff4-663a-4621-af28-d5fca3e5b139\") " pod="openstack/glance-default-external-api-0" Sep 30 06:36:25 crc kubenswrapper[4691]: I0930 06:36:25.653814 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjdqv\" (UniqueName: \"kubernetes.io/projected/bd274ff4-663a-4621-af28-d5fca3e5b139-kube-api-access-bjdqv\") pod \"glance-default-external-api-0\" (UID: \"bd274ff4-663a-4621-af28-d5fca3e5b139\") " pod="openstack/glance-default-external-api-0" Sep 30 06:36:25 crc kubenswrapper[4691]: I0930 06:36:25.653852 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bd274ff4-663a-4621-af28-d5fca3e5b139-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"bd274ff4-663a-4621-af28-d5fca3e5b139\") " pod="openstack/glance-default-external-api-0" Sep 30 06:36:25 crc kubenswrapper[4691]: I0930 06:36:25.654520 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bd274ff4-663a-4621-af28-d5fca3e5b139-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"bd274ff4-663a-4621-af28-d5fca3e5b139\") " pod="openstack/glance-default-external-api-0" Sep 30 06:36:25 crc kubenswrapper[4691]: I0930 06:36:25.662029 4691 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"bd274ff4-663a-4621-af28-d5fca3e5b139\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Sep 30 06:36:25 crc kubenswrapper[4691]: I0930 06:36:25.666409 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd274ff4-663a-4621-af28-d5fca3e5b139-logs\") pod \"glance-default-external-api-0\" (UID: \"bd274ff4-663a-4621-af28-d5fca3e5b139\") " pod="openstack/glance-default-external-api-0" Sep 30 06:36:25 crc kubenswrapper[4691]: I0930 06:36:25.691135 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd274ff4-663a-4621-af28-d5fca3e5b139-scripts\") pod \"glance-default-external-api-0\" (UID: \"bd274ff4-663a-4621-af28-d5fca3e5b139\") " pod="openstack/glance-default-external-api-0" Sep 30 06:36:25 crc kubenswrapper[4691]: I0930 06:36:25.691262 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd274ff4-663a-4621-af28-d5fca3e5b139-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"bd274ff4-663a-4621-af28-d5fca3e5b139\") " pod="openstack/glance-default-external-api-0" Sep 30 06:36:25 crc kubenswrapper[4691]: I0930 06:36:25.694221 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd274ff4-663a-4621-af28-d5fca3e5b139-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"bd274ff4-663a-4621-af28-d5fca3e5b139\") " pod="openstack/glance-default-external-api-0" Sep 30 06:36:25 crc kubenswrapper[4691]: I0930 06:36:25.699655 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjdqv\" (UniqueName: \"kubernetes.io/projected/bd274ff4-663a-4621-af28-d5fca3e5b139-kube-api-access-bjdqv\") pod \"glance-default-external-api-0\" (UID: \"bd274ff4-663a-4621-af28-d5fca3e5b139\") " pod="openstack/glance-default-external-api-0" Sep 30 06:36:25 crc kubenswrapper[4691]: I0930 06:36:25.704175 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd274ff4-663a-4621-af28-d5fca3e5b139-config-data\") pod \"glance-default-external-api-0\" (UID: \"bd274ff4-663a-4621-af28-d5fca3e5b139\") " pod="openstack/glance-default-external-api-0" Sep 30 06:36:25 crc kubenswrapper[4691]: I0930 06:36:25.751105 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"bd274ff4-663a-4621-af28-d5fca3e5b139\") " pod="openstack/glance-default-external-api-0" Sep 30 06:36:25 crc kubenswrapper[4691]: I0930 06:36:25.833451 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 06:36:26 crc kubenswrapper[4691]: I0930 06:36:26.085039 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 06:36:26 crc kubenswrapper[4691]: W0930 06:36:26.168775 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddf950ad9_45e7_4c79_ba30_ef4b423809b0.slice/crio-f2e64b185b731a5cfda931e79a68c8c1f00dea6e941defca043a69f79c3b9b71 WatchSource:0}: Error finding container f2e64b185b731a5cfda931e79a68c8c1f00dea6e941defca043a69f79c3b9b71: Status 404 returned error can't find the container with id f2e64b185b731a5cfda931e79a68c8c1f00dea6e941defca043a69f79c3b9b71 Sep 30 06:36:26 crc kubenswrapper[4691]: I0930 06:36:26.368218 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"df950ad9-45e7-4c79-ba30-ef4b423809b0","Type":"ContainerStarted","Data":"f2e64b185b731a5cfda931e79a68c8c1f00dea6e941defca043a69f79c3b9b71"} Sep 30 06:36:26 crc kubenswrapper[4691]: I0930 06:36:26.421103 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-applier-0" Sep 30 06:36:26 crc kubenswrapper[4691]: I0930 06:36:26.443258 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 06:36:26 crc kubenswrapper[4691]: I0930 06:36:26.472090 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-applier-0"] Sep 30 06:36:26 crc kubenswrapper[4691]: W0930 06:36:26.489612 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd274ff4_663a_4621_af28_d5fca3e5b139.slice/crio-5a164581da1961c3054d1499980b075e6558bafcef62c4a0f3321527ce14d0a6 WatchSource:0}: Error finding container 5a164581da1961c3054d1499980b075e6558bafcef62c4a0f3321527ce14d0a6: Status 404 returned error can't find the container with id 5a164581da1961c3054d1499980b075e6558bafcef62c4a0f3321527ce14d0a6 Sep 30 06:36:27 crc kubenswrapper[4691]: I0930 06:36:27.238852 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4" path="/var/lib/kubelet/pods/d9d464ee-9ac9-4aed-ba1c-5c8fbc0d9fa4/volumes" Sep 30 06:36:27 crc kubenswrapper[4691]: I0930 06:36:27.430335 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bd274ff4-663a-4621-af28-d5fca3e5b139","Type":"ContainerStarted","Data":"5a2b6edbc643a5929286b269316a17d2033f2b4ddab512e09a4bf6912eb16c15"} Sep 30 06:36:27 crc kubenswrapper[4691]: I0930 06:36:27.430607 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bd274ff4-663a-4621-af28-d5fca3e5b139","Type":"ContainerStarted","Data":"5a164581da1961c3054d1499980b075e6558bafcef62c4a0f3321527ce14d0a6"} Sep 30 06:36:27 crc kubenswrapper[4691]: I0930 06:36:27.433123 4691 generic.go:334] "Generic (PLEG): container finished" podID="6315532b-2604-4b19-8b3f-cb4bb9ff83f6" containerID="10cd68f7901e1f9bda7ddcbd82ff985ba27907d359a88feddc49294014fb128d" exitCode=0 Sep 30 06:36:27 crc kubenswrapper[4691]: I0930 06:36:27.433172 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-jfg2g" event={"ID":"6315532b-2604-4b19-8b3f-cb4bb9ff83f6","Type":"ContainerDied","Data":"10cd68f7901e1f9bda7ddcbd82ff985ba27907d359a88feddc49294014fb128d"} Sep 30 06:36:27 crc kubenswrapper[4691]: I0930 06:36:27.461097 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"df950ad9-45e7-4c79-ba30-ef4b423809b0","Type":"ContainerStarted","Data":"6e10c72881f1db003dcd37cc7e2bc8d9653b2faad08dd92dbbc418620d9785a7"} Sep 30 06:36:27 crc kubenswrapper[4691]: I0930 06:36:27.461331 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-decision-engine-0" podUID="b9e263ab-2384-42b0-8c7b-a787bcf361a9" containerName="watcher-decision-engine" containerID="cri-o://3d912d20c8216080d5d5d0f48d6809ba39b87bf21d5a0613b94bdc715405462c" gracePeriod=30 Sep 30 06:36:28 crc kubenswrapper[4691]: I0930 06:36:28.481979 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bd274ff4-663a-4621-af28-d5fca3e5b139","Type":"ContainerStarted","Data":"b84ccdedf764479d11a712b4b4ddd548c5760af2524c9c926d968905051ec7a8"} Sep 30 06:36:28 crc kubenswrapper[4691]: I0930 06:36:28.484614 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"b9e263ab-2384-42b0-8c7b-a787bcf361a9","Type":"ContainerDied","Data":"3d912d20c8216080d5d5d0f48d6809ba39b87bf21d5a0613b94bdc715405462c"} Sep 30 06:36:28 crc kubenswrapper[4691]: I0930 06:36:28.484758 4691 generic.go:334] "Generic (PLEG): container finished" podID="b9e263ab-2384-42b0-8c7b-a787bcf361a9" containerID="3d912d20c8216080d5d5d0f48d6809ba39b87bf21d5a0613b94bdc715405462c" exitCode=1 Sep 30 06:36:28 crc kubenswrapper[4691]: I0930 06:36:28.485067 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-applier-0" podUID="a1cbb8f9-118e-48b5-ae92-067ece5295a2" containerName="watcher-applier" containerID="cri-o://0b11f18aefe7f7db7a62a58255ec5d31d85eaf58a99f7dfcbd8f5869b1dd973b" gracePeriod=30 Sep 30 06:36:28 crc kubenswrapper[4691]: I0930 06:36:28.515324 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.515307397 podStartE2EDuration="3.515307397s" podCreationTimestamp="2025-09-30 06:36:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:36:28.510328678 +0000 UTC m=+1031.985349718" watchObservedRunningTime="2025-09-30 06:36:28.515307397 +0000 UTC m=+1031.990328437" Sep 30 06:36:28 crc kubenswrapper[4691]: I0930 06:36:28.613607 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-596cd479b5-w8rf2" Sep 30 06:36:29 crc kubenswrapper[4691]: I0930 06:36:29.497555 4691 generic.go:334] "Generic (PLEG): container finished" podID="a1cbb8f9-118e-48b5-ae92-067ece5295a2" containerID="0b11f18aefe7f7db7a62a58255ec5d31d85eaf58a99f7dfcbd8f5869b1dd973b" exitCode=0 Sep 30 06:36:29 crc kubenswrapper[4691]: I0930 06:36:29.497587 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"a1cbb8f9-118e-48b5-ae92-067ece5295a2","Type":"ContainerDied","Data":"0b11f18aefe7f7db7a62a58255ec5d31d85eaf58a99f7dfcbd8f5869b1dd973b"} Sep 30 06:36:29 crc kubenswrapper[4691]: I0930 06:36:29.499723 4691 generic.go:334] "Generic (PLEG): container finished" podID="ca252689-a07e-4e84-a79f-7884687c6db3" containerID="102e58ebdd5146960fce63254dfc2ca4b3e0d16f64058217cd413713ce9faf5e" exitCode=0 Sep 30 06:36:29 crc kubenswrapper[4691]: I0930 06:36:29.499795 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-j7phx" event={"ID":"ca252689-a07e-4e84-a79f-7884687c6db3","Type":"ContainerDied","Data":"102e58ebdd5146960fce63254dfc2ca4b3e0d16f64058217cd413713ce9faf5e"} Sep 30 06:36:30 crc kubenswrapper[4691]: E0930 06:36:30.357190 4691 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0b11f18aefe7f7db7a62a58255ec5d31d85eaf58a99f7dfcbd8f5869b1dd973b is running failed: container process not found" containerID="0b11f18aefe7f7db7a62a58255ec5d31d85eaf58a99f7dfcbd8f5869b1dd973b" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Sep 30 06:36:30 crc kubenswrapper[4691]: E0930 06:36:30.357547 4691 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0b11f18aefe7f7db7a62a58255ec5d31d85eaf58a99f7dfcbd8f5869b1dd973b is running failed: container process not found" containerID="0b11f18aefe7f7db7a62a58255ec5d31d85eaf58a99f7dfcbd8f5869b1dd973b" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Sep 30 06:36:30 crc kubenswrapper[4691]: E0930 06:36:30.357916 4691 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0b11f18aefe7f7db7a62a58255ec5d31d85eaf58a99f7dfcbd8f5869b1dd973b is running failed: container process not found" containerID="0b11f18aefe7f7db7a62a58255ec5d31d85eaf58a99f7dfcbd8f5869b1dd973b" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Sep 30 06:36:30 crc kubenswrapper[4691]: E0930 06:36:30.357953 4691 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0b11f18aefe7f7db7a62a58255ec5d31d85eaf58a99f7dfcbd8f5869b1dd973b is running failed: container process not found" probeType="Readiness" pod="openstack/watcher-applier-0" podUID="a1cbb8f9-118e-48b5-ae92-067ece5295a2" containerName="watcher-applier" Sep 30 06:36:30 crc kubenswrapper[4691]: I0930 06:36:30.671068 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7597958cd9-k94q9" Sep 30 06:36:30 crc kubenswrapper[4691]: I0930 06:36:30.818728 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-jfg2g" Sep 30 06:36:30 crc kubenswrapper[4691]: I0930 06:36:30.820685 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75fd5776c-42zjc"] Sep 30 06:36:30 crc kubenswrapper[4691]: I0930 06:36:30.821004 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-75fd5776c-42zjc" podUID="5385150d-abdd-4b17-bbf4-fee7d4b5946e" containerName="dnsmasq-dns" containerID="cri-o://6a0753e503f95d846c3c259455c74b6e174a2c797cab68037838cd926dabbc48" gracePeriod=10 Sep 30 06:36:30 crc kubenswrapper[4691]: I0930 06:36:30.937693 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6315532b-2604-4b19-8b3f-cb4bb9ff83f6-config-data\") pod \"6315532b-2604-4b19-8b3f-cb4bb9ff83f6\" (UID: \"6315532b-2604-4b19-8b3f-cb4bb9ff83f6\") " Sep 30 06:36:30 crc kubenswrapper[4691]: I0930 06:36:30.937813 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6315532b-2604-4b19-8b3f-cb4bb9ff83f6-combined-ca-bundle\") pod \"6315532b-2604-4b19-8b3f-cb4bb9ff83f6\" (UID: \"6315532b-2604-4b19-8b3f-cb4bb9ff83f6\") " Sep 30 06:36:30 crc kubenswrapper[4691]: I0930 06:36:30.937836 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pp995\" (UniqueName: \"kubernetes.io/projected/6315532b-2604-4b19-8b3f-cb4bb9ff83f6-kube-api-access-pp995\") pod \"6315532b-2604-4b19-8b3f-cb4bb9ff83f6\" (UID: \"6315532b-2604-4b19-8b3f-cb4bb9ff83f6\") " Sep 30 06:36:30 crc kubenswrapper[4691]: I0930 06:36:30.937963 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6315532b-2604-4b19-8b3f-cb4bb9ff83f6-logs\") pod \"6315532b-2604-4b19-8b3f-cb4bb9ff83f6\" (UID: \"6315532b-2604-4b19-8b3f-cb4bb9ff83f6\") " Sep 30 06:36:30 crc kubenswrapper[4691]: I0930 06:36:30.938023 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6315532b-2604-4b19-8b3f-cb4bb9ff83f6-scripts\") pod \"6315532b-2604-4b19-8b3f-cb4bb9ff83f6\" (UID: \"6315532b-2604-4b19-8b3f-cb4bb9ff83f6\") " Sep 30 06:36:30 crc kubenswrapper[4691]: I0930 06:36:30.940426 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6315532b-2604-4b19-8b3f-cb4bb9ff83f6-logs" (OuterVolumeSpecName: "logs") pod "6315532b-2604-4b19-8b3f-cb4bb9ff83f6" (UID: "6315532b-2604-4b19-8b3f-cb4bb9ff83f6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:36:30 crc kubenswrapper[4691]: I0930 06:36:30.947375 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6315532b-2604-4b19-8b3f-cb4bb9ff83f6-scripts" (OuterVolumeSpecName: "scripts") pod "6315532b-2604-4b19-8b3f-cb4bb9ff83f6" (UID: "6315532b-2604-4b19-8b3f-cb4bb9ff83f6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:36:30 crc kubenswrapper[4691]: I0930 06:36:30.948078 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6315532b-2604-4b19-8b3f-cb4bb9ff83f6-kube-api-access-pp995" (OuterVolumeSpecName: "kube-api-access-pp995") pod "6315532b-2604-4b19-8b3f-cb4bb9ff83f6" (UID: "6315532b-2604-4b19-8b3f-cb4bb9ff83f6"). InnerVolumeSpecName "kube-api-access-pp995". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:36:30 crc kubenswrapper[4691]: I0930 06:36:30.995013 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6315532b-2604-4b19-8b3f-cb4bb9ff83f6-config-data" (OuterVolumeSpecName: "config-data") pod "6315532b-2604-4b19-8b3f-cb4bb9ff83f6" (UID: "6315532b-2604-4b19-8b3f-cb4bb9ff83f6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:36:31 crc kubenswrapper[4691]: I0930 06:36:31.013051 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6315532b-2604-4b19-8b3f-cb4bb9ff83f6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6315532b-2604-4b19-8b3f-cb4bb9ff83f6" (UID: "6315532b-2604-4b19-8b3f-cb4bb9ff83f6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:36:31 crc kubenswrapper[4691]: I0930 06:36:31.040386 4691 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6315532b-2604-4b19-8b3f-cb4bb9ff83f6-logs\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:31 crc kubenswrapper[4691]: I0930 06:36:31.040492 4691 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6315532b-2604-4b19-8b3f-cb4bb9ff83f6-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:31 crc kubenswrapper[4691]: I0930 06:36:31.040558 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6315532b-2604-4b19-8b3f-cb4bb9ff83f6-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:31 crc kubenswrapper[4691]: I0930 06:36:31.040611 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6315532b-2604-4b19-8b3f-cb4bb9ff83f6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:31 crc kubenswrapper[4691]: I0930 06:36:31.040664 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pp995\" (UniqueName: \"kubernetes.io/projected/6315532b-2604-4b19-8b3f-cb4bb9ff83f6-kube-api-access-pp995\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:31 crc kubenswrapper[4691]: I0930 06:36:31.525551 4691 generic.go:334] "Generic (PLEG): container finished" podID="5385150d-abdd-4b17-bbf4-fee7d4b5946e" containerID="6a0753e503f95d846c3c259455c74b6e174a2c797cab68037838cd926dabbc48" exitCode=0 Sep 30 06:36:31 crc kubenswrapper[4691]: I0930 06:36:31.525607 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75fd5776c-42zjc" event={"ID":"5385150d-abdd-4b17-bbf4-fee7d4b5946e","Type":"ContainerDied","Data":"6a0753e503f95d846c3c259455c74b6e174a2c797cab68037838cd926dabbc48"} Sep 30 06:36:31 crc kubenswrapper[4691]: I0930 06:36:31.526678 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-jfg2g" event={"ID":"6315532b-2604-4b19-8b3f-cb4bb9ff83f6","Type":"ContainerDied","Data":"a8f545033fc6e1ac50240d5db06872701110df32f61d65121614a167ed2acc9f"} Sep 30 06:36:31 crc kubenswrapper[4691]: I0930 06:36:31.526699 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a8f545033fc6e1ac50240d5db06872701110df32f61d65121614a167ed2acc9f" Sep 30 06:36:31 crc kubenswrapper[4691]: I0930 06:36:31.526757 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-jfg2g" Sep 30 06:36:32 crc kubenswrapper[4691]: I0930 06:36:32.013739 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-56d945494d-7svb6"] Sep 30 06:36:32 crc kubenswrapper[4691]: E0930 06:36:32.015129 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6315532b-2604-4b19-8b3f-cb4bb9ff83f6" containerName="placement-db-sync" Sep 30 06:36:32 crc kubenswrapper[4691]: I0930 06:36:32.015141 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="6315532b-2604-4b19-8b3f-cb4bb9ff83f6" containerName="placement-db-sync" Sep 30 06:36:32 crc kubenswrapper[4691]: I0930 06:36:32.015310 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="6315532b-2604-4b19-8b3f-cb4bb9ff83f6" containerName="placement-db-sync" Sep 30 06:36:32 crc kubenswrapper[4691]: I0930 06:36:32.017270 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-56d945494d-7svb6" Sep 30 06:36:32 crc kubenswrapper[4691]: I0930 06:36:32.021186 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-fc6j9" Sep 30 06:36:32 crc kubenswrapper[4691]: I0930 06:36:32.025327 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Sep 30 06:36:32 crc kubenswrapper[4691]: I0930 06:36:32.025368 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Sep 30 06:36:32 crc kubenswrapper[4691]: I0930 06:36:32.025439 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Sep 30 06:36:32 crc kubenswrapper[4691]: I0930 06:36:32.025496 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Sep 30 06:36:32 crc kubenswrapper[4691]: I0930 06:36:32.038969 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-56d945494d-7svb6"] Sep 30 06:36:32 crc kubenswrapper[4691]: I0930 06:36:32.161795 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d35f539b-5139-4155-8f51-a1e425e19925-internal-tls-certs\") pod \"placement-56d945494d-7svb6\" (UID: \"d35f539b-5139-4155-8f51-a1e425e19925\") " pod="openstack/placement-56d945494d-7svb6" Sep 30 06:36:32 crc kubenswrapper[4691]: I0930 06:36:32.161875 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d35f539b-5139-4155-8f51-a1e425e19925-public-tls-certs\") pod \"placement-56d945494d-7svb6\" (UID: \"d35f539b-5139-4155-8f51-a1e425e19925\") " pod="openstack/placement-56d945494d-7svb6" Sep 30 06:36:32 crc kubenswrapper[4691]: I0930 06:36:32.161924 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpcpr\" (UniqueName: \"kubernetes.io/projected/d35f539b-5139-4155-8f51-a1e425e19925-kube-api-access-rpcpr\") pod \"placement-56d945494d-7svb6\" (UID: \"d35f539b-5139-4155-8f51-a1e425e19925\") " pod="openstack/placement-56d945494d-7svb6" Sep 30 06:36:32 crc kubenswrapper[4691]: I0930 06:36:32.161941 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d35f539b-5139-4155-8f51-a1e425e19925-combined-ca-bundle\") pod \"placement-56d945494d-7svb6\" (UID: \"d35f539b-5139-4155-8f51-a1e425e19925\") " pod="openstack/placement-56d945494d-7svb6" Sep 30 06:36:32 crc kubenswrapper[4691]: I0930 06:36:32.161975 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d35f539b-5139-4155-8f51-a1e425e19925-logs\") pod \"placement-56d945494d-7svb6\" (UID: \"d35f539b-5139-4155-8f51-a1e425e19925\") " pod="openstack/placement-56d945494d-7svb6" Sep 30 06:36:32 crc kubenswrapper[4691]: I0930 06:36:32.162007 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d35f539b-5139-4155-8f51-a1e425e19925-config-data\") pod \"placement-56d945494d-7svb6\" (UID: \"d35f539b-5139-4155-8f51-a1e425e19925\") " pod="openstack/placement-56d945494d-7svb6" Sep 30 06:36:32 crc kubenswrapper[4691]: I0930 06:36:32.162029 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d35f539b-5139-4155-8f51-a1e425e19925-scripts\") pod \"placement-56d945494d-7svb6\" (UID: \"d35f539b-5139-4155-8f51-a1e425e19925\") " pod="openstack/placement-56d945494d-7svb6" Sep 30 06:36:32 crc kubenswrapper[4691]: I0930 06:36:32.264258 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d35f539b-5139-4155-8f51-a1e425e19925-scripts\") pod \"placement-56d945494d-7svb6\" (UID: \"d35f539b-5139-4155-8f51-a1e425e19925\") " pod="openstack/placement-56d945494d-7svb6" Sep 30 06:36:32 crc kubenswrapper[4691]: I0930 06:36:32.264366 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d35f539b-5139-4155-8f51-a1e425e19925-internal-tls-certs\") pod \"placement-56d945494d-7svb6\" (UID: \"d35f539b-5139-4155-8f51-a1e425e19925\") " pod="openstack/placement-56d945494d-7svb6" Sep 30 06:36:32 crc kubenswrapper[4691]: I0930 06:36:32.264414 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d35f539b-5139-4155-8f51-a1e425e19925-public-tls-certs\") pod \"placement-56d945494d-7svb6\" (UID: \"d35f539b-5139-4155-8f51-a1e425e19925\") " pod="openstack/placement-56d945494d-7svb6" Sep 30 06:36:32 crc kubenswrapper[4691]: I0930 06:36:32.264445 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpcpr\" (UniqueName: \"kubernetes.io/projected/d35f539b-5139-4155-8f51-a1e425e19925-kube-api-access-rpcpr\") pod \"placement-56d945494d-7svb6\" (UID: \"d35f539b-5139-4155-8f51-a1e425e19925\") " pod="openstack/placement-56d945494d-7svb6" Sep 30 06:36:32 crc kubenswrapper[4691]: I0930 06:36:32.264466 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d35f539b-5139-4155-8f51-a1e425e19925-combined-ca-bundle\") pod \"placement-56d945494d-7svb6\" (UID: \"d35f539b-5139-4155-8f51-a1e425e19925\") " pod="openstack/placement-56d945494d-7svb6" Sep 30 06:36:32 crc kubenswrapper[4691]: I0930 06:36:32.264502 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d35f539b-5139-4155-8f51-a1e425e19925-logs\") pod \"placement-56d945494d-7svb6\" (UID: \"d35f539b-5139-4155-8f51-a1e425e19925\") " pod="openstack/placement-56d945494d-7svb6" Sep 30 06:36:32 crc kubenswrapper[4691]: I0930 06:36:32.264533 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d35f539b-5139-4155-8f51-a1e425e19925-config-data\") pod \"placement-56d945494d-7svb6\" (UID: \"d35f539b-5139-4155-8f51-a1e425e19925\") " pod="openstack/placement-56d945494d-7svb6" Sep 30 06:36:32 crc kubenswrapper[4691]: I0930 06:36:32.265100 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d35f539b-5139-4155-8f51-a1e425e19925-logs\") pod \"placement-56d945494d-7svb6\" (UID: \"d35f539b-5139-4155-8f51-a1e425e19925\") " pod="openstack/placement-56d945494d-7svb6" Sep 30 06:36:32 crc kubenswrapper[4691]: I0930 06:36:32.271387 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d35f539b-5139-4155-8f51-a1e425e19925-scripts\") pod \"placement-56d945494d-7svb6\" (UID: \"d35f539b-5139-4155-8f51-a1e425e19925\") " pod="openstack/placement-56d945494d-7svb6" Sep 30 06:36:32 crc kubenswrapper[4691]: I0930 06:36:32.271623 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d35f539b-5139-4155-8f51-a1e425e19925-combined-ca-bundle\") pod \"placement-56d945494d-7svb6\" (UID: \"d35f539b-5139-4155-8f51-a1e425e19925\") " pod="openstack/placement-56d945494d-7svb6" Sep 30 06:36:32 crc kubenswrapper[4691]: I0930 06:36:32.272034 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d35f539b-5139-4155-8f51-a1e425e19925-public-tls-certs\") pod \"placement-56d945494d-7svb6\" (UID: \"d35f539b-5139-4155-8f51-a1e425e19925\") " pod="openstack/placement-56d945494d-7svb6" Sep 30 06:36:32 crc kubenswrapper[4691]: I0930 06:36:32.273428 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d35f539b-5139-4155-8f51-a1e425e19925-internal-tls-certs\") pod \"placement-56d945494d-7svb6\" (UID: \"d35f539b-5139-4155-8f51-a1e425e19925\") " pod="openstack/placement-56d945494d-7svb6" Sep 30 06:36:32 crc kubenswrapper[4691]: I0930 06:36:32.284372 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpcpr\" (UniqueName: \"kubernetes.io/projected/d35f539b-5139-4155-8f51-a1e425e19925-kube-api-access-rpcpr\") pod \"placement-56d945494d-7svb6\" (UID: \"d35f539b-5139-4155-8f51-a1e425e19925\") " pod="openstack/placement-56d945494d-7svb6" Sep 30 06:36:32 crc kubenswrapper[4691]: I0930 06:36:32.285557 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d35f539b-5139-4155-8f51-a1e425e19925-config-data\") pod \"placement-56d945494d-7svb6\" (UID: \"d35f539b-5139-4155-8f51-a1e425e19925\") " pod="openstack/placement-56d945494d-7svb6" Sep 30 06:36:32 crc kubenswrapper[4691]: I0930 06:36:32.335132 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-56d945494d-7svb6" Sep 30 06:36:34 crc kubenswrapper[4691]: I0930 06:36:34.041068 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-5755bf4df8-zx9td" Sep 30 06:36:34 crc kubenswrapper[4691]: I0930 06:36:34.041111 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5755bf4df8-zx9td" Sep 30 06:36:34 crc kubenswrapper[4691]: I0930 06:36:34.042995 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5755bf4df8-zx9td" podUID="cc29be35-3ceb-4a88-af6e-77e2d0cbab83" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.159:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.159:8443: connect: connection refused" Sep 30 06:36:34 crc kubenswrapper[4691]: I0930 06:36:34.113753 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-56948c48fd-czzmm" Sep 30 06:36:34 crc kubenswrapper[4691]: I0930 06:36:34.113786 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-56948c48fd-czzmm" Sep 30 06:36:34 crc kubenswrapper[4691]: I0930 06:36:34.115267 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-56948c48fd-czzmm" podUID="fec95e1e-14f4-4093-b1d4-402c29686348" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.160:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.160:8443: connect: connection refused" Sep 30 06:36:34 crc kubenswrapper[4691]: I0930 06:36:34.365128 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-j7phx" Sep 30 06:36:34 crc kubenswrapper[4691]: I0930 06:36:34.378268 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75fd5776c-42zjc" Sep 30 06:36:34 crc kubenswrapper[4691]: I0930 06:36:34.403028 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Sep 30 06:36:34 crc kubenswrapper[4691]: I0930 06:36:34.509456 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca252689-a07e-4e84-a79f-7884687c6db3-scripts\") pod \"ca252689-a07e-4e84-a79f-7884687c6db3\" (UID: \"ca252689-a07e-4e84-a79f-7884687c6db3\") " Sep 30 06:36:34 crc kubenswrapper[4691]: I0930 06:36:34.509537 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9e263ab-2384-42b0-8c7b-a787bcf361a9-combined-ca-bundle\") pod \"b9e263ab-2384-42b0-8c7b-a787bcf361a9\" (UID: \"b9e263ab-2384-42b0-8c7b-a787bcf361a9\") " Sep 30 06:36:34 crc kubenswrapper[4691]: I0930 06:36:34.509571 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/b9e263ab-2384-42b0-8c7b-a787bcf361a9-custom-prometheus-ca\") pod \"b9e263ab-2384-42b0-8c7b-a787bcf361a9\" (UID: \"b9e263ab-2384-42b0-8c7b-a787bcf361a9\") " Sep 30 06:36:34 crc kubenswrapper[4691]: I0930 06:36:34.509595 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca252689-a07e-4e84-a79f-7884687c6db3-config-data\") pod \"ca252689-a07e-4e84-a79f-7884687c6db3\" (UID: \"ca252689-a07e-4e84-a79f-7884687c6db3\") " Sep 30 06:36:34 crc kubenswrapper[4691]: I0930 06:36:34.509625 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ca252689-a07e-4e84-a79f-7884687c6db3-credential-keys\") pod \"ca252689-a07e-4e84-a79f-7884687c6db3\" (UID: \"ca252689-a07e-4e84-a79f-7884687c6db3\") " Sep 30 06:36:34 crc kubenswrapper[4691]: I0930 06:36:34.509658 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5385150d-abdd-4b17-bbf4-fee7d4b5946e-ovsdbserver-nb\") pod \"5385150d-abdd-4b17-bbf4-fee7d4b5946e\" (UID: \"5385150d-abdd-4b17-bbf4-fee7d4b5946e\") " Sep 30 06:36:34 crc kubenswrapper[4691]: I0930 06:36:34.509703 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9e263ab-2384-42b0-8c7b-a787bcf361a9-config-data\") pod \"b9e263ab-2384-42b0-8c7b-a787bcf361a9\" (UID: \"b9e263ab-2384-42b0-8c7b-a787bcf361a9\") " Sep 30 06:36:34 crc kubenswrapper[4691]: I0930 06:36:34.509733 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9e263ab-2384-42b0-8c7b-a787bcf361a9-logs\") pod \"b9e263ab-2384-42b0-8c7b-a787bcf361a9\" (UID: \"b9e263ab-2384-42b0-8c7b-a787bcf361a9\") " Sep 30 06:36:34 crc kubenswrapper[4691]: I0930 06:36:34.509770 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5385150d-abdd-4b17-bbf4-fee7d4b5946e-dns-svc\") pod \"5385150d-abdd-4b17-bbf4-fee7d4b5946e\" (UID: \"5385150d-abdd-4b17-bbf4-fee7d4b5946e\") " Sep 30 06:36:34 crc kubenswrapper[4691]: I0930 06:36:34.509822 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5385150d-abdd-4b17-bbf4-fee7d4b5946e-config\") pod \"5385150d-abdd-4b17-bbf4-fee7d4b5946e\" (UID: \"5385150d-abdd-4b17-bbf4-fee7d4b5946e\") " Sep 30 06:36:34 crc kubenswrapper[4691]: I0930 06:36:34.509844 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca252689-a07e-4e84-a79f-7884687c6db3-combined-ca-bundle\") pod \"ca252689-a07e-4e84-a79f-7884687c6db3\" (UID: \"ca252689-a07e-4e84-a79f-7884687c6db3\") " Sep 30 06:36:34 crc kubenswrapper[4691]: I0930 06:36:34.509929 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4m8m\" (UniqueName: \"kubernetes.io/projected/5385150d-abdd-4b17-bbf4-fee7d4b5946e-kube-api-access-v4m8m\") pod \"5385150d-abdd-4b17-bbf4-fee7d4b5946e\" (UID: \"5385150d-abdd-4b17-bbf4-fee7d4b5946e\") " Sep 30 06:36:34 crc kubenswrapper[4691]: I0930 06:36:34.509951 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5385150d-abdd-4b17-bbf4-fee7d4b5946e-dns-swift-storage-0\") pod \"5385150d-abdd-4b17-bbf4-fee7d4b5946e\" (UID: \"5385150d-abdd-4b17-bbf4-fee7d4b5946e\") " Sep 30 06:36:34 crc kubenswrapper[4691]: I0930 06:36:34.510006 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6zpp\" (UniqueName: \"kubernetes.io/projected/b9e263ab-2384-42b0-8c7b-a787bcf361a9-kube-api-access-k6zpp\") pod \"b9e263ab-2384-42b0-8c7b-a787bcf361a9\" (UID: \"b9e263ab-2384-42b0-8c7b-a787bcf361a9\") " Sep 30 06:36:34 crc kubenswrapper[4691]: I0930 06:36:34.510041 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5385150d-abdd-4b17-bbf4-fee7d4b5946e-ovsdbserver-sb\") pod \"5385150d-abdd-4b17-bbf4-fee7d4b5946e\" (UID: \"5385150d-abdd-4b17-bbf4-fee7d4b5946e\") " Sep 30 06:36:34 crc kubenswrapper[4691]: I0930 06:36:34.510072 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ca252689-a07e-4e84-a79f-7884687c6db3-fernet-keys\") pod \"ca252689-a07e-4e84-a79f-7884687c6db3\" (UID: \"ca252689-a07e-4e84-a79f-7884687c6db3\") " Sep 30 06:36:34 crc kubenswrapper[4691]: I0930 06:36:34.510088 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6ww5\" (UniqueName: \"kubernetes.io/projected/ca252689-a07e-4e84-a79f-7884687c6db3-kube-api-access-b6ww5\") pod \"ca252689-a07e-4e84-a79f-7884687c6db3\" (UID: \"ca252689-a07e-4e84-a79f-7884687c6db3\") " Sep 30 06:36:34 crc kubenswrapper[4691]: I0930 06:36:34.511397 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9e263ab-2384-42b0-8c7b-a787bcf361a9-logs" (OuterVolumeSpecName: "logs") pod "b9e263ab-2384-42b0-8c7b-a787bcf361a9" (UID: "b9e263ab-2384-42b0-8c7b-a787bcf361a9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:36:34 crc kubenswrapper[4691]: I0930 06:36:34.532785 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5385150d-abdd-4b17-bbf4-fee7d4b5946e-kube-api-access-v4m8m" (OuterVolumeSpecName: "kube-api-access-v4m8m") pod "5385150d-abdd-4b17-bbf4-fee7d4b5946e" (UID: "5385150d-abdd-4b17-bbf4-fee7d4b5946e"). InnerVolumeSpecName "kube-api-access-v4m8m". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:36:34 crc kubenswrapper[4691]: I0930 06:36:34.533132 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca252689-a07e-4e84-a79f-7884687c6db3-scripts" (OuterVolumeSpecName: "scripts") pod "ca252689-a07e-4e84-a79f-7884687c6db3" (UID: "ca252689-a07e-4e84-a79f-7884687c6db3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:36:34 crc kubenswrapper[4691]: I0930 06:36:34.533459 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca252689-a07e-4e84-a79f-7884687c6db3-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "ca252689-a07e-4e84-a79f-7884687c6db3" (UID: "ca252689-a07e-4e84-a79f-7884687c6db3"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:36:34 crc kubenswrapper[4691]: I0930 06:36:34.533660 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca252689-a07e-4e84-a79f-7884687c6db3-kube-api-access-b6ww5" (OuterVolumeSpecName: "kube-api-access-b6ww5") pod "ca252689-a07e-4e84-a79f-7884687c6db3" (UID: "ca252689-a07e-4e84-a79f-7884687c6db3"). InnerVolumeSpecName "kube-api-access-b6ww5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:36:34 crc kubenswrapper[4691]: I0930 06:36:34.537997 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9e263ab-2384-42b0-8c7b-a787bcf361a9-kube-api-access-k6zpp" (OuterVolumeSpecName: "kube-api-access-k6zpp") pod "b9e263ab-2384-42b0-8c7b-a787bcf361a9" (UID: "b9e263ab-2384-42b0-8c7b-a787bcf361a9"). InnerVolumeSpecName "kube-api-access-k6zpp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:36:34 crc kubenswrapper[4691]: I0930 06:36:34.544065 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca252689-a07e-4e84-a79f-7884687c6db3-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "ca252689-a07e-4e84-a79f-7884687c6db3" (UID: "ca252689-a07e-4e84-a79f-7884687c6db3"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:36:34 crc kubenswrapper[4691]: I0930 06:36:34.610180 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Sep 30 06:36:34 crc kubenswrapper[4691]: I0930 06:36:34.610301 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"b9e263ab-2384-42b0-8c7b-a787bcf361a9","Type":"ContainerDied","Data":"7b769bc63e2eed1b3da0b54fd040ddc4dc6bd4ba6a38e7c98cbf24afbaa35567"} Sep 30 06:36:34 crc kubenswrapper[4691]: I0930 06:36:34.610414 4691 scope.go:117] "RemoveContainer" containerID="3d912d20c8216080d5d5d0f48d6809ba39b87bf21d5a0613b94bdc715405462c" Sep 30 06:36:34 crc kubenswrapper[4691]: I0930 06:36:34.611592 4691 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ca252689-a07e-4e84-a79f-7884687c6db3-fernet-keys\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:34 crc kubenswrapper[4691]: I0930 06:36:34.611607 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b6ww5\" (UniqueName: \"kubernetes.io/projected/ca252689-a07e-4e84-a79f-7884687c6db3-kube-api-access-b6ww5\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:34 crc kubenswrapper[4691]: I0930 06:36:34.611616 4691 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ca252689-a07e-4e84-a79f-7884687c6db3-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:34 crc kubenswrapper[4691]: I0930 06:36:34.611625 4691 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ca252689-a07e-4e84-a79f-7884687c6db3-credential-keys\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:34 crc kubenswrapper[4691]: I0930 06:36:34.611634 4691 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9e263ab-2384-42b0-8c7b-a787bcf361a9-logs\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:34 crc kubenswrapper[4691]: I0930 06:36:34.611643 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4m8m\" (UniqueName: \"kubernetes.io/projected/5385150d-abdd-4b17-bbf4-fee7d4b5946e-kube-api-access-v4m8m\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:34 crc kubenswrapper[4691]: I0930 06:36:34.611651 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6zpp\" (UniqueName: \"kubernetes.io/projected/b9e263ab-2384-42b0-8c7b-a787bcf361a9-kube-api-access-k6zpp\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:34 crc kubenswrapper[4691]: I0930 06:36:34.614158 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca252689-a07e-4e84-a79f-7884687c6db3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ca252689-a07e-4e84-a79f-7884687c6db3" (UID: "ca252689-a07e-4e84-a79f-7884687c6db3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:36:34 crc kubenswrapper[4691]: I0930 06:36:34.623204 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9e263ab-2384-42b0-8c7b-a787bcf361a9-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "b9e263ab-2384-42b0-8c7b-a787bcf361a9" (UID: "b9e263ab-2384-42b0-8c7b-a787bcf361a9"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:36:34 crc kubenswrapper[4691]: I0930 06:36:34.625161 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca252689-a07e-4e84-a79f-7884687c6db3-config-data" (OuterVolumeSpecName: "config-data") pod "ca252689-a07e-4e84-a79f-7884687c6db3" (UID: "ca252689-a07e-4e84-a79f-7884687c6db3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:36:34 crc kubenswrapper[4691]: I0930 06:36:34.627954 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5385150d-abdd-4b17-bbf4-fee7d4b5946e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "5385150d-abdd-4b17-bbf4-fee7d4b5946e" (UID: "5385150d-abdd-4b17-bbf4-fee7d4b5946e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:36:34 crc kubenswrapper[4691]: I0930 06:36:34.629248 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-j7phx" event={"ID":"ca252689-a07e-4e84-a79f-7884687c6db3","Type":"ContainerDied","Data":"f0efd8fd7ae1b1bccd870316d8314f6abc06f1014bbd00c71ad10cdfd06fa70c"} Sep 30 06:36:34 crc kubenswrapper[4691]: I0930 06:36:34.629283 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f0efd8fd7ae1b1bccd870316d8314f6abc06f1014bbd00c71ad10cdfd06fa70c" Sep 30 06:36:34 crc kubenswrapper[4691]: I0930 06:36:34.629337 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-j7phx" Sep 30 06:36:34 crc kubenswrapper[4691]: I0930 06:36:34.647396 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5385150d-abdd-4b17-bbf4-fee7d4b5946e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "5385150d-abdd-4b17-bbf4-fee7d4b5946e" (UID: "5385150d-abdd-4b17-bbf4-fee7d4b5946e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:36:34 crc kubenswrapper[4691]: I0930 06:36:34.657031 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5385150d-abdd-4b17-bbf4-fee7d4b5946e-config" (OuterVolumeSpecName: "config") pod "5385150d-abdd-4b17-bbf4-fee7d4b5946e" (UID: "5385150d-abdd-4b17-bbf4-fee7d4b5946e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:36:34 crc kubenswrapper[4691]: I0930 06:36:34.657658 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-75fd5776c-42zjc" event={"ID":"5385150d-abdd-4b17-bbf4-fee7d4b5946e","Type":"ContainerDied","Data":"a7a31101f62228e8fcb6f246abc71893edf4eb180fd0485bf9ed33a3cd12716d"} Sep 30 06:36:34 crc kubenswrapper[4691]: I0930 06:36:34.657741 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-75fd5776c-42zjc" Sep 30 06:36:34 crc kubenswrapper[4691]: I0930 06:36:34.682179 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5385150d-abdd-4b17-bbf4-fee7d4b5946e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "5385150d-abdd-4b17-bbf4-fee7d4b5946e" (UID: "5385150d-abdd-4b17-bbf4-fee7d4b5946e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:36:34 crc kubenswrapper[4691]: I0930 06:36:34.682406 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5385150d-abdd-4b17-bbf4-fee7d4b5946e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "5385150d-abdd-4b17-bbf4-fee7d4b5946e" (UID: "5385150d-abdd-4b17-bbf4-fee7d4b5946e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:36:34 crc kubenswrapper[4691]: I0930 06:36:34.686145 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9e263ab-2384-42b0-8c7b-a787bcf361a9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b9e263ab-2384-42b0-8c7b-a787bcf361a9" (UID: "b9e263ab-2384-42b0-8c7b-a787bcf361a9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:36:34 crc kubenswrapper[4691]: I0930 06:36:34.713794 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9e263ab-2384-42b0-8c7b-a787bcf361a9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:34 crc kubenswrapper[4691]: I0930 06:36:34.713928 4691 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/b9e263ab-2384-42b0-8c7b-a787bcf361a9-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:34 crc kubenswrapper[4691]: I0930 06:36:34.714730 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca252689-a07e-4e84-a79f-7884687c6db3-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:34 crc kubenswrapper[4691]: I0930 06:36:34.714801 4691 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/5385150d-abdd-4b17-bbf4-fee7d4b5946e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:34 crc kubenswrapper[4691]: I0930 06:36:34.714870 4691 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/5385150d-abdd-4b17-bbf4-fee7d4b5946e-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:34 crc kubenswrapper[4691]: I0930 06:36:34.714943 4691 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5385150d-abdd-4b17-bbf4-fee7d4b5946e-config\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:34 crc kubenswrapper[4691]: I0930 06:36:34.714995 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca252689-a07e-4e84-a79f-7884687c6db3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:34 crc kubenswrapper[4691]: I0930 06:36:34.715072 4691 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/5385150d-abdd-4b17-bbf4-fee7d4b5946e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:34 crc kubenswrapper[4691]: I0930 06:36:34.715154 4691 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/5385150d-abdd-4b17-bbf4-fee7d4b5946e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:34 crc kubenswrapper[4691]: I0930 06:36:34.733995 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9e263ab-2384-42b0-8c7b-a787bcf361a9-config-data" (OuterVolumeSpecName: "config-data") pod "b9e263ab-2384-42b0-8c7b-a787bcf361a9" (UID: "b9e263ab-2384-42b0-8c7b-a787bcf361a9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:36:34 crc kubenswrapper[4691]: I0930 06:36:34.817012 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9e263ab-2384-42b0-8c7b-a787bcf361a9-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:34 crc kubenswrapper[4691]: I0930 06:36:34.952061 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-decision-engine-0"] Sep 30 06:36:34 crc kubenswrapper[4691]: I0930 06:36:34.961431 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-decision-engine-0"] Sep 30 06:36:34 crc kubenswrapper[4691]: I0930 06:36:34.975269 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-decision-engine-0"] Sep 30 06:36:34 crc kubenswrapper[4691]: E0930 06:36:34.975604 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5385150d-abdd-4b17-bbf4-fee7d4b5946e" containerName="dnsmasq-dns" Sep 30 06:36:34 crc kubenswrapper[4691]: I0930 06:36:34.975622 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="5385150d-abdd-4b17-bbf4-fee7d4b5946e" containerName="dnsmasq-dns" Sep 30 06:36:34 crc kubenswrapper[4691]: E0930 06:36:34.975644 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9e263ab-2384-42b0-8c7b-a787bcf361a9" containerName="watcher-decision-engine" Sep 30 06:36:34 crc kubenswrapper[4691]: I0930 06:36:34.975651 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9e263ab-2384-42b0-8c7b-a787bcf361a9" containerName="watcher-decision-engine" Sep 30 06:36:34 crc kubenswrapper[4691]: E0930 06:36:34.975666 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca252689-a07e-4e84-a79f-7884687c6db3" containerName="keystone-bootstrap" Sep 30 06:36:34 crc kubenswrapper[4691]: I0930 06:36:34.975672 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca252689-a07e-4e84-a79f-7884687c6db3" containerName="keystone-bootstrap" Sep 30 06:36:34 crc kubenswrapper[4691]: E0930 06:36:34.975690 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5385150d-abdd-4b17-bbf4-fee7d4b5946e" containerName="init" Sep 30 06:36:34 crc kubenswrapper[4691]: I0930 06:36:34.975695 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="5385150d-abdd-4b17-bbf4-fee7d4b5946e" containerName="init" Sep 30 06:36:34 crc kubenswrapper[4691]: I0930 06:36:34.975852 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9e263ab-2384-42b0-8c7b-a787bcf361a9" containerName="watcher-decision-engine" Sep 30 06:36:34 crc kubenswrapper[4691]: I0930 06:36:34.975915 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="5385150d-abdd-4b17-bbf4-fee7d4b5946e" containerName="dnsmasq-dns" Sep 30 06:36:34 crc kubenswrapper[4691]: I0930 06:36:34.975927 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca252689-a07e-4e84-a79f-7884687c6db3" containerName="keystone-bootstrap" Sep 30 06:36:34 crc kubenswrapper[4691]: I0930 06:36:34.976485 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Sep 30 06:36:34 crc kubenswrapper[4691]: I0930 06:36:34.978438 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-decision-engine-config-data" Sep 30 06:36:35 crc kubenswrapper[4691]: I0930 06:36:34.989152 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Sep 30 06:36:35 crc kubenswrapper[4691]: I0930 06:36:35.015338 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-75fd5776c-42zjc"] Sep 30 06:36:35 crc kubenswrapper[4691]: I0930 06:36:35.023760 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/af3f1644-3ab8-4a6a-9f80-f8ea42297e98-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"af3f1644-3ab8-4a6a-9f80-f8ea42297e98\") " pod="openstack/watcher-decision-engine-0" Sep 30 06:36:35 crc kubenswrapper[4691]: I0930 06:36:35.023935 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7tjl\" (UniqueName: \"kubernetes.io/projected/af3f1644-3ab8-4a6a-9f80-f8ea42297e98-kube-api-access-b7tjl\") pod \"watcher-decision-engine-0\" (UID: \"af3f1644-3ab8-4a6a-9f80-f8ea42297e98\") " pod="openstack/watcher-decision-engine-0" Sep 30 06:36:35 crc kubenswrapper[4691]: I0930 06:36:35.024061 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af3f1644-3ab8-4a6a-9f80-f8ea42297e98-logs\") pod \"watcher-decision-engine-0\" (UID: \"af3f1644-3ab8-4a6a-9f80-f8ea42297e98\") " pod="openstack/watcher-decision-engine-0" Sep 30 06:36:35 crc kubenswrapper[4691]: I0930 06:36:35.024136 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af3f1644-3ab8-4a6a-9f80-f8ea42297e98-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"af3f1644-3ab8-4a6a-9f80-f8ea42297e98\") " pod="openstack/watcher-decision-engine-0" Sep 30 06:36:35 crc kubenswrapper[4691]: I0930 06:36:35.024205 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af3f1644-3ab8-4a6a-9f80-f8ea42297e98-config-data\") pod \"watcher-decision-engine-0\" (UID: \"af3f1644-3ab8-4a6a-9f80-f8ea42297e98\") " pod="openstack/watcher-decision-engine-0" Sep 30 06:36:35 crc kubenswrapper[4691]: I0930 06:36:35.030298 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-75fd5776c-42zjc"] Sep 30 06:36:35 crc kubenswrapper[4691]: I0930 06:36:35.126033 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7tjl\" (UniqueName: \"kubernetes.io/projected/af3f1644-3ab8-4a6a-9f80-f8ea42297e98-kube-api-access-b7tjl\") pod \"watcher-decision-engine-0\" (UID: \"af3f1644-3ab8-4a6a-9f80-f8ea42297e98\") " pod="openstack/watcher-decision-engine-0" Sep 30 06:36:35 crc kubenswrapper[4691]: I0930 06:36:35.127011 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af3f1644-3ab8-4a6a-9f80-f8ea42297e98-logs\") pod \"watcher-decision-engine-0\" (UID: \"af3f1644-3ab8-4a6a-9f80-f8ea42297e98\") " pod="openstack/watcher-decision-engine-0" Sep 30 06:36:35 crc kubenswrapper[4691]: I0930 06:36:35.127126 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af3f1644-3ab8-4a6a-9f80-f8ea42297e98-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"af3f1644-3ab8-4a6a-9f80-f8ea42297e98\") " pod="openstack/watcher-decision-engine-0" Sep 30 06:36:35 crc kubenswrapper[4691]: I0930 06:36:35.127219 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af3f1644-3ab8-4a6a-9f80-f8ea42297e98-config-data\") pod \"watcher-decision-engine-0\" (UID: \"af3f1644-3ab8-4a6a-9f80-f8ea42297e98\") " pod="openstack/watcher-decision-engine-0" Sep 30 06:36:35 crc kubenswrapper[4691]: I0930 06:36:35.127371 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/af3f1644-3ab8-4a6a-9f80-f8ea42297e98-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"af3f1644-3ab8-4a6a-9f80-f8ea42297e98\") " pod="openstack/watcher-decision-engine-0" Sep 30 06:36:35 crc kubenswrapper[4691]: I0930 06:36:35.131312 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/af3f1644-3ab8-4a6a-9f80-f8ea42297e98-logs\") pod \"watcher-decision-engine-0\" (UID: \"af3f1644-3ab8-4a6a-9f80-f8ea42297e98\") " pod="openstack/watcher-decision-engine-0" Sep 30 06:36:35 crc kubenswrapper[4691]: I0930 06:36:35.135457 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/af3f1644-3ab8-4a6a-9f80-f8ea42297e98-custom-prometheus-ca\") pod \"watcher-decision-engine-0\" (UID: \"af3f1644-3ab8-4a6a-9f80-f8ea42297e98\") " pod="openstack/watcher-decision-engine-0" Sep 30 06:36:35 crc kubenswrapper[4691]: I0930 06:36:35.139419 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/af3f1644-3ab8-4a6a-9f80-f8ea42297e98-config-data\") pod \"watcher-decision-engine-0\" (UID: \"af3f1644-3ab8-4a6a-9f80-f8ea42297e98\") " pod="openstack/watcher-decision-engine-0" Sep 30 06:36:35 crc kubenswrapper[4691]: I0930 06:36:35.144471 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/af3f1644-3ab8-4a6a-9f80-f8ea42297e98-combined-ca-bundle\") pod \"watcher-decision-engine-0\" (UID: \"af3f1644-3ab8-4a6a-9f80-f8ea42297e98\") " pod="openstack/watcher-decision-engine-0" Sep 30 06:36:35 crc kubenswrapper[4691]: I0930 06:36:35.161371 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7tjl\" (UniqueName: \"kubernetes.io/projected/af3f1644-3ab8-4a6a-9f80-f8ea42297e98-kube-api-access-b7tjl\") pod \"watcher-decision-engine-0\" (UID: \"af3f1644-3ab8-4a6a-9f80-f8ea42297e98\") " pod="openstack/watcher-decision-engine-0" Sep 30 06:36:35 crc kubenswrapper[4691]: I0930 06:36:35.255852 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5385150d-abdd-4b17-bbf4-fee7d4b5946e" path="/var/lib/kubelet/pods/5385150d-abdd-4b17-bbf4-fee7d4b5946e/volumes" Sep 30 06:36:35 crc kubenswrapper[4691]: I0930 06:36:35.256461 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9e263ab-2384-42b0-8c7b-a787bcf361a9" path="/var/lib/kubelet/pods/b9e263ab-2384-42b0-8c7b-a787bcf361a9/volumes" Sep 30 06:36:35 crc kubenswrapper[4691]: I0930 06:36:35.334689 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-decision-engine-0" Sep 30 06:36:35 crc kubenswrapper[4691]: E0930 06:36:35.356970 4691 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0b11f18aefe7f7db7a62a58255ec5d31d85eaf58a99f7dfcbd8f5869b1dd973b is running failed: container process not found" containerID="0b11f18aefe7f7db7a62a58255ec5d31d85eaf58a99f7dfcbd8f5869b1dd973b" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Sep 30 06:36:35 crc kubenswrapper[4691]: E0930 06:36:35.357383 4691 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0b11f18aefe7f7db7a62a58255ec5d31d85eaf58a99f7dfcbd8f5869b1dd973b is running failed: container process not found" containerID="0b11f18aefe7f7db7a62a58255ec5d31d85eaf58a99f7dfcbd8f5869b1dd973b" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Sep 30 06:36:35 crc kubenswrapper[4691]: E0930 06:36:35.357691 4691 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0b11f18aefe7f7db7a62a58255ec5d31d85eaf58a99f7dfcbd8f5869b1dd973b is running failed: container process not found" containerID="0b11f18aefe7f7db7a62a58255ec5d31d85eaf58a99f7dfcbd8f5869b1dd973b" cmd=["/usr/bin/pgrep","-r","DRST","watcher-applier"] Sep 30 06:36:35 crc kubenswrapper[4691]: E0930 06:36:35.357734 4691 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 0b11f18aefe7f7db7a62a58255ec5d31d85eaf58a99f7dfcbd8f5869b1dd973b is running failed: container process not found" probeType="Readiness" pod="openstack/watcher-applier-0" podUID="a1cbb8f9-118e-48b5-ae92-067ece5295a2" containerName="watcher-applier" Sep 30 06:36:35 crc kubenswrapper[4691]: I0930 06:36:35.529980 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bf6754cd6-fsq4c"] Sep 30 06:36:35 crc kubenswrapper[4691]: I0930 06:36:35.531222 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bf6754cd6-fsq4c" Sep 30 06:36:35 crc kubenswrapper[4691]: I0930 06:36:35.534764 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Sep 30 06:36:35 crc kubenswrapper[4691]: I0930 06:36:35.534820 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-fxfcb" Sep 30 06:36:35 crc kubenswrapper[4691]: I0930 06:36:35.534983 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Sep 30 06:36:35 crc kubenswrapper[4691]: I0930 06:36:35.535138 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Sep 30 06:36:35 crc kubenswrapper[4691]: I0930 06:36:35.535189 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Sep 30 06:36:35 crc kubenswrapper[4691]: I0930 06:36:35.535253 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Sep 30 06:36:35 crc kubenswrapper[4691]: I0930 06:36:35.562936 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bf6754cd6-fsq4c"] Sep 30 06:36:35 crc kubenswrapper[4691]: I0930 06:36:35.665115 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fef5a53-6fb3-4c3b-8929-e9e49f85f050-config-data\") pod \"keystone-bf6754cd6-fsq4c\" (UID: \"5fef5a53-6fb3-4c3b-8929-e9e49f85f050\") " pod="openstack/keystone-bf6754cd6-fsq4c" Sep 30 06:36:35 crc kubenswrapper[4691]: I0930 06:36:35.665384 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5fef5a53-6fb3-4c3b-8929-e9e49f85f050-scripts\") pod \"keystone-bf6754cd6-fsq4c\" (UID: \"5fef5a53-6fb3-4c3b-8929-e9e49f85f050\") " pod="openstack/keystone-bf6754cd6-fsq4c" Sep 30 06:36:35 crc kubenswrapper[4691]: I0930 06:36:35.665424 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5fef5a53-6fb3-4c3b-8929-e9e49f85f050-public-tls-certs\") pod \"keystone-bf6754cd6-fsq4c\" (UID: \"5fef5a53-6fb3-4c3b-8929-e9e49f85f050\") " pod="openstack/keystone-bf6754cd6-fsq4c" Sep 30 06:36:35 crc kubenswrapper[4691]: I0930 06:36:35.665467 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5fef5a53-6fb3-4c3b-8929-e9e49f85f050-credential-keys\") pod \"keystone-bf6754cd6-fsq4c\" (UID: \"5fef5a53-6fb3-4c3b-8929-e9e49f85f050\") " pod="openstack/keystone-bf6754cd6-fsq4c" Sep 30 06:36:35 crc kubenswrapper[4691]: I0930 06:36:35.665508 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8jvg\" (UniqueName: \"kubernetes.io/projected/5fef5a53-6fb3-4c3b-8929-e9e49f85f050-kube-api-access-g8jvg\") pod \"keystone-bf6754cd6-fsq4c\" (UID: \"5fef5a53-6fb3-4c3b-8929-e9e49f85f050\") " pod="openstack/keystone-bf6754cd6-fsq4c" Sep 30 06:36:35 crc kubenswrapper[4691]: I0930 06:36:35.665545 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5fef5a53-6fb3-4c3b-8929-e9e49f85f050-fernet-keys\") pod \"keystone-bf6754cd6-fsq4c\" (UID: \"5fef5a53-6fb3-4c3b-8929-e9e49f85f050\") " pod="openstack/keystone-bf6754cd6-fsq4c" Sep 30 06:36:35 crc kubenswrapper[4691]: I0930 06:36:35.665577 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fef5a53-6fb3-4c3b-8929-e9e49f85f050-combined-ca-bundle\") pod \"keystone-bf6754cd6-fsq4c\" (UID: \"5fef5a53-6fb3-4c3b-8929-e9e49f85f050\") " pod="openstack/keystone-bf6754cd6-fsq4c" Sep 30 06:36:35 crc kubenswrapper[4691]: I0930 06:36:35.665600 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5fef5a53-6fb3-4c3b-8929-e9e49f85f050-internal-tls-certs\") pod \"keystone-bf6754cd6-fsq4c\" (UID: \"5fef5a53-6fb3-4c3b-8929-e9e49f85f050\") " pod="openstack/keystone-bf6754cd6-fsq4c" Sep 30 06:36:35 crc kubenswrapper[4691]: I0930 06:36:35.667416 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"df950ad9-45e7-4c79-ba30-ef4b423809b0","Type":"ContainerStarted","Data":"80b311ef92245ba29ced3c5c44eb4f6e0a4d8ae29cd79b446f38c4dda55ef30f"} Sep 30 06:36:35 crc kubenswrapper[4691]: I0930 06:36:35.690877 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=11.690855674 podStartE2EDuration="11.690855674s" podCreationTimestamp="2025-09-30 06:36:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:36:35.688309862 +0000 UTC m=+1039.163330912" watchObservedRunningTime="2025-09-30 06:36:35.690855674 +0000 UTC m=+1039.165876714" Sep 30 06:36:35 crc kubenswrapper[4691]: I0930 06:36:35.767108 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fef5a53-6fb3-4c3b-8929-e9e49f85f050-combined-ca-bundle\") pod \"keystone-bf6754cd6-fsq4c\" (UID: \"5fef5a53-6fb3-4c3b-8929-e9e49f85f050\") " pod="openstack/keystone-bf6754cd6-fsq4c" Sep 30 06:36:35 crc kubenswrapper[4691]: I0930 06:36:35.767167 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5fef5a53-6fb3-4c3b-8929-e9e49f85f050-internal-tls-certs\") pod \"keystone-bf6754cd6-fsq4c\" (UID: \"5fef5a53-6fb3-4c3b-8929-e9e49f85f050\") " pod="openstack/keystone-bf6754cd6-fsq4c" Sep 30 06:36:35 crc kubenswrapper[4691]: I0930 06:36:35.767217 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fef5a53-6fb3-4c3b-8929-e9e49f85f050-config-data\") pod \"keystone-bf6754cd6-fsq4c\" (UID: \"5fef5a53-6fb3-4c3b-8929-e9e49f85f050\") " pod="openstack/keystone-bf6754cd6-fsq4c" Sep 30 06:36:35 crc kubenswrapper[4691]: I0930 06:36:35.767243 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5fef5a53-6fb3-4c3b-8929-e9e49f85f050-scripts\") pod \"keystone-bf6754cd6-fsq4c\" (UID: \"5fef5a53-6fb3-4c3b-8929-e9e49f85f050\") " pod="openstack/keystone-bf6754cd6-fsq4c" Sep 30 06:36:35 crc kubenswrapper[4691]: I0930 06:36:35.767289 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5fef5a53-6fb3-4c3b-8929-e9e49f85f050-public-tls-certs\") pod \"keystone-bf6754cd6-fsq4c\" (UID: \"5fef5a53-6fb3-4c3b-8929-e9e49f85f050\") " pod="openstack/keystone-bf6754cd6-fsq4c" Sep 30 06:36:35 crc kubenswrapper[4691]: I0930 06:36:35.767345 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5fef5a53-6fb3-4c3b-8929-e9e49f85f050-credential-keys\") pod \"keystone-bf6754cd6-fsq4c\" (UID: \"5fef5a53-6fb3-4c3b-8929-e9e49f85f050\") " pod="openstack/keystone-bf6754cd6-fsq4c" Sep 30 06:36:35 crc kubenswrapper[4691]: I0930 06:36:35.767397 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8jvg\" (UniqueName: \"kubernetes.io/projected/5fef5a53-6fb3-4c3b-8929-e9e49f85f050-kube-api-access-g8jvg\") pod \"keystone-bf6754cd6-fsq4c\" (UID: \"5fef5a53-6fb3-4c3b-8929-e9e49f85f050\") " pod="openstack/keystone-bf6754cd6-fsq4c" Sep 30 06:36:35 crc kubenswrapper[4691]: I0930 06:36:35.767471 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5fef5a53-6fb3-4c3b-8929-e9e49f85f050-fernet-keys\") pod \"keystone-bf6754cd6-fsq4c\" (UID: \"5fef5a53-6fb3-4c3b-8929-e9e49f85f050\") " pod="openstack/keystone-bf6754cd6-fsq4c" Sep 30 06:36:35 crc kubenswrapper[4691]: I0930 06:36:35.783649 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5fef5a53-6fb3-4c3b-8929-e9e49f85f050-config-data\") pod \"keystone-bf6754cd6-fsq4c\" (UID: \"5fef5a53-6fb3-4c3b-8929-e9e49f85f050\") " pod="openstack/keystone-bf6754cd6-fsq4c" Sep 30 06:36:35 crc kubenswrapper[4691]: I0930 06:36:35.784013 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fef5a53-6fb3-4c3b-8929-e9e49f85f050-combined-ca-bundle\") pod \"keystone-bf6754cd6-fsq4c\" (UID: \"5fef5a53-6fb3-4c3b-8929-e9e49f85f050\") " pod="openstack/keystone-bf6754cd6-fsq4c" Sep 30 06:36:35 crc kubenswrapper[4691]: I0930 06:36:35.784130 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5fef5a53-6fb3-4c3b-8929-e9e49f85f050-internal-tls-certs\") pod \"keystone-bf6754cd6-fsq4c\" (UID: \"5fef5a53-6fb3-4c3b-8929-e9e49f85f050\") " pod="openstack/keystone-bf6754cd6-fsq4c" Sep 30 06:36:35 crc kubenswrapper[4691]: I0930 06:36:35.784343 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8jvg\" (UniqueName: \"kubernetes.io/projected/5fef5a53-6fb3-4c3b-8929-e9e49f85f050-kube-api-access-g8jvg\") pod \"keystone-bf6754cd6-fsq4c\" (UID: \"5fef5a53-6fb3-4c3b-8929-e9e49f85f050\") " pod="openstack/keystone-bf6754cd6-fsq4c" Sep 30 06:36:35 crc kubenswrapper[4691]: I0930 06:36:35.787320 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5fef5a53-6fb3-4c3b-8929-e9e49f85f050-fernet-keys\") pod \"keystone-bf6754cd6-fsq4c\" (UID: \"5fef5a53-6fb3-4c3b-8929-e9e49f85f050\") " pod="openstack/keystone-bf6754cd6-fsq4c" Sep 30 06:36:35 crc kubenswrapper[4691]: I0930 06:36:35.791772 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5fef5a53-6fb3-4c3b-8929-e9e49f85f050-scripts\") pod \"keystone-bf6754cd6-fsq4c\" (UID: \"5fef5a53-6fb3-4c3b-8929-e9e49f85f050\") " pod="openstack/keystone-bf6754cd6-fsq4c" Sep 30 06:36:35 crc kubenswrapper[4691]: I0930 06:36:35.808426 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5fef5a53-6fb3-4c3b-8929-e9e49f85f050-credential-keys\") pod \"keystone-bf6754cd6-fsq4c\" (UID: \"5fef5a53-6fb3-4c3b-8929-e9e49f85f050\") " pod="openstack/keystone-bf6754cd6-fsq4c" Sep 30 06:36:35 crc kubenswrapper[4691]: I0930 06:36:35.815112 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5fef5a53-6fb3-4c3b-8929-e9e49f85f050-public-tls-certs\") pod \"keystone-bf6754cd6-fsq4c\" (UID: \"5fef5a53-6fb3-4c3b-8929-e9e49f85f050\") " pod="openstack/keystone-bf6754cd6-fsq4c" Sep 30 06:36:35 crc kubenswrapper[4691]: I0930 06:36:35.834097 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Sep 30 06:36:35 crc kubenswrapper[4691]: I0930 06:36:35.834483 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Sep 30 06:36:35 crc kubenswrapper[4691]: I0930 06:36:35.847357 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bf6754cd6-fsq4c" Sep 30 06:36:35 crc kubenswrapper[4691]: I0930 06:36:35.880752 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Sep 30 06:36:35 crc kubenswrapper[4691]: I0930 06:36:35.883339 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Sep 30 06:36:36 crc kubenswrapper[4691]: I0930 06:36:36.678018 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Sep 30 06:36:36 crc kubenswrapper[4691]: I0930 06:36:36.678280 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Sep 30 06:36:37 crc kubenswrapper[4691]: I0930 06:36:37.595528 4691 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-75fd5776c-42zjc" podUID="5385150d-abdd-4b17-bbf4-fee7d4b5946e" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.135:5353: i/o timeout" Sep 30 06:36:39 crc kubenswrapper[4691]: I0930 06:36:39.287489 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Sep 30 06:36:39 crc kubenswrapper[4691]: I0930 06:36:39.336959 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9dpg\" (UniqueName: \"kubernetes.io/projected/a1cbb8f9-118e-48b5-ae92-067ece5295a2-kube-api-access-s9dpg\") pod \"a1cbb8f9-118e-48b5-ae92-067ece5295a2\" (UID: \"a1cbb8f9-118e-48b5-ae92-067ece5295a2\") " Sep 30 06:36:39 crc kubenswrapper[4691]: I0930 06:36:39.337051 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1cbb8f9-118e-48b5-ae92-067ece5295a2-combined-ca-bundle\") pod \"a1cbb8f9-118e-48b5-ae92-067ece5295a2\" (UID: \"a1cbb8f9-118e-48b5-ae92-067ece5295a2\") " Sep 30 06:36:39 crc kubenswrapper[4691]: I0930 06:36:39.337139 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1cbb8f9-118e-48b5-ae92-067ece5295a2-logs\") pod \"a1cbb8f9-118e-48b5-ae92-067ece5295a2\" (UID: \"a1cbb8f9-118e-48b5-ae92-067ece5295a2\") " Sep 30 06:36:39 crc kubenswrapper[4691]: I0930 06:36:39.337281 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1cbb8f9-118e-48b5-ae92-067ece5295a2-config-data\") pod \"a1cbb8f9-118e-48b5-ae92-067ece5295a2\" (UID: \"a1cbb8f9-118e-48b5-ae92-067ece5295a2\") " Sep 30 06:36:39 crc kubenswrapper[4691]: I0930 06:36:39.338467 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1cbb8f9-118e-48b5-ae92-067ece5295a2-logs" (OuterVolumeSpecName: "logs") pod "a1cbb8f9-118e-48b5-ae92-067ece5295a2" (UID: "a1cbb8f9-118e-48b5-ae92-067ece5295a2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:36:39 crc kubenswrapper[4691]: I0930 06:36:39.344552 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1cbb8f9-118e-48b5-ae92-067ece5295a2-kube-api-access-s9dpg" (OuterVolumeSpecName: "kube-api-access-s9dpg") pod "a1cbb8f9-118e-48b5-ae92-067ece5295a2" (UID: "a1cbb8f9-118e-48b5-ae92-067ece5295a2"). InnerVolumeSpecName "kube-api-access-s9dpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:36:39 crc kubenswrapper[4691]: I0930 06:36:39.398775 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1cbb8f9-118e-48b5-ae92-067ece5295a2-config-data" (OuterVolumeSpecName: "config-data") pod "a1cbb8f9-118e-48b5-ae92-067ece5295a2" (UID: "a1cbb8f9-118e-48b5-ae92-067ece5295a2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:36:39 crc kubenswrapper[4691]: I0930 06:36:39.422362 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1cbb8f9-118e-48b5-ae92-067ece5295a2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a1cbb8f9-118e-48b5-ae92-067ece5295a2" (UID: "a1cbb8f9-118e-48b5-ae92-067ece5295a2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:36:39 crc kubenswrapper[4691]: I0930 06:36:39.440979 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1cbb8f9-118e-48b5-ae92-067ece5295a2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:39 crc kubenswrapper[4691]: I0930 06:36:39.441012 4691 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a1cbb8f9-118e-48b5-ae92-067ece5295a2-logs\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:39 crc kubenswrapper[4691]: I0930 06:36:39.441027 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1cbb8f9-118e-48b5-ae92-067ece5295a2-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:39 crc kubenswrapper[4691]: I0930 06:36:39.441048 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9dpg\" (UniqueName: \"kubernetes.io/projected/a1cbb8f9-118e-48b5-ae92-067ece5295a2-kube-api-access-s9dpg\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:39 crc kubenswrapper[4691]: I0930 06:36:39.700992 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Sep 30 06:36:39 crc kubenswrapper[4691]: I0930 06:36:39.701211 4691 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 06:36:39 crc kubenswrapper[4691]: I0930 06:36:39.705189 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-56d945494d-7svb6"] Sep 30 06:36:39 crc kubenswrapper[4691]: I0930 06:36:39.717988 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Sep 30 06:36:39 crc kubenswrapper[4691]: I0930 06:36:39.775063 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"a1cbb8f9-118e-48b5-ae92-067ece5295a2","Type":"ContainerDied","Data":"d8b8310b1fbbb98511bfeaa0ddaf5c55824e3d49e5b5758e3664cdb2043c48ae"} Sep 30 06:36:39 crc kubenswrapper[4691]: I0930 06:36:39.775157 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Sep 30 06:36:39 crc kubenswrapper[4691]: I0930 06:36:39.820439 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-applier-0"] Sep 30 06:36:39 crc kubenswrapper[4691]: I0930 06:36:39.833848 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-applier-0"] Sep 30 06:36:39 crc kubenswrapper[4691]: I0930 06:36:39.848449 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-applier-0"] Sep 30 06:36:39 crc kubenswrapper[4691]: E0930 06:36:39.848871 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1cbb8f9-118e-48b5-ae92-067ece5295a2" containerName="watcher-applier" Sep 30 06:36:39 crc kubenswrapper[4691]: I0930 06:36:39.859140 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1cbb8f9-118e-48b5-ae92-067ece5295a2" containerName="watcher-applier" Sep 30 06:36:39 crc kubenswrapper[4691]: I0930 06:36:39.859604 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1cbb8f9-118e-48b5-ae92-067ece5295a2" containerName="watcher-applier" Sep 30 06:36:39 crc kubenswrapper[4691]: I0930 06:36:39.860628 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Sep 30 06:36:39 crc kubenswrapper[4691]: I0930 06:36:39.864388 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-applier-config-data" Sep 30 06:36:39 crc kubenswrapper[4691]: I0930 06:36:39.866830 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Sep 30 06:36:39 crc kubenswrapper[4691]: I0930 06:36:39.949794 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fz7mp\" (UniqueName: \"kubernetes.io/projected/901f3032-8727-419d-8de7-b00c08535ca1-kube-api-access-fz7mp\") pod \"watcher-applier-0\" (UID: \"901f3032-8727-419d-8de7-b00c08535ca1\") " pod="openstack/watcher-applier-0" Sep 30 06:36:39 crc kubenswrapper[4691]: I0930 06:36:39.950161 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/901f3032-8727-419d-8de7-b00c08535ca1-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"901f3032-8727-419d-8de7-b00c08535ca1\") " pod="openstack/watcher-applier-0" Sep 30 06:36:39 crc kubenswrapper[4691]: I0930 06:36:39.950191 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/901f3032-8727-419d-8de7-b00c08535ca1-logs\") pod \"watcher-applier-0\" (UID: \"901f3032-8727-419d-8de7-b00c08535ca1\") " pod="openstack/watcher-applier-0" Sep 30 06:36:39 crc kubenswrapper[4691]: I0930 06:36:39.950252 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/901f3032-8727-419d-8de7-b00c08535ca1-config-data\") pod \"watcher-applier-0\" (UID: \"901f3032-8727-419d-8de7-b00c08535ca1\") " pod="openstack/watcher-applier-0" Sep 30 06:36:40 crc kubenswrapper[4691]: I0930 06:36:40.051922 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/901f3032-8727-419d-8de7-b00c08535ca1-config-data\") pod \"watcher-applier-0\" (UID: \"901f3032-8727-419d-8de7-b00c08535ca1\") " pod="openstack/watcher-applier-0" Sep 30 06:36:40 crc kubenswrapper[4691]: I0930 06:36:40.051993 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fz7mp\" (UniqueName: \"kubernetes.io/projected/901f3032-8727-419d-8de7-b00c08535ca1-kube-api-access-fz7mp\") pod \"watcher-applier-0\" (UID: \"901f3032-8727-419d-8de7-b00c08535ca1\") " pod="openstack/watcher-applier-0" Sep 30 06:36:40 crc kubenswrapper[4691]: I0930 06:36:40.052071 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/901f3032-8727-419d-8de7-b00c08535ca1-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"901f3032-8727-419d-8de7-b00c08535ca1\") " pod="openstack/watcher-applier-0" Sep 30 06:36:40 crc kubenswrapper[4691]: I0930 06:36:40.052098 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/901f3032-8727-419d-8de7-b00c08535ca1-logs\") pod \"watcher-applier-0\" (UID: \"901f3032-8727-419d-8de7-b00c08535ca1\") " pod="openstack/watcher-applier-0" Sep 30 06:36:40 crc kubenswrapper[4691]: I0930 06:36:40.052544 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/901f3032-8727-419d-8de7-b00c08535ca1-logs\") pod \"watcher-applier-0\" (UID: \"901f3032-8727-419d-8de7-b00c08535ca1\") " pod="openstack/watcher-applier-0" Sep 30 06:36:40 crc kubenswrapper[4691]: I0930 06:36:40.055906 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/901f3032-8727-419d-8de7-b00c08535ca1-config-data\") pod \"watcher-applier-0\" (UID: \"901f3032-8727-419d-8de7-b00c08535ca1\") " pod="openstack/watcher-applier-0" Sep 30 06:36:40 crc kubenswrapper[4691]: I0930 06:36:40.057026 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/901f3032-8727-419d-8de7-b00c08535ca1-combined-ca-bundle\") pod \"watcher-applier-0\" (UID: \"901f3032-8727-419d-8de7-b00c08535ca1\") " pod="openstack/watcher-applier-0" Sep 30 06:36:40 crc kubenswrapper[4691]: I0930 06:36:40.069163 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fz7mp\" (UniqueName: \"kubernetes.io/projected/901f3032-8727-419d-8de7-b00c08535ca1-kube-api-access-fz7mp\") pod \"watcher-applier-0\" (UID: \"901f3032-8727-419d-8de7-b00c08535ca1\") " pod="openstack/watcher-applier-0" Sep 30 06:36:40 crc kubenswrapper[4691]: I0930 06:36:40.225124 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-applier-0" Sep 30 06:36:40 crc kubenswrapper[4691]: I0930 06:36:40.328578 4691 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="a0cca0db-c02c-407f-9ba0-9dbaabdfa22d" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.149:9322/\": dial tcp 10.217.0.149:9322: connect: connection refused" Sep 30 06:36:40 crc kubenswrapper[4691]: I0930 06:36:40.797085 4691 generic.go:334] "Generic (PLEG): container finished" podID="a0cca0db-c02c-407f-9ba0-9dbaabdfa22d" containerID="804a939313f46aebffa0047313d44f607be5216b0c042e5bcae445243847c38d" exitCode=137 Sep 30 06:36:40 crc kubenswrapper[4691]: I0930 06:36:40.797133 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"a0cca0db-c02c-407f-9ba0-9dbaabdfa22d","Type":"ContainerDied","Data":"804a939313f46aebffa0047313d44f607be5216b0c042e5bcae445243847c38d"} Sep 30 06:36:41 crc kubenswrapper[4691]: I0930 06:36:41.238509 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1cbb8f9-118e-48b5-ae92-067ece5295a2" path="/var/lib/kubelet/pods/a1cbb8f9-118e-48b5-ae92-067ece5295a2/volumes" Sep 30 06:36:45 crc kubenswrapper[4691]: I0930 06:36:45.183981 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Sep 30 06:36:45 crc kubenswrapper[4691]: I0930 06:36:45.184772 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Sep 30 06:36:45 crc kubenswrapper[4691]: I0930 06:36:45.240108 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Sep 30 06:36:45 crc kubenswrapper[4691]: I0930 06:36:45.260040 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Sep 30 06:36:45 crc kubenswrapper[4691]: I0930 06:36:45.328405 4691 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="a0cca0db-c02c-407f-9ba0-9dbaabdfa22d" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.149:9322/\": dial tcp 10.217.0.149:9322: connect: connection refused" Sep 30 06:36:45 crc kubenswrapper[4691]: I0930 06:36:45.846697 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Sep 30 06:36:45 crc kubenswrapper[4691]: I0930 06:36:45.846731 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Sep 30 06:36:45 crc kubenswrapper[4691]: I0930 06:36:45.944516 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-56948c48fd-czzmm" Sep 30 06:36:45 crc kubenswrapper[4691]: I0930 06:36:45.953173 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-5755bf4df8-zx9td" Sep 30 06:36:47 crc kubenswrapper[4691]: I0930 06:36:47.959970 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Sep 30 06:36:47 crc kubenswrapper[4691]: I0930 06:36:47.961196 4691 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 06:36:47 crc kubenswrapper[4691]: I0930 06:36:47.975822 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Sep 30 06:36:49 crc kubenswrapper[4691]: I0930 06:36:49.725654 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-56948c48fd-czzmm" Sep 30 06:36:49 crc kubenswrapper[4691]: I0930 06:36:49.727670 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-5755bf4df8-zx9td" Sep 30 06:36:49 crc kubenswrapper[4691]: I0930 06:36:49.794392 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5755bf4df8-zx9td"] Sep 30 06:36:49 crc kubenswrapper[4691]: I0930 06:36:49.892342 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5755bf4df8-zx9td" podUID="cc29be35-3ceb-4a88-af6e-77e2d0cbab83" containerName="horizon-log" containerID="cri-o://b433c5a51bea7ed49e49b10d76eaae08ac7b93bc283b6448ac328bf46caa578c" gracePeriod=30 Sep 30 06:36:49 crc kubenswrapper[4691]: I0930 06:36:49.892819 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-56d945494d-7svb6" event={"ID":"d35f539b-5139-4155-8f51-a1e425e19925","Type":"ContainerStarted","Data":"6d895667899160158deaab7af27ac99392566ebd4d3ea289ddde5d78313927df"} Sep 30 06:36:49 crc kubenswrapper[4691]: I0930 06:36:49.893088 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5755bf4df8-zx9td" podUID="cc29be35-3ceb-4a88-af6e-77e2d0cbab83" containerName="horizon" containerID="cri-o://6cf0134f04e4636a4eb6deba05544969528280e6e7e910d2525ad3eab4aef3df" gracePeriod=30 Sep 30 06:36:50 crc kubenswrapper[4691]: E0930 06:36:50.597978 4691 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.30:5001/podified-master-centos10/openstack-cinder-api:watcher_latest" Sep 30 06:36:50 crc kubenswrapper[4691]: E0930 06:36:50.598042 4691 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.30:5001/podified-master-centos10/openstack-cinder-api:watcher_latest" Sep 30 06:36:50 crc kubenswrapper[4691]: E0930 06:36:50.598190 4691 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:38.102.83.30:5001/podified-master-centos10/openstack-cinder-api:watcher_latest,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mjg5r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-fh78b_openstack(071e402d-9775-412e-ad8a-1643cd646d7c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Sep 30 06:36:50 crc kubenswrapper[4691]: E0930 06:36:50.599343 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-fh78b" podUID="071e402d-9775-412e-ad8a-1643cd646d7c" Sep 30 06:36:50 crc kubenswrapper[4691]: I0930 06:36:50.622184 4691 scope.go:117] "RemoveContainer" containerID="6a0753e503f95d846c3c259455c74b6e174a2c797cab68037838cd926dabbc48" Sep 30 06:36:50 crc kubenswrapper[4691]: I0930 06:36:50.738426 4691 scope.go:117] "RemoveContainer" containerID="40e732a792146e552b05fa44be23dda22f4f40a9eff19885c012d4acec28c16a" Sep 30 06:36:50 crc kubenswrapper[4691]: I0930 06:36:50.786837 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Sep 30 06:36:50 crc kubenswrapper[4691]: I0930 06:36:50.812021 4691 scope.go:117] "RemoveContainer" containerID="0b11f18aefe7f7db7a62a58255ec5d31d85eaf58a99f7dfcbd8f5869b1dd973b" Sep 30 06:36:50 crc kubenswrapper[4691]: I0930 06:36:50.855589 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0cca0db-c02c-407f-9ba0-9dbaabdfa22d-logs\") pod \"a0cca0db-c02c-407f-9ba0-9dbaabdfa22d\" (UID: \"a0cca0db-c02c-407f-9ba0-9dbaabdfa22d\") " Sep 30 06:36:50 crc kubenswrapper[4691]: I0930 06:36:50.855636 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0cca0db-c02c-407f-9ba0-9dbaabdfa22d-combined-ca-bundle\") pod \"a0cca0db-c02c-407f-9ba0-9dbaabdfa22d\" (UID: \"a0cca0db-c02c-407f-9ba0-9dbaabdfa22d\") " Sep 30 06:36:50 crc kubenswrapper[4691]: I0930 06:36:50.856329 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0cca0db-c02c-407f-9ba0-9dbaabdfa22d-config-data\") pod \"a0cca0db-c02c-407f-9ba0-9dbaabdfa22d\" (UID: \"a0cca0db-c02c-407f-9ba0-9dbaabdfa22d\") " Sep 30 06:36:50 crc kubenswrapper[4691]: I0930 06:36:50.856351 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/a0cca0db-c02c-407f-9ba0-9dbaabdfa22d-custom-prometheus-ca\") pod \"a0cca0db-c02c-407f-9ba0-9dbaabdfa22d\" (UID: \"a0cca0db-c02c-407f-9ba0-9dbaabdfa22d\") " Sep 30 06:36:50 crc kubenswrapper[4691]: I0930 06:36:50.856475 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-spwrv\" (UniqueName: \"kubernetes.io/projected/a0cca0db-c02c-407f-9ba0-9dbaabdfa22d-kube-api-access-spwrv\") pod \"a0cca0db-c02c-407f-9ba0-9dbaabdfa22d\" (UID: \"a0cca0db-c02c-407f-9ba0-9dbaabdfa22d\") " Sep 30 06:36:50 crc kubenswrapper[4691]: I0930 06:36:50.856129 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0cca0db-c02c-407f-9ba0-9dbaabdfa22d-logs" (OuterVolumeSpecName: "logs") pod "a0cca0db-c02c-407f-9ba0-9dbaabdfa22d" (UID: "a0cca0db-c02c-407f-9ba0-9dbaabdfa22d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:36:50 crc kubenswrapper[4691]: I0930 06:36:50.870314 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0cca0db-c02c-407f-9ba0-9dbaabdfa22d-kube-api-access-spwrv" (OuterVolumeSpecName: "kube-api-access-spwrv") pod "a0cca0db-c02c-407f-9ba0-9dbaabdfa22d" (UID: "a0cca0db-c02c-407f-9ba0-9dbaabdfa22d"). InnerVolumeSpecName "kube-api-access-spwrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:36:50 crc kubenswrapper[4691]: I0930 06:36:50.902838 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"a0cca0db-c02c-407f-9ba0-9dbaabdfa22d","Type":"ContainerDied","Data":"1799b4e2e11916b8561607f8bd29fe120a6c0971cacdebfdd8856c7ae07d3ee5"} Sep 30 06:36:50 crc kubenswrapper[4691]: I0930 06:36:50.903056 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Sep 30 06:36:50 crc kubenswrapper[4691]: I0930 06:36:50.903109 4691 scope.go:117] "RemoveContainer" containerID="804a939313f46aebffa0047313d44f607be5216b0c042e5bcae445243847c38d" Sep 30 06:36:50 crc kubenswrapper[4691]: I0930 06:36:50.910733 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-56d945494d-7svb6" event={"ID":"d35f539b-5139-4155-8f51-a1e425e19925","Type":"ContainerStarted","Data":"44304f88bbedcc1b773c1ed64d8ad01c69cd5bbcc3e646d05a6ca9ee9fd6e9ad"} Sep 30 06:36:50 crc kubenswrapper[4691]: I0930 06:36:50.914836 4691 generic.go:334] "Generic (PLEG): container finished" podID="cc29be35-3ceb-4a88-af6e-77e2d0cbab83" containerID="6cf0134f04e4636a4eb6deba05544969528280e6e7e910d2525ad3eab4aef3df" exitCode=0 Sep 30 06:36:50 crc kubenswrapper[4691]: I0930 06:36:50.915038 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5755bf4df8-zx9td" event={"ID":"cc29be35-3ceb-4a88-af6e-77e2d0cbab83","Type":"ContainerDied","Data":"6cf0134f04e4636a4eb6deba05544969528280e6e7e910d2525ad3eab4aef3df"} Sep 30 06:36:50 crc kubenswrapper[4691]: E0930 06:36:50.917164 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.30:5001/podified-master-centos10/openstack-cinder-api:watcher_latest\\\"\"" pod="openstack/cinder-db-sync-fh78b" podUID="071e402d-9775-412e-ad8a-1643cd646d7c" Sep 30 06:36:50 crc kubenswrapper[4691]: I0930 06:36:50.953615 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0cca0db-c02c-407f-9ba0-9dbaabdfa22d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a0cca0db-c02c-407f-9ba0-9dbaabdfa22d" (UID: "a0cca0db-c02c-407f-9ba0-9dbaabdfa22d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:36:50 crc kubenswrapper[4691]: I0930 06:36:50.958381 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0cca0db-c02c-407f-9ba0-9dbaabdfa22d-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "a0cca0db-c02c-407f-9ba0-9dbaabdfa22d" (UID: "a0cca0db-c02c-407f-9ba0-9dbaabdfa22d"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:36:50 crc kubenswrapper[4691]: I0930 06:36:50.958929 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-spwrv\" (UniqueName: \"kubernetes.io/projected/a0cca0db-c02c-407f-9ba0-9dbaabdfa22d-kube-api-access-spwrv\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:50 crc kubenswrapper[4691]: I0930 06:36:50.958954 4691 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a0cca0db-c02c-407f-9ba0-9dbaabdfa22d-logs\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:50 crc kubenswrapper[4691]: I0930 06:36:50.958965 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0cca0db-c02c-407f-9ba0-9dbaabdfa22d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:50 crc kubenswrapper[4691]: I0930 06:36:50.958973 4691 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/a0cca0db-c02c-407f-9ba0-9dbaabdfa22d-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:50 crc kubenswrapper[4691]: I0930 06:36:50.981416 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0cca0db-c02c-407f-9ba0-9dbaabdfa22d-config-data" (OuterVolumeSpecName: "config-data") pod "a0cca0db-c02c-407f-9ba0-9dbaabdfa22d" (UID: "a0cca0db-c02c-407f-9ba0-9dbaabdfa22d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:36:50 crc kubenswrapper[4691]: I0930 06:36:50.992645 4691 scope.go:117] "RemoveContainer" containerID="58ea47eba5766897abe87f1915f5f3bb77b89b35aba64c8932cb8aad7e531a91" Sep 30 06:36:51 crc kubenswrapper[4691]: I0930 06:36:51.048258 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-decision-engine-0"] Sep 30 06:36:51 crc kubenswrapper[4691]: W0930 06:36:51.058329 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf3f1644_3ab8_4a6a_9f80_f8ea42297e98.slice/crio-a73f48753ce667bc99c2704da19e3728b9dbc3e4c6c0265a4d11ecec557be259 WatchSource:0}: Error finding container a73f48753ce667bc99c2704da19e3728b9dbc3e4c6c0265a4d11ecec557be259: Status 404 returned error can't find the container with id a73f48753ce667bc99c2704da19e3728b9dbc3e4c6c0265a4d11ecec557be259 Sep 30 06:36:51 crc kubenswrapper[4691]: I0930 06:36:51.060390 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0cca0db-c02c-407f-9ba0-9dbaabdfa22d-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:51 crc kubenswrapper[4691]: W0930 06:36:51.135463 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5fef5a53_6fb3_4c3b_8929_e9e49f85f050.slice/crio-3bcd1d2e1d5351c54baa4f5e2bdb4a15b3646245a54b24c225a22d3206645882 WatchSource:0}: Error finding container 3bcd1d2e1d5351c54baa4f5e2bdb4a15b3646245a54b24c225a22d3206645882: Status 404 returned error can't find the container with id 3bcd1d2e1d5351c54baa4f5e2bdb4a15b3646245a54b24c225a22d3206645882 Sep 30 06:36:51 crc kubenswrapper[4691]: I0930 06:36:51.136463 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bf6754cd6-fsq4c"] Sep 30 06:36:51 crc kubenswrapper[4691]: I0930 06:36:51.238746 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-applier-0"] Sep 30 06:36:51 crc kubenswrapper[4691]: I0930 06:36:51.250708 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Sep 30 06:36:51 crc kubenswrapper[4691]: I0930 06:36:51.269071 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-api-0"] Sep 30 06:36:51 crc kubenswrapper[4691]: I0930 06:36:51.279987 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Sep 30 06:36:51 crc kubenswrapper[4691]: E0930 06:36:51.280390 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0cca0db-c02c-407f-9ba0-9dbaabdfa22d" containerName="watcher-api" Sep 30 06:36:51 crc kubenswrapper[4691]: I0930 06:36:51.280410 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0cca0db-c02c-407f-9ba0-9dbaabdfa22d" containerName="watcher-api" Sep 30 06:36:51 crc kubenswrapper[4691]: E0930 06:36:51.280446 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0cca0db-c02c-407f-9ba0-9dbaabdfa22d" containerName="watcher-api-log" Sep 30 06:36:51 crc kubenswrapper[4691]: I0930 06:36:51.280454 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0cca0db-c02c-407f-9ba0-9dbaabdfa22d" containerName="watcher-api-log" Sep 30 06:36:51 crc kubenswrapper[4691]: I0930 06:36:51.280647 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0cca0db-c02c-407f-9ba0-9dbaabdfa22d" containerName="watcher-api" Sep 30 06:36:51 crc kubenswrapper[4691]: I0930 06:36:51.280658 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0cca0db-c02c-407f-9ba0-9dbaabdfa22d" containerName="watcher-api-log" Sep 30 06:36:51 crc kubenswrapper[4691]: I0930 06:36:51.282425 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Sep 30 06:36:51 crc kubenswrapper[4691]: I0930 06:36:51.291678 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Sep 30 06:36:51 crc kubenswrapper[4691]: I0930 06:36:51.302167 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Sep 30 06:36:51 crc kubenswrapper[4691]: I0930 06:36:51.368660 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/b2d59717-fe78-4912-ac77-bf28a8188b39-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"b2d59717-fe78-4912-ac77-bf28a8188b39\") " pod="openstack/watcher-api-0" Sep 30 06:36:51 crc kubenswrapper[4691]: I0930 06:36:51.368721 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2d59717-fe78-4912-ac77-bf28a8188b39-config-data\") pod \"watcher-api-0\" (UID: \"b2d59717-fe78-4912-ac77-bf28a8188b39\") " pod="openstack/watcher-api-0" Sep 30 06:36:51 crc kubenswrapper[4691]: I0930 06:36:51.368790 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qt45q\" (UniqueName: \"kubernetes.io/projected/b2d59717-fe78-4912-ac77-bf28a8188b39-kube-api-access-qt45q\") pod \"watcher-api-0\" (UID: \"b2d59717-fe78-4912-ac77-bf28a8188b39\") " pod="openstack/watcher-api-0" Sep 30 06:36:51 crc kubenswrapper[4691]: I0930 06:36:51.368850 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2d59717-fe78-4912-ac77-bf28a8188b39-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"b2d59717-fe78-4912-ac77-bf28a8188b39\") " pod="openstack/watcher-api-0" Sep 30 06:36:51 crc kubenswrapper[4691]: I0930 06:36:51.368869 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2d59717-fe78-4912-ac77-bf28a8188b39-logs\") pod \"watcher-api-0\" (UID: \"b2d59717-fe78-4912-ac77-bf28a8188b39\") " pod="openstack/watcher-api-0" Sep 30 06:36:51 crc kubenswrapper[4691]: I0930 06:36:51.471164 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/b2d59717-fe78-4912-ac77-bf28a8188b39-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"b2d59717-fe78-4912-ac77-bf28a8188b39\") " pod="openstack/watcher-api-0" Sep 30 06:36:51 crc kubenswrapper[4691]: I0930 06:36:51.471264 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2d59717-fe78-4912-ac77-bf28a8188b39-config-data\") pod \"watcher-api-0\" (UID: \"b2d59717-fe78-4912-ac77-bf28a8188b39\") " pod="openstack/watcher-api-0" Sep 30 06:36:51 crc kubenswrapper[4691]: I0930 06:36:51.471374 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qt45q\" (UniqueName: \"kubernetes.io/projected/b2d59717-fe78-4912-ac77-bf28a8188b39-kube-api-access-qt45q\") pod \"watcher-api-0\" (UID: \"b2d59717-fe78-4912-ac77-bf28a8188b39\") " pod="openstack/watcher-api-0" Sep 30 06:36:51 crc kubenswrapper[4691]: I0930 06:36:51.471483 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2d59717-fe78-4912-ac77-bf28a8188b39-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"b2d59717-fe78-4912-ac77-bf28a8188b39\") " pod="openstack/watcher-api-0" Sep 30 06:36:51 crc kubenswrapper[4691]: I0930 06:36:51.471520 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2d59717-fe78-4912-ac77-bf28a8188b39-logs\") pod \"watcher-api-0\" (UID: \"b2d59717-fe78-4912-ac77-bf28a8188b39\") " pod="openstack/watcher-api-0" Sep 30 06:36:51 crc kubenswrapper[4691]: I0930 06:36:51.472016 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2d59717-fe78-4912-ac77-bf28a8188b39-logs\") pod \"watcher-api-0\" (UID: \"b2d59717-fe78-4912-ac77-bf28a8188b39\") " pod="openstack/watcher-api-0" Sep 30 06:36:51 crc kubenswrapper[4691]: I0930 06:36:51.474782 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/b2d59717-fe78-4912-ac77-bf28a8188b39-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"b2d59717-fe78-4912-ac77-bf28a8188b39\") " pod="openstack/watcher-api-0" Sep 30 06:36:51 crc kubenswrapper[4691]: I0930 06:36:51.475868 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2d59717-fe78-4912-ac77-bf28a8188b39-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"b2d59717-fe78-4912-ac77-bf28a8188b39\") " pod="openstack/watcher-api-0" Sep 30 06:36:51 crc kubenswrapper[4691]: I0930 06:36:51.476713 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2d59717-fe78-4912-ac77-bf28a8188b39-config-data\") pod \"watcher-api-0\" (UID: \"b2d59717-fe78-4912-ac77-bf28a8188b39\") " pod="openstack/watcher-api-0" Sep 30 06:36:51 crc kubenswrapper[4691]: I0930 06:36:51.488239 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qt45q\" (UniqueName: \"kubernetes.io/projected/b2d59717-fe78-4912-ac77-bf28a8188b39-kube-api-access-qt45q\") pod \"watcher-api-0\" (UID: \"b2d59717-fe78-4912-ac77-bf28a8188b39\") " pod="openstack/watcher-api-0" Sep 30 06:36:51 crc kubenswrapper[4691]: I0930 06:36:51.605629 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Sep 30 06:36:51 crc kubenswrapper[4691]: I0930 06:36:51.924730 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"901f3032-8727-419d-8de7-b00c08535ca1","Type":"ContainerStarted","Data":"315efb976058aad4b4b485010eb13699439948664ae5429a5c66b887eff36755"} Sep 30 06:36:51 crc kubenswrapper[4691]: I0930 06:36:51.925002 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-applier-0" event={"ID":"901f3032-8727-419d-8de7-b00c08535ca1","Type":"ContainerStarted","Data":"f049a6c612a633b1eea5f08999fd297b5a5cfe876f8375658b4a04fc952f33d4"} Sep 30 06:36:51 crc kubenswrapper[4691]: I0930 06:36:51.926126 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-pblkf" event={"ID":"3e1ab390-f1ae-4ec9-b5d6-fb137a511e21","Type":"ContainerStarted","Data":"c421916911fd50b60418d185f1c860b702b4ec4ec3250ddf3ec26d1b20cbdc09"} Sep 30 06:36:51 crc kubenswrapper[4691]: I0930 06:36:51.929128 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6781ac88-7516-4101-8abd-9cacfbb930b7","Type":"ContainerStarted","Data":"3b94374bd067443ed31a28e4b285e89c05324fc370fedd8b4d95d64d45cb5855"} Sep 30 06:36:51 crc kubenswrapper[4691]: I0930 06:36:51.930149 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bf6754cd6-fsq4c" event={"ID":"5fef5a53-6fb3-4c3b-8929-e9e49f85f050","Type":"ContainerStarted","Data":"13c801d094080ec035b04eb87fd1f5d94239189d7ec70ac5b118c59921d9bf05"} Sep 30 06:36:51 crc kubenswrapper[4691]: I0930 06:36:51.930180 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bf6754cd6-fsq4c" event={"ID":"5fef5a53-6fb3-4c3b-8929-e9e49f85f050","Type":"ContainerStarted","Data":"3bcd1d2e1d5351c54baa4f5e2bdb4a15b3646245a54b24c225a22d3206645882"} Sep 30 06:36:51 crc kubenswrapper[4691]: I0930 06:36:51.931043 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-bf6754cd6-fsq4c" Sep 30 06:36:51 crc kubenswrapper[4691]: I0930 06:36:51.932557 4691 generic.go:334] "Generic (PLEG): container finished" podID="1f010c19-f02f-4c8b-8b12-1f357e860666" containerID="f53ce7512c565ebd65d02bacc532dddd113915699808a3404774f62e491fb73d" exitCode=0 Sep 30 06:36:51 crc kubenswrapper[4691]: I0930 06:36:51.932609 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-sb2qw" event={"ID":"1f010c19-f02f-4c8b-8b12-1f357e860666","Type":"ContainerDied","Data":"f53ce7512c565ebd65d02bacc532dddd113915699808a3404774f62e491fb73d"} Sep 30 06:36:51 crc kubenswrapper[4691]: I0930 06:36:51.934861 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"af3f1644-3ab8-4a6a-9f80-f8ea42297e98","Type":"ContainerStarted","Data":"5501ce67b931bb5c3d6aeb1c094dd1c1c6449137f68ebefe633416b944309b7c"} Sep 30 06:36:51 crc kubenswrapper[4691]: I0930 06:36:51.934915 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"af3f1644-3ab8-4a6a-9f80-f8ea42297e98","Type":"ContainerStarted","Data":"a73f48753ce667bc99c2704da19e3728b9dbc3e4c6c0265a4d11ecec557be259"} Sep 30 06:36:51 crc kubenswrapper[4691]: I0930 06:36:51.937842 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-56d945494d-7svb6" event={"ID":"d35f539b-5139-4155-8f51-a1e425e19925","Type":"ContainerStarted","Data":"b1ad5894e150f7ea35f11da13d960a3a470f192049ea0fadeb9a5b27375a1074"} Sep 30 06:36:51 crc kubenswrapper[4691]: I0930 06:36:51.937972 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-56d945494d-7svb6" Sep 30 06:36:51 crc kubenswrapper[4691]: I0930 06:36:51.938003 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-56d945494d-7svb6" Sep 30 06:36:51 crc kubenswrapper[4691]: I0930 06:36:51.949864 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-applier-0" podStartSLOduration=12.949824408 podStartE2EDuration="12.949824408s" podCreationTimestamp="2025-09-30 06:36:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:36:51.93928457 +0000 UTC m=+1055.414305610" watchObservedRunningTime="2025-09-30 06:36:51.949824408 +0000 UTC m=+1055.424845468" Sep 30 06:36:51 crc kubenswrapper[4691]: I0930 06:36:51.967206 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-decision-engine-0" podStartSLOduration=17.967188436 podStartE2EDuration="17.967188436s" podCreationTimestamp="2025-09-30 06:36:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:36:51.960242163 +0000 UTC m=+1055.435263213" watchObservedRunningTime="2025-09-30 06:36:51.967188436 +0000 UTC m=+1055.442209476" Sep 30 06:36:51 crc kubenswrapper[4691]: I0930 06:36:51.994628 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-pblkf" podStartSLOduration=15.466150966 podStartE2EDuration="32.994608816s" podCreationTimestamp="2025-09-30 06:36:19 +0000 UTC" firstStartedPulling="2025-09-30 06:36:21.645617866 +0000 UTC m=+1025.120638906" lastFinishedPulling="2025-09-30 06:36:39.174075706 +0000 UTC m=+1042.649096756" observedRunningTime="2025-09-30 06:36:51.984931796 +0000 UTC m=+1055.459952846" watchObservedRunningTime="2025-09-30 06:36:51.994608816 +0000 UTC m=+1055.469629856" Sep 30 06:36:52 crc kubenswrapper[4691]: I0930 06:36:52.015805 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-56d945494d-7svb6" podStartSLOduration=21.015783306 podStartE2EDuration="21.015783306s" podCreationTimestamp="2025-09-30 06:36:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:36:52.007446228 +0000 UTC m=+1055.482467278" watchObservedRunningTime="2025-09-30 06:36:52.015783306 +0000 UTC m=+1055.490804366" Sep 30 06:36:52 crc kubenswrapper[4691]: I0930 06:36:52.026491 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bf6754cd6-fsq4c" podStartSLOduration=17.026473449 podStartE2EDuration="17.026473449s" podCreationTimestamp="2025-09-30 06:36:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:36:52.0205906 +0000 UTC m=+1055.495611640" watchObservedRunningTime="2025-09-30 06:36:52.026473449 +0000 UTC m=+1055.501494479" Sep 30 06:36:52 crc kubenswrapper[4691]: W0930 06:36:52.117942 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb2d59717_fe78_4912_ac77_bf28a8188b39.slice/crio-d99e3f572ad75365f6ec7b9c45b342fcd41e9cdd9a46d16f7b7ae7a61d9124ea WatchSource:0}: Error finding container d99e3f572ad75365f6ec7b9c45b342fcd41e9cdd9a46d16f7b7ae7a61d9124ea: Status 404 returned error can't find the container with id d99e3f572ad75365f6ec7b9c45b342fcd41e9cdd9a46d16f7b7ae7a61d9124ea Sep 30 06:36:52 crc kubenswrapper[4691]: I0930 06:36:52.121775 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Sep 30 06:36:52 crc kubenswrapper[4691]: I0930 06:36:52.850418 4691 patch_prober.go:28] interesting pod/machine-config-daemon-4w4k6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 06:36:52 crc kubenswrapper[4691]: I0930 06:36:52.850739 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 06:36:52 crc kubenswrapper[4691]: I0930 06:36:52.850786 4691 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" Sep 30 06:36:52 crc kubenswrapper[4691]: I0930 06:36:52.851535 4691 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"38f0c707492af70fdfb0f260acc0b7e0af55b1c1967ae7e929f5286c470b2dd6"} pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 06:36:52 crc kubenswrapper[4691]: I0930 06:36:52.851645 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" containerName="machine-config-daemon" containerID="cri-o://38f0c707492af70fdfb0f260acc0b7e0af55b1c1967ae7e929f5286c470b2dd6" gracePeriod=600 Sep 30 06:36:52 crc kubenswrapper[4691]: I0930 06:36:52.951969 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"b2d59717-fe78-4912-ac77-bf28a8188b39","Type":"ContainerStarted","Data":"1ff33ba74008cc7dd9c432f7c98e0e6bb7eda43d733a98199cabc425cf6ffdff"} Sep 30 06:36:52 crc kubenswrapper[4691]: I0930 06:36:52.952190 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"b2d59717-fe78-4912-ac77-bf28a8188b39","Type":"ContainerStarted","Data":"2fbf394317facd465bebb8c9cf8b0e21fa97d8fc44d13fec042173d3642e11d3"} Sep 30 06:36:52 crc kubenswrapper[4691]: I0930 06:36:52.952201 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"b2d59717-fe78-4912-ac77-bf28a8188b39","Type":"ContainerStarted","Data":"d99e3f572ad75365f6ec7b9c45b342fcd41e9cdd9a46d16f7b7ae7a61d9124ea"} Sep 30 06:36:52 crc kubenswrapper[4691]: I0930 06:36:52.983464 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=1.983439162 podStartE2EDuration="1.983439162s" podCreationTimestamp="2025-09-30 06:36:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:36:52.976677876 +0000 UTC m=+1056.451698976" watchObservedRunningTime="2025-09-30 06:36:52.983439162 +0000 UTC m=+1056.458460242" Sep 30 06:36:53 crc kubenswrapper[4691]: I0930 06:36:53.239188 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0cca0db-c02c-407f-9ba0-9dbaabdfa22d" path="/var/lib/kubelet/pods/a0cca0db-c02c-407f-9ba0-9dbaabdfa22d/volumes" Sep 30 06:36:53 crc kubenswrapper[4691]: I0930 06:36:53.328253 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-sb2qw" Sep 30 06:36:53 crc kubenswrapper[4691]: I0930 06:36:53.406643 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f292l\" (UniqueName: \"kubernetes.io/projected/1f010c19-f02f-4c8b-8b12-1f357e860666-kube-api-access-f292l\") pod \"1f010c19-f02f-4c8b-8b12-1f357e860666\" (UID: \"1f010c19-f02f-4c8b-8b12-1f357e860666\") " Sep 30 06:36:53 crc kubenswrapper[4691]: I0930 06:36:53.406711 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f010c19-f02f-4c8b-8b12-1f357e860666-combined-ca-bundle\") pod \"1f010c19-f02f-4c8b-8b12-1f357e860666\" (UID: \"1f010c19-f02f-4c8b-8b12-1f357e860666\") " Sep 30 06:36:53 crc kubenswrapper[4691]: I0930 06:36:53.406783 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1f010c19-f02f-4c8b-8b12-1f357e860666-config\") pod \"1f010c19-f02f-4c8b-8b12-1f357e860666\" (UID: \"1f010c19-f02f-4c8b-8b12-1f357e860666\") " Sep 30 06:36:53 crc kubenswrapper[4691]: I0930 06:36:53.413591 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f010c19-f02f-4c8b-8b12-1f357e860666-kube-api-access-f292l" (OuterVolumeSpecName: "kube-api-access-f292l") pod "1f010c19-f02f-4c8b-8b12-1f357e860666" (UID: "1f010c19-f02f-4c8b-8b12-1f357e860666"). InnerVolumeSpecName "kube-api-access-f292l". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:36:53 crc kubenswrapper[4691]: I0930 06:36:53.440735 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f010c19-f02f-4c8b-8b12-1f357e860666-config" (OuterVolumeSpecName: "config") pod "1f010c19-f02f-4c8b-8b12-1f357e860666" (UID: "1f010c19-f02f-4c8b-8b12-1f357e860666"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:36:53 crc kubenswrapper[4691]: I0930 06:36:53.445998 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f010c19-f02f-4c8b-8b12-1f357e860666-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1f010c19-f02f-4c8b-8b12-1f357e860666" (UID: "1f010c19-f02f-4c8b-8b12-1f357e860666"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:36:53 crc kubenswrapper[4691]: I0930 06:36:53.508223 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f292l\" (UniqueName: \"kubernetes.io/projected/1f010c19-f02f-4c8b-8b12-1f357e860666-kube-api-access-f292l\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:53 crc kubenswrapper[4691]: I0930 06:36:53.508513 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f010c19-f02f-4c8b-8b12-1f357e860666-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:53 crc kubenswrapper[4691]: I0930 06:36:53.508525 4691 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/1f010c19-f02f-4c8b-8b12-1f357e860666-config\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:53 crc kubenswrapper[4691]: I0930 06:36:53.964235 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-sb2qw" event={"ID":"1f010c19-f02f-4c8b-8b12-1f357e860666","Type":"ContainerDied","Data":"7c2b1cb34fb8435a0af0821e1b6738645ad380afe74f7ff343bdb442000c1ed7"} Sep 30 06:36:53 crc kubenswrapper[4691]: I0930 06:36:53.964275 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c2b1cb34fb8435a0af0821e1b6738645ad380afe74f7ff343bdb442000c1ed7" Sep 30 06:36:53 crc kubenswrapper[4691]: I0930 06:36:53.964323 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-sb2qw" Sep 30 06:36:53 crc kubenswrapper[4691]: I0930 06:36:53.969079 4691 generic.go:334] "Generic (PLEG): container finished" podID="69b46ade-8260-448f-84b7-506632d23ff9" containerID="38f0c707492af70fdfb0f260acc0b7e0af55b1c1967ae7e929f5286c470b2dd6" exitCode=0 Sep 30 06:36:53 crc kubenswrapper[4691]: I0930 06:36:53.969135 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" event={"ID":"69b46ade-8260-448f-84b7-506632d23ff9","Type":"ContainerDied","Data":"38f0c707492af70fdfb0f260acc0b7e0af55b1c1967ae7e929f5286c470b2dd6"} Sep 30 06:36:53 crc kubenswrapper[4691]: I0930 06:36:53.969155 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" event={"ID":"69b46ade-8260-448f-84b7-506632d23ff9","Type":"ContainerStarted","Data":"b284d70235ce92b5dbfc6f06471e0d2494b74dc71ad661702951112856d0f82c"} Sep 30 06:36:53 crc kubenswrapper[4691]: I0930 06:36:53.969170 4691 scope.go:117] "RemoveContainer" containerID="a93cc69e9131d7c4e2a3f6590c1d8cfd39f8977341d3f1a63ae9e1ccb3a86989" Sep 30 06:36:53 crc kubenswrapper[4691]: I0930 06:36:53.973037 4691 generic.go:334] "Generic (PLEG): container finished" podID="3e1ab390-f1ae-4ec9-b5d6-fb137a511e21" containerID="c421916911fd50b60418d185f1c860b702b4ec4ec3250ddf3ec26d1b20cbdc09" exitCode=0 Sep 30 06:36:53 crc kubenswrapper[4691]: I0930 06:36:53.973134 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-pblkf" event={"ID":"3e1ab390-f1ae-4ec9-b5d6-fb137a511e21","Type":"ContainerDied","Data":"c421916911fd50b60418d185f1c860b702b4ec4ec3250ddf3ec26d1b20cbdc09"} Sep 30 06:36:53 crc kubenswrapper[4691]: I0930 06:36:53.973496 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Sep 30 06:36:54 crc kubenswrapper[4691]: I0930 06:36:54.041040 4691 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5755bf4df8-zx9td" podUID="cc29be35-3ceb-4a88-af6e-77e2d0cbab83" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.159:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.159:8443: connect: connection refused" Sep 30 06:36:54 crc kubenswrapper[4691]: I0930 06:36:54.152917 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d9695899-d4zhb"] Sep 30 06:36:54 crc kubenswrapper[4691]: E0930 06:36:54.153333 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f010c19-f02f-4c8b-8b12-1f357e860666" containerName="neutron-db-sync" Sep 30 06:36:54 crc kubenswrapper[4691]: I0930 06:36:54.153349 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f010c19-f02f-4c8b-8b12-1f357e860666" containerName="neutron-db-sync" Sep 30 06:36:54 crc kubenswrapper[4691]: I0930 06:36:54.153540 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f010c19-f02f-4c8b-8b12-1f357e860666" containerName="neutron-db-sync" Sep 30 06:36:54 crc kubenswrapper[4691]: I0930 06:36:54.154443 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d9695899-d4zhb" Sep 30 06:36:54 crc kubenswrapper[4691]: I0930 06:36:54.173339 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d9695899-d4zhb"] Sep 30 06:36:54 crc kubenswrapper[4691]: I0930 06:36:54.226934 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00f9a0ab-0ede-4a32-8fc4-baf3788218f8-ovsdbserver-sb\") pod \"dnsmasq-dns-5d9695899-d4zhb\" (UID: \"00f9a0ab-0ede-4a32-8fc4-baf3788218f8\") " pod="openstack/dnsmasq-dns-5d9695899-d4zhb" Sep 30 06:36:54 crc kubenswrapper[4691]: I0930 06:36:54.226986 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwxzc\" (UniqueName: \"kubernetes.io/projected/00f9a0ab-0ede-4a32-8fc4-baf3788218f8-kube-api-access-nwxzc\") pod \"dnsmasq-dns-5d9695899-d4zhb\" (UID: \"00f9a0ab-0ede-4a32-8fc4-baf3788218f8\") " pod="openstack/dnsmasq-dns-5d9695899-d4zhb" Sep 30 06:36:54 crc kubenswrapper[4691]: I0930 06:36:54.227114 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/00f9a0ab-0ede-4a32-8fc4-baf3788218f8-dns-swift-storage-0\") pod \"dnsmasq-dns-5d9695899-d4zhb\" (UID: \"00f9a0ab-0ede-4a32-8fc4-baf3788218f8\") " pod="openstack/dnsmasq-dns-5d9695899-d4zhb" Sep 30 06:36:54 crc kubenswrapper[4691]: I0930 06:36:54.227140 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00f9a0ab-0ede-4a32-8fc4-baf3788218f8-dns-svc\") pod \"dnsmasq-dns-5d9695899-d4zhb\" (UID: \"00f9a0ab-0ede-4a32-8fc4-baf3788218f8\") " pod="openstack/dnsmasq-dns-5d9695899-d4zhb" Sep 30 06:36:54 crc kubenswrapper[4691]: I0930 06:36:54.230329 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/00f9a0ab-0ede-4a32-8fc4-baf3788218f8-ovsdbserver-nb\") pod \"dnsmasq-dns-5d9695899-d4zhb\" (UID: \"00f9a0ab-0ede-4a32-8fc4-baf3788218f8\") " pod="openstack/dnsmasq-dns-5d9695899-d4zhb" Sep 30 06:36:54 crc kubenswrapper[4691]: I0930 06:36:54.232427 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00f9a0ab-0ede-4a32-8fc4-baf3788218f8-config\") pod \"dnsmasq-dns-5d9695899-d4zhb\" (UID: \"00f9a0ab-0ede-4a32-8fc4-baf3788218f8\") " pod="openstack/dnsmasq-dns-5d9695899-d4zhb" Sep 30 06:36:54 crc kubenswrapper[4691]: I0930 06:36:54.277872 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-6f47894d84-xb69p"] Sep 30 06:36:54 crc kubenswrapper[4691]: I0930 06:36:54.279538 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6f47894d84-xb69p" Sep 30 06:36:54 crc kubenswrapper[4691]: I0930 06:36:54.284969 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-pzgvz" Sep 30 06:36:54 crc kubenswrapper[4691]: I0930 06:36:54.286230 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Sep 30 06:36:54 crc kubenswrapper[4691]: I0930 06:36:54.286989 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Sep 30 06:36:54 crc kubenswrapper[4691]: I0930 06:36:54.287134 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Sep 30 06:36:54 crc kubenswrapper[4691]: I0930 06:36:54.293644 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6f47894d84-xb69p"] Sep 30 06:36:54 crc kubenswrapper[4691]: I0930 06:36:54.334852 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/00f9a0ab-0ede-4a32-8fc4-baf3788218f8-ovsdbserver-nb\") pod \"dnsmasq-dns-5d9695899-d4zhb\" (UID: \"00f9a0ab-0ede-4a32-8fc4-baf3788218f8\") " pod="openstack/dnsmasq-dns-5d9695899-d4zhb" Sep 30 06:36:54 crc kubenswrapper[4691]: I0930 06:36:54.335193 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00f9a0ab-0ede-4a32-8fc4-baf3788218f8-config\") pod \"dnsmasq-dns-5d9695899-d4zhb\" (UID: \"00f9a0ab-0ede-4a32-8fc4-baf3788218f8\") " pod="openstack/dnsmasq-dns-5d9695899-d4zhb" Sep 30 06:36:54 crc kubenswrapper[4691]: I0930 06:36:54.335239 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00f9a0ab-0ede-4a32-8fc4-baf3788218f8-ovsdbserver-sb\") pod \"dnsmasq-dns-5d9695899-d4zhb\" (UID: \"00f9a0ab-0ede-4a32-8fc4-baf3788218f8\") " pod="openstack/dnsmasq-dns-5d9695899-d4zhb" Sep 30 06:36:54 crc kubenswrapper[4691]: I0930 06:36:54.335261 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwxzc\" (UniqueName: \"kubernetes.io/projected/00f9a0ab-0ede-4a32-8fc4-baf3788218f8-kube-api-access-nwxzc\") pod \"dnsmasq-dns-5d9695899-d4zhb\" (UID: \"00f9a0ab-0ede-4a32-8fc4-baf3788218f8\") " pod="openstack/dnsmasq-dns-5d9695899-d4zhb" Sep 30 06:36:54 crc kubenswrapper[4691]: I0930 06:36:54.335349 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/00f9a0ab-0ede-4a32-8fc4-baf3788218f8-dns-swift-storage-0\") pod \"dnsmasq-dns-5d9695899-d4zhb\" (UID: \"00f9a0ab-0ede-4a32-8fc4-baf3788218f8\") " pod="openstack/dnsmasq-dns-5d9695899-d4zhb" Sep 30 06:36:54 crc kubenswrapper[4691]: I0930 06:36:54.335374 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00f9a0ab-0ede-4a32-8fc4-baf3788218f8-dns-svc\") pod \"dnsmasq-dns-5d9695899-d4zhb\" (UID: \"00f9a0ab-0ede-4a32-8fc4-baf3788218f8\") " pod="openstack/dnsmasq-dns-5d9695899-d4zhb" Sep 30 06:36:54 crc kubenswrapper[4691]: I0930 06:36:54.340347 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00f9a0ab-0ede-4a32-8fc4-baf3788218f8-dns-svc\") pod \"dnsmasq-dns-5d9695899-d4zhb\" (UID: \"00f9a0ab-0ede-4a32-8fc4-baf3788218f8\") " pod="openstack/dnsmasq-dns-5d9695899-d4zhb" Sep 30 06:36:54 crc kubenswrapper[4691]: I0930 06:36:54.342400 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/00f9a0ab-0ede-4a32-8fc4-baf3788218f8-dns-swift-storage-0\") pod \"dnsmasq-dns-5d9695899-d4zhb\" (UID: \"00f9a0ab-0ede-4a32-8fc4-baf3788218f8\") " pod="openstack/dnsmasq-dns-5d9695899-d4zhb" Sep 30 06:36:54 crc kubenswrapper[4691]: I0930 06:36:54.342906 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00f9a0ab-0ede-4a32-8fc4-baf3788218f8-config\") pod \"dnsmasq-dns-5d9695899-d4zhb\" (UID: \"00f9a0ab-0ede-4a32-8fc4-baf3788218f8\") " pod="openstack/dnsmasq-dns-5d9695899-d4zhb" Sep 30 06:36:54 crc kubenswrapper[4691]: I0930 06:36:54.347414 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/00f9a0ab-0ede-4a32-8fc4-baf3788218f8-ovsdbserver-nb\") pod \"dnsmasq-dns-5d9695899-d4zhb\" (UID: \"00f9a0ab-0ede-4a32-8fc4-baf3788218f8\") " pod="openstack/dnsmasq-dns-5d9695899-d4zhb" Sep 30 06:36:54 crc kubenswrapper[4691]: I0930 06:36:54.350116 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00f9a0ab-0ede-4a32-8fc4-baf3788218f8-ovsdbserver-sb\") pod \"dnsmasq-dns-5d9695899-d4zhb\" (UID: \"00f9a0ab-0ede-4a32-8fc4-baf3788218f8\") " pod="openstack/dnsmasq-dns-5d9695899-d4zhb" Sep 30 06:36:54 crc kubenswrapper[4691]: I0930 06:36:54.370037 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwxzc\" (UniqueName: \"kubernetes.io/projected/00f9a0ab-0ede-4a32-8fc4-baf3788218f8-kube-api-access-nwxzc\") pod \"dnsmasq-dns-5d9695899-d4zhb\" (UID: \"00f9a0ab-0ede-4a32-8fc4-baf3788218f8\") " pod="openstack/dnsmasq-dns-5d9695899-d4zhb" Sep 30 06:36:54 crc kubenswrapper[4691]: I0930 06:36:54.438085 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tn728\" (UniqueName: \"kubernetes.io/projected/4d6bfe51-c6c1-4062-b1e1-22905c50a142-kube-api-access-tn728\") pod \"neutron-6f47894d84-xb69p\" (UID: \"4d6bfe51-c6c1-4062-b1e1-22905c50a142\") " pod="openstack/neutron-6f47894d84-xb69p" Sep 30 06:36:54 crc kubenswrapper[4691]: I0930 06:36:54.438121 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d6bfe51-c6c1-4062-b1e1-22905c50a142-combined-ca-bundle\") pod \"neutron-6f47894d84-xb69p\" (UID: \"4d6bfe51-c6c1-4062-b1e1-22905c50a142\") " pod="openstack/neutron-6f47894d84-xb69p" Sep 30 06:36:54 crc kubenswrapper[4691]: I0930 06:36:54.438215 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4d6bfe51-c6c1-4062-b1e1-22905c50a142-config\") pod \"neutron-6f47894d84-xb69p\" (UID: \"4d6bfe51-c6c1-4062-b1e1-22905c50a142\") " pod="openstack/neutron-6f47894d84-xb69p" Sep 30 06:36:54 crc kubenswrapper[4691]: I0930 06:36:54.438254 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d6bfe51-c6c1-4062-b1e1-22905c50a142-ovndb-tls-certs\") pod \"neutron-6f47894d84-xb69p\" (UID: \"4d6bfe51-c6c1-4062-b1e1-22905c50a142\") " pod="openstack/neutron-6f47894d84-xb69p" Sep 30 06:36:54 crc kubenswrapper[4691]: I0930 06:36:54.438274 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4d6bfe51-c6c1-4062-b1e1-22905c50a142-httpd-config\") pod \"neutron-6f47894d84-xb69p\" (UID: \"4d6bfe51-c6c1-4062-b1e1-22905c50a142\") " pod="openstack/neutron-6f47894d84-xb69p" Sep 30 06:36:54 crc kubenswrapper[4691]: I0930 06:36:54.479860 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d9695899-d4zhb" Sep 30 06:36:54 crc kubenswrapper[4691]: I0930 06:36:54.540261 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d6bfe51-c6c1-4062-b1e1-22905c50a142-ovndb-tls-certs\") pod \"neutron-6f47894d84-xb69p\" (UID: \"4d6bfe51-c6c1-4062-b1e1-22905c50a142\") " pod="openstack/neutron-6f47894d84-xb69p" Sep 30 06:36:54 crc kubenswrapper[4691]: I0930 06:36:54.540294 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4d6bfe51-c6c1-4062-b1e1-22905c50a142-httpd-config\") pod \"neutron-6f47894d84-xb69p\" (UID: \"4d6bfe51-c6c1-4062-b1e1-22905c50a142\") " pod="openstack/neutron-6f47894d84-xb69p" Sep 30 06:36:54 crc kubenswrapper[4691]: I0930 06:36:54.540359 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d6bfe51-c6c1-4062-b1e1-22905c50a142-combined-ca-bundle\") pod \"neutron-6f47894d84-xb69p\" (UID: \"4d6bfe51-c6c1-4062-b1e1-22905c50a142\") " pod="openstack/neutron-6f47894d84-xb69p" Sep 30 06:36:54 crc kubenswrapper[4691]: I0930 06:36:54.540381 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tn728\" (UniqueName: \"kubernetes.io/projected/4d6bfe51-c6c1-4062-b1e1-22905c50a142-kube-api-access-tn728\") pod \"neutron-6f47894d84-xb69p\" (UID: \"4d6bfe51-c6c1-4062-b1e1-22905c50a142\") " pod="openstack/neutron-6f47894d84-xb69p" Sep 30 06:36:54 crc kubenswrapper[4691]: I0930 06:36:54.540487 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4d6bfe51-c6c1-4062-b1e1-22905c50a142-config\") pod \"neutron-6f47894d84-xb69p\" (UID: \"4d6bfe51-c6c1-4062-b1e1-22905c50a142\") " pod="openstack/neutron-6f47894d84-xb69p" Sep 30 06:36:54 crc kubenswrapper[4691]: I0930 06:36:54.544649 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4d6bfe51-c6c1-4062-b1e1-22905c50a142-httpd-config\") pod \"neutron-6f47894d84-xb69p\" (UID: \"4d6bfe51-c6c1-4062-b1e1-22905c50a142\") " pod="openstack/neutron-6f47894d84-xb69p" Sep 30 06:36:54 crc kubenswrapper[4691]: I0930 06:36:54.546091 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/4d6bfe51-c6c1-4062-b1e1-22905c50a142-config\") pod \"neutron-6f47894d84-xb69p\" (UID: \"4d6bfe51-c6c1-4062-b1e1-22905c50a142\") " pod="openstack/neutron-6f47894d84-xb69p" Sep 30 06:36:54 crc kubenswrapper[4691]: I0930 06:36:54.546467 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d6bfe51-c6c1-4062-b1e1-22905c50a142-ovndb-tls-certs\") pod \"neutron-6f47894d84-xb69p\" (UID: \"4d6bfe51-c6c1-4062-b1e1-22905c50a142\") " pod="openstack/neutron-6f47894d84-xb69p" Sep 30 06:36:54 crc kubenswrapper[4691]: I0930 06:36:54.550017 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d6bfe51-c6c1-4062-b1e1-22905c50a142-combined-ca-bundle\") pod \"neutron-6f47894d84-xb69p\" (UID: \"4d6bfe51-c6c1-4062-b1e1-22905c50a142\") " pod="openstack/neutron-6f47894d84-xb69p" Sep 30 06:36:54 crc kubenswrapper[4691]: I0930 06:36:54.559734 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tn728\" (UniqueName: \"kubernetes.io/projected/4d6bfe51-c6c1-4062-b1e1-22905c50a142-kube-api-access-tn728\") pod \"neutron-6f47894d84-xb69p\" (UID: \"4d6bfe51-c6c1-4062-b1e1-22905c50a142\") " pod="openstack/neutron-6f47894d84-xb69p" Sep 30 06:36:54 crc kubenswrapper[4691]: I0930 06:36:54.661454 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6f47894d84-xb69p" Sep 30 06:36:54 crc kubenswrapper[4691]: I0930 06:36:54.899053 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-67dcbcd77c-9lrb5" Sep 30 06:36:55 crc kubenswrapper[4691]: I0930 06:36:55.027411 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-67dcbcd77c-9lrb5" Sep 30 06:36:55 crc kubenswrapper[4691]: I0930 06:36:55.027611 4691 generic.go:334] "Generic (PLEG): container finished" podID="ca89855f-21bb-4d05-93aa-5705f6d93548" containerID="95d2427331b0abd1a63d88739144370f1f4b11996b83b09c945ab4a2bfed6adc" exitCode=137 Sep 30 06:36:55 crc kubenswrapper[4691]: I0930 06:36:55.027665 4691 generic.go:334] "Generic (PLEG): container finished" podID="ca89855f-21bb-4d05-93aa-5705f6d93548" containerID="3e7e8bac6245462541c670bb16eee4e523e1118c6c7a122350fd5ef75f6ad5a3" exitCode=137 Sep 30 06:36:55 crc kubenswrapper[4691]: I0930 06:36:55.027683 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67dcbcd77c-9lrb5" event={"ID":"ca89855f-21bb-4d05-93aa-5705f6d93548","Type":"ContainerDied","Data":"95d2427331b0abd1a63d88739144370f1f4b11996b83b09c945ab4a2bfed6adc"} Sep 30 06:36:55 crc kubenswrapper[4691]: I0930 06:36:55.027717 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67dcbcd77c-9lrb5" event={"ID":"ca89855f-21bb-4d05-93aa-5705f6d93548","Type":"ContainerDied","Data":"3e7e8bac6245462541c670bb16eee4e523e1118c6c7a122350fd5ef75f6ad5a3"} Sep 30 06:36:55 crc kubenswrapper[4691]: I0930 06:36:55.027726 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-67dcbcd77c-9lrb5" event={"ID":"ca89855f-21bb-4d05-93aa-5705f6d93548","Type":"ContainerDied","Data":"3817e72d071dad559c56b9f590afb0ee39a713a0f26a6196e6f73c2414d19909"} Sep 30 06:36:55 crc kubenswrapper[4691]: I0930 06:36:55.027742 4691 scope.go:117] "RemoveContainer" containerID="95d2427331b0abd1a63d88739144370f1f4b11996b83b09c945ab4a2bfed6adc" Sep 30 06:36:55 crc kubenswrapper[4691]: I0930 06:36:55.032596 4691 generic.go:334] "Generic (PLEG): container finished" podID="95c4dd18-3200-4193-9868-7315a13103b3" containerID="45fe2b4c11d6247066471f070f55ce4395f18598527836e58e649a9a6563713c" exitCode=137 Sep 30 06:36:55 crc kubenswrapper[4691]: I0930 06:36:55.032613 4691 generic.go:334] "Generic (PLEG): container finished" podID="95c4dd18-3200-4193-9868-7315a13103b3" containerID="97368ae4b6b7644978046f502ca13057d3a35202a18e92e04b831593c190457f" exitCode=137 Sep 30 06:36:55 crc kubenswrapper[4691]: I0930 06:36:55.032643 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-596cd479b5-w8rf2" event={"ID":"95c4dd18-3200-4193-9868-7315a13103b3","Type":"ContainerDied","Data":"45fe2b4c11d6247066471f070f55ce4395f18598527836e58e649a9a6563713c"} Sep 30 06:36:55 crc kubenswrapper[4691]: I0930 06:36:55.032660 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-596cd479b5-w8rf2" event={"ID":"95c4dd18-3200-4193-9868-7315a13103b3","Type":"ContainerDied","Data":"97368ae4b6b7644978046f502ca13057d3a35202a18e92e04b831593c190457f"} Sep 30 06:36:55 crc kubenswrapper[4691]: I0930 06:36:55.043838 4691 generic.go:334] "Generic (PLEG): container finished" podID="730e43c6-3b1f-4a8c-9540-4ff131592381" containerID="0a8b70070c7c4c4b4e43f91d12a71b2845f849b3dbe059d8b6fc0efc4dd3e0e8" exitCode=137 Sep 30 06:36:55 crc kubenswrapper[4691]: I0930 06:36:55.043862 4691 generic.go:334] "Generic (PLEG): container finished" podID="730e43c6-3b1f-4a8c-9540-4ff131592381" containerID="be42e16a38be0dbe741dfa7d7ee8358eef6373849e72d61dfa4010eb140cc226" exitCode=137 Sep 30 06:36:55 crc kubenswrapper[4691]: I0930 06:36:55.044014 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f69875457-kvdnf" event={"ID":"730e43c6-3b1f-4a8c-9540-4ff131592381","Type":"ContainerDied","Data":"0a8b70070c7c4c4b4e43f91d12a71b2845f849b3dbe059d8b6fc0efc4dd3e0e8"} Sep 30 06:36:55 crc kubenswrapper[4691]: I0930 06:36:55.044040 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f69875457-kvdnf" event={"ID":"730e43c6-3b1f-4a8c-9540-4ff131592381","Type":"ContainerDied","Data":"be42e16a38be0dbe741dfa7d7ee8358eef6373849e72d61dfa4010eb140cc226"} Sep 30 06:36:55 crc kubenswrapper[4691]: I0930 06:36:55.055287 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca89855f-21bb-4d05-93aa-5705f6d93548-logs\") pod \"ca89855f-21bb-4d05-93aa-5705f6d93548\" (UID: \"ca89855f-21bb-4d05-93aa-5705f6d93548\") " Sep 30 06:36:55 crc kubenswrapper[4691]: I0930 06:36:55.055981 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca89855f-21bb-4d05-93aa-5705f6d93548-logs" (OuterVolumeSpecName: "logs") pod "ca89855f-21bb-4d05-93aa-5705f6d93548" (UID: "ca89855f-21bb-4d05-93aa-5705f6d93548"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:36:55 crc kubenswrapper[4691]: I0930 06:36:55.056106 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ca89855f-21bb-4d05-93aa-5705f6d93548-scripts\") pod \"ca89855f-21bb-4d05-93aa-5705f6d93548\" (UID: \"ca89855f-21bb-4d05-93aa-5705f6d93548\") " Sep 30 06:36:55 crc kubenswrapper[4691]: I0930 06:36:55.056244 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-prxkn\" (UniqueName: \"kubernetes.io/projected/ca89855f-21bb-4d05-93aa-5705f6d93548-kube-api-access-prxkn\") pod \"ca89855f-21bb-4d05-93aa-5705f6d93548\" (UID: \"ca89855f-21bb-4d05-93aa-5705f6d93548\") " Sep 30 06:36:55 crc kubenswrapper[4691]: I0930 06:36:55.056349 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ca89855f-21bb-4d05-93aa-5705f6d93548-config-data\") pod \"ca89855f-21bb-4d05-93aa-5705f6d93548\" (UID: \"ca89855f-21bb-4d05-93aa-5705f6d93548\") " Sep 30 06:36:55 crc kubenswrapper[4691]: I0930 06:36:55.056914 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ca89855f-21bb-4d05-93aa-5705f6d93548-horizon-secret-key\") pod \"ca89855f-21bb-4d05-93aa-5705f6d93548\" (UID: \"ca89855f-21bb-4d05-93aa-5705f6d93548\") " Sep 30 06:36:55 crc kubenswrapper[4691]: I0930 06:36:55.064075 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca89855f-21bb-4d05-93aa-5705f6d93548-kube-api-access-prxkn" (OuterVolumeSpecName: "kube-api-access-prxkn") pod "ca89855f-21bb-4d05-93aa-5705f6d93548" (UID: "ca89855f-21bb-4d05-93aa-5705f6d93548"). InnerVolumeSpecName "kube-api-access-prxkn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:36:55 crc kubenswrapper[4691]: I0930 06:36:55.064339 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-596cd479b5-w8rf2" Sep 30 06:36:55 crc kubenswrapper[4691]: I0930 06:36:55.068535 4691 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ca89855f-21bb-4d05-93aa-5705f6d93548-logs\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:55 crc kubenswrapper[4691]: I0930 06:36:55.071125 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca89855f-21bb-4d05-93aa-5705f6d93548-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "ca89855f-21bb-4d05-93aa-5705f6d93548" (UID: "ca89855f-21bb-4d05-93aa-5705f6d93548"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:36:55 crc kubenswrapper[4691]: I0930 06:36:55.073788 4691 generic.go:334] "Generic (PLEG): container finished" podID="af3f1644-3ab8-4a6a-9f80-f8ea42297e98" containerID="5501ce67b931bb5c3d6aeb1c094dd1c1c6449137f68ebefe633416b944309b7c" exitCode=1 Sep 30 06:36:55 crc kubenswrapper[4691]: I0930 06:36:55.074042 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"af3f1644-3ab8-4a6a-9f80-f8ea42297e98","Type":"ContainerDied","Data":"5501ce67b931bb5c3d6aeb1c094dd1c1c6449137f68ebefe633416b944309b7c"} Sep 30 06:36:55 crc kubenswrapper[4691]: I0930 06:36:55.074795 4691 scope.go:117] "RemoveContainer" containerID="5501ce67b931bb5c3d6aeb1c094dd1c1c6449137f68ebefe633416b944309b7c" Sep 30 06:36:55 crc kubenswrapper[4691]: I0930 06:36:55.086436 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-f69875457-kvdnf" Sep 30 06:36:55 crc kubenswrapper[4691]: I0930 06:36:55.109551 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca89855f-21bb-4d05-93aa-5705f6d93548-scripts" (OuterVolumeSpecName: "scripts") pod "ca89855f-21bb-4d05-93aa-5705f6d93548" (UID: "ca89855f-21bb-4d05-93aa-5705f6d93548"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:36:55 crc kubenswrapper[4691]: I0930 06:36:55.127589 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ca89855f-21bb-4d05-93aa-5705f6d93548-config-data" (OuterVolumeSpecName: "config-data") pod "ca89855f-21bb-4d05-93aa-5705f6d93548" (UID: "ca89855f-21bb-4d05-93aa-5705f6d93548"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:36:55 crc kubenswrapper[4691]: I0930 06:36:55.170086 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/95c4dd18-3200-4193-9868-7315a13103b3-scripts\") pod \"95c4dd18-3200-4193-9868-7315a13103b3\" (UID: \"95c4dd18-3200-4193-9868-7315a13103b3\") " Sep 30 06:36:55 crc kubenswrapper[4691]: I0930 06:36:55.170431 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/95c4dd18-3200-4193-9868-7315a13103b3-config-data\") pod \"95c4dd18-3200-4193-9868-7315a13103b3\" (UID: \"95c4dd18-3200-4193-9868-7315a13103b3\") " Sep 30 06:36:55 crc kubenswrapper[4691]: I0930 06:36:55.170523 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/730e43c6-3b1f-4a8c-9540-4ff131592381-scripts\") pod \"730e43c6-3b1f-4a8c-9540-4ff131592381\" (UID: \"730e43c6-3b1f-4a8c-9540-4ff131592381\") " Sep 30 06:36:55 crc kubenswrapper[4691]: I0930 06:36:55.170571 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/730e43c6-3b1f-4a8c-9540-4ff131592381-logs\") pod \"730e43c6-3b1f-4a8c-9540-4ff131592381\" (UID: \"730e43c6-3b1f-4a8c-9540-4ff131592381\") " Sep 30 06:36:55 crc kubenswrapper[4691]: I0930 06:36:55.170594 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpxzz\" (UniqueName: \"kubernetes.io/projected/730e43c6-3b1f-4a8c-9540-4ff131592381-kube-api-access-tpxzz\") pod \"730e43c6-3b1f-4a8c-9540-4ff131592381\" (UID: \"730e43c6-3b1f-4a8c-9540-4ff131592381\") " Sep 30 06:36:55 crc kubenswrapper[4691]: I0930 06:36:55.170611 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95c4dd18-3200-4193-9868-7315a13103b3-logs\") pod \"95c4dd18-3200-4193-9868-7315a13103b3\" (UID: \"95c4dd18-3200-4193-9868-7315a13103b3\") " Sep 30 06:36:55 crc kubenswrapper[4691]: I0930 06:36:55.170644 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8qqs\" (UniqueName: \"kubernetes.io/projected/95c4dd18-3200-4193-9868-7315a13103b3-kube-api-access-s8qqs\") pod \"95c4dd18-3200-4193-9868-7315a13103b3\" (UID: \"95c4dd18-3200-4193-9868-7315a13103b3\") " Sep 30 06:36:55 crc kubenswrapper[4691]: I0930 06:36:55.170666 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/730e43c6-3b1f-4a8c-9540-4ff131592381-horizon-secret-key\") pod \"730e43c6-3b1f-4a8c-9540-4ff131592381\" (UID: \"730e43c6-3b1f-4a8c-9540-4ff131592381\") " Sep 30 06:36:55 crc kubenswrapper[4691]: I0930 06:36:55.170685 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/730e43c6-3b1f-4a8c-9540-4ff131592381-config-data\") pod \"730e43c6-3b1f-4a8c-9540-4ff131592381\" (UID: \"730e43c6-3b1f-4a8c-9540-4ff131592381\") " Sep 30 06:36:55 crc kubenswrapper[4691]: I0930 06:36:55.170726 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/95c4dd18-3200-4193-9868-7315a13103b3-horizon-secret-key\") pod \"95c4dd18-3200-4193-9868-7315a13103b3\" (UID: \"95c4dd18-3200-4193-9868-7315a13103b3\") " Sep 30 06:36:55 crc kubenswrapper[4691]: I0930 06:36:55.171194 4691 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/ca89855f-21bb-4d05-93aa-5705f6d93548-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:55 crc kubenswrapper[4691]: I0930 06:36:55.171212 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-prxkn\" (UniqueName: \"kubernetes.io/projected/ca89855f-21bb-4d05-93aa-5705f6d93548-kube-api-access-prxkn\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:55 crc kubenswrapper[4691]: I0930 06:36:55.171222 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ca89855f-21bb-4d05-93aa-5705f6d93548-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:55 crc kubenswrapper[4691]: I0930 06:36:55.171232 4691 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/ca89855f-21bb-4d05-93aa-5705f6d93548-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:55 crc kubenswrapper[4691]: I0930 06:36:55.171640 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/730e43c6-3b1f-4a8c-9540-4ff131592381-logs" (OuterVolumeSpecName: "logs") pod "730e43c6-3b1f-4a8c-9540-4ff131592381" (UID: "730e43c6-3b1f-4a8c-9540-4ff131592381"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:36:55 crc kubenswrapper[4691]: I0930 06:36:55.172219 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95c4dd18-3200-4193-9868-7315a13103b3-logs" (OuterVolumeSpecName: "logs") pod "95c4dd18-3200-4193-9868-7315a13103b3" (UID: "95c4dd18-3200-4193-9868-7315a13103b3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:36:55 crc kubenswrapper[4691]: I0930 06:36:55.175680 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/730e43c6-3b1f-4a8c-9540-4ff131592381-kube-api-access-tpxzz" (OuterVolumeSpecName: "kube-api-access-tpxzz") pod "730e43c6-3b1f-4a8c-9540-4ff131592381" (UID: "730e43c6-3b1f-4a8c-9540-4ff131592381"). InnerVolumeSpecName "kube-api-access-tpxzz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:36:55 crc kubenswrapper[4691]: I0930 06:36:55.178773 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95c4dd18-3200-4193-9868-7315a13103b3-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "95c4dd18-3200-4193-9868-7315a13103b3" (UID: "95c4dd18-3200-4193-9868-7315a13103b3"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:36:55 crc kubenswrapper[4691]: I0930 06:36:55.183994 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95c4dd18-3200-4193-9868-7315a13103b3-kube-api-access-s8qqs" (OuterVolumeSpecName: "kube-api-access-s8qqs") pod "95c4dd18-3200-4193-9868-7315a13103b3" (UID: "95c4dd18-3200-4193-9868-7315a13103b3"). InnerVolumeSpecName "kube-api-access-s8qqs". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:36:55 crc kubenswrapper[4691]: I0930 06:36:55.193389 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/730e43c6-3b1f-4a8c-9540-4ff131592381-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "730e43c6-3b1f-4a8c-9540-4ff131592381" (UID: "730e43c6-3b1f-4a8c-9540-4ff131592381"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:36:55 crc kubenswrapper[4691]: I0930 06:36:55.200329 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/730e43c6-3b1f-4a8c-9540-4ff131592381-config-data" (OuterVolumeSpecName: "config-data") pod "730e43c6-3b1f-4a8c-9540-4ff131592381" (UID: "730e43c6-3b1f-4a8c-9540-4ff131592381"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:36:55 crc kubenswrapper[4691]: I0930 06:36:55.215557 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/730e43c6-3b1f-4a8c-9540-4ff131592381-scripts" (OuterVolumeSpecName: "scripts") pod "730e43c6-3b1f-4a8c-9540-4ff131592381" (UID: "730e43c6-3b1f-4a8c-9540-4ff131592381"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:36:55 crc kubenswrapper[4691]: I0930 06:36:55.215503 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95c4dd18-3200-4193-9868-7315a13103b3-scripts" (OuterVolumeSpecName: "scripts") pod "95c4dd18-3200-4193-9868-7315a13103b3" (UID: "95c4dd18-3200-4193-9868-7315a13103b3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:36:55 crc kubenswrapper[4691]: I0930 06:36:55.226080 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-applier-0" Sep 30 06:36:55 crc kubenswrapper[4691]: I0930 06:36:55.234174 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95c4dd18-3200-4193-9868-7315a13103b3-config-data" (OuterVolumeSpecName: "config-data") pod "95c4dd18-3200-4193-9868-7315a13103b3" (UID: "95c4dd18-3200-4193-9868-7315a13103b3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:36:55 crc kubenswrapper[4691]: I0930 06:36:55.244253 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d9695899-d4zhb"] Sep 30 06:36:55 crc kubenswrapper[4691]: I0930 06:36:55.258914 4691 scope.go:117] "RemoveContainer" containerID="3e7e8bac6245462541c670bb16eee4e523e1118c6c7a122350fd5ef75f6ad5a3" Sep 30 06:36:55 crc kubenswrapper[4691]: I0930 06:36:55.274231 4691 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/730e43c6-3b1f-4a8c-9540-4ff131592381-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:55 crc kubenswrapper[4691]: I0930 06:36:55.274379 4691 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/730e43c6-3b1f-4a8c-9540-4ff131592381-logs\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:55 crc kubenswrapper[4691]: I0930 06:36:55.274436 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tpxzz\" (UniqueName: \"kubernetes.io/projected/730e43c6-3b1f-4a8c-9540-4ff131592381-kube-api-access-tpxzz\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:55 crc kubenswrapper[4691]: I0930 06:36:55.274489 4691 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/95c4dd18-3200-4193-9868-7315a13103b3-logs\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:55 crc kubenswrapper[4691]: I0930 06:36:55.274543 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s8qqs\" (UniqueName: \"kubernetes.io/projected/95c4dd18-3200-4193-9868-7315a13103b3-kube-api-access-s8qqs\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:55 crc kubenswrapper[4691]: I0930 06:36:55.274595 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/730e43c6-3b1f-4a8c-9540-4ff131592381-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:55 crc kubenswrapper[4691]: I0930 06:36:55.274649 4691 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/730e43c6-3b1f-4a8c-9540-4ff131592381-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:55 crc kubenswrapper[4691]: I0930 06:36:55.274701 4691 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/95c4dd18-3200-4193-9868-7315a13103b3-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:55 crc kubenswrapper[4691]: I0930 06:36:55.274751 4691 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/95c4dd18-3200-4193-9868-7315a13103b3-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:55 crc kubenswrapper[4691]: I0930 06:36:55.274801 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/95c4dd18-3200-4193-9868-7315a13103b3-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 06:36:55 crc kubenswrapper[4691]: I0930 06:36:55.314072 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-6f47894d84-xb69p"] Sep 30 06:36:55 crc kubenswrapper[4691]: I0930 06:36:55.339133 4691 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/watcher-api-0" podUID="a0cca0db-c02c-407f-9ba0-9dbaabdfa22d" containerName="watcher-api" probeResult="failure" output="Get \"http://10.217.0.149:9322/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 30 06:36:55 crc kubenswrapper[4691]: I0930 06:36:55.339223 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Sep 30 06:36:55 crc kubenswrapper[4691]: I0930 06:36:55.339240 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Sep 30 06:36:55 crc kubenswrapper[4691]: I0930 06:36:55.374195 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-67dcbcd77c-9lrb5"] Sep 30 06:36:55 crc kubenswrapper[4691]: I0930 06:36:55.379023 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-67dcbcd77c-9lrb5"] Sep 30 06:36:56 crc kubenswrapper[4691]: I0930 06:36:56.088324 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-596cd479b5-w8rf2" event={"ID":"95c4dd18-3200-4193-9868-7315a13103b3","Type":"ContainerDied","Data":"f1718dd175fa0a0a38f6f40c796e91a7ca46eb4853eeef23b158e6929a434fe5"} Sep 30 06:36:56 crc kubenswrapper[4691]: I0930 06:36:56.088366 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-596cd479b5-w8rf2" Sep 30 06:36:56 crc kubenswrapper[4691]: I0930 06:36:56.089928 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-f69875457-kvdnf" event={"ID":"730e43c6-3b1f-4a8c-9540-4ff131592381","Type":"ContainerDied","Data":"973c7707713ff6b9ae3abbb47e9445dd8ea045b05ea1451d71c0c768973e3ff1"} Sep 30 06:36:56 crc kubenswrapper[4691]: I0930 06:36:56.089967 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-f69875457-kvdnf" Sep 30 06:36:56 crc kubenswrapper[4691]: I0930 06:36:56.094528 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d9695899-d4zhb" event={"ID":"00f9a0ab-0ede-4a32-8fc4-baf3788218f8","Type":"ContainerStarted","Data":"d4b30d1df7ce21467d6a5d5780f602071ebd84823f380b14b3a7102b9d3d1796"} Sep 30 06:36:56 crc kubenswrapper[4691]: I0930 06:36:56.128242 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-596cd479b5-w8rf2"] Sep 30 06:36:56 crc kubenswrapper[4691]: I0930 06:36:56.155516 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-596cd479b5-w8rf2"] Sep 30 06:36:56 crc kubenswrapper[4691]: I0930 06:36:56.164033 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-f69875457-kvdnf"] Sep 30 06:36:56 crc kubenswrapper[4691]: I0930 06:36:56.171979 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-f69875457-kvdnf"] Sep 30 06:36:56 crc kubenswrapper[4691]: I0930 06:36:56.606702 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Sep 30 06:36:56 crc kubenswrapper[4691]: I0930 06:36:56.606988 4691 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 06:36:56 crc kubenswrapper[4691]: I0930 06:36:56.929097 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Sep 30 06:36:57 crc kubenswrapper[4691]: I0930 06:36:57.018576 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-8687477df-8l865"] Sep 30 06:36:57 crc kubenswrapper[4691]: E0930 06:36:57.018897 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95c4dd18-3200-4193-9868-7315a13103b3" containerName="horizon" Sep 30 06:36:57 crc kubenswrapper[4691]: I0930 06:36:57.018915 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="95c4dd18-3200-4193-9868-7315a13103b3" containerName="horizon" Sep 30 06:36:57 crc kubenswrapper[4691]: E0930 06:36:57.018942 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="730e43c6-3b1f-4a8c-9540-4ff131592381" containerName="horizon-log" Sep 30 06:36:57 crc kubenswrapper[4691]: I0930 06:36:57.018948 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="730e43c6-3b1f-4a8c-9540-4ff131592381" containerName="horizon-log" Sep 30 06:36:57 crc kubenswrapper[4691]: E0930 06:36:57.018960 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95c4dd18-3200-4193-9868-7315a13103b3" containerName="horizon-log" Sep 30 06:36:57 crc kubenswrapper[4691]: I0930 06:36:57.018966 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="95c4dd18-3200-4193-9868-7315a13103b3" containerName="horizon-log" Sep 30 06:36:57 crc kubenswrapper[4691]: E0930 06:36:57.018988 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca89855f-21bb-4d05-93aa-5705f6d93548" containerName="horizon" Sep 30 06:36:57 crc kubenswrapper[4691]: I0930 06:36:57.018994 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca89855f-21bb-4d05-93aa-5705f6d93548" containerName="horizon" Sep 30 06:36:57 crc kubenswrapper[4691]: E0930 06:36:57.019004 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca89855f-21bb-4d05-93aa-5705f6d93548" containerName="horizon-log" Sep 30 06:36:57 crc kubenswrapper[4691]: I0930 06:36:57.019010 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca89855f-21bb-4d05-93aa-5705f6d93548" containerName="horizon-log" Sep 30 06:36:57 crc kubenswrapper[4691]: E0930 06:36:57.019018 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="730e43c6-3b1f-4a8c-9540-4ff131592381" containerName="horizon" Sep 30 06:36:57 crc kubenswrapper[4691]: I0930 06:36:57.019024 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="730e43c6-3b1f-4a8c-9540-4ff131592381" containerName="horizon" Sep 30 06:36:57 crc kubenswrapper[4691]: I0930 06:36:57.019187 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca89855f-21bb-4d05-93aa-5705f6d93548" containerName="horizon-log" Sep 30 06:36:57 crc kubenswrapper[4691]: I0930 06:36:57.019200 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="95c4dd18-3200-4193-9868-7315a13103b3" containerName="horizon-log" Sep 30 06:36:57 crc kubenswrapper[4691]: I0930 06:36:57.019209 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="730e43c6-3b1f-4a8c-9540-4ff131592381" containerName="horizon-log" Sep 30 06:36:57 crc kubenswrapper[4691]: I0930 06:36:57.019220 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="95c4dd18-3200-4193-9868-7315a13103b3" containerName="horizon" Sep 30 06:36:57 crc kubenswrapper[4691]: I0930 06:36:57.019231 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca89855f-21bb-4d05-93aa-5705f6d93548" containerName="horizon" Sep 30 06:36:57 crc kubenswrapper[4691]: I0930 06:36:57.019245 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="730e43c6-3b1f-4a8c-9540-4ff131592381" containerName="horizon" Sep 30 06:36:57 crc kubenswrapper[4691]: I0930 06:36:57.020133 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8687477df-8l865" Sep 30 06:36:57 crc kubenswrapper[4691]: I0930 06:36:57.049689 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Sep 30 06:36:57 crc kubenswrapper[4691]: I0930 06:36:57.049765 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Sep 30 06:36:57 crc kubenswrapper[4691]: I0930 06:36:57.059416 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-8687477df-8l865"] Sep 30 06:36:57 crc kubenswrapper[4691]: I0930 06:36:57.121794 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d0f6749-bfde-4329-9905-f51ef18e904c-internal-tls-certs\") pod \"neutron-8687477df-8l865\" (UID: \"7d0f6749-bfde-4329-9905-f51ef18e904c\") " pod="openstack/neutron-8687477df-8l865" Sep 30 06:36:57 crc kubenswrapper[4691]: I0930 06:36:57.121852 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d0f6749-bfde-4329-9905-f51ef18e904c-combined-ca-bundle\") pod \"neutron-8687477df-8l865\" (UID: \"7d0f6749-bfde-4329-9905-f51ef18e904c\") " pod="openstack/neutron-8687477df-8l865" Sep 30 06:36:57 crc kubenswrapper[4691]: I0930 06:36:57.121917 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d0f6749-bfde-4329-9905-f51ef18e904c-public-tls-certs\") pod \"neutron-8687477df-8l865\" (UID: \"7d0f6749-bfde-4329-9905-f51ef18e904c\") " pod="openstack/neutron-8687477df-8l865" Sep 30 06:36:57 crc kubenswrapper[4691]: I0930 06:36:57.121961 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7d0f6749-bfde-4329-9905-f51ef18e904c-httpd-config\") pod \"neutron-8687477df-8l865\" (UID: \"7d0f6749-bfde-4329-9905-f51ef18e904c\") " pod="openstack/neutron-8687477df-8l865" Sep 30 06:36:57 crc kubenswrapper[4691]: I0930 06:36:57.121978 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d0f6749-bfde-4329-9905-f51ef18e904c-ovndb-tls-certs\") pod \"neutron-8687477df-8l865\" (UID: \"7d0f6749-bfde-4329-9905-f51ef18e904c\") " pod="openstack/neutron-8687477df-8l865" Sep 30 06:36:57 crc kubenswrapper[4691]: I0930 06:36:57.122018 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6t76z\" (UniqueName: \"kubernetes.io/projected/7d0f6749-bfde-4329-9905-f51ef18e904c-kube-api-access-6t76z\") pod \"neutron-8687477df-8l865\" (UID: \"7d0f6749-bfde-4329-9905-f51ef18e904c\") " pod="openstack/neutron-8687477df-8l865" Sep 30 06:36:57 crc kubenswrapper[4691]: I0930 06:36:57.122229 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7d0f6749-bfde-4329-9905-f51ef18e904c-config\") pod \"neutron-8687477df-8l865\" (UID: \"7d0f6749-bfde-4329-9905-f51ef18e904c\") " pod="openstack/neutron-8687477df-8l865" Sep 30 06:36:57 crc kubenswrapper[4691]: I0930 06:36:57.223971 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/7d0f6749-bfde-4329-9905-f51ef18e904c-config\") pod \"neutron-8687477df-8l865\" (UID: \"7d0f6749-bfde-4329-9905-f51ef18e904c\") " pod="openstack/neutron-8687477df-8l865" Sep 30 06:36:57 crc kubenswrapper[4691]: I0930 06:36:57.224077 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d0f6749-bfde-4329-9905-f51ef18e904c-internal-tls-certs\") pod \"neutron-8687477df-8l865\" (UID: \"7d0f6749-bfde-4329-9905-f51ef18e904c\") " pod="openstack/neutron-8687477df-8l865" Sep 30 06:36:57 crc kubenswrapper[4691]: I0930 06:36:57.224136 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d0f6749-bfde-4329-9905-f51ef18e904c-combined-ca-bundle\") pod \"neutron-8687477df-8l865\" (UID: \"7d0f6749-bfde-4329-9905-f51ef18e904c\") " pod="openstack/neutron-8687477df-8l865" Sep 30 06:36:57 crc kubenswrapper[4691]: I0930 06:36:57.224208 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d0f6749-bfde-4329-9905-f51ef18e904c-public-tls-certs\") pod \"neutron-8687477df-8l865\" (UID: \"7d0f6749-bfde-4329-9905-f51ef18e904c\") " pod="openstack/neutron-8687477df-8l865" Sep 30 06:36:57 crc kubenswrapper[4691]: I0930 06:36:57.224260 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7d0f6749-bfde-4329-9905-f51ef18e904c-httpd-config\") pod \"neutron-8687477df-8l865\" (UID: \"7d0f6749-bfde-4329-9905-f51ef18e904c\") " pod="openstack/neutron-8687477df-8l865" Sep 30 06:36:57 crc kubenswrapper[4691]: I0930 06:36:57.224276 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d0f6749-bfde-4329-9905-f51ef18e904c-ovndb-tls-certs\") pod \"neutron-8687477df-8l865\" (UID: \"7d0f6749-bfde-4329-9905-f51ef18e904c\") " pod="openstack/neutron-8687477df-8l865" Sep 30 06:36:57 crc kubenswrapper[4691]: I0930 06:36:57.224302 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6t76z\" (UniqueName: \"kubernetes.io/projected/7d0f6749-bfde-4329-9905-f51ef18e904c-kube-api-access-6t76z\") pod \"neutron-8687477df-8l865\" (UID: \"7d0f6749-bfde-4329-9905-f51ef18e904c\") " pod="openstack/neutron-8687477df-8l865" Sep 30 06:36:57 crc kubenswrapper[4691]: I0930 06:36:57.231232 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d0f6749-bfde-4329-9905-f51ef18e904c-ovndb-tls-certs\") pod \"neutron-8687477df-8l865\" (UID: \"7d0f6749-bfde-4329-9905-f51ef18e904c\") " pod="openstack/neutron-8687477df-8l865" Sep 30 06:36:57 crc kubenswrapper[4691]: I0930 06:36:57.232165 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/7d0f6749-bfde-4329-9905-f51ef18e904c-httpd-config\") pod \"neutron-8687477df-8l865\" (UID: \"7d0f6749-bfde-4329-9905-f51ef18e904c\") " pod="openstack/neutron-8687477df-8l865" Sep 30 06:36:57 crc kubenswrapper[4691]: I0930 06:36:57.232431 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d0f6749-bfde-4329-9905-f51ef18e904c-internal-tls-certs\") pod \"neutron-8687477df-8l865\" (UID: \"7d0f6749-bfde-4329-9905-f51ef18e904c\") " pod="openstack/neutron-8687477df-8l865" Sep 30 06:36:57 crc kubenswrapper[4691]: I0930 06:36:57.232715 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d0f6749-bfde-4329-9905-f51ef18e904c-public-tls-certs\") pod \"neutron-8687477df-8l865\" (UID: \"7d0f6749-bfde-4329-9905-f51ef18e904c\") " pod="openstack/neutron-8687477df-8l865" Sep 30 06:36:57 crc kubenswrapper[4691]: I0930 06:36:57.241009 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/7d0f6749-bfde-4329-9905-f51ef18e904c-config\") pod \"neutron-8687477df-8l865\" (UID: \"7d0f6749-bfde-4329-9905-f51ef18e904c\") " pod="openstack/neutron-8687477df-8l865" Sep 30 06:36:57 crc kubenswrapper[4691]: I0930 06:36:57.247088 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d0f6749-bfde-4329-9905-f51ef18e904c-combined-ca-bundle\") pod \"neutron-8687477df-8l865\" (UID: \"7d0f6749-bfde-4329-9905-f51ef18e904c\") " pod="openstack/neutron-8687477df-8l865" Sep 30 06:36:57 crc kubenswrapper[4691]: I0930 06:36:57.253574 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="730e43c6-3b1f-4a8c-9540-4ff131592381" path="/var/lib/kubelet/pods/730e43c6-3b1f-4a8c-9540-4ff131592381/volumes" Sep 30 06:36:57 crc kubenswrapper[4691]: I0930 06:36:57.253840 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6t76z\" (UniqueName: \"kubernetes.io/projected/7d0f6749-bfde-4329-9905-f51ef18e904c-kube-api-access-6t76z\") pod \"neutron-8687477df-8l865\" (UID: \"7d0f6749-bfde-4329-9905-f51ef18e904c\") " pod="openstack/neutron-8687477df-8l865" Sep 30 06:36:57 crc kubenswrapper[4691]: I0930 06:36:57.254599 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95c4dd18-3200-4193-9868-7315a13103b3" path="/var/lib/kubelet/pods/95c4dd18-3200-4193-9868-7315a13103b3/volumes" Sep 30 06:36:57 crc kubenswrapper[4691]: I0930 06:36:57.255933 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca89855f-21bb-4d05-93aa-5705f6d93548" path="/var/lib/kubelet/pods/ca89855f-21bb-4d05-93aa-5705f6d93548/volumes" Sep 30 06:36:57 crc kubenswrapper[4691]: I0930 06:36:57.371776 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-8687477df-8l865" Sep 30 06:36:59 crc kubenswrapper[4691]: I0930 06:36:59.982854 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-pblkf" Sep 30 06:37:00 crc kubenswrapper[4691]: I0930 06:37:00.093560 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3e1ab390-f1ae-4ec9-b5d6-fb137a511e21-db-sync-config-data\") pod \"3e1ab390-f1ae-4ec9-b5d6-fb137a511e21\" (UID: \"3e1ab390-f1ae-4ec9-b5d6-fb137a511e21\") " Sep 30 06:37:00 crc kubenswrapper[4691]: I0930 06:37:00.093637 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e1ab390-f1ae-4ec9-b5d6-fb137a511e21-combined-ca-bundle\") pod \"3e1ab390-f1ae-4ec9-b5d6-fb137a511e21\" (UID: \"3e1ab390-f1ae-4ec9-b5d6-fb137a511e21\") " Sep 30 06:37:00 crc kubenswrapper[4691]: I0930 06:37:00.093667 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tjv72\" (UniqueName: \"kubernetes.io/projected/3e1ab390-f1ae-4ec9-b5d6-fb137a511e21-kube-api-access-tjv72\") pod \"3e1ab390-f1ae-4ec9-b5d6-fb137a511e21\" (UID: \"3e1ab390-f1ae-4ec9-b5d6-fb137a511e21\") " Sep 30 06:37:00 crc kubenswrapper[4691]: I0930 06:37:00.098817 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e1ab390-f1ae-4ec9-b5d6-fb137a511e21-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "3e1ab390-f1ae-4ec9-b5d6-fb137a511e21" (UID: "3e1ab390-f1ae-4ec9-b5d6-fb137a511e21"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:37:00 crc kubenswrapper[4691]: I0930 06:37:00.098845 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e1ab390-f1ae-4ec9-b5d6-fb137a511e21-kube-api-access-tjv72" (OuterVolumeSpecName: "kube-api-access-tjv72") pod "3e1ab390-f1ae-4ec9-b5d6-fb137a511e21" (UID: "3e1ab390-f1ae-4ec9-b5d6-fb137a511e21"). InnerVolumeSpecName "kube-api-access-tjv72". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:37:00 crc kubenswrapper[4691]: I0930 06:37:00.140859 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3e1ab390-f1ae-4ec9-b5d6-fb137a511e21-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3e1ab390-f1ae-4ec9-b5d6-fb137a511e21" (UID: "3e1ab390-f1ae-4ec9-b5d6-fb137a511e21"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:37:00 crc kubenswrapper[4691]: I0930 06:37:00.143582 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-pblkf" event={"ID":"3e1ab390-f1ae-4ec9-b5d6-fb137a511e21","Type":"ContainerDied","Data":"b9703e08c9440c1c4381523edd6c55dba9d1a581cb0311632ae4dd7494ec41bf"} Sep 30 06:37:00 crc kubenswrapper[4691]: I0930 06:37:00.143630 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b9703e08c9440c1c4381523edd6c55dba9d1a581cb0311632ae4dd7494ec41bf" Sep 30 06:37:00 crc kubenswrapper[4691]: I0930 06:37:00.143663 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-pblkf" Sep 30 06:37:00 crc kubenswrapper[4691]: I0930 06:37:00.195626 4691 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3e1ab390-f1ae-4ec9-b5d6-fb137a511e21-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:00 crc kubenswrapper[4691]: I0930 06:37:00.195652 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3e1ab390-f1ae-4ec9-b5d6-fb137a511e21-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:00 crc kubenswrapper[4691]: I0930 06:37:00.195663 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tjv72\" (UniqueName: \"kubernetes.io/projected/3e1ab390-f1ae-4ec9-b5d6-fb137a511e21-kube-api-access-tjv72\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:00 crc kubenswrapper[4691]: I0930 06:37:00.225597 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-applier-0" Sep 30 06:37:00 crc kubenswrapper[4691]: I0930 06:37:00.301546 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-applier-0" Sep 30 06:37:00 crc kubenswrapper[4691]: W0930 06:37:00.771701 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d6bfe51_c6c1_4062_b1e1_22905c50a142.slice/crio-578d6ca02bf5638cef2fd8a37adebbfeebdfd3896a7769eaf6f7f33ec501117b WatchSource:0}: Error finding container 578d6ca02bf5638cef2fd8a37adebbfeebdfd3896a7769eaf6f7f33ec501117b: Status 404 returned error can't find the container with id 578d6ca02bf5638cef2fd8a37adebbfeebdfd3896a7769eaf6f7f33ec501117b Sep 30 06:37:00 crc kubenswrapper[4691]: I0930 06:37:00.802492 4691 scope.go:117] "RemoveContainer" containerID="95d2427331b0abd1a63d88739144370f1f4b11996b83b09c945ab4a2bfed6adc" Sep 30 06:37:00 crc kubenswrapper[4691]: E0930 06:37:00.803448 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95d2427331b0abd1a63d88739144370f1f4b11996b83b09c945ab4a2bfed6adc\": container with ID starting with 95d2427331b0abd1a63d88739144370f1f4b11996b83b09c945ab4a2bfed6adc not found: ID does not exist" containerID="95d2427331b0abd1a63d88739144370f1f4b11996b83b09c945ab4a2bfed6adc" Sep 30 06:37:00 crc kubenswrapper[4691]: I0930 06:37:00.803542 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95d2427331b0abd1a63d88739144370f1f4b11996b83b09c945ab4a2bfed6adc"} err="failed to get container status \"95d2427331b0abd1a63d88739144370f1f4b11996b83b09c945ab4a2bfed6adc\": rpc error: code = NotFound desc = could not find container \"95d2427331b0abd1a63d88739144370f1f4b11996b83b09c945ab4a2bfed6adc\": container with ID starting with 95d2427331b0abd1a63d88739144370f1f4b11996b83b09c945ab4a2bfed6adc not found: ID does not exist" Sep 30 06:37:00 crc kubenswrapper[4691]: I0930 06:37:00.803637 4691 scope.go:117] "RemoveContainer" containerID="3e7e8bac6245462541c670bb16eee4e523e1118c6c7a122350fd5ef75f6ad5a3" Sep 30 06:37:00 crc kubenswrapper[4691]: E0930 06:37:00.804241 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e7e8bac6245462541c670bb16eee4e523e1118c6c7a122350fd5ef75f6ad5a3\": container with ID starting with 3e7e8bac6245462541c670bb16eee4e523e1118c6c7a122350fd5ef75f6ad5a3 not found: ID does not exist" containerID="3e7e8bac6245462541c670bb16eee4e523e1118c6c7a122350fd5ef75f6ad5a3" Sep 30 06:37:00 crc kubenswrapper[4691]: I0930 06:37:00.804282 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e7e8bac6245462541c670bb16eee4e523e1118c6c7a122350fd5ef75f6ad5a3"} err="failed to get container status \"3e7e8bac6245462541c670bb16eee4e523e1118c6c7a122350fd5ef75f6ad5a3\": rpc error: code = NotFound desc = could not find container \"3e7e8bac6245462541c670bb16eee4e523e1118c6c7a122350fd5ef75f6ad5a3\": container with ID starting with 3e7e8bac6245462541c670bb16eee4e523e1118c6c7a122350fd5ef75f6ad5a3 not found: ID does not exist" Sep 30 06:37:00 crc kubenswrapper[4691]: I0930 06:37:00.804308 4691 scope.go:117] "RemoveContainer" containerID="95d2427331b0abd1a63d88739144370f1f4b11996b83b09c945ab4a2bfed6adc" Sep 30 06:37:00 crc kubenswrapper[4691]: I0930 06:37:00.804735 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95d2427331b0abd1a63d88739144370f1f4b11996b83b09c945ab4a2bfed6adc"} err="failed to get container status \"95d2427331b0abd1a63d88739144370f1f4b11996b83b09c945ab4a2bfed6adc\": rpc error: code = NotFound desc = could not find container \"95d2427331b0abd1a63d88739144370f1f4b11996b83b09c945ab4a2bfed6adc\": container with ID starting with 95d2427331b0abd1a63d88739144370f1f4b11996b83b09c945ab4a2bfed6adc not found: ID does not exist" Sep 30 06:37:00 crc kubenswrapper[4691]: I0930 06:37:00.804760 4691 scope.go:117] "RemoveContainer" containerID="3e7e8bac6245462541c670bb16eee4e523e1118c6c7a122350fd5ef75f6ad5a3" Sep 30 06:37:00 crc kubenswrapper[4691]: I0930 06:37:00.805584 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e7e8bac6245462541c670bb16eee4e523e1118c6c7a122350fd5ef75f6ad5a3"} err="failed to get container status \"3e7e8bac6245462541c670bb16eee4e523e1118c6c7a122350fd5ef75f6ad5a3\": rpc error: code = NotFound desc = could not find container \"3e7e8bac6245462541c670bb16eee4e523e1118c6c7a122350fd5ef75f6ad5a3\": container with ID starting with 3e7e8bac6245462541c670bb16eee4e523e1118c6c7a122350fd5ef75f6ad5a3 not found: ID does not exist" Sep 30 06:37:00 crc kubenswrapper[4691]: I0930 06:37:00.805626 4691 scope.go:117] "RemoveContainer" containerID="45fe2b4c11d6247066471f070f55ce4395f18598527836e58e649a9a6563713c" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.114704 4691 scope.go:117] "RemoveContainer" containerID="97368ae4b6b7644978046f502ca13057d3a35202a18e92e04b831593c190457f" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.162270 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f47894d84-xb69p" event={"ID":"4d6bfe51-c6c1-4062-b1e1-22905c50a142","Type":"ContainerStarted","Data":"578d6ca02bf5638cef2fd8a37adebbfeebdfd3896a7769eaf6f7f33ec501117b"} Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.209795 4691 scope.go:117] "RemoveContainer" containerID="0a8b70070c7c4c4b4e43f91d12a71b2845f849b3dbe059d8b6fc0efc4dd3e0e8" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.272046 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-applier-0" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.319286 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-5c7dfdd7cd-qdz5k"] Sep 30 06:37:01 crc kubenswrapper[4691]: E0930 06:37:01.319759 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e1ab390-f1ae-4ec9-b5d6-fb137a511e21" containerName="barbican-db-sync" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.319775 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e1ab390-f1ae-4ec9-b5d6-fb137a511e21" containerName="barbican-db-sync" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.320699 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e1ab390-f1ae-4ec9-b5d6-fb137a511e21" containerName="barbican-db-sync" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.322322 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5c7dfdd7cd-qdz5k" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.328039 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.328241 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.328354 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-84nz8" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.348107 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-5d7ff878f-9tz9w"] Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.351068 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5d7ff878f-9tz9w" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.356201 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.377987 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5c7dfdd7cd-qdz5k"] Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.395177 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5d7ff878f-9tz9w"] Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.453732 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpnlk\" (UniqueName: \"kubernetes.io/projected/bccc96eb-4a1a-44bc-8086-eb5e7a7ce253-kube-api-access-xpnlk\") pod \"barbican-keystone-listener-5c7dfdd7cd-qdz5k\" (UID: \"bccc96eb-4a1a-44bc-8086-eb5e7a7ce253\") " pod="openstack/barbican-keystone-listener-5c7dfdd7cd-qdz5k" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.454039 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d94cb2d-a415-4b43-9976-0a844c446734-config-data\") pod \"barbican-worker-5d7ff878f-9tz9w\" (UID: \"9d94cb2d-a415-4b43-9976-0a844c446734\") " pod="openstack/barbican-worker-5d7ff878f-9tz9w" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.454083 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d94cb2d-a415-4b43-9976-0a844c446734-combined-ca-bundle\") pod \"barbican-worker-5d7ff878f-9tz9w\" (UID: \"9d94cb2d-a415-4b43-9976-0a844c446734\") " pod="openstack/barbican-worker-5d7ff878f-9tz9w" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.454107 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bccc96eb-4a1a-44bc-8086-eb5e7a7ce253-config-data\") pod \"barbican-keystone-listener-5c7dfdd7cd-qdz5k\" (UID: \"bccc96eb-4a1a-44bc-8086-eb5e7a7ce253\") " pod="openstack/barbican-keystone-listener-5c7dfdd7cd-qdz5k" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.454130 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d94cb2d-a415-4b43-9976-0a844c446734-logs\") pod \"barbican-worker-5d7ff878f-9tz9w\" (UID: \"9d94cb2d-a415-4b43-9976-0a844c446734\") " pod="openstack/barbican-worker-5d7ff878f-9tz9w" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.454163 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddbbn\" (UniqueName: \"kubernetes.io/projected/9d94cb2d-a415-4b43-9976-0a844c446734-kube-api-access-ddbbn\") pod \"barbican-worker-5d7ff878f-9tz9w\" (UID: \"9d94cb2d-a415-4b43-9976-0a844c446734\") " pod="openstack/barbican-worker-5d7ff878f-9tz9w" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.454212 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bccc96eb-4a1a-44bc-8086-eb5e7a7ce253-combined-ca-bundle\") pod \"barbican-keystone-listener-5c7dfdd7cd-qdz5k\" (UID: \"bccc96eb-4a1a-44bc-8086-eb5e7a7ce253\") " pod="openstack/barbican-keystone-listener-5c7dfdd7cd-qdz5k" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.454242 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9d94cb2d-a415-4b43-9976-0a844c446734-config-data-custom\") pod \"barbican-worker-5d7ff878f-9tz9w\" (UID: \"9d94cb2d-a415-4b43-9976-0a844c446734\") " pod="openstack/barbican-worker-5d7ff878f-9tz9w" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.454262 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bccc96eb-4a1a-44bc-8086-eb5e7a7ce253-config-data-custom\") pod \"barbican-keystone-listener-5c7dfdd7cd-qdz5k\" (UID: \"bccc96eb-4a1a-44bc-8086-eb5e7a7ce253\") " pod="openstack/barbican-keystone-listener-5c7dfdd7cd-qdz5k" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.454275 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bccc96eb-4a1a-44bc-8086-eb5e7a7ce253-logs\") pod \"barbican-keystone-listener-5c7dfdd7cd-qdz5k\" (UID: \"bccc96eb-4a1a-44bc-8086-eb5e7a7ce253\") " pod="openstack/barbican-keystone-listener-5c7dfdd7cd-qdz5k" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.466393 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d9695899-d4zhb"] Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.477593 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5865f587f5-rvd99"] Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.479276 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5865f587f5-rvd99" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.490026 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5865f587f5-rvd99"] Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.495045 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-67959b9db8-p8w6w"] Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.496460 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-67959b9db8-p8w6w" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.498968 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.511292 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-67959b9db8-p8w6w"] Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.556035 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpnlk\" (UniqueName: \"kubernetes.io/projected/bccc96eb-4a1a-44bc-8086-eb5e7a7ce253-kube-api-access-xpnlk\") pod \"barbican-keystone-listener-5c7dfdd7cd-qdz5k\" (UID: \"bccc96eb-4a1a-44bc-8086-eb5e7a7ce253\") " pod="openstack/barbican-keystone-listener-5c7dfdd7cd-qdz5k" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.556098 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab991360-2557-48e2-b39e-91dece03bcbe-config\") pod \"dnsmasq-dns-5865f587f5-rvd99\" (UID: \"ab991360-2557-48e2-b39e-91dece03bcbe\") " pod="openstack/dnsmasq-dns-5865f587f5-rvd99" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.556119 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ab991360-2557-48e2-b39e-91dece03bcbe-dns-swift-storage-0\") pod \"dnsmasq-dns-5865f587f5-rvd99\" (UID: \"ab991360-2557-48e2-b39e-91dece03bcbe\") " pod="openstack/dnsmasq-dns-5865f587f5-rvd99" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.556140 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d94cb2d-a415-4b43-9976-0a844c446734-config-data\") pod \"barbican-worker-5d7ff878f-9tz9w\" (UID: \"9d94cb2d-a415-4b43-9976-0a844c446734\") " pod="openstack/barbican-worker-5d7ff878f-9tz9w" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.556176 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbfa4b8e-37ac-4713-a8af-b610f086f7e2-combined-ca-bundle\") pod \"barbican-api-67959b9db8-p8w6w\" (UID: \"cbfa4b8e-37ac-4713-a8af-b610f086f7e2\") " pod="openstack/barbican-api-67959b9db8-p8w6w" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.556211 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ab991360-2557-48e2-b39e-91dece03bcbe-dns-svc\") pod \"dnsmasq-dns-5865f587f5-rvd99\" (UID: \"ab991360-2557-48e2-b39e-91dece03bcbe\") " pod="openstack/dnsmasq-dns-5865f587f5-rvd99" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.556249 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d94cb2d-a415-4b43-9976-0a844c446734-combined-ca-bundle\") pod \"barbican-worker-5d7ff878f-9tz9w\" (UID: \"9d94cb2d-a415-4b43-9976-0a844c446734\") " pod="openstack/barbican-worker-5d7ff878f-9tz9w" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.556275 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bccc96eb-4a1a-44bc-8086-eb5e7a7ce253-config-data\") pod \"barbican-keystone-listener-5c7dfdd7cd-qdz5k\" (UID: \"bccc96eb-4a1a-44bc-8086-eb5e7a7ce253\") " pod="openstack/barbican-keystone-listener-5c7dfdd7cd-qdz5k" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.556291 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cbfa4b8e-37ac-4713-a8af-b610f086f7e2-config-data-custom\") pod \"barbican-api-67959b9db8-p8w6w\" (UID: \"cbfa4b8e-37ac-4713-a8af-b610f086f7e2\") " pod="openstack/barbican-api-67959b9db8-p8w6w" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.556334 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d94cb2d-a415-4b43-9976-0a844c446734-logs\") pod \"barbican-worker-5d7ff878f-9tz9w\" (UID: \"9d94cb2d-a415-4b43-9976-0a844c446734\") " pod="openstack/barbican-worker-5d7ff878f-9tz9w" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.556359 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddktz\" (UniqueName: \"kubernetes.io/projected/cbfa4b8e-37ac-4713-a8af-b610f086f7e2-kube-api-access-ddktz\") pod \"barbican-api-67959b9db8-p8w6w\" (UID: \"cbfa4b8e-37ac-4713-a8af-b610f086f7e2\") " pod="openstack/barbican-api-67959b9db8-p8w6w" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.556377 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ab991360-2557-48e2-b39e-91dece03bcbe-ovsdbserver-sb\") pod \"dnsmasq-dns-5865f587f5-rvd99\" (UID: \"ab991360-2557-48e2-b39e-91dece03bcbe\") " pod="openstack/dnsmasq-dns-5865f587f5-rvd99" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.556414 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbfa4b8e-37ac-4713-a8af-b610f086f7e2-config-data\") pod \"barbican-api-67959b9db8-p8w6w\" (UID: \"cbfa4b8e-37ac-4713-a8af-b610f086f7e2\") " pod="openstack/barbican-api-67959b9db8-p8w6w" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.556434 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddbbn\" (UniqueName: \"kubernetes.io/projected/9d94cb2d-a415-4b43-9976-0a844c446734-kube-api-access-ddbbn\") pod \"barbican-worker-5d7ff878f-9tz9w\" (UID: \"9d94cb2d-a415-4b43-9976-0a844c446734\") " pod="openstack/barbican-worker-5d7ff878f-9tz9w" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.556450 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ab991360-2557-48e2-b39e-91dece03bcbe-ovsdbserver-nb\") pod \"dnsmasq-dns-5865f587f5-rvd99\" (UID: \"ab991360-2557-48e2-b39e-91dece03bcbe\") " pod="openstack/dnsmasq-dns-5865f587f5-rvd99" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.557050 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bccc96eb-4a1a-44bc-8086-eb5e7a7ce253-combined-ca-bundle\") pod \"barbican-keystone-listener-5c7dfdd7cd-qdz5k\" (UID: \"bccc96eb-4a1a-44bc-8086-eb5e7a7ce253\") " pod="openstack/barbican-keystone-listener-5c7dfdd7cd-qdz5k" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.557077 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmq2r\" (UniqueName: \"kubernetes.io/projected/ab991360-2557-48e2-b39e-91dece03bcbe-kube-api-access-zmq2r\") pod \"dnsmasq-dns-5865f587f5-rvd99\" (UID: \"ab991360-2557-48e2-b39e-91dece03bcbe\") " pod="openstack/dnsmasq-dns-5865f587f5-rvd99" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.557113 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9d94cb2d-a415-4b43-9976-0a844c446734-config-data-custom\") pod \"barbican-worker-5d7ff878f-9tz9w\" (UID: \"9d94cb2d-a415-4b43-9976-0a844c446734\") " pod="openstack/barbican-worker-5d7ff878f-9tz9w" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.557137 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cbfa4b8e-37ac-4713-a8af-b610f086f7e2-logs\") pod \"barbican-api-67959b9db8-p8w6w\" (UID: \"cbfa4b8e-37ac-4713-a8af-b610f086f7e2\") " pod="openstack/barbican-api-67959b9db8-p8w6w" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.557135 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9d94cb2d-a415-4b43-9976-0a844c446734-logs\") pod \"barbican-worker-5d7ff878f-9tz9w\" (UID: \"9d94cb2d-a415-4b43-9976-0a844c446734\") " pod="openstack/barbican-worker-5d7ff878f-9tz9w" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.557854 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bccc96eb-4a1a-44bc-8086-eb5e7a7ce253-config-data-custom\") pod \"barbican-keystone-listener-5c7dfdd7cd-qdz5k\" (UID: \"bccc96eb-4a1a-44bc-8086-eb5e7a7ce253\") " pod="openstack/barbican-keystone-listener-5c7dfdd7cd-qdz5k" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.560726 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9d94cb2d-a415-4b43-9976-0a844c446734-config-data-custom\") pod \"barbican-worker-5d7ff878f-9tz9w\" (UID: \"9d94cb2d-a415-4b43-9976-0a844c446734\") " pod="openstack/barbican-worker-5d7ff878f-9tz9w" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.560779 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bccc96eb-4a1a-44bc-8086-eb5e7a7ce253-logs\") pod \"barbican-keystone-listener-5c7dfdd7cd-qdz5k\" (UID: \"bccc96eb-4a1a-44bc-8086-eb5e7a7ce253\") " pod="openstack/barbican-keystone-listener-5c7dfdd7cd-qdz5k" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.561201 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bccc96eb-4a1a-44bc-8086-eb5e7a7ce253-logs\") pod \"barbican-keystone-listener-5c7dfdd7cd-qdz5k\" (UID: \"bccc96eb-4a1a-44bc-8086-eb5e7a7ce253\") " pod="openstack/barbican-keystone-listener-5c7dfdd7cd-qdz5k" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.563669 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9d94cb2d-a415-4b43-9976-0a844c446734-combined-ca-bundle\") pod \"barbican-worker-5d7ff878f-9tz9w\" (UID: \"9d94cb2d-a415-4b43-9976-0a844c446734\") " pod="openstack/barbican-worker-5d7ff878f-9tz9w" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.564093 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bccc96eb-4a1a-44bc-8086-eb5e7a7ce253-config-data-custom\") pod \"barbican-keystone-listener-5c7dfdd7cd-qdz5k\" (UID: \"bccc96eb-4a1a-44bc-8086-eb5e7a7ce253\") " pod="openstack/barbican-keystone-listener-5c7dfdd7cd-qdz5k" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.576832 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9d94cb2d-a415-4b43-9976-0a844c446734-config-data\") pod \"barbican-worker-5d7ff878f-9tz9w\" (UID: \"9d94cb2d-a415-4b43-9976-0a844c446734\") " pod="openstack/barbican-worker-5d7ff878f-9tz9w" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.576886 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bccc96eb-4a1a-44bc-8086-eb5e7a7ce253-config-data\") pod \"barbican-keystone-listener-5c7dfdd7cd-qdz5k\" (UID: \"bccc96eb-4a1a-44bc-8086-eb5e7a7ce253\") " pod="openstack/barbican-keystone-listener-5c7dfdd7cd-qdz5k" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.577336 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bccc96eb-4a1a-44bc-8086-eb5e7a7ce253-combined-ca-bundle\") pod \"barbican-keystone-listener-5c7dfdd7cd-qdz5k\" (UID: \"bccc96eb-4a1a-44bc-8086-eb5e7a7ce253\") " pod="openstack/barbican-keystone-listener-5c7dfdd7cd-qdz5k" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.581399 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpnlk\" (UniqueName: \"kubernetes.io/projected/bccc96eb-4a1a-44bc-8086-eb5e7a7ce253-kube-api-access-xpnlk\") pod \"barbican-keystone-listener-5c7dfdd7cd-qdz5k\" (UID: \"bccc96eb-4a1a-44bc-8086-eb5e7a7ce253\") " pod="openstack/barbican-keystone-listener-5c7dfdd7cd-qdz5k" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.585011 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddbbn\" (UniqueName: \"kubernetes.io/projected/9d94cb2d-a415-4b43-9976-0a844c446734-kube-api-access-ddbbn\") pod \"barbican-worker-5d7ff878f-9tz9w\" (UID: \"9d94cb2d-a415-4b43-9976-0a844c446734\") " pod="openstack/barbican-worker-5d7ff878f-9tz9w" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.606800 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.612826 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.664435 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddktz\" (UniqueName: \"kubernetes.io/projected/cbfa4b8e-37ac-4713-a8af-b610f086f7e2-kube-api-access-ddktz\") pod \"barbican-api-67959b9db8-p8w6w\" (UID: \"cbfa4b8e-37ac-4713-a8af-b610f086f7e2\") " pod="openstack/barbican-api-67959b9db8-p8w6w" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.664479 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ab991360-2557-48e2-b39e-91dece03bcbe-ovsdbserver-sb\") pod \"dnsmasq-dns-5865f587f5-rvd99\" (UID: \"ab991360-2557-48e2-b39e-91dece03bcbe\") " pod="openstack/dnsmasq-dns-5865f587f5-rvd99" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.664503 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbfa4b8e-37ac-4713-a8af-b610f086f7e2-config-data\") pod \"barbican-api-67959b9db8-p8w6w\" (UID: \"cbfa4b8e-37ac-4713-a8af-b610f086f7e2\") " pod="openstack/barbican-api-67959b9db8-p8w6w" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.664529 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ab991360-2557-48e2-b39e-91dece03bcbe-ovsdbserver-nb\") pod \"dnsmasq-dns-5865f587f5-rvd99\" (UID: \"ab991360-2557-48e2-b39e-91dece03bcbe\") " pod="openstack/dnsmasq-dns-5865f587f5-rvd99" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.664602 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmq2r\" (UniqueName: \"kubernetes.io/projected/ab991360-2557-48e2-b39e-91dece03bcbe-kube-api-access-zmq2r\") pod \"dnsmasq-dns-5865f587f5-rvd99\" (UID: \"ab991360-2557-48e2-b39e-91dece03bcbe\") " pod="openstack/dnsmasq-dns-5865f587f5-rvd99" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.664641 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cbfa4b8e-37ac-4713-a8af-b610f086f7e2-logs\") pod \"barbican-api-67959b9db8-p8w6w\" (UID: \"cbfa4b8e-37ac-4713-a8af-b610f086f7e2\") " pod="openstack/barbican-api-67959b9db8-p8w6w" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.664703 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab991360-2557-48e2-b39e-91dece03bcbe-config\") pod \"dnsmasq-dns-5865f587f5-rvd99\" (UID: \"ab991360-2557-48e2-b39e-91dece03bcbe\") " pod="openstack/dnsmasq-dns-5865f587f5-rvd99" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.664724 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ab991360-2557-48e2-b39e-91dece03bcbe-dns-swift-storage-0\") pod \"dnsmasq-dns-5865f587f5-rvd99\" (UID: \"ab991360-2557-48e2-b39e-91dece03bcbe\") " pod="openstack/dnsmasq-dns-5865f587f5-rvd99" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.664744 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbfa4b8e-37ac-4713-a8af-b610f086f7e2-combined-ca-bundle\") pod \"barbican-api-67959b9db8-p8w6w\" (UID: \"cbfa4b8e-37ac-4713-a8af-b610f086f7e2\") " pod="openstack/barbican-api-67959b9db8-p8w6w" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.664782 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ab991360-2557-48e2-b39e-91dece03bcbe-dns-svc\") pod \"dnsmasq-dns-5865f587f5-rvd99\" (UID: \"ab991360-2557-48e2-b39e-91dece03bcbe\") " pod="openstack/dnsmasq-dns-5865f587f5-rvd99" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.664810 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cbfa4b8e-37ac-4713-a8af-b610f086f7e2-config-data-custom\") pod \"barbican-api-67959b9db8-p8w6w\" (UID: \"cbfa4b8e-37ac-4713-a8af-b610f086f7e2\") " pod="openstack/barbican-api-67959b9db8-p8w6w" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.665075 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cbfa4b8e-37ac-4713-a8af-b610f086f7e2-logs\") pod \"barbican-api-67959b9db8-p8w6w\" (UID: \"cbfa4b8e-37ac-4713-a8af-b610f086f7e2\") " pod="openstack/barbican-api-67959b9db8-p8w6w" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.665710 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ab991360-2557-48e2-b39e-91dece03bcbe-ovsdbserver-sb\") pod \"dnsmasq-dns-5865f587f5-rvd99\" (UID: \"ab991360-2557-48e2-b39e-91dece03bcbe\") " pod="openstack/dnsmasq-dns-5865f587f5-rvd99" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.665937 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab991360-2557-48e2-b39e-91dece03bcbe-config\") pod \"dnsmasq-dns-5865f587f5-rvd99\" (UID: \"ab991360-2557-48e2-b39e-91dece03bcbe\") " pod="openstack/dnsmasq-dns-5865f587f5-rvd99" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.666735 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ab991360-2557-48e2-b39e-91dece03bcbe-dns-swift-storage-0\") pod \"dnsmasq-dns-5865f587f5-rvd99\" (UID: \"ab991360-2557-48e2-b39e-91dece03bcbe\") " pod="openstack/dnsmasq-dns-5865f587f5-rvd99" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.668268 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ab991360-2557-48e2-b39e-91dece03bcbe-ovsdbserver-nb\") pod \"dnsmasq-dns-5865f587f5-rvd99\" (UID: \"ab991360-2557-48e2-b39e-91dece03bcbe\") " pod="openstack/dnsmasq-dns-5865f587f5-rvd99" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.669430 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ab991360-2557-48e2-b39e-91dece03bcbe-dns-svc\") pod \"dnsmasq-dns-5865f587f5-rvd99\" (UID: \"ab991360-2557-48e2-b39e-91dece03bcbe\") " pod="openstack/dnsmasq-dns-5865f587f5-rvd99" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.674036 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbfa4b8e-37ac-4713-a8af-b610f086f7e2-config-data\") pod \"barbican-api-67959b9db8-p8w6w\" (UID: \"cbfa4b8e-37ac-4713-a8af-b610f086f7e2\") " pod="openstack/barbican-api-67959b9db8-p8w6w" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.677521 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbfa4b8e-37ac-4713-a8af-b610f086f7e2-combined-ca-bundle\") pod \"barbican-api-67959b9db8-p8w6w\" (UID: \"cbfa4b8e-37ac-4713-a8af-b610f086f7e2\") " pod="openstack/barbican-api-67959b9db8-p8w6w" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.678104 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cbfa4b8e-37ac-4713-a8af-b610f086f7e2-config-data-custom\") pod \"barbican-api-67959b9db8-p8w6w\" (UID: \"cbfa4b8e-37ac-4713-a8af-b610f086f7e2\") " pod="openstack/barbican-api-67959b9db8-p8w6w" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.681693 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddktz\" (UniqueName: \"kubernetes.io/projected/cbfa4b8e-37ac-4713-a8af-b610f086f7e2-kube-api-access-ddktz\") pod \"barbican-api-67959b9db8-p8w6w\" (UID: \"cbfa4b8e-37ac-4713-a8af-b610f086f7e2\") " pod="openstack/barbican-api-67959b9db8-p8w6w" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.684638 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmq2r\" (UniqueName: \"kubernetes.io/projected/ab991360-2557-48e2-b39e-91dece03bcbe-kube-api-access-zmq2r\") pod \"dnsmasq-dns-5865f587f5-rvd99\" (UID: \"ab991360-2557-48e2-b39e-91dece03bcbe\") " pod="openstack/dnsmasq-dns-5865f587f5-rvd99" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.706709 4691 scope.go:117] "RemoveContainer" containerID="be42e16a38be0dbe741dfa7d7ee8358eef6373849e72d61dfa4010eb140cc226" Sep 30 06:37:01 crc kubenswrapper[4691]: E0930 06:37:01.793531 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="6781ac88-7516-4101-8abd-9cacfbb930b7" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.817467 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-5c7dfdd7cd-qdz5k" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.824778 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5d7ff878f-9tz9w" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.849435 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5865f587f5-rvd99" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.868644 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-67959b9db8-p8w6w" Sep 30 06:37:01 crc kubenswrapper[4691]: I0930 06:37:01.931668 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-8687477df-8l865"] Sep 30 06:37:02 crc kubenswrapper[4691]: I0930 06:37:02.179680 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8687477df-8l865" event={"ID":"7d0f6749-bfde-4329-9905-f51ef18e904c","Type":"ContainerStarted","Data":"6f8ce8385903eab7e5c71bde0f2b1ecd240d23661f5f770e52359c902e69c5be"} Sep 30 06:37:02 crc kubenswrapper[4691]: I0930 06:37:02.184968 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"af3f1644-3ab8-4a6a-9f80-f8ea42297e98","Type":"ContainerStarted","Data":"0bda2d329b80bafe79c6a51e31c421bb8b3254080dc231fa2738d610c27ec014"} Sep 30 06:37:02 crc kubenswrapper[4691]: I0930 06:37:02.192055 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f47894d84-xb69p" event={"ID":"4d6bfe51-c6c1-4062-b1e1-22905c50a142","Type":"ContainerStarted","Data":"90829e5557768999da198dbb96694764a139fd80b48837ac756544291f70a36c"} Sep 30 06:37:02 crc kubenswrapper[4691]: I0930 06:37:02.192089 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f47894d84-xb69p" event={"ID":"4d6bfe51-c6c1-4062-b1e1-22905c50a142","Type":"ContainerStarted","Data":"9e0a6bbbcf9c4b88082c2de5488f2b0e557254e439866c0f97aad7411933c802"} Sep 30 06:37:02 crc kubenswrapper[4691]: I0930 06:37:02.192688 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-6f47894d84-xb69p" Sep 30 06:37:02 crc kubenswrapper[4691]: I0930 06:37:02.262285 4691 generic.go:334] "Generic (PLEG): container finished" podID="00f9a0ab-0ede-4a32-8fc4-baf3788218f8" containerID="d36d6bc38729f413b76c93bc71b94701255a6b816c83011f4876ecb1ac0c41f2" exitCode=0 Sep 30 06:37:02 crc kubenswrapper[4691]: I0930 06:37:02.262754 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-6f47894d84-xb69p" podStartSLOduration=8.262732268 podStartE2EDuration="8.262732268s" podCreationTimestamp="2025-09-30 06:36:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:37:02.247977584 +0000 UTC m=+1065.722998624" watchObservedRunningTime="2025-09-30 06:37:02.262732268 +0000 UTC m=+1065.737753308" Sep 30 06:37:02 crc kubenswrapper[4691]: I0930 06:37:02.263006 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d9695899-d4zhb" event={"ID":"00f9a0ab-0ede-4a32-8fc4-baf3788218f8","Type":"ContainerDied","Data":"d36d6bc38729f413b76c93bc71b94701255a6b816c83011f4876ecb1ac0c41f2"} Sep 30 06:37:02 crc kubenswrapper[4691]: I0930 06:37:02.281384 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6781ac88-7516-4101-8abd-9cacfbb930b7" containerName="ceilometer-notification-agent" containerID="cri-o://50d0d88a5f1231386758cbaf3ab91925e14ccbf64d6051e87f834402f40a6ee1" gracePeriod=30 Sep 30 06:37:02 crc kubenswrapper[4691]: I0930 06:37:02.281482 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6781ac88-7516-4101-8abd-9cacfbb930b7","Type":"ContainerStarted","Data":"1f25bc6dda3eba3f29ffd650446bc5056db4492240f2fd9795efaa7075f86d61"} Sep 30 06:37:02 crc kubenswrapper[4691]: I0930 06:37:02.281516 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6781ac88-7516-4101-8abd-9cacfbb930b7" containerName="proxy-httpd" containerID="cri-o://1f25bc6dda3eba3f29ffd650446bc5056db4492240f2fd9795efaa7075f86d61" gracePeriod=30 Sep 30 06:37:02 crc kubenswrapper[4691]: I0930 06:37:02.281592 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6781ac88-7516-4101-8abd-9cacfbb930b7" containerName="sg-core" containerID="cri-o://3b94374bd067443ed31a28e4b285e89c05324fc370fedd8b4d95d64d45cb5855" gracePeriod=30 Sep 30 06:37:02 crc kubenswrapper[4691]: I0930 06:37:02.282007 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 06:37:02 crc kubenswrapper[4691]: I0930 06:37:02.289056 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Sep 30 06:37:02 crc kubenswrapper[4691]: I0930 06:37:02.394307 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5d7ff878f-9tz9w"] Sep 30 06:37:02 crc kubenswrapper[4691]: I0930 06:37:02.454679 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-5c7dfdd7cd-qdz5k"] Sep 30 06:37:02 crc kubenswrapper[4691]: I0930 06:37:02.600011 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5865f587f5-rvd99"] Sep 30 06:37:02 crc kubenswrapper[4691]: W0930 06:37:02.657929 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podab991360_2557_48e2_b39e_91dece03bcbe.slice/crio-c99bca1cca04bba916cc8a5f28435bd0cdd1222c5d5f29163157bf3f7d99ee45 WatchSource:0}: Error finding container c99bca1cca04bba916cc8a5f28435bd0cdd1222c5d5f29163157bf3f7d99ee45: Status 404 returned error can't find the container with id c99bca1cca04bba916cc8a5f28435bd0cdd1222c5d5f29163157bf3f7d99ee45 Sep 30 06:37:02 crc kubenswrapper[4691]: I0930 06:37:02.794260 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-67959b9db8-p8w6w"] Sep 30 06:37:02 crc kubenswrapper[4691]: W0930 06:37:02.800216 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcbfa4b8e_37ac_4713_a8af_b610f086f7e2.slice/crio-feb2146197bcd200b2070e9ef6b33126ce9c44667ce4e55e7a22dde92f49def9 WatchSource:0}: Error finding container feb2146197bcd200b2070e9ef6b33126ce9c44667ce4e55e7a22dde92f49def9: Status 404 returned error can't find the container with id feb2146197bcd200b2070e9ef6b33126ce9c44667ce4e55e7a22dde92f49def9 Sep 30 06:37:02 crc kubenswrapper[4691]: I0930 06:37:02.889205 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d9695899-d4zhb" Sep 30 06:37:03 crc kubenswrapper[4691]: I0930 06:37:03.021457 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00f9a0ab-0ede-4a32-8fc4-baf3788218f8-config\") pod \"00f9a0ab-0ede-4a32-8fc4-baf3788218f8\" (UID: \"00f9a0ab-0ede-4a32-8fc4-baf3788218f8\") " Sep 30 06:37:03 crc kubenswrapper[4691]: I0930 06:37:03.021724 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/00f9a0ab-0ede-4a32-8fc4-baf3788218f8-dns-swift-storage-0\") pod \"00f9a0ab-0ede-4a32-8fc4-baf3788218f8\" (UID: \"00f9a0ab-0ede-4a32-8fc4-baf3788218f8\") " Sep 30 06:37:03 crc kubenswrapper[4691]: I0930 06:37:03.021747 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00f9a0ab-0ede-4a32-8fc4-baf3788218f8-dns-svc\") pod \"00f9a0ab-0ede-4a32-8fc4-baf3788218f8\" (UID: \"00f9a0ab-0ede-4a32-8fc4-baf3788218f8\") " Sep 30 06:37:03 crc kubenswrapper[4691]: I0930 06:37:03.021774 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/00f9a0ab-0ede-4a32-8fc4-baf3788218f8-ovsdbserver-nb\") pod \"00f9a0ab-0ede-4a32-8fc4-baf3788218f8\" (UID: \"00f9a0ab-0ede-4a32-8fc4-baf3788218f8\") " Sep 30 06:37:03 crc kubenswrapper[4691]: I0930 06:37:03.021859 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwxzc\" (UniqueName: \"kubernetes.io/projected/00f9a0ab-0ede-4a32-8fc4-baf3788218f8-kube-api-access-nwxzc\") pod \"00f9a0ab-0ede-4a32-8fc4-baf3788218f8\" (UID: \"00f9a0ab-0ede-4a32-8fc4-baf3788218f8\") " Sep 30 06:37:03 crc kubenswrapper[4691]: I0930 06:37:03.021946 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00f9a0ab-0ede-4a32-8fc4-baf3788218f8-ovsdbserver-sb\") pod \"00f9a0ab-0ede-4a32-8fc4-baf3788218f8\" (UID: \"00f9a0ab-0ede-4a32-8fc4-baf3788218f8\") " Sep 30 06:37:03 crc kubenswrapper[4691]: I0930 06:37:03.069055 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00f9a0ab-0ede-4a32-8fc4-baf3788218f8-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "00f9a0ab-0ede-4a32-8fc4-baf3788218f8" (UID: "00f9a0ab-0ede-4a32-8fc4-baf3788218f8"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:37:03 crc kubenswrapper[4691]: I0930 06:37:03.086975 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00f9a0ab-0ede-4a32-8fc4-baf3788218f8-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "00f9a0ab-0ede-4a32-8fc4-baf3788218f8" (UID: "00f9a0ab-0ede-4a32-8fc4-baf3788218f8"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:37:03 crc kubenswrapper[4691]: I0930 06:37:03.118052 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00f9a0ab-0ede-4a32-8fc4-baf3788218f8-kube-api-access-nwxzc" (OuterVolumeSpecName: "kube-api-access-nwxzc") pod "00f9a0ab-0ede-4a32-8fc4-baf3788218f8" (UID: "00f9a0ab-0ede-4a32-8fc4-baf3788218f8"). InnerVolumeSpecName "kube-api-access-nwxzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:37:03 crc kubenswrapper[4691]: I0930 06:37:03.126013 4691 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/00f9a0ab-0ede-4a32-8fc4-baf3788218f8-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:03 crc kubenswrapper[4691]: I0930 06:37:03.126359 4691 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/00f9a0ab-0ede-4a32-8fc4-baf3788218f8-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:03 crc kubenswrapper[4691]: I0930 06:37:03.126433 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwxzc\" (UniqueName: \"kubernetes.io/projected/00f9a0ab-0ede-4a32-8fc4-baf3788218f8-kube-api-access-nwxzc\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:03 crc kubenswrapper[4691]: I0930 06:37:03.138040 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00f9a0ab-0ede-4a32-8fc4-baf3788218f8-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "00f9a0ab-0ede-4a32-8fc4-baf3788218f8" (UID: "00f9a0ab-0ede-4a32-8fc4-baf3788218f8"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:37:03 crc kubenswrapper[4691]: I0930 06:37:03.165113 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00f9a0ab-0ede-4a32-8fc4-baf3788218f8-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "00f9a0ab-0ede-4a32-8fc4-baf3788218f8" (UID: "00f9a0ab-0ede-4a32-8fc4-baf3788218f8"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:37:03 crc kubenswrapper[4691]: I0930 06:37:03.232294 4691 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/00f9a0ab-0ede-4a32-8fc4-baf3788218f8-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:03 crc kubenswrapper[4691]: I0930 06:37:03.232329 4691 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/00f9a0ab-0ede-4a32-8fc4-baf3788218f8-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:03 crc kubenswrapper[4691]: I0930 06:37:03.279052 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/00f9a0ab-0ede-4a32-8fc4-baf3788218f8-config" (OuterVolumeSpecName: "config") pod "00f9a0ab-0ede-4a32-8fc4-baf3788218f8" (UID: "00f9a0ab-0ede-4a32-8fc4-baf3788218f8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:37:03 crc kubenswrapper[4691]: I0930 06:37:03.327927 4691 generic.go:334] "Generic (PLEG): container finished" podID="6781ac88-7516-4101-8abd-9cacfbb930b7" containerID="1f25bc6dda3eba3f29ffd650446bc5056db4492240f2fd9795efaa7075f86d61" exitCode=0 Sep 30 06:37:03 crc kubenswrapper[4691]: I0930 06:37:03.327953 4691 generic.go:334] "Generic (PLEG): container finished" podID="6781ac88-7516-4101-8abd-9cacfbb930b7" containerID="3b94374bd067443ed31a28e4b285e89c05324fc370fedd8b4d95d64d45cb5855" exitCode=2 Sep 30 06:37:03 crc kubenswrapper[4691]: I0930 06:37:03.327993 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6781ac88-7516-4101-8abd-9cacfbb930b7","Type":"ContainerDied","Data":"1f25bc6dda3eba3f29ffd650446bc5056db4492240f2fd9795efaa7075f86d61"} Sep 30 06:37:03 crc kubenswrapper[4691]: I0930 06:37:03.328016 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6781ac88-7516-4101-8abd-9cacfbb930b7","Type":"ContainerDied","Data":"3b94374bd067443ed31a28e4b285e89c05324fc370fedd8b4d95d64d45cb5855"} Sep 30 06:37:03 crc kubenswrapper[4691]: I0930 06:37:03.335311 4691 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/00f9a0ab-0ede-4a32-8fc4-baf3788218f8-config\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:03 crc kubenswrapper[4691]: I0930 06:37:03.350156 4691 generic.go:334] "Generic (PLEG): container finished" podID="ab991360-2557-48e2-b39e-91dece03bcbe" containerID="ecbcabc3de94e7fe361d4043d01edb1a41343dd2b9c59e366f08316095a7c4b2" exitCode=0 Sep 30 06:37:03 crc kubenswrapper[4691]: I0930 06:37:03.350692 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5865f587f5-rvd99" event={"ID":"ab991360-2557-48e2-b39e-91dece03bcbe","Type":"ContainerDied","Data":"ecbcabc3de94e7fe361d4043d01edb1a41343dd2b9c59e366f08316095a7c4b2"} Sep 30 06:37:03 crc kubenswrapper[4691]: I0930 06:37:03.350735 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5865f587f5-rvd99" event={"ID":"ab991360-2557-48e2-b39e-91dece03bcbe","Type":"ContainerStarted","Data":"c99bca1cca04bba916cc8a5f28435bd0cdd1222c5d5f29163157bf3f7d99ee45"} Sep 30 06:37:03 crc kubenswrapper[4691]: I0930 06:37:03.410394 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8687477df-8l865" event={"ID":"7d0f6749-bfde-4329-9905-f51ef18e904c","Type":"ContainerStarted","Data":"617319bf1f9906d7504e8b36cca0d5423ce3bdfd64157eb797d28adc06cc9890"} Sep 30 06:37:03 crc kubenswrapper[4691]: I0930 06:37:03.410667 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-8687477df-8l865" event={"ID":"7d0f6749-bfde-4329-9905-f51ef18e904c","Type":"ContainerStarted","Data":"5be0be89578ad759644d787185bda7c36c4e0a8815b8b47a51241fa149f7efb3"} Sep 30 06:37:03 crc kubenswrapper[4691]: I0930 06:37:03.411176 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-8687477df-8l865" Sep 30 06:37:03 crc kubenswrapper[4691]: I0930 06:37:03.437577 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5d7ff878f-9tz9w" event={"ID":"9d94cb2d-a415-4b43-9976-0a844c446734","Type":"ContainerStarted","Data":"1f568fd2c053a8e58bfaaf37b559900f0178521a67293860cefdeacae68efe74"} Sep 30 06:37:03 crc kubenswrapper[4691]: I0930 06:37:03.517501 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5c7dfdd7cd-qdz5k" event={"ID":"bccc96eb-4a1a-44bc-8086-eb5e7a7ce253","Type":"ContainerStarted","Data":"d02ae6997a092d610dc81417d8bf68f423e738906074ea5f412e0b21679b8bd8"} Sep 30 06:37:03 crc kubenswrapper[4691]: I0930 06:37:03.525969 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-8687477df-8l865" podStartSLOduration=7.52594752 podStartE2EDuration="7.52594752s" podCreationTimestamp="2025-09-30 06:36:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:37:03.442191281 +0000 UTC m=+1066.917212341" watchObservedRunningTime="2025-09-30 06:37:03.52594752 +0000 UTC m=+1067.000968560" Sep 30 06:37:03 crc kubenswrapper[4691]: I0930 06:37:03.539185 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-67959b9db8-p8w6w" event={"ID":"cbfa4b8e-37ac-4713-a8af-b610f086f7e2","Type":"ContainerStarted","Data":"feb2146197bcd200b2070e9ef6b33126ce9c44667ce4e55e7a22dde92f49def9"} Sep 30 06:37:03 crc kubenswrapper[4691]: I0930 06:37:03.547315 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d9695899-d4zhb" event={"ID":"00f9a0ab-0ede-4a32-8fc4-baf3788218f8","Type":"ContainerDied","Data":"d4b30d1df7ce21467d6a5d5780f602071ebd84823f380b14b3a7102b9d3d1796"} Sep 30 06:37:03 crc kubenswrapper[4691]: I0930 06:37:03.547373 4691 scope.go:117] "RemoveContainer" containerID="d36d6bc38729f413b76c93bc71b94701255a6b816c83011f4876ecb1ac0c41f2" Sep 30 06:37:03 crc kubenswrapper[4691]: I0930 06:37:03.547561 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d9695899-d4zhb" Sep 30 06:37:03 crc kubenswrapper[4691]: I0930 06:37:03.674065 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d9695899-d4zhb"] Sep 30 06:37:03 crc kubenswrapper[4691]: I0930 06:37:03.708616 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5d9695899-d4zhb"] Sep 30 06:37:04 crc kubenswrapper[4691]: I0930 06:37:04.040928 4691 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5755bf4df8-zx9td" podUID="cc29be35-3ceb-4a88-af6e-77e2d0cbab83" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.159:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.159:8443: connect: connection refused" Sep 30 06:37:04 crc kubenswrapper[4691]: I0930 06:37:04.462938 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-68888bb5f6-d225g"] Sep 30 06:37:04 crc kubenswrapper[4691]: E0930 06:37:04.463561 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00f9a0ab-0ede-4a32-8fc4-baf3788218f8" containerName="init" Sep 30 06:37:04 crc kubenswrapper[4691]: I0930 06:37:04.463574 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="00f9a0ab-0ede-4a32-8fc4-baf3788218f8" containerName="init" Sep 30 06:37:04 crc kubenswrapper[4691]: I0930 06:37:04.463730 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="00f9a0ab-0ede-4a32-8fc4-baf3788218f8" containerName="init" Sep 30 06:37:04 crc kubenswrapper[4691]: I0930 06:37:04.464818 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-68888bb5f6-d225g" Sep 30 06:37:04 crc kubenswrapper[4691]: I0930 06:37:04.466888 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Sep 30 06:37:04 crc kubenswrapper[4691]: I0930 06:37:04.466964 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Sep 30 06:37:04 crc kubenswrapper[4691]: I0930 06:37:04.488197 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-68888bb5f6-d225g"] Sep 30 06:37:04 crc kubenswrapper[4691]: I0930 06:37:04.560989 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-67959b9db8-p8w6w" event={"ID":"cbfa4b8e-37ac-4713-a8af-b610f086f7e2","Type":"ContainerStarted","Data":"b6752c80f69833593d27c5ffb66ab3315ee613a1f95e62af78e01a7a5d3dd421"} Sep 30 06:37:04 crc kubenswrapper[4691]: I0930 06:37:04.561028 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-67959b9db8-p8w6w" event={"ID":"cbfa4b8e-37ac-4713-a8af-b610f086f7e2","Type":"ContainerStarted","Data":"ef517aaa6758085874127d7c6326aed742a754e9537fde8af3ab95d6767d0e04"} Sep 30 06:37:04 crc kubenswrapper[4691]: I0930 06:37:04.563288 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-67959b9db8-p8w6w" Sep 30 06:37:04 crc kubenswrapper[4691]: I0930 06:37:04.563687 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-67959b9db8-p8w6w" Sep 30 06:37:04 crc kubenswrapper[4691]: I0930 06:37:04.573098 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-fh78b" event={"ID":"071e402d-9775-412e-ad8a-1643cd646d7c","Type":"ContainerStarted","Data":"8b4784efaa23c98f9e640e2c991c0a4de71215216dde1b5054d199bea4aaf3fd"} Sep 30 06:37:04 crc kubenswrapper[4691]: I0930 06:37:04.580484 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7v29k\" (UniqueName: \"kubernetes.io/projected/29d7ded5-bae4-41e2-9aa9-c959091d3696-kube-api-access-7v29k\") pod \"barbican-api-68888bb5f6-d225g\" (UID: \"29d7ded5-bae4-41e2-9aa9-c959091d3696\") " pod="openstack/barbican-api-68888bb5f6-d225g" Sep 30 06:37:04 crc kubenswrapper[4691]: I0930 06:37:04.580553 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29d7ded5-bae4-41e2-9aa9-c959091d3696-combined-ca-bundle\") pod \"barbican-api-68888bb5f6-d225g\" (UID: \"29d7ded5-bae4-41e2-9aa9-c959091d3696\") " pod="openstack/barbican-api-68888bb5f6-d225g" Sep 30 06:37:04 crc kubenswrapper[4691]: I0930 06:37:04.580649 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29d7ded5-bae4-41e2-9aa9-c959091d3696-config-data\") pod \"barbican-api-68888bb5f6-d225g\" (UID: \"29d7ded5-bae4-41e2-9aa9-c959091d3696\") " pod="openstack/barbican-api-68888bb5f6-d225g" Sep 30 06:37:04 crc kubenswrapper[4691]: I0930 06:37:04.580738 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/29d7ded5-bae4-41e2-9aa9-c959091d3696-config-data-custom\") pod \"barbican-api-68888bb5f6-d225g\" (UID: \"29d7ded5-bae4-41e2-9aa9-c959091d3696\") " pod="openstack/barbican-api-68888bb5f6-d225g" Sep 30 06:37:04 crc kubenswrapper[4691]: I0930 06:37:04.580784 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/29d7ded5-bae4-41e2-9aa9-c959091d3696-internal-tls-certs\") pod \"barbican-api-68888bb5f6-d225g\" (UID: \"29d7ded5-bae4-41e2-9aa9-c959091d3696\") " pod="openstack/barbican-api-68888bb5f6-d225g" Sep 30 06:37:04 crc kubenswrapper[4691]: I0930 06:37:04.580816 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29d7ded5-bae4-41e2-9aa9-c959091d3696-logs\") pod \"barbican-api-68888bb5f6-d225g\" (UID: \"29d7ded5-bae4-41e2-9aa9-c959091d3696\") " pod="openstack/barbican-api-68888bb5f6-d225g" Sep 30 06:37:04 crc kubenswrapper[4691]: I0930 06:37:04.580938 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/29d7ded5-bae4-41e2-9aa9-c959091d3696-public-tls-certs\") pod \"barbican-api-68888bb5f6-d225g\" (UID: \"29d7ded5-bae4-41e2-9aa9-c959091d3696\") " pod="openstack/barbican-api-68888bb5f6-d225g" Sep 30 06:37:04 crc kubenswrapper[4691]: I0930 06:37:04.586073 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-67959b9db8-p8w6w" podStartSLOduration=3.586049604 podStartE2EDuration="3.586049604s" podCreationTimestamp="2025-09-30 06:37:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:37:04.57784232 +0000 UTC m=+1068.052863360" watchObservedRunningTime="2025-09-30 06:37:04.586049604 +0000 UTC m=+1068.061070634" Sep 30 06:37:04 crc kubenswrapper[4691]: I0930 06:37:04.597738 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5865f587f5-rvd99" event={"ID":"ab991360-2557-48e2-b39e-91dece03bcbe","Type":"ContainerStarted","Data":"8c0cf071d81fdd3a88c2d92130296e4f87e7baf332fbdd0f9820c61cae827b36"} Sep 30 06:37:04 crc kubenswrapper[4691]: I0930 06:37:04.597782 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5865f587f5-rvd99" Sep 30 06:37:04 crc kubenswrapper[4691]: I0930 06:37:04.631257 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-fh78b" podStartSLOduration=5.93138277 podStartE2EDuration="45.631238004s" podCreationTimestamp="2025-09-30 06:36:19 +0000 UTC" firstStartedPulling="2025-09-30 06:36:22.663163603 +0000 UTC m=+1026.138184643" lastFinishedPulling="2025-09-30 06:37:02.363018837 +0000 UTC m=+1065.838039877" observedRunningTime="2025-09-30 06:37:04.59842125 +0000 UTC m=+1068.073442290" watchObservedRunningTime="2025-09-30 06:37:04.631238004 +0000 UTC m=+1068.106259034" Sep 30 06:37:04 crc kubenswrapper[4691]: I0930 06:37:04.634784 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5865f587f5-rvd99" podStartSLOduration=3.634777727 podStartE2EDuration="3.634777727s" podCreationTimestamp="2025-09-30 06:37:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:37:04.623304019 +0000 UTC m=+1068.098325079" watchObservedRunningTime="2025-09-30 06:37:04.634777727 +0000 UTC m=+1068.109798767" Sep 30 06:37:04 crc kubenswrapper[4691]: I0930 06:37:04.688119 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/29d7ded5-bae4-41e2-9aa9-c959091d3696-config-data-custom\") pod \"barbican-api-68888bb5f6-d225g\" (UID: \"29d7ded5-bae4-41e2-9aa9-c959091d3696\") " pod="openstack/barbican-api-68888bb5f6-d225g" Sep 30 06:37:04 crc kubenswrapper[4691]: I0930 06:37:04.688190 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/29d7ded5-bae4-41e2-9aa9-c959091d3696-internal-tls-certs\") pod \"barbican-api-68888bb5f6-d225g\" (UID: \"29d7ded5-bae4-41e2-9aa9-c959091d3696\") " pod="openstack/barbican-api-68888bb5f6-d225g" Sep 30 06:37:04 crc kubenswrapper[4691]: I0930 06:37:04.688216 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29d7ded5-bae4-41e2-9aa9-c959091d3696-logs\") pod \"barbican-api-68888bb5f6-d225g\" (UID: \"29d7ded5-bae4-41e2-9aa9-c959091d3696\") " pod="openstack/barbican-api-68888bb5f6-d225g" Sep 30 06:37:04 crc kubenswrapper[4691]: I0930 06:37:04.688276 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/29d7ded5-bae4-41e2-9aa9-c959091d3696-public-tls-certs\") pod \"barbican-api-68888bb5f6-d225g\" (UID: \"29d7ded5-bae4-41e2-9aa9-c959091d3696\") " pod="openstack/barbican-api-68888bb5f6-d225g" Sep 30 06:37:04 crc kubenswrapper[4691]: I0930 06:37:04.688397 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7v29k\" (UniqueName: \"kubernetes.io/projected/29d7ded5-bae4-41e2-9aa9-c959091d3696-kube-api-access-7v29k\") pod \"barbican-api-68888bb5f6-d225g\" (UID: \"29d7ded5-bae4-41e2-9aa9-c959091d3696\") " pod="openstack/barbican-api-68888bb5f6-d225g" Sep 30 06:37:04 crc kubenswrapper[4691]: I0930 06:37:04.688426 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29d7ded5-bae4-41e2-9aa9-c959091d3696-combined-ca-bundle\") pod \"barbican-api-68888bb5f6-d225g\" (UID: \"29d7ded5-bae4-41e2-9aa9-c959091d3696\") " pod="openstack/barbican-api-68888bb5f6-d225g" Sep 30 06:37:04 crc kubenswrapper[4691]: I0930 06:37:04.688471 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29d7ded5-bae4-41e2-9aa9-c959091d3696-config-data\") pod \"barbican-api-68888bb5f6-d225g\" (UID: \"29d7ded5-bae4-41e2-9aa9-c959091d3696\") " pod="openstack/barbican-api-68888bb5f6-d225g" Sep 30 06:37:04 crc kubenswrapper[4691]: I0930 06:37:04.692935 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/29d7ded5-bae4-41e2-9aa9-c959091d3696-logs\") pod \"barbican-api-68888bb5f6-d225g\" (UID: \"29d7ded5-bae4-41e2-9aa9-c959091d3696\") " pod="openstack/barbican-api-68888bb5f6-d225g" Sep 30 06:37:04 crc kubenswrapper[4691]: I0930 06:37:04.698701 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/29d7ded5-bae4-41e2-9aa9-c959091d3696-combined-ca-bundle\") pod \"barbican-api-68888bb5f6-d225g\" (UID: \"29d7ded5-bae4-41e2-9aa9-c959091d3696\") " pod="openstack/barbican-api-68888bb5f6-d225g" Sep 30 06:37:04 crc kubenswrapper[4691]: I0930 06:37:04.698727 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/29d7ded5-bae4-41e2-9aa9-c959091d3696-public-tls-certs\") pod \"barbican-api-68888bb5f6-d225g\" (UID: \"29d7ded5-bae4-41e2-9aa9-c959091d3696\") " pod="openstack/barbican-api-68888bb5f6-d225g" Sep 30 06:37:04 crc kubenswrapper[4691]: I0930 06:37:04.699082 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/29d7ded5-bae4-41e2-9aa9-c959091d3696-internal-tls-certs\") pod \"barbican-api-68888bb5f6-d225g\" (UID: \"29d7ded5-bae4-41e2-9aa9-c959091d3696\") " pod="openstack/barbican-api-68888bb5f6-d225g" Sep 30 06:37:04 crc kubenswrapper[4691]: I0930 06:37:04.700709 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/29d7ded5-bae4-41e2-9aa9-c959091d3696-config-data-custom\") pod \"barbican-api-68888bb5f6-d225g\" (UID: \"29d7ded5-bae4-41e2-9aa9-c959091d3696\") " pod="openstack/barbican-api-68888bb5f6-d225g" Sep 30 06:37:04 crc kubenswrapper[4691]: I0930 06:37:04.703938 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/29d7ded5-bae4-41e2-9aa9-c959091d3696-config-data\") pod \"barbican-api-68888bb5f6-d225g\" (UID: \"29d7ded5-bae4-41e2-9aa9-c959091d3696\") " pod="openstack/barbican-api-68888bb5f6-d225g" Sep 30 06:37:04 crc kubenswrapper[4691]: I0930 06:37:04.708885 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7v29k\" (UniqueName: \"kubernetes.io/projected/29d7ded5-bae4-41e2-9aa9-c959091d3696-kube-api-access-7v29k\") pod \"barbican-api-68888bb5f6-d225g\" (UID: \"29d7ded5-bae4-41e2-9aa9-c959091d3696\") " pod="openstack/barbican-api-68888bb5f6-d225g" Sep 30 06:37:04 crc kubenswrapper[4691]: I0930 06:37:04.741079 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-56d945494d-7svb6" Sep 30 06:37:04 crc kubenswrapper[4691]: I0930 06:37:04.793869 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-68888bb5f6-d225g" Sep 30 06:37:05 crc kubenswrapper[4691]: I0930 06:37:05.239755 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00f9a0ab-0ede-4a32-8fc4-baf3788218f8" path="/var/lib/kubelet/pods/00f9a0ab-0ede-4a32-8fc4-baf3788218f8/volumes" Sep 30 06:37:05 crc kubenswrapper[4691]: I0930 06:37:05.335871 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Sep 30 06:37:05 crc kubenswrapper[4691]: I0930 06:37:05.335979 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Sep 30 06:37:05 crc kubenswrapper[4691]: I0930 06:37:05.400594 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Sep 30 06:37:05 crc kubenswrapper[4691]: I0930 06:37:05.460237 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-56d945494d-7svb6" Sep 30 06:37:05 crc kubenswrapper[4691]: I0930 06:37:05.631523 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5c7dfdd7cd-qdz5k" event={"ID":"bccc96eb-4a1a-44bc-8086-eb5e7a7ce253","Type":"ContainerStarted","Data":"7f12a6169710159d6d2e617b238f45394ebbbb1de30feeb8a4517cc24dacc0dd"} Sep 30 06:37:05 crc kubenswrapper[4691]: I0930 06:37:05.664105 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Sep 30 06:37:05 crc kubenswrapper[4691]: W0930 06:37:05.886382 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod29d7ded5_bae4_41e2_9aa9_c959091d3696.slice/crio-42239300a244c1623d90f727415a1e0b10daa47672a6a9eafca53cdd58445b56 WatchSource:0}: Error finding container 42239300a244c1623d90f727415a1e0b10daa47672a6a9eafca53cdd58445b56: Status 404 returned error can't find the container with id 42239300a244c1623d90f727415a1e0b10daa47672a6a9eafca53cdd58445b56 Sep 30 06:37:05 crc kubenswrapper[4691]: I0930 06:37:05.891530 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-68888bb5f6-d225g"] Sep 30 06:37:06 crc kubenswrapper[4691]: I0930 06:37:06.651022 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-68888bb5f6-d225g" event={"ID":"29d7ded5-bae4-41e2-9aa9-c959091d3696","Type":"ContainerStarted","Data":"6047a89345ebe4e7afe4b3aaa23a7ad9848d939cede49afb7d3388e362d398b4"} Sep 30 06:37:06 crc kubenswrapper[4691]: I0930 06:37:06.651494 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-68888bb5f6-d225g" event={"ID":"29d7ded5-bae4-41e2-9aa9-c959091d3696","Type":"ContainerStarted","Data":"c8d739a6c9e00d2749706b11e509607db471f92852839be9b411a0a56e147b09"} Sep 30 06:37:06 crc kubenswrapper[4691]: I0930 06:37:06.651515 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-68888bb5f6-d225g" event={"ID":"29d7ded5-bae4-41e2-9aa9-c959091d3696","Type":"ContainerStarted","Data":"42239300a244c1623d90f727415a1e0b10daa47672a6a9eafca53cdd58445b56"} Sep 30 06:37:06 crc kubenswrapper[4691]: I0930 06:37:06.652036 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-68888bb5f6-d225g" Sep 30 06:37:06 crc kubenswrapper[4691]: I0930 06:37:06.652138 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-68888bb5f6-d225g" Sep 30 06:37:06 crc kubenswrapper[4691]: I0930 06:37:06.661797 4691 generic.go:334] "Generic (PLEG): container finished" podID="6781ac88-7516-4101-8abd-9cacfbb930b7" containerID="50d0d88a5f1231386758cbaf3ab91925e14ccbf64d6051e87f834402f40a6ee1" exitCode=0 Sep 30 06:37:06 crc kubenswrapper[4691]: I0930 06:37:06.661871 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6781ac88-7516-4101-8abd-9cacfbb930b7","Type":"ContainerDied","Data":"50d0d88a5f1231386758cbaf3ab91925e14ccbf64d6051e87f834402f40a6ee1"} Sep 30 06:37:06 crc kubenswrapper[4691]: I0930 06:37:06.661968 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6781ac88-7516-4101-8abd-9cacfbb930b7","Type":"ContainerDied","Data":"7f7be92aa56741158622c73542f0e8a2e3cf431bdb549192ed6cc40991f90812"} Sep 30 06:37:06 crc kubenswrapper[4691]: I0930 06:37:06.661984 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7f7be92aa56741158622c73542f0e8a2e3cf431bdb549192ed6cc40991f90812" Sep 30 06:37:06 crc kubenswrapper[4691]: I0930 06:37:06.681730 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-68888bb5f6-d225g" podStartSLOduration=2.681707823 podStartE2EDuration="2.681707823s" podCreationTimestamp="2025-09-30 06:37:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:37:06.678651654 +0000 UTC m=+1070.153672714" watchObservedRunningTime="2025-09-30 06:37:06.681707823 +0000 UTC m=+1070.156728863" Sep 30 06:37:06 crc kubenswrapper[4691]: I0930 06:37:06.685652 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5d7ff878f-9tz9w" event={"ID":"9d94cb2d-a415-4b43-9976-0a844c446734","Type":"ContainerStarted","Data":"7ac244a44f938f11b80abe75d1f92a0bcc9014972ad9bc334ddd435fcc2981a5"} Sep 30 06:37:06 crc kubenswrapper[4691]: I0930 06:37:06.685692 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5d7ff878f-9tz9w" event={"ID":"9d94cb2d-a415-4b43-9976-0a844c446734","Type":"ContainerStarted","Data":"b6f6d6f2258e8b2ce10b136c8a667cf91609add0ccb257e074b36d658594bc56"} Sep 30 06:37:06 crc kubenswrapper[4691]: I0930 06:37:06.690561 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-5c7dfdd7cd-qdz5k" event={"ID":"bccc96eb-4a1a-44bc-8086-eb5e7a7ce253","Type":"ContainerStarted","Data":"99670c877e70c646a174dff418628e62e9cee6419ba44f936dfe47a9911edb90"} Sep 30 06:37:06 crc kubenswrapper[4691]: I0930 06:37:06.725554 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-5d7ff878f-9tz9w" podStartSLOduration=2.797506495 podStartE2EDuration="5.725535739s" podCreationTimestamp="2025-09-30 06:37:01 +0000 UTC" firstStartedPulling="2025-09-30 06:37:02.3852619 +0000 UTC m=+1065.860282940" lastFinishedPulling="2025-09-30 06:37:05.313291144 +0000 UTC m=+1068.788312184" observedRunningTime="2025-09-30 06:37:06.725092165 +0000 UTC m=+1070.200113205" watchObservedRunningTime="2025-09-30 06:37:06.725535739 +0000 UTC m=+1070.200556779" Sep 30 06:37:06 crc kubenswrapper[4691]: I0930 06:37:06.765473 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 06:37:06 crc kubenswrapper[4691]: I0930 06:37:06.779146 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-5c7dfdd7cd-qdz5k" podStartSLOduration=2.985489258 podStartE2EDuration="5.779123059s" podCreationTimestamp="2025-09-30 06:37:01 +0000 UTC" firstStartedPulling="2025-09-30 06:37:02.518502866 +0000 UTC m=+1065.993523906" lastFinishedPulling="2025-09-30 06:37:05.312136667 +0000 UTC m=+1068.787157707" observedRunningTime="2025-09-30 06:37:06.740391906 +0000 UTC m=+1070.215412956" watchObservedRunningTime="2025-09-30 06:37:06.779123059 +0000 UTC m=+1070.254144099" Sep 30 06:37:06 crc kubenswrapper[4691]: I0930 06:37:06.859466 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6781ac88-7516-4101-8abd-9cacfbb930b7-config-data\") pod \"6781ac88-7516-4101-8abd-9cacfbb930b7\" (UID: \"6781ac88-7516-4101-8abd-9cacfbb930b7\") " Sep 30 06:37:06 crc kubenswrapper[4691]: I0930 06:37:06.859515 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6781ac88-7516-4101-8abd-9cacfbb930b7-sg-core-conf-yaml\") pod \"6781ac88-7516-4101-8abd-9cacfbb930b7\" (UID: \"6781ac88-7516-4101-8abd-9cacfbb930b7\") " Sep 30 06:37:06 crc kubenswrapper[4691]: I0930 06:37:06.859610 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6781ac88-7516-4101-8abd-9cacfbb930b7-log-httpd\") pod \"6781ac88-7516-4101-8abd-9cacfbb930b7\" (UID: \"6781ac88-7516-4101-8abd-9cacfbb930b7\") " Sep 30 06:37:06 crc kubenswrapper[4691]: I0930 06:37:06.859657 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmmmv\" (UniqueName: \"kubernetes.io/projected/6781ac88-7516-4101-8abd-9cacfbb930b7-kube-api-access-vmmmv\") pod \"6781ac88-7516-4101-8abd-9cacfbb930b7\" (UID: \"6781ac88-7516-4101-8abd-9cacfbb930b7\") " Sep 30 06:37:06 crc kubenswrapper[4691]: I0930 06:37:06.859677 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6781ac88-7516-4101-8abd-9cacfbb930b7-combined-ca-bundle\") pod \"6781ac88-7516-4101-8abd-9cacfbb930b7\" (UID: \"6781ac88-7516-4101-8abd-9cacfbb930b7\") " Sep 30 06:37:06 crc kubenswrapper[4691]: I0930 06:37:06.859788 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6781ac88-7516-4101-8abd-9cacfbb930b7-scripts\") pod \"6781ac88-7516-4101-8abd-9cacfbb930b7\" (UID: \"6781ac88-7516-4101-8abd-9cacfbb930b7\") " Sep 30 06:37:06 crc kubenswrapper[4691]: I0930 06:37:06.859805 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6781ac88-7516-4101-8abd-9cacfbb930b7-run-httpd\") pod \"6781ac88-7516-4101-8abd-9cacfbb930b7\" (UID: \"6781ac88-7516-4101-8abd-9cacfbb930b7\") " Sep 30 06:37:06 crc kubenswrapper[4691]: I0930 06:37:06.860654 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6781ac88-7516-4101-8abd-9cacfbb930b7-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6781ac88-7516-4101-8abd-9cacfbb930b7" (UID: "6781ac88-7516-4101-8abd-9cacfbb930b7"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:37:06 crc kubenswrapper[4691]: I0930 06:37:06.864639 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6781ac88-7516-4101-8abd-9cacfbb930b7-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6781ac88-7516-4101-8abd-9cacfbb930b7" (UID: "6781ac88-7516-4101-8abd-9cacfbb930b7"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:37:06 crc kubenswrapper[4691]: I0930 06:37:06.865068 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6781ac88-7516-4101-8abd-9cacfbb930b7-kube-api-access-vmmmv" (OuterVolumeSpecName: "kube-api-access-vmmmv") pod "6781ac88-7516-4101-8abd-9cacfbb930b7" (UID: "6781ac88-7516-4101-8abd-9cacfbb930b7"). InnerVolumeSpecName "kube-api-access-vmmmv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:37:06 crc kubenswrapper[4691]: I0930 06:37:06.867012 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6781ac88-7516-4101-8abd-9cacfbb930b7-scripts" (OuterVolumeSpecName: "scripts") pod "6781ac88-7516-4101-8abd-9cacfbb930b7" (UID: "6781ac88-7516-4101-8abd-9cacfbb930b7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:37:06 crc kubenswrapper[4691]: I0930 06:37:06.890445 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6781ac88-7516-4101-8abd-9cacfbb930b7-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6781ac88-7516-4101-8abd-9cacfbb930b7" (UID: "6781ac88-7516-4101-8abd-9cacfbb930b7"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:37:06 crc kubenswrapper[4691]: I0930 06:37:06.930845 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6781ac88-7516-4101-8abd-9cacfbb930b7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6781ac88-7516-4101-8abd-9cacfbb930b7" (UID: "6781ac88-7516-4101-8abd-9cacfbb930b7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:37:06 crc kubenswrapper[4691]: I0930 06:37:06.964165 4691 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6781ac88-7516-4101-8abd-9cacfbb930b7-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:06 crc kubenswrapper[4691]: I0930 06:37:06.964389 4691 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6781ac88-7516-4101-8abd-9cacfbb930b7-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:06 crc kubenswrapper[4691]: I0930 06:37:06.964399 4691 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6781ac88-7516-4101-8abd-9cacfbb930b7-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:06 crc kubenswrapper[4691]: I0930 06:37:06.964409 4691 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6781ac88-7516-4101-8abd-9cacfbb930b7-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:06 crc kubenswrapper[4691]: I0930 06:37:06.964420 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmmmv\" (UniqueName: \"kubernetes.io/projected/6781ac88-7516-4101-8abd-9cacfbb930b7-kube-api-access-vmmmv\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:06 crc kubenswrapper[4691]: I0930 06:37:06.964431 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6781ac88-7516-4101-8abd-9cacfbb930b7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:06 crc kubenswrapper[4691]: I0930 06:37:06.977646 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6781ac88-7516-4101-8abd-9cacfbb930b7-config-data" (OuterVolumeSpecName: "config-data") pod "6781ac88-7516-4101-8abd-9cacfbb930b7" (UID: "6781ac88-7516-4101-8abd-9cacfbb930b7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:37:07 crc kubenswrapper[4691]: I0930 06:37:07.066454 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6781ac88-7516-4101-8abd-9cacfbb930b7-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:07 crc kubenswrapper[4691]: I0930 06:37:07.698313 4691 generic.go:334] "Generic (PLEG): container finished" podID="af3f1644-3ab8-4a6a-9f80-f8ea42297e98" containerID="0bda2d329b80bafe79c6a51e31c421bb8b3254080dc231fa2738d610c27ec014" exitCode=1 Sep 30 06:37:07 crc kubenswrapper[4691]: I0930 06:37:07.698384 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"af3f1644-3ab8-4a6a-9f80-f8ea42297e98","Type":"ContainerDied","Data":"0bda2d329b80bafe79c6a51e31c421bb8b3254080dc231fa2738d610c27ec014"} Sep 30 06:37:07 crc kubenswrapper[4691]: I0930 06:37:07.698689 4691 scope.go:117] "RemoveContainer" containerID="5501ce67b931bb5c3d6aeb1c094dd1c1c6449137f68ebefe633416b944309b7c" Sep 30 06:37:07 crc kubenswrapper[4691]: I0930 06:37:07.699595 4691 scope.go:117] "RemoveContainer" containerID="0bda2d329b80bafe79c6a51e31c421bb8b3254080dc231fa2738d610c27ec014" Sep 30 06:37:07 crc kubenswrapper[4691]: I0930 06:37:07.699599 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 06:37:07 crc kubenswrapper[4691]: E0930 06:37:07.699851 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 10s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(af3f1644-3ab8-4a6a-9f80-f8ea42297e98)\"" pod="openstack/watcher-decision-engine-0" podUID="af3f1644-3ab8-4a6a-9f80-f8ea42297e98" Sep 30 06:37:07 crc kubenswrapper[4691]: I0930 06:37:07.827059 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 06:37:07 crc kubenswrapper[4691]: I0930 06:37:07.844079 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 06:37:07 crc kubenswrapper[4691]: I0930 06:37:07.852619 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 06:37:07 crc kubenswrapper[4691]: E0930 06:37:07.853170 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6781ac88-7516-4101-8abd-9cacfbb930b7" containerName="ceilometer-notification-agent" Sep 30 06:37:07 crc kubenswrapper[4691]: I0930 06:37:07.853199 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="6781ac88-7516-4101-8abd-9cacfbb930b7" containerName="ceilometer-notification-agent" Sep 30 06:37:07 crc kubenswrapper[4691]: E0930 06:37:07.853224 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6781ac88-7516-4101-8abd-9cacfbb930b7" containerName="proxy-httpd" Sep 30 06:37:07 crc kubenswrapper[4691]: I0930 06:37:07.853232 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="6781ac88-7516-4101-8abd-9cacfbb930b7" containerName="proxy-httpd" Sep 30 06:37:07 crc kubenswrapper[4691]: E0930 06:37:07.853243 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6781ac88-7516-4101-8abd-9cacfbb930b7" containerName="sg-core" Sep 30 06:37:07 crc kubenswrapper[4691]: I0930 06:37:07.853250 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="6781ac88-7516-4101-8abd-9cacfbb930b7" containerName="sg-core" Sep 30 06:37:07 crc kubenswrapper[4691]: I0930 06:37:07.853457 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="6781ac88-7516-4101-8abd-9cacfbb930b7" containerName="ceilometer-notification-agent" Sep 30 06:37:07 crc kubenswrapper[4691]: I0930 06:37:07.853478 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="6781ac88-7516-4101-8abd-9cacfbb930b7" containerName="sg-core" Sep 30 06:37:07 crc kubenswrapper[4691]: I0930 06:37:07.853514 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="6781ac88-7516-4101-8abd-9cacfbb930b7" containerName="proxy-httpd" Sep 30 06:37:07 crc kubenswrapper[4691]: I0930 06:37:07.855654 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 06:37:07 crc kubenswrapper[4691]: I0930 06:37:07.861270 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 06:37:07 crc kubenswrapper[4691]: I0930 06:37:07.861795 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 06:37:07 crc kubenswrapper[4691]: I0930 06:37:07.866020 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 06:37:07 crc kubenswrapper[4691]: I0930 06:37:07.981707 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/161707ca-6070-49ab-ab57-896ef94e4d83-log-httpd\") pod \"ceilometer-0\" (UID: \"161707ca-6070-49ab-ab57-896ef94e4d83\") " pod="openstack/ceilometer-0" Sep 30 06:37:07 crc kubenswrapper[4691]: I0930 06:37:07.982041 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpc2f\" (UniqueName: \"kubernetes.io/projected/161707ca-6070-49ab-ab57-896ef94e4d83-kube-api-access-fpc2f\") pod \"ceilometer-0\" (UID: \"161707ca-6070-49ab-ab57-896ef94e4d83\") " pod="openstack/ceilometer-0" Sep 30 06:37:07 crc kubenswrapper[4691]: I0930 06:37:07.982166 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/161707ca-6070-49ab-ab57-896ef94e4d83-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"161707ca-6070-49ab-ab57-896ef94e4d83\") " pod="openstack/ceilometer-0" Sep 30 06:37:07 crc kubenswrapper[4691]: I0930 06:37:07.982257 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/161707ca-6070-49ab-ab57-896ef94e4d83-scripts\") pod \"ceilometer-0\" (UID: \"161707ca-6070-49ab-ab57-896ef94e4d83\") " pod="openstack/ceilometer-0" Sep 30 06:37:07 crc kubenswrapper[4691]: I0930 06:37:07.982340 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/161707ca-6070-49ab-ab57-896ef94e4d83-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"161707ca-6070-49ab-ab57-896ef94e4d83\") " pod="openstack/ceilometer-0" Sep 30 06:37:07 crc kubenswrapper[4691]: I0930 06:37:07.982435 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/161707ca-6070-49ab-ab57-896ef94e4d83-config-data\") pod \"ceilometer-0\" (UID: \"161707ca-6070-49ab-ab57-896ef94e4d83\") " pod="openstack/ceilometer-0" Sep 30 06:37:07 crc kubenswrapper[4691]: I0930 06:37:07.982569 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/161707ca-6070-49ab-ab57-896ef94e4d83-run-httpd\") pod \"ceilometer-0\" (UID: \"161707ca-6070-49ab-ab57-896ef94e4d83\") " pod="openstack/ceilometer-0" Sep 30 06:37:08 crc kubenswrapper[4691]: I0930 06:37:08.084602 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/161707ca-6070-49ab-ab57-896ef94e4d83-log-httpd\") pod \"ceilometer-0\" (UID: \"161707ca-6070-49ab-ab57-896ef94e4d83\") " pod="openstack/ceilometer-0" Sep 30 06:37:08 crc kubenswrapper[4691]: I0930 06:37:08.084673 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpc2f\" (UniqueName: \"kubernetes.io/projected/161707ca-6070-49ab-ab57-896ef94e4d83-kube-api-access-fpc2f\") pod \"ceilometer-0\" (UID: \"161707ca-6070-49ab-ab57-896ef94e4d83\") " pod="openstack/ceilometer-0" Sep 30 06:37:08 crc kubenswrapper[4691]: I0930 06:37:08.084707 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/161707ca-6070-49ab-ab57-896ef94e4d83-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"161707ca-6070-49ab-ab57-896ef94e4d83\") " pod="openstack/ceilometer-0" Sep 30 06:37:08 crc kubenswrapper[4691]: I0930 06:37:08.084971 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/161707ca-6070-49ab-ab57-896ef94e4d83-scripts\") pod \"ceilometer-0\" (UID: \"161707ca-6070-49ab-ab57-896ef94e4d83\") " pod="openstack/ceilometer-0" Sep 30 06:37:08 crc kubenswrapper[4691]: I0930 06:37:08.085104 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/161707ca-6070-49ab-ab57-896ef94e4d83-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"161707ca-6070-49ab-ab57-896ef94e4d83\") " pod="openstack/ceilometer-0" Sep 30 06:37:08 crc kubenswrapper[4691]: I0930 06:37:08.085170 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/161707ca-6070-49ab-ab57-896ef94e4d83-log-httpd\") pod \"ceilometer-0\" (UID: \"161707ca-6070-49ab-ab57-896ef94e4d83\") " pod="openstack/ceilometer-0" Sep 30 06:37:08 crc kubenswrapper[4691]: I0930 06:37:08.085181 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/161707ca-6070-49ab-ab57-896ef94e4d83-config-data\") pod \"ceilometer-0\" (UID: \"161707ca-6070-49ab-ab57-896ef94e4d83\") " pod="openstack/ceilometer-0" Sep 30 06:37:08 crc kubenswrapper[4691]: I0930 06:37:08.085443 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/161707ca-6070-49ab-ab57-896ef94e4d83-run-httpd\") pod \"ceilometer-0\" (UID: \"161707ca-6070-49ab-ab57-896ef94e4d83\") " pod="openstack/ceilometer-0" Sep 30 06:37:08 crc kubenswrapper[4691]: I0930 06:37:08.085824 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/161707ca-6070-49ab-ab57-896ef94e4d83-run-httpd\") pod \"ceilometer-0\" (UID: \"161707ca-6070-49ab-ab57-896ef94e4d83\") " pod="openstack/ceilometer-0" Sep 30 06:37:08 crc kubenswrapper[4691]: I0930 06:37:08.090325 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/161707ca-6070-49ab-ab57-896ef94e4d83-config-data\") pod \"ceilometer-0\" (UID: \"161707ca-6070-49ab-ab57-896ef94e4d83\") " pod="openstack/ceilometer-0" Sep 30 06:37:08 crc kubenswrapper[4691]: I0930 06:37:08.090674 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/161707ca-6070-49ab-ab57-896ef94e4d83-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"161707ca-6070-49ab-ab57-896ef94e4d83\") " pod="openstack/ceilometer-0" Sep 30 06:37:08 crc kubenswrapper[4691]: I0930 06:37:08.091026 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/161707ca-6070-49ab-ab57-896ef94e4d83-scripts\") pod \"ceilometer-0\" (UID: \"161707ca-6070-49ab-ab57-896ef94e4d83\") " pod="openstack/ceilometer-0" Sep 30 06:37:08 crc kubenswrapper[4691]: I0930 06:37:08.106825 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/161707ca-6070-49ab-ab57-896ef94e4d83-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"161707ca-6070-49ab-ab57-896ef94e4d83\") " pod="openstack/ceilometer-0" Sep 30 06:37:08 crc kubenswrapper[4691]: I0930 06:37:08.113423 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpc2f\" (UniqueName: \"kubernetes.io/projected/161707ca-6070-49ab-ab57-896ef94e4d83-kube-api-access-fpc2f\") pod \"ceilometer-0\" (UID: \"161707ca-6070-49ab-ab57-896ef94e4d83\") " pod="openstack/ceilometer-0" Sep 30 06:37:08 crc kubenswrapper[4691]: I0930 06:37:08.162664 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Sep 30 06:37:08 crc kubenswrapper[4691]: I0930 06:37:08.163059 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="b2d59717-fe78-4912-ac77-bf28a8188b39" containerName="watcher-api-log" containerID="cri-o://2fbf394317facd465bebb8c9cf8b0e21fa97d8fc44d13fec042173d3642e11d3" gracePeriod=30 Sep 30 06:37:08 crc kubenswrapper[4691]: I0930 06:37:08.163513 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/watcher-api-0" podUID="b2d59717-fe78-4912-ac77-bf28a8188b39" containerName="watcher-api" containerID="cri-o://1ff33ba74008cc7dd9c432f7c98e0e6bb7eda43d733a98199cabc425cf6ffdff" gracePeriod=30 Sep 30 06:37:08 crc kubenswrapper[4691]: I0930 06:37:08.184322 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 06:37:08 crc kubenswrapper[4691]: I0930 06:37:08.692182 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 06:37:08 crc kubenswrapper[4691]: I0930 06:37:08.713397 4691 generic.go:334] "Generic (PLEG): container finished" podID="b2d59717-fe78-4912-ac77-bf28a8188b39" containerID="2fbf394317facd465bebb8c9cf8b0e21fa97d8fc44d13fec042173d3642e11d3" exitCode=143 Sep 30 06:37:08 crc kubenswrapper[4691]: I0930 06:37:08.713570 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"b2d59717-fe78-4912-ac77-bf28a8188b39","Type":"ContainerDied","Data":"2fbf394317facd465bebb8c9cf8b0e21fa97d8fc44d13fec042173d3642e11d3"} Sep 30 06:37:08 crc kubenswrapper[4691]: I0930 06:37:08.716624 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"161707ca-6070-49ab-ab57-896ef94e4d83","Type":"ContainerStarted","Data":"4b9869bc4324750ceb1478c77b7c981d7157202fb6b6250c5d64997653999b42"} Sep 30 06:37:08 crc kubenswrapper[4691]: I0930 06:37:08.837383 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-bf6754cd6-fsq4c" Sep 30 06:37:08 crc kubenswrapper[4691]: I0930 06:37:08.983446 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Sep 30 06:37:08 crc kubenswrapper[4691]: I0930 06:37:08.989332 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Sep 30 06:37:09 crc kubenswrapper[4691]: I0930 06:37:08.999345 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Sep 30 06:37:09 crc kubenswrapper[4691]: I0930 06:37:09.003237 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Sep 30 06:37:09 crc kubenswrapper[4691]: I0930 06:37:09.005303 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-kp7n4" Sep 30 06:37:09 crc kubenswrapper[4691]: I0930 06:37:09.011454 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Sep 30 06:37:09 crc kubenswrapper[4691]: I0930 06:37:09.123198 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7161fa44-37a9-4d36-b45b-b4f5cf9aa2ee-openstack-config-secret\") pod \"openstackclient\" (UID: \"7161fa44-37a9-4d36-b45b-b4f5cf9aa2ee\") " pod="openstack/openstackclient" Sep 30 06:37:09 crc kubenswrapper[4691]: I0930 06:37:09.123347 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7161fa44-37a9-4d36-b45b-b4f5cf9aa2ee-combined-ca-bundle\") pod \"openstackclient\" (UID: \"7161fa44-37a9-4d36-b45b-b4f5cf9aa2ee\") " pod="openstack/openstackclient" Sep 30 06:37:09 crc kubenswrapper[4691]: I0930 06:37:09.123397 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwbch\" (UniqueName: \"kubernetes.io/projected/7161fa44-37a9-4d36-b45b-b4f5cf9aa2ee-kube-api-access-fwbch\") pod \"openstackclient\" (UID: \"7161fa44-37a9-4d36-b45b-b4f5cf9aa2ee\") " pod="openstack/openstackclient" Sep 30 06:37:09 crc kubenswrapper[4691]: I0930 06:37:09.123474 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7161fa44-37a9-4d36-b45b-b4f5cf9aa2ee-openstack-config\") pod \"openstackclient\" (UID: \"7161fa44-37a9-4d36-b45b-b4f5cf9aa2ee\") " pod="openstack/openstackclient" Sep 30 06:37:09 crc kubenswrapper[4691]: I0930 06:37:09.224947 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7161fa44-37a9-4d36-b45b-b4f5cf9aa2ee-combined-ca-bundle\") pod \"openstackclient\" (UID: \"7161fa44-37a9-4d36-b45b-b4f5cf9aa2ee\") " pod="openstack/openstackclient" Sep 30 06:37:09 crc kubenswrapper[4691]: I0930 06:37:09.225004 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwbch\" (UniqueName: \"kubernetes.io/projected/7161fa44-37a9-4d36-b45b-b4f5cf9aa2ee-kube-api-access-fwbch\") pod \"openstackclient\" (UID: \"7161fa44-37a9-4d36-b45b-b4f5cf9aa2ee\") " pod="openstack/openstackclient" Sep 30 06:37:09 crc kubenswrapper[4691]: I0930 06:37:09.225065 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7161fa44-37a9-4d36-b45b-b4f5cf9aa2ee-openstack-config\") pod \"openstackclient\" (UID: \"7161fa44-37a9-4d36-b45b-b4f5cf9aa2ee\") " pod="openstack/openstackclient" Sep 30 06:37:09 crc kubenswrapper[4691]: I0930 06:37:09.225102 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7161fa44-37a9-4d36-b45b-b4f5cf9aa2ee-openstack-config-secret\") pod \"openstackclient\" (UID: \"7161fa44-37a9-4d36-b45b-b4f5cf9aa2ee\") " pod="openstack/openstackclient" Sep 30 06:37:09 crc kubenswrapper[4691]: I0930 06:37:09.225859 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/7161fa44-37a9-4d36-b45b-b4f5cf9aa2ee-openstack-config\") pod \"openstackclient\" (UID: \"7161fa44-37a9-4d36-b45b-b4f5cf9aa2ee\") " pod="openstack/openstackclient" Sep 30 06:37:09 crc kubenswrapper[4691]: I0930 06:37:09.229009 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/7161fa44-37a9-4d36-b45b-b4f5cf9aa2ee-openstack-config-secret\") pod \"openstackclient\" (UID: \"7161fa44-37a9-4d36-b45b-b4f5cf9aa2ee\") " pod="openstack/openstackclient" Sep 30 06:37:09 crc kubenswrapper[4691]: I0930 06:37:09.231437 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7161fa44-37a9-4d36-b45b-b4f5cf9aa2ee-combined-ca-bundle\") pod \"openstackclient\" (UID: \"7161fa44-37a9-4d36-b45b-b4f5cf9aa2ee\") " pod="openstack/openstackclient" Sep 30 06:37:09 crc kubenswrapper[4691]: I0930 06:37:09.234928 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6781ac88-7516-4101-8abd-9cacfbb930b7" path="/var/lib/kubelet/pods/6781ac88-7516-4101-8abd-9cacfbb930b7/volumes" Sep 30 06:37:09 crc kubenswrapper[4691]: I0930 06:37:09.246144 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwbch\" (UniqueName: \"kubernetes.io/projected/7161fa44-37a9-4d36-b45b-b4f5cf9aa2ee-kube-api-access-fwbch\") pod \"openstackclient\" (UID: \"7161fa44-37a9-4d36-b45b-b4f5cf9aa2ee\") " pod="openstack/openstackclient" Sep 30 06:37:09 crc kubenswrapper[4691]: I0930 06:37:09.314851 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Sep 30 06:37:09 crc kubenswrapper[4691]: I0930 06:37:09.731843 4691 generic.go:334] "Generic (PLEG): container finished" podID="b2d59717-fe78-4912-ac77-bf28a8188b39" containerID="1ff33ba74008cc7dd9c432f7c98e0e6bb7eda43d733a98199cabc425cf6ffdff" exitCode=0 Sep 30 06:37:09 crc kubenswrapper[4691]: I0930 06:37:09.732072 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"b2d59717-fe78-4912-ac77-bf28a8188b39","Type":"ContainerDied","Data":"1ff33ba74008cc7dd9c432f7c98e0e6bb7eda43d733a98199cabc425cf6ffdff"} Sep 30 06:37:09 crc kubenswrapper[4691]: I0930 06:37:09.739437 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"161707ca-6070-49ab-ab57-896ef94e4d83","Type":"ContainerStarted","Data":"14c17752051a8cbb66057b5df10ffc1ffb6f8a37529dde585276ce97faaea3b7"} Sep 30 06:37:09 crc kubenswrapper[4691]: I0930 06:37:09.739477 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"161707ca-6070-49ab-ab57-896ef94e4d83","Type":"ContainerStarted","Data":"d62e58cae8a8bd7109f62fddbca91c1ee5314dab43689f8bc58f3f2e599d9090"} Sep 30 06:37:09 crc kubenswrapper[4691]: I0930 06:37:09.859000 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Sep 30 06:37:09 crc kubenswrapper[4691]: I0930 06:37:09.943340 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Sep 30 06:37:10 crc kubenswrapper[4691]: I0930 06:37:10.043766 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2d59717-fe78-4912-ac77-bf28a8188b39-combined-ca-bundle\") pod \"b2d59717-fe78-4912-ac77-bf28a8188b39\" (UID: \"b2d59717-fe78-4912-ac77-bf28a8188b39\") " Sep 30 06:37:10 crc kubenswrapper[4691]: I0930 06:37:10.043859 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/b2d59717-fe78-4912-ac77-bf28a8188b39-custom-prometheus-ca\") pod \"b2d59717-fe78-4912-ac77-bf28a8188b39\" (UID: \"b2d59717-fe78-4912-ac77-bf28a8188b39\") " Sep 30 06:37:10 crc kubenswrapper[4691]: I0930 06:37:10.043902 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2d59717-fe78-4912-ac77-bf28a8188b39-config-data\") pod \"b2d59717-fe78-4912-ac77-bf28a8188b39\" (UID: \"b2d59717-fe78-4912-ac77-bf28a8188b39\") " Sep 30 06:37:10 crc kubenswrapper[4691]: I0930 06:37:10.043940 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qt45q\" (UniqueName: \"kubernetes.io/projected/b2d59717-fe78-4912-ac77-bf28a8188b39-kube-api-access-qt45q\") pod \"b2d59717-fe78-4912-ac77-bf28a8188b39\" (UID: \"b2d59717-fe78-4912-ac77-bf28a8188b39\") " Sep 30 06:37:10 crc kubenswrapper[4691]: I0930 06:37:10.044024 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2d59717-fe78-4912-ac77-bf28a8188b39-logs\") pod \"b2d59717-fe78-4912-ac77-bf28a8188b39\" (UID: \"b2d59717-fe78-4912-ac77-bf28a8188b39\") " Sep 30 06:37:10 crc kubenswrapper[4691]: I0930 06:37:10.044788 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2d59717-fe78-4912-ac77-bf28a8188b39-logs" (OuterVolumeSpecName: "logs") pod "b2d59717-fe78-4912-ac77-bf28a8188b39" (UID: "b2d59717-fe78-4912-ac77-bf28a8188b39"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:37:10 crc kubenswrapper[4691]: I0930 06:37:10.064214 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2d59717-fe78-4912-ac77-bf28a8188b39-kube-api-access-qt45q" (OuterVolumeSpecName: "kube-api-access-qt45q") pod "b2d59717-fe78-4912-ac77-bf28a8188b39" (UID: "b2d59717-fe78-4912-ac77-bf28a8188b39"). InnerVolumeSpecName "kube-api-access-qt45q". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:37:10 crc kubenswrapper[4691]: I0930 06:37:10.098086 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2d59717-fe78-4912-ac77-bf28a8188b39-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b2d59717-fe78-4912-ac77-bf28a8188b39" (UID: "b2d59717-fe78-4912-ac77-bf28a8188b39"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:37:10 crc kubenswrapper[4691]: I0930 06:37:10.105698 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2d59717-fe78-4912-ac77-bf28a8188b39-custom-prometheus-ca" (OuterVolumeSpecName: "custom-prometheus-ca") pod "b2d59717-fe78-4912-ac77-bf28a8188b39" (UID: "b2d59717-fe78-4912-ac77-bf28a8188b39"). InnerVolumeSpecName "custom-prometheus-ca". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:37:10 crc kubenswrapper[4691]: I0930 06:37:10.146743 4691 reconciler_common.go:293] "Volume detached for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/b2d59717-fe78-4912-ac77-bf28a8188b39-custom-prometheus-ca\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:10 crc kubenswrapper[4691]: I0930 06:37:10.146771 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qt45q\" (UniqueName: \"kubernetes.io/projected/b2d59717-fe78-4912-ac77-bf28a8188b39-kube-api-access-qt45q\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:10 crc kubenswrapper[4691]: I0930 06:37:10.146780 4691 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2d59717-fe78-4912-ac77-bf28a8188b39-logs\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:10 crc kubenswrapper[4691]: I0930 06:37:10.146788 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2d59717-fe78-4912-ac77-bf28a8188b39-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:10 crc kubenswrapper[4691]: I0930 06:37:10.150656 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2d59717-fe78-4912-ac77-bf28a8188b39-config-data" (OuterVolumeSpecName: "config-data") pod "b2d59717-fe78-4912-ac77-bf28a8188b39" (UID: "b2d59717-fe78-4912-ac77-bf28a8188b39"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:37:10 crc kubenswrapper[4691]: I0930 06:37:10.249102 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2d59717-fe78-4912-ac77-bf28a8188b39-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:10 crc kubenswrapper[4691]: I0930 06:37:10.759205 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"b2d59717-fe78-4912-ac77-bf28a8188b39","Type":"ContainerDied","Data":"d99e3f572ad75365f6ec7b9c45b342fcd41e9cdd9a46d16f7b7ae7a61d9124ea"} Sep 30 06:37:10 crc kubenswrapper[4691]: I0930 06:37:10.759268 4691 scope.go:117] "RemoveContainer" containerID="1ff33ba74008cc7dd9c432f7c98e0e6bb7eda43d733a98199cabc425cf6ffdff" Sep 30 06:37:10 crc kubenswrapper[4691]: I0930 06:37:10.759402 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Sep 30 06:37:10 crc kubenswrapper[4691]: I0930 06:37:10.763481 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"161707ca-6070-49ab-ab57-896ef94e4d83","Type":"ContainerStarted","Data":"60b9438b6446b0d09bd8a0c5790c7ecedc34cda8676c05ebfe940b14407da591"} Sep 30 06:37:10 crc kubenswrapper[4691]: I0930 06:37:10.765671 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"7161fa44-37a9-4d36-b45b-b4f5cf9aa2ee","Type":"ContainerStarted","Data":"337a08edc7b53f6cb49fb24b3f21b8647a2208a444b4dc3b6c699dcff8770889"} Sep 30 06:37:10 crc kubenswrapper[4691]: I0930 06:37:10.792286 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-api-0"] Sep 30 06:37:10 crc kubenswrapper[4691]: I0930 06:37:10.807446 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-api-0"] Sep 30 06:37:10 crc kubenswrapper[4691]: I0930 06:37:10.822343 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/watcher-api-0"] Sep 30 06:37:10 crc kubenswrapper[4691]: E0930 06:37:10.822865 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2d59717-fe78-4912-ac77-bf28a8188b39" containerName="watcher-api" Sep 30 06:37:10 crc kubenswrapper[4691]: I0930 06:37:10.822962 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2d59717-fe78-4912-ac77-bf28a8188b39" containerName="watcher-api" Sep 30 06:37:10 crc kubenswrapper[4691]: E0930 06:37:10.823042 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2d59717-fe78-4912-ac77-bf28a8188b39" containerName="watcher-api-log" Sep 30 06:37:10 crc kubenswrapper[4691]: I0930 06:37:10.823106 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2d59717-fe78-4912-ac77-bf28a8188b39" containerName="watcher-api-log" Sep 30 06:37:10 crc kubenswrapper[4691]: I0930 06:37:10.823339 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2d59717-fe78-4912-ac77-bf28a8188b39" containerName="watcher-api" Sep 30 06:37:10 crc kubenswrapper[4691]: I0930 06:37:10.823429 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2d59717-fe78-4912-ac77-bf28a8188b39" containerName="watcher-api-log" Sep 30 06:37:10 crc kubenswrapper[4691]: I0930 06:37:10.824650 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Sep 30 06:37:10 crc kubenswrapper[4691]: I0930 06:37:10.827217 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"watcher-api-config-data" Sep 30 06:37:10 crc kubenswrapper[4691]: I0930 06:37:10.827255 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-public-svc" Sep 30 06:37:10 crc kubenswrapper[4691]: I0930 06:37:10.827290 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-watcher-internal-svc" Sep 30 06:37:10 crc kubenswrapper[4691]: I0930 06:37:10.831644 4691 scope.go:117] "RemoveContainer" containerID="2fbf394317facd465bebb8c9cf8b0e21fa97d8fc44d13fec042173d3642e11d3" Sep 30 06:37:10 crc kubenswrapper[4691]: I0930 06:37:10.835163 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Sep 30 06:37:10 crc kubenswrapper[4691]: I0930 06:37:10.962249 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c37a536-f38a-431d-8b76-fa23d610af0b-config-data\") pod \"watcher-api-0\" (UID: \"7c37a536-f38a-431d-8b76-fa23d610af0b\") " pod="openstack/watcher-api-0" Sep 30 06:37:10 crc kubenswrapper[4691]: I0930 06:37:10.962527 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c37a536-f38a-431d-8b76-fa23d610af0b-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"7c37a536-f38a-431d-8b76-fa23d610af0b\") " pod="openstack/watcher-api-0" Sep 30 06:37:10 crc kubenswrapper[4691]: I0930 06:37:10.962597 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c37a536-f38a-431d-8b76-fa23d610af0b-public-tls-certs\") pod \"watcher-api-0\" (UID: \"7c37a536-f38a-431d-8b76-fa23d610af0b\") " pod="openstack/watcher-api-0" Sep 30 06:37:10 crc kubenswrapper[4691]: I0930 06:37:10.962646 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nf8f9\" (UniqueName: \"kubernetes.io/projected/7c37a536-f38a-431d-8b76-fa23d610af0b-kube-api-access-nf8f9\") pod \"watcher-api-0\" (UID: \"7c37a536-f38a-431d-8b76-fa23d610af0b\") " pod="openstack/watcher-api-0" Sep 30 06:37:10 crc kubenswrapper[4691]: I0930 06:37:10.962722 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c37a536-f38a-431d-8b76-fa23d610af0b-logs\") pod \"watcher-api-0\" (UID: \"7c37a536-f38a-431d-8b76-fa23d610af0b\") " pod="openstack/watcher-api-0" Sep 30 06:37:10 crc kubenswrapper[4691]: I0930 06:37:10.962743 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c37a536-f38a-431d-8b76-fa23d610af0b-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"7c37a536-f38a-431d-8b76-fa23d610af0b\") " pod="openstack/watcher-api-0" Sep 30 06:37:10 crc kubenswrapper[4691]: I0930 06:37:10.962764 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/7c37a536-f38a-431d-8b76-fa23d610af0b-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"7c37a536-f38a-431d-8b76-fa23d610af0b\") " pod="openstack/watcher-api-0" Sep 30 06:37:11 crc kubenswrapper[4691]: I0930 06:37:11.064844 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c37a536-f38a-431d-8b76-fa23d610af0b-logs\") pod \"watcher-api-0\" (UID: \"7c37a536-f38a-431d-8b76-fa23d610af0b\") " pod="openstack/watcher-api-0" Sep 30 06:37:11 crc kubenswrapper[4691]: I0930 06:37:11.064926 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c37a536-f38a-431d-8b76-fa23d610af0b-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"7c37a536-f38a-431d-8b76-fa23d610af0b\") " pod="openstack/watcher-api-0" Sep 30 06:37:11 crc kubenswrapper[4691]: I0930 06:37:11.064949 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/7c37a536-f38a-431d-8b76-fa23d610af0b-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"7c37a536-f38a-431d-8b76-fa23d610af0b\") " pod="openstack/watcher-api-0" Sep 30 06:37:11 crc kubenswrapper[4691]: I0930 06:37:11.065009 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c37a536-f38a-431d-8b76-fa23d610af0b-config-data\") pod \"watcher-api-0\" (UID: \"7c37a536-f38a-431d-8b76-fa23d610af0b\") " pod="openstack/watcher-api-0" Sep 30 06:37:11 crc kubenswrapper[4691]: I0930 06:37:11.065034 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c37a536-f38a-431d-8b76-fa23d610af0b-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"7c37a536-f38a-431d-8b76-fa23d610af0b\") " pod="openstack/watcher-api-0" Sep 30 06:37:11 crc kubenswrapper[4691]: I0930 06:37:11.065091 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c37a536-f38a-431d-8b76-fa23d610af0b-public-tls-certs\") pod \"watcher-api-0\" (UID: \"7c37a536-f38a-431d-8b76-fa23d610af0b\") " pod="openstack/watcher-api-0" Sep 30 06:37:11 crc kubenswrapper[4691]: I0930 06:37:11.065602 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7c37a536-f38a-431d-8b76-fa23d610af0b-logs\") pod \"watcher-api-0\" (UID: \"7c37a536-f38a-431d-8b76-fa23d610af0b\") " pod="openstack/watcher-api-0" Sep 30 06:37:11 crc kubenswrapper[4691]: I0930 06:37:11.065678 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nf8f9\" (UniqueName: \"kubernetes.io/projected/7c37a536-f38a-431d-8b76-fa23d610af0b-kube-api-access-nf8f9\") pod \"watcher-api-0\" (UID: \"7c37a536-f38a-431d-8b76-fa23d610af0b\") " pod="openstack/watcher-api-0" Sep 30 06:37:11 crc kubenswrapper[4691]: I0930 06:37:11.073746 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c37a536-f38a-431d-8b76-fa23d610af0b-combined-ca-bundle\") pod \"watcher-api-0\" (UID: \"7c37a536-f38a-431d-8b76-fa23d610af0b\") " pod="openstack/watcher-api-0" Sep 30 06:37:11 crc kubenswrapper[4691]: I0930 06:37:11.075509 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c37a536-f38a-431d-8b76-fa23d610af0b-config-data\") pod \"watcher-api-0\" (UID: \"7c37a536-f38a-431d-8b76-fa23d610af0b\") " pod="openstack/watcher-api-0" Sep 30 06:37:11 crc kubenswrapper[4691]: I0930 06:37:11.077604 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c37a536-f38a-431d-8b76-fa23d610af0b-internal-tls-certs\") pod \"watcher-api-0\" (UID: \"7c37a536-f38a-431d-8b76-fa23d610af0b\") " pod="openstack/watcher-api-0" Sep 30 06:37:11 crc kubenswrapper[4691]: I0930 06:37:11.083566 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"custom-prometheus-ca\" (UniqueName: \"kubernetes.io/secret/7c37a536-f38a-431d-8b76-fa23d610af0b-custom-prometheus-ca\") pod \"watcher-api-0\" (UID: \"7c37a536-f38a-431d-8b76-fa23d610af0b\") " pod="openstack/watcher-api-0" Sep 30 06:37:11 crc kubenswrapper[4691]: I0930 06:37:11.088603 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nf8f9\" (UniqueName: \"kubernetes.io/projected/7c37a536-f38a-431d-8b76-fa23d610af0b-kube-api-access-nf8f9\") pod \"watcher-api-0\" (UID: \"7c37a536-f38a-431d-8b76-fa23d610af0b\") " pod="openstack/watcher-api-0" Sep 30 06:37:11 crc kubenswrapper[4691]: I0930 06:37:11.098479 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c37a536-f38a-431d-8b76-fa23d610af0b-public-tls-certs\") pod \"watcher-api-0\" (UID: \"7c37a536-f38a-431d-8b76-fa23d610af0b\") " pod="openstack/watcher-api-0" Sep 30 06:37:11 crc kubenswrapper[4691]: I0930 06:37:11.149867 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/watcher-api-0" Sep 30 06:37:11 crc kubenswrapper[4691]: I0930 06:37:11.245023 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2d59717-fe78-4912-ac77-bf28a8188b39" path="/var/lib/kubelet/pods/b2d59717-fe78-4912-ac77-bf28a8188b39/volumes" Sep 30 06:37:11 crc kubenswrapper[4691]: I0930 06:37:11.651411 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/watcher-api-0"] Sep 30 06:37:11 crc kubenswrapper[4691]: W0930 06:37:11.664417 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7c37a536_f38a_431d_8b76_fa23d610af0b.slice/crio-bc5a1c71258f3aa019b4850fe6917744bc67ec939feff1fca6e6c53405ff44ad WatchSource:0}: Error finding container bc5a1c71258f3aa019b4850fe6917744bc67ec939feff1fca6e6c53405ff44ad: Status 404 returned error can't find the container with id bc5a1c71258f3aa019b4850fe6917744bc67ec939feff1fca6e6c53405ff44ad Sep 30 06:37:11 crc kubenswrapper[4691]: I0930 06:37:11.779428 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"7c37a536-f38a-431d-8b76-fa23d610af0b","Type":"ContainerStarted","Data":"bc5a1c71258f3aa019b4850fe6917744bc67ec939feff1fca6e6c53405ff44ad"} Sep 30 06:37:11 crc kubenswrapper[4691]: I0930 06:37:11.784230 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"161707ca-6070-49ab-ab57-896ef94e4d83","Type":"ContainerStarted","Data":"1ad5b67aebaa793bf0bcd7f48d2e870ccc1e842d3ac14779b24c3da1ab463138"} Sep 30 06:37:11 crc kubenswrapper[4691]: I0930 06:37:11.784360 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 06:37:11 crc kubenswrapper[4691]: I0930 06:37:11.785854 4691 generic.go:334] "Generic (PLEG): container finished" podID="071e402d-9775-412e-ad8a-1643cd646d7c" containerID="8b4784efaa23c98f9e640e2c991c0a4de71215216dde1b5054d199bea4aaf3fd" exitCode=0 Sep 30 06:37:11 crc kubenswrapper[4691]: I0930 06:37:11.785903 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-fh78b" event={"ID":"071e402d-9775-412e-ad8a-1643cd646d7c","Type":"ContainerDied","Data":"8b4784efaa23c98f9e640e2c991c0a4de71215216dde1b5054d199bea4aaf3fd"} Sep 30 06:37:11 crc kubenswrapper[4691]: I0930 06:37:11.807380 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.100257025 podStartE2EDuration="4.807365588s" podCreationTimestamp="2025-09-30 06:37:07 +0000 UTC" firstStartedPulling="2025-09-30 06:37:08.699507243 +0000 UTC m=+1072.174528283" lastFinishedPulling="2025-09-30 06:37:11.406615816 +0000 UTC m=+1074.881636846" observedRunningTime="2025-09-30 06:37:11.804127374 +0000 UTC m=+1075.279148424" watchObservedRunningTime="2025-09-30 06:37:11.807365588 +0000 UTC m=+1075.282386628" Sep 30 06:37:11 crc kubenswrapper[4691]: I0930 06:37:11.854036 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5865f587f5-rvd99" Sep 30 06:37:11 crc kubenswrapper[4691]: I0930 06:37:11.940095 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7597958cd9-k94q9"] Sep 30 06:37:11 crc kubenswrapper[4691]: I0930 06:37:11.940710 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7597958cd9-k94q9" podUID="04be0ebf-14ea-4b62-b235-af7e6fdff8ee" containerName="dnsmasq-dns" containerID="cri-o://1adf575f4136c8beacd48f416506a13b997d4fab22b6618e34606bc3c0c95a54" gracePeriod=10 Sep 30 06:37:12 crc kubenswrapper[4691]: I0930 06:37:12.630548 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7597958cd9-k94q9" Sep 30 06:37:12 crc kubenswrapper[4691]: I0930 06:37:12.798480 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"7c37a536-f38a-431d-8b76-fa23d610af0b","Type":"ContainerStarted","Data":"08886d107a295ed26d325b5e4e74061d725f9a386ca06838f84be5ac4faeefca"} Sep 30 06:37:12 crc kubenswrapper[4691]: I0930 06:37:12.798524 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-api-0" event={"ID":"7c37a536-f38a-431d-8b76-fa23d610af0b","Type":"ContainerStarted","Data":"783352430af85e7d18ff7c96b2f01a2bcd0035874ae1f9ae3b6c1f9f6913d98e"} Sep 30 06:37:12 crc kubenswrapper[4691]: I0930 06:37:12.798541 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Sep 30 06:37:12 crc kubenswrapper[4691]: I0930 06:37:12.802300 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04be0ebf-14ea-4b62-b235-af7e6fdff8ee-ovsdbserver-sb\") pod \"04be0ebf-14ea-4b62-b235-af7e6fdff8ee\" (UID: \"04be0ebf-14ea-4b62-b235-af7e6fdff8ee\") " Sep 30 06:37:12 crc kubenswrapper[4691]: I0930 06:37:12.802411 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9xfp\" (UniqueName: \"kubernetes.io/projected/04be0ebf-14ea-4b62-b235-af7e6fdff8ee-kube-api-access-n9xfp\") pod \"04be0ebf-14ea-4b62-b235-af7e6fdff8ee\" (UID: \"04be0ebf-14ea-4b62-b235-af7e6fdff8ee\") " Sep 30 06:37:12 crc kubenswrapper[4691]: I0930 06:37:12.802470 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04be0ebf-14ea-4b62-b235-af7e6fdff8ee-dns-svc\") pod \"04be0ebf-14ea-4b62-b235-af7e6fdff8ee\" (UID: \"04be0ebf-14ea-4b62-b235-af7e6fdff8ee\") " Sep 30 06:37:12 crc kubenswrapper[4691]: I0930 06:37:12.802489 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04be0ebf-14ea-4b62-b235-af7e6fdff8ee-ovsdbserver-nb\") pod \"04be0ebf-14ea-4b62-b235-af7e6fdff8ee\" (UID: \"04be0ebf-14ea-4b62-b235-af7e6fdff8ee\") " Sep 30 06:37:12 crc kubenswrapper[4691]: I0930 06:37:12.802509 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04be0ebf-14ea-4b62-b235-af7e6fdff8ee-config\") pod \"04be0ebf-14ea-4b62-b235-af7e6fdff8ee\" (UID: \"04be0ebf-14ea-4b62-b235-af7e6fdff8ee\") " Sep 30 06:37:12 crc kubenswrapper[4691]: I0930 06:37:12.802523 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/04be0ebf-14ea-4b62-b235-af7e6fdff8ee-dns-swift-storage-0\") pod \"04be0ebf-14ea-4b62-b235-af7e6fdff8ee\" (UID: \"04be0ebf-14ea-4b62-b235-af7e6fdff8ee\") " Sep 30 06:37:12 crc kubenswrapper[4691]: I0930 06:37:12.802993 4691 generic.go:334] "Generic (PLEG): container finished" podID="04be0ebf-14ea-4b62-b235-af7e6fdff8ee" containerID="1adf575f4136c8beacd48f416506a13b997d4fab22b6618e34606bc3c0c95a54" exitCode=0 Sep 30 06:37:12 crc kubenswrapper[4691]: I0930 06:37:12.803742 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7597958cd9-k94q9" Sep 30 06:37:12 crc kubenswrapper[4691]: I0930 06:37:12.803910 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7597958cd9-k94q9" event={"ID":"04be0ebf-14ea-4b62-b235-af7e6fdff8ee","Type":"ContainerDied","Data":"1adf575f4136c8beacd48f416506a13b997d4fab22b6618e34606bc3c0c95a54"} Sep 30 06:37:12 crc kubenswrapper[4691]: I0930 06:37:12.803938 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7597958cd9-k94q9" event={"ID":"04be0ebf-14ea-4b62-b235-af7e6fdff8ee","Type":"ContainerDied","Data":"002be7e11cef2be9f31196c501b5898291e0636d0ba563a6f6c02f623012a8b3"} Sep 30 06:37:12 crc kubenswrapper[4691]: I0930 06:37:12.803954 4691 scope.go:117] "RemoveContainer" containerID="1adf575f4136c8beacd48f416506a13b997d4fab22b6618e34606bc3c0c95a54" Sep 30 06:37:12 crc kubenswrapper[4691]: I0930 06:37:12.814216 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04be0ebf-14ea-4b62-b235-af7e6fdff8ee-kube-api-access-n9xfp" (OuterVolumeSpecName: "kube-api-access-n9xfp") pod "04be0ebf-14ea-4b62-b235-af7e6fdff8ee" (UID: "04be0ebf-14ea-4b62-b235-af7e6fdff8ee"). InnerVolumeSpecName "kube-api-access-n9xfp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:37:12 crc kubenswrapper[4691]: I0930 06:37:12.865700 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04be0ebf-14ea-4b62-b235-af7e6fdff8ee-config" (OuterVolumeSpecName: "config") pod "04be0ebf-14ea-4b62-b235-af7e6fdff8ee" (UID: "04be0ebf-14ea-4b62-b235-af7e6fdff8ee"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:37:12 crc kubenswrapper[4691]: I0930 06:37:12.902459 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04be0ebf-14ea-4b62-b235-af7e6fdff8ee-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "04be0ebf-14ea-4b62-b235-af7e6fdff8ee" (UID: "04be0ebf-14ea-4b62-b235-af7e6fdff8ee"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:37:12 crc kubenswrapper[4691]: I0930 06:37:12.906322 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n9xfp\" (UniqueName: \"kubernetes.io/projected/04be0ebf-14ea-4b62-b235-af7e6fdff8ee-kube-api-access-n9xfp\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:12 crc kubenswrapper[4691]: I0930 06:37:12.906431 4691 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/04be0ebf-14ea-4b62-b235-af7e6fdff8ee-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:12 crc kubenswrapper[4691]: I0930 06:37:12.906493 4691 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/04be0ebf-14ea-4b62-b235-af7e6fdff8ee-config\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:12 crc kubenswrapper[4691]: I0930 06:37:12.913054 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04be0ebf-14ea-4b62-b235-af7e6fdff8ee-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "04be0ebf-14ea-4b62-b235-af7e6fdff8ee" (UID: "04be0ebf-14ea-4b62-b235-af7e6fdff8ee"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:37:12 crc kubenswrapper[4691]: I0930 06:37:12.933417 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04be0ebf-14ea-4b62-b235-af7e6fdff8ee-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "04be0ebf-14ea-4b62-b235-af7e6fdff8ee" (UID: "04be0ebf-14ea-4b62-b235-af7e6fdff8ee"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:37:12 crc kubenswrapper[4691]: I0930 06:37:12.935322 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/04be0ebf-14ea-4b62-b235-af7e6fdff8ee-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "04be0ebf-14ea-4b62-b235-af7e6fdff8ee" (UID: "04be0ebf-14ea-4b62-b235-af7e6fdff8ee"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:37:12 crc kubenswrapper[4691]: I0930 06:37:12.962528 4691 scope.go:117] "RemoveContainer" containerID="8e80ea8403d22c99056604869ce2d20a2c678097c8ae26225c3988c26c277c1c" Sep 30 06:37:13 crc kubenswrapper[4691]: I0930 06:37:13.011596 4691 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/04be0ebf-14ea-4b62-b235-af7e6fdff8ee-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:13 crc kubenswrapper[4691]: I0930 06:37:13.011626 4691 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/04be0ebf-14ea-4b62-b235-af7e6fdff8ee-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:13 crc kubenswrapper[4691]: I0930 06:37:13.011635 4691 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/04be0ebf-14ea-4b62-b235-af7e6fdff8ee-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:13 crc kubenswrapper[4691]: I0930 06:37:13.020942 4691 scope.go:117] "RemoveContainer" containerID="1adf575f4136c8beacd48f416506a13b997d4fab22b6618e34606bc3c0c95a54" Sep 30 06:37:13 crc kubenswrapper[4691]: E0930 06:37:13.022104 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1adf575f4136c8beacd48f416506a13b997d4fab22b6618e34606bc3c0c95a54\": container with ID starting with 1adf575f4136c8beacd48f416506a13b997d4fab22b6618e34606bc3c0c95a54 not found: ID does not exist" containerID="1adf575f4136c8beacd48f416506a13b997d4fab22b6618e34606bc3c0c95a54" Sep 30 06:37:13 crc kubenswrapper[4691]: I0930 06:37:13.022156 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1adf575f4136c8beacd48f416506a13b997d4fab22b6618e34606bc3c0c95a54"} err="failed to get container status \"1adf575f4136c8beacd48f416506a13b997d4fab22b6618e34606bc3c0c95a54\": rpc error: code = NotFound desc = could not find container \"1adf575f4136c8beacd48f416506a13b997d4fab22b6618e34606bc3c0c95a54\": container with ID starting with 1adf575f4136c8beacd48f416506a13b997d4fab22b6618e34606bc3c0c95a54 not found: ID does not exist" Sep 30 06:37:13 crc kubenswrapper[4691]: I0930 06:37:13.022190 4691 scope.go:117] "RemoveContainer" containerID="8e80ea8403d22c99056604869ce2d20a2c678097c8ae26225c3988c26c277c1c" Sep 30 06:37:13 crc kubenswrapper[4691]: E0930 06:37:13.022483 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e80ea8403d22c99056604869ce2d20a2c678097c8ae26225c3988c26c277c1c\": container with ID starting with 8e80ea8403d22c99056604869ce2d20a2c678097c8ae26225c3988c26c277c1c not found: ID does not exist" containerID="8e80ea8403d22c99056604869ce2d20a2c678097c8ae26225c3988c26c277c1c" Sep 30 06:37:13 crc kubenswrapper[4691]: I0930 06:37:13.022508 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e80ea8403d22c99056604869ce2d20a2c678097c8ae26225c3988c26c277c1c"} err="failed to get container status \"8e80ea8403d22c99056604869ce2d20a2c678097c8ae26225c3988c26c277c1c\": rpc error: code = NotFound desc = could not find container \"8e80ea8403d22c99056604869ce2d20a2c678097c8ae26225c3988c26c277c1c\": container with ID starting with 8e80ea8403d22c99056604869ce2d20a2c678097c8ae26225c3988c26c277c1c not found: ID does not exist" Sep 30 06:37:13 crc kubenswrapper[4691]: I0930 06:37:13.077787 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-67959b9db8-p8w6w" podUID="cbfa4b8e-37ac-4713-a8af-b610f086f7e2" containerName="barbican-api" probeResult="failure" output="HTTP probe failed with statuscode: 500" Sep 30 06:37:13 crc kubenswrapper[4691]: I0930 06:37:13.142371 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/watcher-api-0" podStartSLOduration=3.142354064 podStartE2EDuration="3.142354064s" podCreationTimestamp="2025-09-30 06:37:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:37:12.843274775 +0000 UTC m=+1076.318295835" watchObservedRunningTime="2025-09-30 06:37:13.142354064 +0000 UTC m=+1076.617375104" Sep 30 06:37:13 crc kubenswrapper[4691]: I0930 06:37:13.150386 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7597958cd9-k94q9"] Sep 30 06:37:13 crc kubenswrapper[4691]: I0930 06:37:13.157616 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7597958cd9-k94q9"] Sep 30 06:37:13 crc kubenswrapper[4691]: I0930 06:37:13.248758 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04be0ebf-14ea-4b62-b235-af7e6fdff8ee" path="/var/lib/kubelet/pods/04be0ebf-14ea-4b62-b235-af7e6fdff8ee/volumes" Sep 30 06:37:13 crc kubenswrapper[4691]: I0930 06:37:13.253209 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-fh78b" Sep 30 06:37:13 crc kubenswrapper[4691]: I0930 06:37:13.427787 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjg5r\" (UniqueName: \"kubernetes.io/projected/071e402d-9775-412e-ad8a-1643cd646d7c-kube-api-access-mjg5r\") pod \"071e402d-9775-412e-ad8a-1643cd646d7c\" (UID: \"071e402d-9775-412e-ad8a-1643cd646d7c\") " Sep 30 06:37:13 crc kubenswrapper[4691]: I0930 06:37:13.427829 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/071e402d-9775-412e-ad8a-1643cd646d7c-config-data\") pod \"071e402d-9775-412e-ad8a-1643cd646d7c\" (UID: \"071e402d-9775-412e-ad8a-1643cd646d7c\") " Sep 30 06:37:13 crc kubenswrapper[4691]: I0930 06:37:13.427959 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/071e402d-9775-412e-ad8a-1643cd646d7c-combined-ca-bundle\") pod \"071e402d-9775-412e-ad8a-1643cd646d7c\" (UID: \"071e402d-9775-412e-ad8a-1643cd646d7c\") " Sep 30 06:37:13 crc kubenswrapper[4691]: I0930 06:37:13.427988 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/071e402d-9775-412e-ad8a-1643cd646d7c-etc-machine-id\") pod \"071e402d-9775-412e-ad8a-1643cd646d7c\" (UID: \"071e402d-9775-412e-ad8a-1643cd646d7c\") " Sep 30 06:37:13 crc kubenswrapper[4691]: I0930 06:37:13.428021 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/071e402d-9775-412e-ad8a-1643cd646d7c-scripts\") pod \"071e402d-9775-412e-ad8a-1643cd646d7c\" (UID: \"071e402d-9775-412e-ad8a-1643cd646d7c\") " Sep 30 06:37:13 crc kubenswrapper[4691]: I0930 06:37:13.428049 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/071e402d-9775-412e-ad8a-1643cd646d7c-db-sync-config-data\") pod \"071e402d-9775-412e-ad8a-1643cd646d7c\" (UID: \"071e402d-9775-412e-ad8a-1643cd646d7c\") " Sep 30 06:37:13 crc kubenswrapper[4691]: I0930 06:37:13.429183 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/071e402d-9775-412e-ad8a-1643cd646d7c-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "071e402d-9775-412e-ad8a-1643cd646d7c" (UID: "071e402d-9775-412e-ad8a-1643cd646d7c"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 06:37:13 crc kubenswrapper[4691]: I0930 06:37:13.434713 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/071e402d-9775-412e-ad8a-1643cd646d7c-kube-api-access-mjg5r" (OuterVolumeSpecName: "kube-api-access-mjg5r") pod "071e402d-9775-412e-ad8a-1643cd646d7c" (UID: "071e402d-9775-412e-ad8a-1643cd646d7c"). InnerVolumeSpecName "kube-api-access-mjg5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:37:13 crc kubenswrapper[4691]: I0930 06:37:13.436177 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/071e402d-9775-412e-ad8a-1643cd646d7c-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "071e402d-9775-412e-ad8a-1643cd646d7c" (UID: "071e402d-9775-412e-ad8a-1643cd646d7c"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:37:13 crc kubenswrapper[4691]: I0930 06:37:13.444926 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/071e402d-9775-412e-ad8a-1643cd646d7c-scripts" (OuterVolumeSpecName: "scripts") pod "071e402d-9775-412e-ad8a-1643cd646d7c" (UID: "071e402d-9775-412e-ad8a-1643cd646d7c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:37:13 crc kubenswrapper[4691]: I0930 06:37:13.512996 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/071e402d-9775-412e-ad8a-1643cd646d7c-config-data" (OuterVolumeSpecName: "config-data") pod "071e402d-9775-412e-ad8a-1643cd646d7c" (UID: "071e402d-9775-412e-ad8a-1643cd646d7c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:37:13 crc kubenswrapper[4691]: I0930 06:37:13.531055 4691 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/071e402d-9775-412e-ad8a-1643cd646d7c-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:13 crc kubenswrapper[4691]: I0930 06:37:13.531078 4691 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/071e402d-9775-412e-ad8a-1643cd646d7c-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:13 crc kubenswrapper[4691]: I0930 06:37:13.531088 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjg5r\" (UniqueName: \"kubernetes.io/projected/071e402d-9775-412e-ad8a-1643cd646d7c-kube-api-access-mjg5r\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:13 crc kubenswrapper[4691]: I0930 06:37:13.531096 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/071e402d-9775-412e-ad8a-1643cd646d7c-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:13 crc kubenswrapper[4691]: I0930 06:37:13.531106 4691 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/071e402d-9775-412e-ad8a-1643cd646d7c-etc-machine-id\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:13 crc kubenswrapper[4691]: I0930 06:37:13.531208 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/071e402d-9775-412e-ad8a-1643cd646d7c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "071e402d-9775-412e-ad8a-1643cd646d7c" (UID: "071e402d-9775-412e-ad8a-1643cd646d7c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:37:13 crc kubenswrapper[4691]: I0930 06:37:13.632877 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/071e402d-9775-412e-ad8a-1643cd646d7c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:13 crc kubenswrapper[4691]: I0930 06:37:13.702835 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-67959b9db8-p8w6w" Sep 30 06:37:13 crc kubenswrapper[4691]: I0930 06:37:13.817621 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-fh78b" event={"ID":"071e402d-9775-412e-ad8a-1643cd646d7c","Type":"ContainerDied","Data":"3e47f682055924cdf667683eaf5a375316289e58027ae0b596c305f0f002919c"} Sep 30 06:37:13 crc kubenswrapper[4691]: I0930 06:37:13.817654 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3e47f682055924cdf667683eaf5a375316289e58027ae0b596c305f0f002919c" Sep 30 06:37:13 crc kubenswrapper[4691]: I0930 06:37:13.817667 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-fh78b" Sep 30 06:37:13 crc kubenswrapper[4691]: I0930 06:37:13.887552 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-67959b9db8-p8w6w" Sep 30 06:37:14 crc kubenswrapper[4691]: I0930 06:37:14.045278 4691 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5755bf4df8-zx9td" podUID="cc29be35-3ceb-4a88-af6e-77e2d0cbab83" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.159:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.159:8443: connect: connection refused" Sep 30 06:37:14 crc kubenswrapper[4691]: I0930 06:37:14.045368 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5755bf4df8-zx9td" Sep 30 06:37:14 crc kubenswrapper[4691]: I0930 06:37:14.143373 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-67cb6557b7-q5zk6"] Sep 30 06:37:14 crc kubenswrapper[4691]: E0930 06:37:14.143723 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="071e402d-9775-412e-ad8a-1643cd646d7c" containerName="cinder-db-sync" Sep 30 06:37:14 crc kubenswrapper[4691]: I0930 06:37:14.143734 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="071e402d-9775-412e-ad8a-1643cd646d7c" containerName="cinder-db-sync" Sep 30 06:37:14 crc kubenswrapper[4691]: E0930 06:37:14.143757 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04be0ebf-14ea-4b62-b235-af7e6fdff8ee" containerName="dnsmasq-dns" Sep 30 06:37:14 crc kubenswrapper[4691]: I0930 06:37:14.143763 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="04be0ebf-14ea-4b62-b235-af7e6fdff8ee" containerName="dnsmasq-dns" Sep 30 06:37:14 crc kubenswrapper[4691]: E0930 06:37:14.143782 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04be0ebf-14ea-4b62-b235-af7e6fdff8ee" containerName="init" Sep 30 06:37:14 crc kubenswrapper[4691]: I0930 06:37:14.143787 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="04be0ebf-14ea-4b62-b235-af7e6fdff8ee" containerName="init" Sep 30 06:37:14 crc kubenswrapper[4691]: I0930 06:37:14.154152 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="04be0ebf-14ea-4b62-b235-af7e6fdff8ee" containerName="dnsmasq-dns" Sep 30 06:37:14 crc kubenswrapper[4691]: I0930 06:37:14.154201 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="071e402d-9775-412e-ad8a-1643cd646d7c" containerName="cinder-db-sync" Sep 30 06:37:14 crc kubenswrapper[4691]: I0930 06:37:14.155345 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67cb6557b7-q5zk6" Sep 30 06:37:14 crc kubenswrapper[4691]: I0930 06:37:14.173185 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67cb6557b7-q5zk6"] Sep 30 06:37:14 crc kubenswrapper[4691]: I0930 06:37:14.191692 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 06:37:14 crc kubenswrapper[4691]: I0930 06:37:14.193242 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 30 06:37:14 crc kubenswrapper[4691]: I0930 06:37:14.204131 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-n5crj" Sep 30 06:37:14 crc kubenswrapper[4691]: I0930 06:37:14.204416 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Sep 30 06:37:14 crc kubenswrapper[4691]: I0930 06:37:14.210629 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Sep 30 06:37:14 crc kubenswrapper[4691]: I0930 06:37:14.211074 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Sep 30 06:37:14 crc kubenswrapper[4691]: I0930 06:37:14.271641 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/02b60d82-3894-4906-a792-84d9d0c2538e-dns-svc\") pod \"dnsmasq-dns-67cb6557b7-q5zk6\" (UID: \"02b60d82-3894-4906-a792-84d9d0c2538e\") " pod="openstack/dnsmasq-dns-67cb6557b7-q5zk6" Sep 30 06:37:14 crc kubenswrapper[4691]: I0930 06:37:14.277204 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/02b60d82-3894-4906-a792-84d9d0c2538e-dns-swift-storage-0\") pod \"dnsmasq-dns-67cb6557b7-q5zk6\" (UID: \"02b60d82-3894-4906-a792-84d9d0c2538e\") " pod="openstack/dnsmasq-dns-67cb6557b7-q5zk6" Sep 30 06:37:14 crc kubenswrapper[4691]: I0930 06:37:14.277286 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/02b60d82-3894-4906-a792-84d9d0c2538e-ovsdbserver-nb\") pod \"dnsmasq-dns-67cb6557b7-q5zk6\" (UID: \"02b60d82-3894-4906-a792-84d9d0c2538e\") " pod="openstack/dnsmasq-dns-67cb6557b7-q5zk6" Sep 30 06:37:14 crc kubenswrapper[4691]: I0930 06:37:14.277325 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29wkn\" (UniqueName: \"kubernetes.io/projected/02b60d82-3894-4906-a792-84d9d0c2538e-kube-api-access-29wkn\") pod \"dnsmasq-dns-67cb6557b7-q5zk6\" (UID: \"02b60d82-3894-4906-a792-84d9d0c2538e\") " pod="openstack/dnsmasq-dns-67cb6557b7-q5zk6" Sep 30 06:37:14 crc kubenswrapper[4691]: I0930 06:37:14.277462 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/02b60d82-3894-4906-a792-84d9d0c2538e-ovsdbserver-sb\") pod \"dnsmasq-dns-67cb6557b7-q5zk6\" (UID: \"02b60d82-3894-4906-a792-84d9d0c2538e\") " pod="openstack/dnsmasq-dns-67cb6557b7-q5zk6" Sep 30 06:37:14 crc kubenswrapper[4691]: I0930 06:37:14.277991 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02b60d82-3894-4906-a792-84d9d0c2538e-config\") pod \"dnsmasq-dns-67cb6557b7-q5zk6\" (UID: \"02b60d82-3894-4906-a792-84d9d0c2538e\") " pod="openstack/dnsmasq-dns-67cb6557b7-q5zk6" Sep 30 06:37:14 crc kubenswrapper[4691]: I0930 06:37:14.307300 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 06:37:14 crc kubenswrapper[4691]: I0930 06:37:14.379635 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02b60d82-3894-4906-a792-84d9d0c2538e-config\") pod \"dnsmasq-dns-67cb6557b7-q5zk6\" (UID: \"02b60d82-3894-4906-a792-84d9d0c2538e\") " pod="openstack/dnsmasq-dns-67cb6557b7-q5zk6" Sep 30 06:37:14 crc kubenswrapper[4691]: I0930 06:37:14.379684 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0dd335f3-da02-4e62-a202-670de97399a9-scripts\") pod \"cinder-scheduler-0\" (UID: \"0dd335f3-da02-4e62-a202-670de97399a9\") " pod="openstack/cinder-scheduler-0" Sep 30 06:37:14 crc kubenswrapper[4691]: I0930 06:37:14.379736 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dd335f3-da02-4e62-a202-670de97399a9-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"0dd335f3-da02-4e62-a202-670de97399a9\") " pod="openstack/cinder-scheduler-0" Sep 30 06:37:14 crc kubenswrapper[4691]: I0930 06:37:14.380467 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/02b60d82-3894-4906-a792-84d9d0c2538e-dns-svc\") pod \"dnsmasq-dns-67cb6557b7-q5zk6\" (UID: \"02b60d82-3894-4906-a792-84d9d0c2538e\") " pod="openstack/dnsmasq-dns-67cb6557b7-q5zk6" Sep 30 06:37:14 crc kubenswrapper[4691]: I0930 06:37:14.380495 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/02b60d82-3894-4906-a792-84d9d0c2538e-dns-swift-storage-0\") pod \"dnsmasq-dns-67cb6557b7-q5zk6\" (UID: \"02b60d82-3894-4906-a792-84d9d0c2538e\") " pod="openstack/dnsmasq-dns-67cb6557b7-q5zk6" Sep 30 06:37:14 crc kubenswrapper[4691]: I0930 06:37:14.380594 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02b60d82-3894-4906-a792-84d9d0c2538e-config\") pod \"dnsmasq-dns-67cb6557b7-q5zk6\" (UID: \"02b60d82-3894-4906-a792-84d9d0c2538e\") " pod="openstack/dnsmasq-dns-67cb6557b7-q5zk6" Sep 30 06:37:14 crc kubenswrapper[4691]: I0930 06:37:14.381132 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/02b60d82-3894-4906-a792-84d9d0c2538e-dns-svc\") pod \"dnsmasq-dns-67cb6557b7-q5zk6\" (UID: \"02b60d82-3894-4906-a792-84d9d0c2538e\") " pod="openstack/dnsmasq-dns-67cb6557b7-q5zk6" Sep 30 06:37:14 crc kubenswrapper[4691]: I0930 06:37:14.381242 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/02b60d82-3894-4906-a792-84d9d0c2538e-ovsdbserver-nb\") pod \"dnsmasq-dns-67cb6557b7-q5zk6\" (UID: \"02b60d82-3894-4906-a792-84d9d0c2538e\") " pod="openstack/dnsmasq-dns-67cb6557b7-q5zk6" Sep 30 06:37:14 crc kubenswrapper[4691]: I0930 06:37:14.381327 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29wkn\" (UniqueName: \"kubernetes.io/projected/02b60d82-3894-4906-a792-84d9d0c2538e-kube-api-access-29wkn\") pod \"dnsmasq-dns-67cb6557b7-q5zk6\" (UID: \"02b60d82-3894-4906-a792-84d9d0c2538e\") " pod="openstack/dnsmasq-dns-67cb6557b7-q5zk6" Sep 30 06:37:14 crc kubenswrapper[4691]: I0930 06:37:14.381380 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0dd335f3-da02-4e62-a202-670de97399a9-config-data\") pod \"cinder-scheduler-0\" (UID: \"0dd335f3-da02-4e62-a202-670de97399a9\") " pod="openstack/cinder-scheduler-0" Sep 30 06:37:14 crc kubenswrapper[4691]: I0930 06:37:14.381363 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/02b60d82-3894-4906-a792-84d9d0c2538e-dns-swift-storage-0\") pod \"dnsmasq-dns-67cb6557b7-q5zk6\" (UID: \"02b60d82-3894-4906-a792-84d9d0c2538e\") " pod="openstack/dnsmasq-dns-67cb6557b7-q5zk6" Sep 30 06:37:14 crc kubenswrapper[4691]: I0930 06:37:14.381474 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhp8g\" (UniqueName: \"kubernetes.io/projected/0dd335f3-da02-4e62-a202-670de97399a9-kube-api-access-hhp8g\") pod \"cinder-scheduler-0\" (UID: \"0dd335f3-da02-4e62-a202-670de97399a9\") " pod="openstack/cinder-scheduler-0" Sep 30 06:37:14 crc kubenswrapper[4691]: I0930 06:37:14.381610 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0dd335f3-da02-4e62-a202-670de97399a9-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"0dd335f3-da02-4e62-a202-670de97399a9\") " pod="openstack/cinder-scheduler-0" Sep 30 06:37:14 crc kubenswrapper[4691]: I0930 06:37:14.381690 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/02b60d82-3894-4906-a792-84d9d0c2538e-ovsdbserver-sb\") pod \"dnsmasq-dns-67cb6557b7-q5zk6\" (UID: \"02b60d82-3894-4906-a792-84d9d0c2538e\") " pod="openstack/dnsmasq-dns-67cb6557b7-q5zk6" Sep 30 06:37:14 crc kubenswrapper[4691]: I0930 06:37:14.381772 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0dd335f3-da02-4e62-a202-670de97399a9-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"0dd335f3-da02-4e62-a202-670de97399a9\") " pod="openstack/cinder-scheduler-0" Sep 30 06:37:14 crc kubenswrapper[4691]: I0930 06:37:14.382104 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/02b60d82-3894-4906-a792-84d9d0c2538e-ovsdbserver-nb\") pod \"dnsmasq-dns-67cb6557b7-q5zk6\" (UID: \"02b60d82-3894-4906-a792-84d9d0c2538e\") " pod="openstack/dnsmasq-dns-67cb6557b7-q5zk6" Sep 30 06:37:14 crc kubenswrapper[4691]: I0930 06:37:14.383935 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/02b60d82-3894-4906-a792-84d9d0c2538e-ovsdbserver-sb\") pod \"dnsmasq-dns-67cb6557b7-q5zk6\" (UID: \"02b60d82-3894-4906-a792-84d9d0c2538e\") " pod="openstack/dnsmasq-dns-67cb6557b7-q5zk6" Sep 30 06:37:14 crc kubenswrapper[4691]: I0930 06:37:14.383974 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Sep 30 06:37:14 crc kubenswrapper[4691]: I0930 06:37:14.385562 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 30 06:37:14 crc kubenswrapper[4691]: I0930 06:37:14.393914 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Sep 30 06:37:14 crc kubenswrapper[4691]: I0930 06:37:14.406145 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Sep 30 06:37:14 crc kubenswrapper[4691]: I0930 06:37:14.406184 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29wkn\" (UniqueName: \"kubernetes.io/projected/02b60d82-3894-4906-a792-84d9d0c2538e-kube-api-access-29wkn\") pod \"dnsmasq-dns-67cb6557b7-q5zk6\" (UID: \"02b60d82-3894-4906-a792-84d9d0c2538e\") " pod="openstack/dnsmasq-dns-67cb6557b7-q5zk6" Sep 30 06:37:14 crc kubenswrapper[4691]: I0930 06:37:14.482788 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/faa07cbe-8687-4be2-8757-8e730fccb6bb-scripts\") pod \"cinder-api-0\" (UID: \"faa07cbe-8687-4be2-8757-8e730fccb6bb\") " pod="openstack/cinder-api-0" Sep 30 06:37:14 crc kubenswrapper[4691]: I0930 06:37:14.482836 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0dd335f3-da02-4e62-a202-670de97399a9-config-data\") pod \"cinder-scheduler-0\" (UID: \"0dd335f3-da02-4e62-a202-670de97399a9\") " pod="openstack/cinder-scheduler-0" Sep 30 06:37:14 crc kubenswrapper[4691]: I0930 06:37:14.482863 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hhp8g\" (UniqueName: \"kubernetes.io/projected/0dd335f3-da02-4e62-a202-670de97399a9-kube-api-access-hhp8g\") pod \"cinder-scheduler-0\" (UID: \"0dd335f3-da02-4e62-a202-670de97399a9\") " pod="openstack/cinder-scheduler-0" Sep 30 06:37:14 crc kubenswrapper[4691]: I0930 06:37:14.482905 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/faa07cbe-8687-4be2-8757-8e730fccb6bb-logs\") pod \"cinder-api-0\" (UID: \"faa07cbe-8687-4be2-8757-8e730fccb6bb\") " pod="openstack/cinder-api-0" Sep 30 06:37:14 crc kubenswrapper[4691]: I0930 06:37:14.482930 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0dd335f3-da02-4e62-a202-670de97399a9-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"0dd335f3-da02-4e62-a202-670de97399a9\") " pod="openstack/cinder-scheduler-0" Sep 30 06:37:14 crc kubenswrapper[4691]: I0930 06:37:14.482963 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0dd335f3-da02-4e62-a202-670de97399a9-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"0dd335f3-da02-4e62-a202-670de97399a9\") " pod="openstack/cinder-scheduler-0" Sep 30 06:37:14 crc kubenswrapper[4691]: I0930 06:37:14.482978 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/faa07cbe-8687-4be2-8757-8e730fccb6bb-config-data-custom\") pod \"cinder-api-0\" (UID: \"faa07cbe-8687-4be2-8757-8e730fccb6bb\") " pod="openstack/cinder-api-0" Sep 30 06:37:14 crc kubenswrapper[4691]: I0930 06:37:14.483003 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0dd335f3-da02-4e62-a202-670de97399a9-scripts\") pod \"cinder-scheduler-0\" (UID: \"0dd335f3-da02-4e62-a202-670de97399a9\") " pod="openstack/cinder-scheduler-0" Sep 30 06:37:14 crc kubenswrapper[4691]: I0930 06:37:14.483020 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ps4l8\" (UniqueName: \"kubernetes.io/projected/faa07cbe-8687-4be2-8757-8e730fccb6bb-kube-api-access-ps4l8\") pod \"cinder-api-0\" (UID: \"faa07cbe-8687-4be2-8757-8e730fccb6bb\") " pod="openstack/cinder-api-0" Sep 30 06:37:14 crc kubenswrapper[4691]: I0930 06:37:14.483040 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dd335f3-da02-4e62-a202-670de97399a9-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"0dd335f3-da02-4e62-a202-670de97399a9\") " pod="openstack/cinder-scheduler-0" Sep 30 06:37:14 crc kubenswrapper[4691]: I0930 06:37:14.483062 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faa07cbe-8687-4be2-8757-8e730fccb6bb-config-data\") pod \"cinder-api-0\" (UID: \"faa07cbe-8687-4be2-8757-8e730fccb6bb\") " pod="openstack/cinder-api-0" Sep 30 06:37:14 crc kubenswrapper[4691]: I0930 06:37:14.483082 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faa07cbe-8687-4be2-8757-8e730fccb6bb-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"faa07cbe-8687-4be2-8757-8e730fccb6bb\") " pod="openstack/cinder-api-0" Sep 30 06:37:14 crc kubenswrapper[4691]: I0930 06:37:14.483106 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/faa07cbe-8687-4be2-8757-8e730fccb6bb-etc-machine-id\") pod \"cinder-api-0\" (UID: \"faa07cbe-8687-4be2-8757-8e730fccb6bb\") " pod="openstack/cinder-api-0" Sep 30 06:37:14 crc kubenswrapper[4691]: I0930 06:37:14.484935 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0dd335f3-da02-4e62-a202-670de97399a9-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"0dd335f3-da02-4e62-a202-670de97399a9\") " pod="openstack/cinder-scheduler-0" Sep 30 06:37:14 crc kubenswrapper[4691]: I0930 06:37:14.492466 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0dd335f3-da02-4e62-a202-670de97399a9-scripts\") pod \"cinder-scheduler-0\" (UID: \"0dd335f3-da02-4e62-a202-670de97399a9\") " pod="openstack/cinder-scheduler-0" Sep 30 06:37:14 crc kubenswrapper[4691]: I0930 06:37:14.493088 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0dd335f3-da02-4e62-a202-670de97399a9-config-data\") pod \"cinder-scheduler-0\" (UID: \"0dd335f3-da02-4e62-a202-670de97399a9\") " pod="openstack/cinder-scheduler-0" Sep 30 06:37:14 crc kubenswrapper[4691]: I0930 06:37:14.494599 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dd335f3-da02-4e62-a202-670de97399a9-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"0dd335f3-da02-4e62-a202-670de97399a9\") " pod="openstack/cinder-scheduler-0" Sep 30 06:37:14 crc kubenswrapper[4691]: I0930 06:37:14.499501 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0dd335f3-da02-4e62-a202-670de97399a9-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"0dd335f3-da02-4e62-a202-670de97399a9\") " pod="openstack/cinder-scheduler-0" Sep 30 06:37:14 crc kubenswrapper[4691]: I0930 06:37:14.504590 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhp8g\" (UniqueName: \"kubernetes.io/projected/0dd335f3-da02-4e62-a202-670de97399a9-kube-api-access-hhp8g\") pod \"cinder-scheduler-0\" (UID: \"0dd335f3-da02-4e62-a202-670de97399a9\") " pod="openstack/cinder-scheduler-0" Sep 30 06:37:14 crc kubenswrapper[4691]: I0930 06:37:14.569500 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67cb6557b7-q5zk6" Sep 30 06:37:14 crc kubenswrapper[4691]: I0930 06:37:14.585064 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/faa07cbe-8687-4be2-8757-8e730fccb6bb-scripts\") pod \"cinder-api-0\" (UID: \"faa07cbe-8687-4be2-8757-8e730fccb6bb\") " pod="openstack/cinder-api-0" Sep 30 06:37:14 crc kubenswrapper[4691]: I0930 06:37:14.585132 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/faa07cbe-8687-4be2-8757-8e730fccb6bb-logs\") pod \"cinder-api-0\" (UID: \"faa07cbe-8687-4be2-8757-8e730fccb6bb\") " pod="openstack/cinder-api-0" Sep 30 06:37:14 crc kubenswrapper[4691]: I0930 06:37:14.585178 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/faa07cbe-8687-4be2-8757-8e730fccb6bb-config-data-custom\") pod \"cinder-api-0\" (UID: \"faa07cbe-8687-4be2-8757-8e730fccb6bb\") " pod="openstack/cinder-api-0" Sep 30 06:37:14 crc kubenswrapper[4691]: I0930 06:37:14.585208 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ps4l8\" (UniqueName: \"kubernetes.io/projected/faa07cbe-8687-4be2-8757-8e730fccb6bb-kube-api-access-ps4l8\") pod \"cinder-api-0\" (UID: \"faa07cbe-8687-4be2-8757-8e730fccb6bb\") " pod="openstack/cinder-api-0" Sep 30 06:37:14 crc kubenswrapper[4691]: I0930 06:37:14.585237 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faa07cbe-8687-4be2-8757-8e730fccb6bb-config-data\") pod \"cinder-api-0\" (UID: \"faa07cbe-8687-4be2-8757-8e730fccb6bb\") " pod="openstack/cinder-api-0" Sep 30 06:37:14 crc kubenswrapper[4691]: I0930 06:37:14.585256 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faa07cbe-8687-4be2-8757-8e730fccb6bb-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"faa07cbe-8687-4be2-8757-8e730fccb6bb\") " pod="openstack/cinder-api-0" Sep 30 06:37:14 crc kubenswrapper[4691]: I0930 06:37:14.585282 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/faa07cbe-8687-4be2-8757-8e730fccb6bb-etc-machine-id\") pod \"cinder-api-0\" (UID: \"faa07cbe-8687-4be2-8757-8e730fccb6bb\") " pod="openstack/cinder-api-0" Sep 30 06:37:14 crc kubenswrapper[4691]: I0930 06:37:14.585354 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/faa07cbe-8687-4be2-8757-8e730fccb6bb-etc-machine-id\") pod \"cinder-api-0\" (UID: \"faa07cbe-8687-4be2-8757-8e730fccb6bb\") " pod="openstack/cinder-api-0" Sep 30 06:37:14 crc kubenswrapper[4691]: I0930 06:37:14.586549 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/faa07cbe-8687-4be2-8757-8e730fccb6bb-logs\") pod \"cinder-api-0\" (UID: \"faa07cbe-8687-4be2-8757-8e730fccb6bb\") " pod="openstack/cinder-api-0" Sep 30 06:37:14 crc kubenswrapper[4691]: I0930 06:37:14.589220 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/faa07cbe-8687-4be2-8757-8e730fccb6bb-config-data-custom\") pod \"cinder-api-0\" (UID: \"faa07cbe-8687-4be2-8757-8e730fccb6bb\") " pod="openstack/cinder-api-0" Sep 30 06:37:14 crc kubenswrapper[4691]: I0930 06:37:14.589251 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/faa07cbe-8687-4be2-8757-8e730fccb6bb-scripts\") pod \"cinder-api-0\" (UID: \"faa07cbe-8687-4be2-8757-8e730fccb6bb\") " pod="openstack/cinder-api-0" Sep 30 06:37:14 crc kubenswrapper[4691]: I0930 06:37:14.596310 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 30 06:37:14 crc kubenswrapper[4691]: I0930 06:37:14.598821 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faa07cbe-8687-4be2-8757-8e730fccb6bb-config-data\") pod \"cinder-api-0\" (UID: \"faa07cbe-8687-4be2-8757-8e730fccb6bb\") " pod="openstack/cinder-api-0" Sep 30 06:37:14 crc kubenswrapper[4691]: I0930 06:37:14.601588 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faa07cbe-8687-4be2-8757-8e730fccb6bb-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"faa07cbe-8687-4be2-8757-8e730fccb6bb\") " pod="openstack/cinder-api-0" Sep 30 06:37:14 crc kubenswrapper[4691]: I0930 06:37:14.608109 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ps4l8\" (UniqueName: \"kubernetes.io/projected/faa07cbe-8687-4be2-8757-8e730fccb6bb-kube-api-access-ps4l8\") pod \"cinder-api-0\" (UID: \"faa07cbe-8687-4be2-8757-8e730fccb6bb\") " pod="openstack/cinder-api-0" Sep 30 06:37:14 crc kubenswrapper[4691]: I0930 06:37:14.760702 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 30 06:37:15 crc kubenswrapper[4691]: I0930 06:37:15.285514 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-67cb6557b7-q5zk6"] Sep 30 06:37:15 crc kubenswrapper[4691]: I0930 06:37:15.339997 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Sep 30 06:37:15 crc kubenswrapper[4691]: I0930 06:37:15.340880 4691 scope.go:117] "RemoveContainer" containerID="0bda2d329b80bafe79c6a51e31c421bb8b3254080dc231fa2738d610c27ec014" Sep 30 06:37:15 crc kubenswrapper[4691]: E0930 06:37:15.341156 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"watcher-decision-engine\" with CrashLoopBackOff: \"back-off 10s restarting failed container=watcher-decision-engine pod=watcher-decision-engine-0_openstack(af3f1644-3ab8-4a6a-9f80-f8ea42297e98)\"" pod="openstack/watcher-decision-engine-0" podUID="af3f1644-3ab8-4a6a-9f80-f8ea42297e98" Sep 30 06:37:15 crc kubenswrapper[4691]: I0930 06:37:15.417345 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 06:37:15 crc kubenswrapper[4691]: I0930 06:37:15.729442 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Sep 30 06:37:15 crc kubenswrapper[4691]: I0930 06:37:15.872564 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0dd335f3-da02-4e62-a202-670de97399a9","Type":"ContainerStarted","Data":"393fd1f67fe2bf6db3d1de8d2a3ed849638384113c9fcf1f0911e5fa27f61d45"} Sep 30 06:37:15 crc kubenswrapper[4691]: I0930 06:37:15.874630 4691 generic.go:334] "Generic (PLEG): container finished" podID="02b60d82-3894-4906-a792-84d9d0c2538e" containerID="f1a093915c38b87c7ffe83ea81e71d0931ff1af2141184b2d2980c0c107fff2b" exitCode=0 Sep 30 06:37:15 crc kubenswrapper[4691]: I0930 06:37:15.874670 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67cb6557b7-q5zk6" event={"ID":"02b60d82-3894-4906-a792-84d9d0c2538e","Type":"ContainerDied","Data":"f1a093915c38b87c7ffe83ea81e71d0931ff1af2141184b2d2980c0c107fff2b"} Sep 30 06:37:15 crc kubenswrapper[4691]: I0930 06:37:15.874686 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67cb6557b7-q5zk6" event={"ID":"02b60d82-3894-4906-a792-84d9d0c2538e","Type":"ContainerStarted","Data":"335cbf83626b016c722d2696db1e91dbd98294200053b48ef1c025713d009cff"} Sep 30 06:37:15 crc kubenswrapper[4691]: I0930 06:37:15.896776 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"faa07cbe-8687-4be2-8757-8e730fccb6bb","Type":"ContainerStarted","Data":"aa55314589fad3b35775c0a75df9ffc5ddae38488d7f3783e12f6abd00b93fbd"} Sep 30 06:37:16 crc kubenswrapper[4691]: I0930 06:37:16.151437 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-api-0" Sep 30 06:37:16 crc kubenswrapper[4691]: I0930 06:37:16.151850 4691 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Sep 30 06:37:16 crc kubenswrapper[4691]: I0930 06:37:16.984138 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67cb6557b7-q5zk6" event={"ID":"02b60d82-3894-4906-a792-84d9d0c2538e","Type":"ContainerStarted","Data":"99ccc05210f364478faded7979c78de08cfdbabef8111547b044e25c689345fe"} Sep 30 06:37:16 crc kubenswrapper[4691]: I0930 06:37:16.985335 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-67cb6557b7-q5zk6" Sep 30 06:37:17 crc kubenswrapper[4691]: I0930 06:37:17.015430 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"faa07cbe-8687-4be2-8757-8e730fccb6bb","Type":"ContainerStarted","Data":"6afdf23a85746e8816a73dc3776966f7353659e9415744704f9fc5c1f961ffc0"} Sep 30 06:37:17 crc kubenswrapper[4691]: I0930 06:37:17.035468 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0dd335f3-da02-4e62-a202-670de97399a9","Type":"ContainerStarted","Data":"6c26171ffd1df45ef63125ed39237d7064089677283a5d6999e84c70da922bd8"} Sep 30 06:37:17 crc kubenswrapper[4691]: I0930 06:37:17.049419 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-67cb6557b7-q5zk6" podStartSLOduration=3.04940449 podStartE2EDuration="3.04940449s" podCreationTimestamp="2025-09-30 06:37:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:37:17.015720319 +0000 UTC m=+1080.490741369" watchObservedRunningTime="2025-09-30 06:37:17.04940449 +0000 UTC m=+1080.524425530" Sep 30 06:37:17 crc kubenswrapper[4691]: I0930 06:37:17.241229 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Sep 30 06:37:17 crc kubenswrapper[4691]: I0930 06:37:17.543666 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Sep 30 06:37:17 crc kubenswrapper[4691]: I0930 06:37:17.585241 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-68888bb5f6-d225g" Sep 30 06:37:17 crc kubenswrapper[4691]: I0930 06:37:17.884910 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-688b4ff469-2cgjc"] Sep 30 06:37:17 crc kubenswrapper[4691]: I0930 06:37:17.886737 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-688b4ff469-2cgjc" Sep 30 06:37:17 crc kubenswrapper[4691]: I0930 06:37:17.895946 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Sep 30 06:37:17 crc kubenswrapper[4691]: I0930 06:37:17.896069 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Sep 30 06:37:17 crc kubenswrapper[4691]: I0930 06:37:17.896172 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Sep 30 06:37:17 crc kubenswrapper[4691]: I0930 06:37:17.910993 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-688b4ff469-2cgjc"] Sep 30 06:37:18 crc kubenswrapper[4691]: I0930 06:37:18.016690 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-68888bb5f6-d225g" Sep 30 06:37:18 crc kubenswrapper[4691]: I0930 06:37:18.023493 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e4e270ac-98e8-47b9-bf7b-7492996aa18c-etc-swift\") pod \"swift-proxy-688b4ff469-2cgjc\" (UID: \"e4e270ac-98e8-47b9-bf7b-7492996aa18c\") " pod="openstack/swift-proxy-688b4ff469-2cgjc" Sep 30 06:37:18 crc kubenswrapper[4691]: I0930 06:37:18.023564 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4e270ac-98e8-47b9-bf7b-7492996aa18c-internal-tls-certs\") pod \"swift-proxy-688b4ff469-2cgjc\" (UID: \"e4e270ac-98e8-47b9-bf7b-7492996aa18c\") " pod="openstack/swift-proxy-688b4ff469-2cgjc" Sep 30 06:37:18 crc kubenswrapper[4691]: I0930 06:37:18.023618 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4e270ac-98e8-47b9-bf7b-7492996aa18c-run-httpd\") pod \"swift-proxy-688b4ff469-2cgjc\" (UID: \"e4e270ac-98e8-47b9-bf7b-7492996aa18c\") " pod="openstack/swift-proxy-688b4ff469-2cgjc" Sep 30 06:37:18 crc kubenswrapper[4691]: I0930 06:37:18.023648 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4e270ac-98e8-47b9-bf7b-7492996aa18c-combined-ca-bundle\") pod \"swift-proxy-688b4ff469-2cgjc\" (UID: \"e4e270ac-98e8-47b9-bf7b-7492996aa18c\") " pod="openstack/swift-proxy-688b4ff469-2cgjc" Sep 30 06:37:18 crc kubenswrapper[4691]: I0930 06:37:18.023717 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-th55q\" (UniqueName: \"kubernetes.io/projected/e4e270ac-98e8-47b9-bf7b-7492996aa18c-kube-api-access-th55q\") pod \"swift-proxy-688b4ff469-2cgjc\" (UID: \"e4e270ac-98e8-47b9-bf7b-7492996aa18c\") " pod="openstack/swift-proxy-688b4ff469-2cgjc" Sep 30 06:37:18 crc kubenswrapper[4691]: I0930 06:37:18.023751 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4e270ac-98e8-47b9-bf7b-7492996aa18c-config-data\") pod \"swift-proxy-688b4ff469-2cgjc\" (UID: \"e4e270ac-98e8-47b9-bf7b-7492996aa18c\") " pod="openstack/swift-proxy-688b4ff469-2cgjc" Sep 30 06:37:18 crc kubenswrapper[4691]: I0930 06:37:18.023764 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4e270ac-98e8-47b9-bf7b-7492996aa18c-log-httpd\") pod \"swift-proxy-688b4ff469-2cgjc\" (UID: \"e4e270ac-98e8-47b9-bf7b-7492996aa18c\") " pod="openstack/swift-proxy-688b4ff469-2cgjc" Sep 30 06:37:18 crc kubenswrapper[4691]: I0930 06:37:18.023786 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4e270ac-98e8-47b9-bf7b-7492996aa18c-public-tls-certs\") pod \"swift-proxy-688b4ff469-2cgjc\" (UID: \"e4e270ac-98e8-47b9-bf7b-7492996aa18c\") " pod="openstack/swift-proxy-688b4ff469-2cgjc" Sep 30 06:37:18 crc kubenswrapper[4691]: I0930 06:37:18.088235 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-67959b9db8-p8w6w"] Sep 30 06:37:18 crc kubenswrapper[4691]: I0930 06:37:18.096723 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="faa07cbe-8687-4be2-8757-8e730fccb6bb" containerName="cinder-api-log" containerID="cri-o://6afdf23a85746e8816a73dc3776966f7353659e9415744704f9fc5c1f961ffc0" gracePeriod=30 Sep 30 06:37:18 crc kubenswrapper[4691]: I0930 06:37:18.097151 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="faa07cbe-8687-4be2-8757-8e730fccb6bb" containerName="cinder-api" containerID="cri-o://4b5150a2721319b421cd3bc5176d9411074b6febc4ead3606b3a6efedbea3587" gracePeriod=30 Sep 30 06:37:18 crc kubenswrapper[4691]: I0930 06:37:18.097596 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Sep 30 06:37:18 crc kubenswrapper[4691]: I0930 06:37:18.097732 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0dd335f3-da02-4e62-a202-670de97399a9","Type":"ContainerStarted","Data":"d4acda655110d6831a69dda83e4b9168c76994949041e5c7a6a52d49c8340068"} Sep 30 06:37:18 crc kubenswrapper[4691]: I0930 06:37:18.097831 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"faa07cbe-8687-4be2-8757-8e730fccb6bb","Type":"ContainerStarted","Data":"4b5150a2721319b421cd3bc5176d9411074b6febc4ead3606b3a6efedbea3587"} Sep 30 06:37:18 crc kubenswrapper[4691]: I0930 06:37:18.098117 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-67959b9db8-p8w6w" podUID="cbfa4b8e-37ac-4713-a8af-b610f086f7e2" containerName="barbican-api-log" containerID="cri-o://ef517aaa6758085874127d7c6326aed742a754e9537fde8af3ab95d6767d0e04" gracePeriod=30 Sep 30 06:37:18 crc kubenswrapper[4691]: I0930 06:37:18.098289 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-67959b9db8-p8w6w" podUID="cbfa4b8e-37ac-4713-a8af-b610f086f7e2" containerName="barbican-api" containerID="cri-o://b6752c80f69833593d27c5ffb66ab3315ee613a1f95e62af78e01a7a5d3dd421" gracePeriod=30 Sep 30 06:37:18 crc kubenswrapper[4691]: I0930 06:37:18.119000 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-67959b9db8-p8w6w" podUID="cbfa4b8e-37ac-4713-a8af-b610f086f7e2" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.178:9311/healthcheck\": EOF" Sep 30 06:37:18 crc kubenswrapper[4691]: I0930 06:37:18.119145 4691 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-67959b9db8-p8w6w" podUID="cbfa4b8e-37ac-4713-a8af-b610f086f7e2" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.178:9311/healthcheck\": EOF" Sep 30 06:37:18 crc kubenswrapper[4691]: I0930 06:37:18.121169 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/barbican-api-67959b9db8-p8w6w" podUID="cbfa4b8e-37ac-4713-a8af-b610f086f7e2" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.178:9311/healthcheck\": EOF" Sep 30 06:37:18 crc kubenswrapper[4691]: I0930 06:37:18.125259 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-th55q\" (UniqueName: \"kubernetes.io/projected/e4e270ac-98e8-47b9-bf7b-7492996aa18c-kube-api-access-th55q\") pod \"swift-proxy-688b4ff469-2cgjc\" (UID: \"e4e270ac-98e8-47b9-bf7b-7492996aa18c\") " pod="openstack/swift-proxy-688b4ff469-2cgjc" Sep 30 06:37:18 crc kubenswrapper[4691]: I0930 06:37:18.125315 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4e270ac-98e8-47b9-bf7b-7492996aa18c-config-data\") pod \"swift-proxy-688b4ff469-2cgjc\" (UID: \"e4e270ac-98e8-47b9-bf7b-7492996aa18c\") " pod="openstack/swift-proxy-688b4ff469-2cgjc" Sep 30 06:37:18 crc kubenswrapper[4691]: I0930 06:37:18.125333 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4e270ac-98e8-47b9-bf7b-7492996aa18c-log-httpd\") pod \"swift-proxy-688b4ff469-2cgjc\" (UID: \"e4e270ac-98e8-47b9-bf7b-7492996aa18c\") " pod="openstack/swift-proxy-688b4ff469-2cgjc" Sep 30 06:37:18 crc kubenswrapper[4691]: I0930 06:37:18.125360 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4e270ac-98e8-47b9-bf7b-7492996aa18c-public-tls-certs\") pod \"swift-proxy-688b4ff469-2cgjc\" (UID: \"e4e270ac-98e8-47b9-bf7b-7492996aa18c\") " pod="openstack/swift-proxy-688b4ff469-2cgjc" Sep 30 06:37:18 crc kubenswrapper[4691]: I0930 06:37:18.125386 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e4e270ac-98e8-47b9-bf7b-7492996aa18c-etc-swift\") pod \"swift-proxy-688b4ff469-2cgjc\" (UID: \"e4e270ac-98e8-47b9-bf7b-7492996aa18c\") " pod="openstack/swift-proxy-688b4ff469-2cgjc" Sep 30 06:37:18 crc kubenswrapper[4691]: I0930 06:37:18.125431 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4e270ac-98e8-47b9-bf7b-7492996aa18c-internal-tls-certs\") pod \"swift-proxy-688b4ff469-2cgjc\" (UID: \"e4e270ac-98e8-47b9-bf7b-7492996aa18c\") " pod="openstack/swift-proxy-688b4ff469-2cgjc" Sep 30 06:37:18 crc kubenswrapper[4691]: I0930 06:37:18.125484 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4e270ac-98e8-47b9-bf7b-7492996aa18c-run-httpd\") pod \"swift-proxy-688b4ff469-2cgjc\" (UID: \"e4e270ac-98e8-47b9-bf7b-7492996aa18c\") " pod="openstack/swift-proxy-688b4ff469-2cgjc" Sep 30 06:37:18 crc kubenswrapper[4691]: I0930 06:37:18.125517 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4e270ac-98e8-47b9-bf7b-7492996aa18c-combined-ca-bundle\") pod \"swift-proxy-688b4ff469-2cgjc\" (UID: \"e4e270ac-98e8-47b9-bf7b-7492996aa18c\") " pod="openstack/swift-proxy-688b4ff469-2cgjc" Sep 30 06:37:18 crc kubenswrapper[4691]: I0930 06:37:18.126972 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.774496261 podStartE2EDuration="4.126955213s" podCreationTimestamp="2025-09-30 06:37:14 +0000 UTC" firstStartedPulling="2025-09-30 06:37:15.434005604 +0000 UTC m=+1078.909026644" lastFinishedPulling="2025-09-30 06:37:15.786464556 +0000 UTC m=+1079.261485596" observedRunningTime="2025-09-30 06:37:18.121387355 +0000 UTC m=+1081.596408395" watchObservedRunningTime="2025-09-30 06:37:18.126955213 +0000 UTC m=+1081.601976253" Sep 30 06:37:18 crc kubenswrapper[4691]: I0930 06:37:18.128724 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4e270ac-98e8-47b9-bf7b-7492996aa18c-log-httpd\") pod \"swift-proxy-688b4ff469-2cgjc\" (UID: \"e4e270ac-98e8-47b9-bf7b-7492996aa18c\") " pod="openstack/swift-proxy-688b4ff469-2cgjc" Sep 30 06:37:18 crc kubenswrapper[4691]: I0930 06:37:18.130021 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e4e270ac-98e8-47b9-bf7b-7492996aa18c-run-httpd\") pod \"swift-proxy-688b4ff469-2cgjc\" (UID: \"e4e270ac-98e8-47b9-bf7b-7492996aa18c\") " pod="openstack/swift-proxy-688b4ff469-2cgjc" Sep 30 06:37:18 crc kubenswrapper[4691]: I0930 06:37:18.137217 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/e4e270ac-98e8-47b9-bf7b-7492996aa18c-etc-swift\") pod \"swift-proxy-688b4ff469-2cgjc\" (UID: \"e4e270ac-98e8-47b9-bf7b-7492996aa18c\") " pod="openstack/swift-proxy-688b4ff469-2cgjc" Sep 30 06:37:18 crc kubenswrapper[4691]: I0930 06:37:18.144138 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4e270ac-98e8-47b9-bf7b-7492996aa18c-public-tls-certs\") pod \"swift-proxy-688b4ff469-2cgjc\" (UID: \"e4e270ac-98e8-47b9-bf7b-7492996aa18c\") " pod="openstack/swift-proxy-688b4ff469-2cgjc" Sep 30 06:37:18 crc kubenswrapper[4691]: I0930 06:37:18.148683 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e4e270ac-98e8-47b9-bf7b-7492996aa18c-internal-tls-certs\") pod \"swift-proxy-688b4ff469-2cgjc\" (UID: \"e4e270ac-98e8-47b9-bf7b-7492996aa18c\") " pod="openstack/swift-proxy-688b4ff469-2cgjc" Sep 30 06:37:18 crc kubenswrapper[4691]: I0930 06:37:18.159216 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e4e270ac-98e8-47b9-bf7b-7492996aa18c-config-data\") pod \"swift-proxy-688b4ff469-2cgjc\" (UID: \"e4e270ac-98e8-47b9-bf7b-7492996aa18c\") " pod="openstack/swift-proxy-688b4ff469-2cgjc" Sep 30 06:37:18 crc kubenswrapper[4691]: I0930 06:37:18.166049 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4e270ac-98e8-47b9-bf7b-7492996aa18c-combined-ca-bundle\") pod \"swift-proxy-688b4ff469-2cgjc\" (UID: \"e4e270ac-98e8-47b9-bf7b-7492996aa18c\") " pod="openstack/swift-proxy-688b4ff469-2cgjc" Sep 30 06:37:18 crc kubenswrapper[4691]: I0930 06:37:18.166146 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=4.16612764 podStartE2EDuration="4.16612764s" podCreationTimestamp="2025-09-30 06:37:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:37:18.147882204 +0000 UTC m=+1081.622903244" watchObservedRunningTime="2025-09-30 06:37:18.16612764 +0000 UTC m=+1081.641148680" Sep 30 06:37:18 crc kubenswrapper[4691]: I0930 06:37:18.175587 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-th55q\" (UniqueName: \"kubernetes.io/projected/e4e270ac-98e8-47b9-bf7b-7492996aa18c-kube-api-access-th55q\") pod \"swift-proxy-688b4ff469-2cgjc\" (UID: \"e4e270ac-98e8-47b9-bf7b-7492996aa18c\") " pod="openstack/swift-proxy-688b4ff469-2cgjc" Sep 30 06:37:18 crc kubenswrapper[4691]: I0930 06:37:18.227189 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-688b4ff469-2cgjc" Sep 30 06:37:19 crc kubenswrapper[4691]: I0930 06:37:19.105593 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 06:37:19 crc kubenswrapper[4691]: I0930 06:37:19.106233 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="161707ca-6070-49ab-ab57-896ef94e4d83" containerName="ceilometer-central-agent" containerID="cri-o://d62e58cae8a8bd7109f62fddbca91c1ee5314dab43689f8bc58f3f2e599d9090" gracePeriod=30 Sep 30 06:37:19 crc kubenswrapper[4691]: I0930 06:37:19.106606 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="161707ca-6070-49ab-ab57-896ef94e4d83" containerName="proxy-httpd" containerID="cri-o://1ad5b67aebaa793bf0bcd7f48d2e870ccc1e842d3ac14779b24c3da1ab463138" gracePeriod=30 Sep 30 06:37:19 crc kubenswrapper[4691]: I0930 06:37:19.106654 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="161707ca-6070-49ab-ab57-896ef94e4d83" containerName="sg-core" containerID="cri-o://60b9438b6446b0d09bd8a0c5790c7ecedc34cda8676c05ebfe940b14407da591" gracePeriod=30 Sep 30 06:37:19 crc kubenswrapper[4691]: I0930 06:37:19.106684 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="161707ca-6070-49ab-ab57-896ef94e4d83" containerName="ceilometer-notification-agent" containerID="cri-o://14c17752051a8cbb66057b5df10ffc1ffb6f8a37529dde585276ce97faaea3b7" gracePeriod=30 Sep 30 06:37:19 crc kubenswrapper[4691]: I0930 06:37:19.126572 4691 generic.go:334] "Generic (PLEG): container finished" podID="faa07cbe-8687-4be2-8757-8e730fccb6bb" containerID="6afdf23a85746e8816a73dc3776966f7353659e9415744704f9fc5c1f961ffc0" exitCode=143 Sep 30 06:37:19 crc kubenswrapper[4691]: I0930 06:37:19.126692 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"faa07cbe-8687-4be2-8757-8e730fccb6bb","Type":"ContainerDied","Data":"6afdf23a85746e8816a73dc3776966f7353659e9415744704f9fc5c1f961ffc0"} Sep 30 06:37:19 crc kubenswrapper[4691]: I0930 06:37:19.139318 4691 generic.go:334] "Generic (PLEG): container finished" podID="cbfa4b8e-37ac-4713-a8af-b610f086f7e2" containerID="ef517aaa6758085874127d7c6326aed742a754e9537fde8af3ab95d6767d0e04" exitCode=143 Sep 30 06:37:19 crc kubenswrapper[4691]: I0930 06:37:19.139398 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-67959b9db8-p8w6w" event={"ID":"cbfa4b8e-37ac-4713-a8af-b610f086f7e2","Type":"ContainerDied","Data":"ef517aaa6758085874127d7c6326aed742a754e9537fde8af3ab95d6767d0e04"} Sep 30 06:37:19 crc kubenswrapper[4691]: I0930 06:37:19.169055 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-688b4ff469-2cgjc"] Sep 30 06:37:19 crc kubenswrapper[4691]: I0930 06:37:19.597196 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.155984 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-688b4ff469-2cgjc" event={"ID":"e4e270ac-98e8-47b9-bf7b-7492996aa18c","Type":"ContainerStarted","Data":"b48003aeec000b63e708231241af6220cd3011ca4fa635753adb44514d3116ed"} Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.156251 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-688b4ff469-2cgjc" event={"ID":"e4e270ac-98e8-47b9-bf7b-7492996aa18c","Type":"ContainerStarted","Data":"a22e33b26f49821d7b4f72640a8970d9cc615fb2150001133d25d0114b20b322"} Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.156262 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-688b4ff469-2cgjc" event={"ID":"e4e270ac-98e8-47b9-bf7b-7492996aa18c","Type":"ContainerStarted","Data":"9b5d963805a17af15681591ddac7ade8030963073c706ae70c6650748d100664"} Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.157606 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-688b4ff469-2cgjc" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.157640 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-688b4ff469-2cgjc" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.158400 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.173630 4691 generic.go:334] "Generic (PLEG): container finished" podID="cc29be35-3ceb-4a88-af6e-77e2d0cbab83" containerID="b433c5a51bea7ed49e49b10d76eaae08ac7b93bc283b6448ac328bf46caa578c" exitCode=137 Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.173689 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5755bf4df8-zx9td" event={"ID":"cc29be35-3ceb-4a88-af6e-77e2d0cbab83","Type":"ContainerDied","Data":"b433c5a51bea7ed49e49b10d76eaae08ac7b93bc283b6448ac328bf46caa578c"} Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.180720 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-688b4ff469-2cgjc" podStartSLOduration=3.180708297 podStartE2EDuration="3.180708297s" podCreationTimestamp="2025-09-30 06:37:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:37:20.178154435 +0000 UTC m=+1083.653175485" watchObservedRunningTime="2025-09-30 06:37:20.180708297 +0000 UTC m=+1083.655729337" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.196301 4691 generic.go:334] "Generic (PLEG): container finished" podID="161707ca-6070-49ab-ab57-896ef94e4d83" containerID="1ad5b67aebaa793bf0bcd7f48d2e870ccc1e842d3ac14779b24c3da1ab463138" exitCode=0 Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.196330 4691 generic.go:334] "Generic (PLEG): container finished" podID="161707ca-6070-49ab-ab57-896ef94e4d83" containerID="60b9438b6446b0d09bd8a0c5790c7ecedc34cda8676c05ebfe940b14407da591" exitCode=2 Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.196338 4691 generic.go:334] "Generic (PLEG): container finished" podID="161707ca-6070-49ab-ab57-896ef94e4d83" containerID="14c17752051a8cbb66057b5df10ffc1ffb6f8a37529dde585276ce97faaea3b7" exitCode=0 Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.196344 4691 generic.go:334] "Generic (PLEG): container finished" podID="161707ca-6070-49ab-ab57-896ef94e4d83" containerID="d62e58cae8a8bd7109f62fddbca91c1ee5314dab43689f8bc58f3f2e599d9090" exitCode=0 Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.197205 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.197404 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"161707ca-6070-49ab-ab57-896ef94e4d83","Type":"ContainerDied","Data":"1ad5b67aebaa793bf0bcd7f48d2e870ccc1e842d3ac14779b24c3da1ab463138"} Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.197431 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"161707ca-6070-49ab-ab57-896ef94e4d83","Type":"ContainerDied","Data":"60b9438b6446b0d09bd8a0c5790c7ecedc34cda8676c05ebfe940b14407da591"} Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.197442 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"161707ca-6070-49ab-ab57-896ef94e4d83","Type":"ContainerDied","Data":"14c17752051a8cbb66057b5df10ffc1ffb6f8a37529dde585276ce97faaea3b7"} Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.197453 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"161707ca-6070-49ab-ab57-896ef94e4d83","Type":"ContainerDied","Data":"d62e58cae8a8bd7109f62fddbca91c1ee5314dab43689f8bc58f3f2e599d9090"} Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.197461 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"161707ca-6070-49ab-ab57-896ef94e4d83","Type":"ContainerDied","Data":"4b9869bc4324750ceb1478c77b7c981d7157202fb6b6250c5d64997653999b42"} Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.197474 4691 scope.go:117] "RemoveContainer" containerID="1ad5b67aebaa793bf0bcd7f48d2e870ccc1e842d3ac14779b24c3da1ab463138" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.275402 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/161707ca-6070-49ab-ab57-896ef94e4d83-log-httpd\") pod \"161707ca-6070-49ab-ab57-896ef94e4d83\" (UID: \"161707ca-6070-49ab-ab57-896ef94e4d83\") " Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.275449 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fpc2f\" (UniqueName: \"kubernetes.io/projected/161707ca-6070-49ab-ab57-896ef94e4d83-kube-api-access-fpc2f\") pod \"161707ca-6070-49ab-ab57-896ef94e4d83\" (UID: \"161707ca-6070-49ab-ab57-896ef94e4d83\") " Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.275500 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/161707ca-6070-49ab-ab57-896ef94e4d83-run-httpd\") pod \"161707ca-6070-49ab-ab57-896ef94e4d83\" (UID: \"161707ca-6070-49ab-ab57-896ef94e4d83\") " Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.275661 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/161707ca-6070-49ab-ab57-896ef94e4d83-scripts\") pod \"161707ca-6070-49ab-ab57-896ef94e4d83\" (UID: \"161707ca-6070-49ab-ab57-896ef94e4d83\") " Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.275721 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/161707ca-6070-49ab-ab57-896ef94e4d83-sg-core-conf-yaml\") pod \"161707ca-6070-49ab-ab57-896ef94e4d83\" (UID: \"161707ca-6070-49ab-ab57-896ef94e4d83\") " Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.275746 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/161707ca-6070-49ab-ab57-896ef94e4d83-combined-ca-bundle\") pod \"161707ca-6070-49ab-ab57-896ef94e4d83\" (UID: \"161707ca-6070-49ab-ab57-896ef94e4d83\") " Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.275822 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/161707ca-6070-49ab-ab57-896ef94e4d83-config-data\") pod \"161707ca-6070-49ab-ab57-896ef94e4d83\" (UID: \"161707ca-6070-49ab-ab57-896ef94e4d83\") " Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.277614 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/161707ca-6070-49ab-ab57-896ef94e4d83-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "161707ca-6070-49ab-ab57-896ef94e4d83" (UID: "161707ca-6070-49ab-ab57-896ef94e4d83"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.279912 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/161707ca-6070-49ab-ab57-896ef94e4d83-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "161707ca-6070-49ab-ab57-896ef94e4d83" (UID: "161707ca-6070-49ab-ab57-896ef94e4d83"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.289394 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/161707ca-6070-49ab-ab57-896ef94e4d83-scripts" (OuterVolumeSpecName: "scripts") pod "161707ca-6070-49ab-ab57-896ef94e4d83" (UID: "161707ca-6070-49ab-ab57-896ef94e4d83"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.289544 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/161707ca-6070-49ab-ab57-896ef94e4d83-kube-api-access-fpc2f" (OuterVolumeSpecName: "kube-api-access-fpc2f") pod "161707ca-6070-49ab-ab57-896ef94e4d83" (UID: "161707ca-6070-49ab-ab57-896ef94e4d83"). InnerVolumeSpecName "kube-api-access-fpc2f". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.318045 4691 scope.go:117] "RemoveContainer" containerID="60b9438b6446b0d09bd8a0c5790c7ecedc34cda8676c05ebfe940b14407da591" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.361470 4691 scope.go:117] "RemoveContainer" containerID="14c17752051a8cbb66057b5df10ffc1ffb6f8a37529dde585276ce97faaea3b7" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.374120 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/161707ca-6070-49ab-ab57-896ef94e4d83-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "161707ca-6070-49ab-ab57-896ef94e4d83" (UID: "161707ca-6070-49ab-ab57-896ef94e4d83"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.377940 4691 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/161707ca-6070-49ab-ab57-896ef94e4d83-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.377965 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fpc2f\" (UniqueName: \"kubernetes.io/projected/161707ca-6070-49ab-ab57-896ef94e4d83-kube-api-access-fpc2f\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.377974 4691 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/161707ca-6070-49ab-ab57-896ef94e4d83-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.377982 4691 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/161707ca-6070-49ab-ab57-896ef94e4d83-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.377991 4691 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/161707ca-6070-49ab-ab57-896ef94e4d83-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.378816 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5755bf4df8-zx9td" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.392084 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/161707ca-6070-49ab-ab57-896ef94e4d83-config-data" (OuterVolumeSpecName: "config-data") pod "161707ca-6070-49ab-ab57-896ef94e4d83" (UID: "161707ca-6070-49ab-ab57-896ef94e4d83"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.401829 4691 scope.go:117] "RemoveContainer" containerID="d62e58cae8a8bd7109f62fddbca91c1ee5314dab43689f8bc58f3f2e599d9090" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.432851 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/161707ca-6070-49ab-ab57-896ef94e4d83-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "161707ca-6070-49ab-ab57-896ef94e4d83" (UID: "161707ca-6070-49ab-ab57-896ef94e4d83"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.438400 4691 scope.go:117] "RemoveContainer" containerID="1ad5b67aebaa793bf0bcd7f48d2e870ccc1e842d3ac14779b24c3da1ab463138" Sep 30 06:37:20 crc kubenswrapper[4691]: E0930 06:37:20.438801 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ad5b67aebaa793bf0bcd7f48d2e870ccc1e842d3ac14779b24c3da1ab463138\": container with ID starting with 1ad5b67aebaa793bf0bcd7f48d2e870ccc1e842d3ac14779b24c3da1ab463138 not found: ID does not exist" containerID="1ad5b67aebaa793bf0bcd7f48d2e870ccc1e842d3ac14779b24c3da1ab463138" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.438827 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ad5b67aebaa793bf0bcd7f48d2e870ccc1e842d3ac14779b24c3da1ab463138"} err="failed to get container status \"1ad5b67aebaa793bf0bcd7f48d2e870ccc1e842d3ac14779b24c3da1ab463138\": rpc error: code = NotFound desc = could not find container \"1ad5b67aebaa793bf0bcd7f48d2e870ccc1e842d3ac14779b24c3da1ab463138\": container with ID starting with 1ad5b67aebaa793bf0bcd7f48d2e870ccc1e842d3ac14779b24c3da1ab463138 not found: ID does not exist" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.438846 4691 scope.go:117] "RemoveContainer" containerID="60b9438b6446b0d09bd8a0c5790c7ecedc34cda8676c05ebfe940b14407da591" Sep 30 06:37:20 crc kubenswrapper[4691]: E0930 06:37:20.439270 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60b9438b6446b0d09bd8a0c5790c7ecedc34cda8676c05ebfe940b14407da591\": container with ID starting with 60b9438b6446b0d09bd8a0c5790c7ecedc34cda8676c05ebfe940b14407da591 not found: ID does not exist" containerID="60b9438b6446b0d09bd8a0c5790c7ecedc34cda8676c05ebfe940b14407da591" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.439291 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60b9438b6446b0d09bd8a0c5790c7ecedc34cda8676c05ebfe940b14407da591"} err="failed to get container status \"60b9438b6446b0d09bd8a0c5790c7ecedc34cda8676c05ebfe940b14407da591\": rpc error: code = NotFound desc = could not find container \"60b9438b6446b0d09bd8a0c5790c7ecedc34cda8676c05ebfe940b14407da591\": container with ID starting with 60b9438b6446b0d09bd8a0c5790c7ecedc34cda8676c05ebfe940b14407da591 not found: ID does not exist" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.439302 4691 scope.go:117] "RemoveContainer" containerID="14c17752051a8cbb66057b5df10ffc1ffb6f8a37529dde585276ce97faaea3b7" Sep 30 06:37:20 crc kubenswrapper[4691]: E0930 06:37:20.439545 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14c17752051a8cbb66057b5df10ffc1ffb6f8a37529dde585276ce97faaea3b7\": container with ID starting with 14c17752051a8cbb66057b5df10ffc1ffb6f8a37529dde585276ce97faaea3b7 not found: ID does not exist" containerID="14c17752051a8cbb66057b5df10ffc1ffb6f8a37529dde585276ce97faaea3b7" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.439566 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14c17752051a8cbb66057b5df10ffc1ffb6f8a37529dde585276ce97faaea3b7"} err="failed to get container status \"14c17752051a8cbb66057b5df10ffc1ffb6f8a37529dde585276ce97faaea3b7\": rpc error: code = NotFound desc = could not find container \"14c17752051a8cbb66057b5df10ffc1ffb6f8a37529dde585276ce97faaea3b7\": container with ID starting with 14c17752051a8cbb66057b5df10ffc1ffb6f8a37529dde585276ce97faaea3b7 not found: ID does not exist" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.439578 4691 scope.go:117] "RemoveContainer" containerID="d62e58cae8a8bd7109f62fddbca91c1ee5314dab43689f8bc58f3f2e599d9090" Sep 30 06:37:20 crc kubenswrapper[4691]: E0930 06:37:20.439788 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d62e58cae8a8bd7109f62fddbca91c1ee5314dab43689f8bc58f3f2e599d9090\": container with ID starting with d62e58cae8a8bd7109f62fddbca91c1ee5314dab43689f8bc58f3f2e599d9090 not found: ID does not exist" containerID="d62e58cae8a8bd7109f62fddbca91c1ee5314dab43689f8bc58f3f2e599d9090" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.439806 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d62e58cae8a8bd7109f62fddbca91c1ee5314dab43689f8bc58f3f2e599d9090"} err="failed to get container status \"d62e58cae8a8bd7109f62fddbca91c1ee5314dab43689f8bc58f3f2e599d9090\": rpc error: code = NotFound desc = could not find container \"d62e58cae8a8bd7109f62fddbca91c1ee5314dab43689f8bc58f3f2e599d9090\": container with ID starting with d62e58cae8a8bd7109f62fddbca91c1ee5314dab43689f8bc58f3f2e599d9090 not found: ID does not exist" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.439817 4691 scope.go:117] "RemoveContainer" containerID="1ad5b67aebaa793bf0bcd7f48d2e870ccc1e842d3ac14779b24c3da1ab463138" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.440033 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ad5b67aebaa793bf0bcd7f48d2e870ccc1e842d3ac14779b24c3da1ab463138"} err="failed to get container status \"1ad5b67aebaa793bf0bcd7f48d2e870ccc1e842d3ac14779b24c3da1ab463138\": rpc error: code = NotFound desc = could not find container \"1ad5b67aebaa793bf0bcd7f48d2e870ccc1e842d3ac14779b24c3da1ab463138\": container with ID starting with 1ad5b67aebaa793bf0bcd7f48d2e870ccc1e842d3ac14779b24c3da1ab463138 not found: ID does not exist" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.440048 4691 scope.go:117] "RemoveContainer" containerID="60b9438b6446b0d09bd8a0c5790c7ecedc34cda8676c05ebfe940b14407da591" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.440248 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60b9438b6446b0d09bd8a0c5790c7ecedc34cda8676c05ebfe940b14407da591"} err="failed to get container status \"60b9438b6446b0d09bd8a0c5790c7ecedc34cda8676c05ebfe940b14407da591\": rpc error: code = NotFound desc = could not find container \"60b9438b6446b0d09bd8a0c5790c7ecedc34cda8676c05ebfe940b14407da591\": container with ID starting with 60b9438b6446b0d09bd8a0c5790c7ecedc34cda8676c05ebfe940b14407da591 not found: ID does not exist" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.440263 4691 scope.go:117] "RemoveContainer" containerID="14c17752051a8cbb66057b5df10ffc1ffb6f8a37529dde585276ce97faaea3b7" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.440441 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14c17752051a8cbb66057b5df10ffc1ffb6f8a37529dde585276ce97faaea3b7"} err="failed to get container status \"14c17752051a8cbb66057b5df10ffc1ffb6f8a37529dde585276ce97faaea3b7\": rpc error: code = NotFound desc = could not find container \"14c17752051a8cbb66057b5df10ffc1ffb6f8a37529dde585276ce97faaea3b7\": container with ID starting with 14c17752051a8cbb66057b5df10ffc1ffb6f8a37529dde585276ce97faaea3b7 not found: ID does not exist" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.440456 4691 scope.go:117] "RemoveContainer" containerID="d62e58cae8a8bd7109f62fddbca91c1ee5314dab43689f8bc58f3f2e599d9090" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.440624 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d62e58cae8a8bd7109f62fddbca91c1ee5314dab43689f8bc58f3f2e599d9090"} err="failed to get container status \"d62e58cae8a8bd7109f62fddbca91c1ee5314dab43689f8bc58f3f2e599d9090\": rpc error: code = NotFound desc = could not find container \"d62e58cae8a8bd7109f62fddbca91c1ee5314dab43689f8bc58f3f2e599d9090\": container with ID starting with d62e58cae8a8bd7109f62fddbca91c1ee5314dab43689f8bc58f3f2e599d9090 not found: ID does not exist" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.440685 4691 scope.go:117] "RemoveContainer" containerID="1ad5b67aebaa793bf0bcd7f48d2e870ccc1e842d3ac14779b24c3da1ab463138" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.441095 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ad5b67aebaa793bf0bcd7f48d2e870ccc1e842d3ac14779b24c3da1ab463138"} err="failed to get container status \"1ad5b67aebaa793bf0bcd7f48d2e870ccc1e842d3ac14779b24c3da1ab463138\": rpc error: code = NotFound desc = could not find container \"1ad5b67aebaa793bf0bcd7f48d2e870ccc1e842d3ac14779b24c3da1ab463138\": container with ID starting with 1ad5b67aebaa793bf0bcd7f48d2e870ccc1e842d3ac14779b24c3da1ab463138 not found: ID does not exist" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.441164 4691 scope.go:117] "RemoveContainer" containerID="60b9438b6446b0d09bd8a0c5790c7ecedc34cda8676c05ebfe940b14407da591" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.441637 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60b9438b6446b0d09bd8a0c5790c7ecedc34cda8676c05ebfe940b14407da591"} err="failed to get container status \"60b9438b6446b0d09bd8a0c5790c7ecedc34cda8676c05ebfe940b14407da591\": rpc error: code = NotFound desc = could not find container \"60b9438b6446b0d09bd8a0c5790c7ecedc34cda8676c05ebfe940b14407da591\": container with ID starting with 60b9438b6446b0d09bd8a0c5790c7ecedc34cda8676c05ebfe940b14407da591 not found: ID does not exist" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.441703 4691 scope.go:117] "RemoveContainer" containerID="14c17752051a8cbb66057b5df10ffc1ffb6f8a37529dde585276ce97faaea3b7" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.441985 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14c17752051a8cbb66057b5df10ffc1ffb6f8a37529dde585276ce97faaea3b7"} err="failed to get container status \"14c17752051a8cbb66057b5df10ffc1ffb6f8a37529dde585276ce97faaea3b7\": rpc error: code = NotFound desc = could not find container \"14c17752051a8cbb66057b5df10ffc1ffb6f8a37529dde585276ce97faaea3b7\": container with ID starting with 14c17752051a8cbb66057b5df10ffc1ffb6f8a37529dde585276ce97faaea3b7 not found: ID does not exist" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.442065 4691 scope.go:117] "RemoveContainer" containerID="d62e58cae8a8bd7109f62fddbca91c1ee5314dab43689f8bc58f3f2e599d9090" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.442443 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d62e58cae8a8bd7109f62fddbca91c1ee5314dab43689f8bc58f3f2e599d9090"} err="failed to get container status \"d62e58cae8a8bd7109f62fddbca91c1ee5314dab43689f8bc58f3f2e599d9090\": rpc error: code = NotFound desc = could not find container \"d62e58cae8a8bd7109f62fddbca91c1ee5314dab43689f8bc58f3f2e599d9090\": container with ID starting with d62e58cae8a8bd7109f62fddbca91c1ee5314dab43689f8bc58f3f2e599d9090 not found: ID does not exist" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.442520 4691 scope.go:117] "RemoveContainer" containerID="1ad5b67aebaa793bf0bcd7f48d2e870ccc1e842d3ac14779b24c3da1ab463138" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.442778 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ad5b67aebaa793bf0bcd7f48d2e870ccc1e842d3ac14779b24c3da1ab463138"} err="failed to get container status \"1ad5b67aebaa793bf0bcd7f48d2e870ccc1e842d3ac14779b24c3da1ab463138\": rpc error: code = NotFound desc = could not find container \"1ad5b67aebaa793bf0bcd7f48d2e870ccc1e842d3ac14779b24c3da1ab463138\": container with ID starting with 1ad5b67aebaa793bf0bcd7f48d2e870ccc1e842d3ac14779b24c3da1ab463138 not found: ID does not exist" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.442850 4691 scope.go:117] "RemoveContainer" containerID="60b9438b6446b0d09bd8a0c5790c7ecedc34cda8676c05ebfe940b14407da591" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.443142 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60b9438b6446b0d09bd8a0c5790c7ecedc34cda8676c05ebfe940b14407da591"} err="failed to get container status \"60b9438b6446b0d09bd8a0c5790c7ecedc34cda8676c05ebfe940b14407da591\": rpc error: code = NotFound desc = could not find container \"60b9438b6446b0d09bd8a0c5790c7ecedc34cda8676c05ebfe940b14407da591\": container with ID starting with 60b9438b6446b0d09bd8a0c5790c7ecedc34cda8676c05ebfe940b14407da591 not found: ID does not exist" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.443275 4691 scope.go:117] "RemoveContainer" containerID="14c17752051a8cbb66057b5df10ffc1ffb6f8a37529dde585276ce97faaea3b7" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.443548 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14c17752051a8cbb66057b5df10ffc1ffb6f8a37529dde585276ce97faaea3b7"} err="failed to get container status \"14c17752051a8cbb66057b5df10ffc1ffb6f8a37529dde585276ce97faaea3b7\": rpc error: code = NotFound desc = could not find container \"14c17752051a8cbb66057b5df10ffc1ffb6f8a37529dde585276ce97faaea3b7\": container with ID starting with 14c17752051a8cbb66057b5df10ffc1ffb6f8a37529dde585276ce97faaea3b7 not found: ID does not exist" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.443658 4691 scope.go:117] "RemoveContainer" containerID="d62e58cae8a8bd7109f62fddbca91c1ee5314dab43689f8bc58f3f2e599d9090" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.444037 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d62e58cae8a8bd7109f62fddbca91c1ee5314dab43689f8bc58f3f2e599d9090"} err="failed to get container status \"d62e58cae8a8bd7109f62fddbca91c1ee5314dab43689f8bc58f3f2e599d9090\": rpc error: code = NotFound desc = could not find container \"d62e58cae8a8bd7109f62fddbca91c1ee5314dab43689f8bc58f3f2e599d9090\": container with ID starting with d62e58cae8a8bd7109f62fddbca91c1ee5314dab43689f8bc58f3f2e599d9090 not found: ID does not exist" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.479154 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/cc29be35-3ceb-4a88-af6e-77e2d0cbab83-horizon-secret-key\") pod \"cc29be35-3ceb-4a88-af6e-77e2d0cbab83\" (UID: \"cc29be35-3ceb-4a88-af6e-77e2d0cbab83\") " Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.479206 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cc29be35-3ceb-4a88-af6e-77e2d0cbab83-config-data\") pod \"cc29be35-3ceb-4a88-af6e-77e2d0cbab83\" (UID: \"cc29be35-3ceb-4a88-af6e-77e2d0cbab83\") " Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.479278 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc29be35-3ceb-4a88-af6e-77e2d0cbab83-horizon-tls-certs\") pod \"cc29be35-3ceb-4a88-af6e-77e2d0cbab83\" (UID: \"cc29be35-3ceb-4a88-af6e-77e2d0cbab83\") " Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.479355 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc29be35-3ceb-4a88-af6e-77e2d0cbab83-logs\") pod \"cc29be35-3ceb-4a88-af6e-77e2d0cbab83\" (UID: \"cc29be35-3ceb-4a88-af6e-77e2d0cbab83\") " Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.479451 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc29be35-3ceb-4a88-af6e-77e2d0cbab83-combined-ca-bundle\") pod \"cc29be35-3ceb-4a88-af6e-77e2d0cbab83\" (UID: \"cc29be35-3ceb-4a88-af6e-77e2d0cbab83\") " Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.479488 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rlmvf\" (UniqueName: \"kubernetes.io/projected/cc29be35-3ceb-4a88-af6e-77e2d0cbab83-kube-api-access-rlmvf\") pod \"cc29be35-3ceb-4a88-af6e-77e2d0cbab83\" (UID: \"cc29be35-3ceb-4a88-af6e-77e2d0cbab83\") " Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.479508 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cc29be35-3ceb-4a88-af6e-77e2d0cbab83-scripts\") pod \"cc29be35-3ceb-4a88-af6e-77e2d0cbab83\" (UID: \"cc29be35-3ceb-4a88-af6e-77e2d0cbab83\") " Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.479879 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/161707ca-6070-49ab-ab57-896ef94e4d83-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.479901 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/161707ca-6070-49ab-ab57-896ef94e4d83-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.483140 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc29be35-3ceb-4a88-af6e-77e2d0cbab83-logs" (OuterVolumeSpecName: "logs") pod "cc29be35-3ceb-4a88-af6e-77e2d0cbab83" (UID: "cc29be35-3ceb-4a88-af6e-77e2d0cbab83"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.484341 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc29be35-3ceb-4a88-af6e-77e2d0cbab83-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "cc29be35-3ceb-4a88-af6e-77e2d0cbab83" (UID: "cc29be35-3ceb-4a88-af6e-77e2d0cbab83"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.492050 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc29be35-3ceb-4a88-af6e-77e2d0cbab83-kube-api-access-rlmvf" (OuterVolumeSpecName: "kube-api-access-rlmvf") pod "cc29be35-3ceb-4a88-af6e-77e2d0cbab83" (UID: "cc29be35-3ceb-4a88-af6e-77e2d0cbab83"). InnerVolumeSpecName "kube-api-access-rlmvf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.529156 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc29be35-3ceb-4a88-af6e-77e2d0cbab83-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cc29be35-3ceb-4a88-af6e-77e2d0cbab83" (UID: "cc29be35-3ceb-4a88-af6e-77e2d0cbab83"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.541992 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.555954 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.556418 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc29be35-3ceb-4a88-af6e-77e2d0cbab83-config-data" (OuterVolumeSpecName: "config-data") pod "cc29be35-3ceb-4a88-af6e-77e2d0cbab83" (UID: "cc29be35-3ceb-4a88-af6e-77e2d0cbab83"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.556832 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc29be35-3ceb-4a88-af6e-77e2d0cbab83-scripts" (OuterVolumeSpecName: "scripts") pod "cc29be35-3ceb-4a88-af6e-77e2d0cbab83" (UID: "cc29be35-3ceb-4a88-af6e-77e2d0cbab83"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.569384 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 06:37:20 crc kubenswrapper[4691]: E0930 06:37:20.569948 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="161707ca-6070-49ab-ab57-896ef94e4d83" containerName="proxy-httpd" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.570010 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="161707ca-6070-49ab-ab57-896ef94e4d83" containerName="proxy-httpd" Sep 30 06:37:20 crc kubenswrapper[4691]: E0930 06:37:20.570061 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc29be35-3ceb-4a88-af6e-77e2d0cbab83" containerName="horizon" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.570110 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc29be35-3ceb-4a88-af6e-77e2d0cbab83" containerName="horizon" Sep 30 06:37:20 crc kubenswrapper[4691]: E0930 06:37:20.570210 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="161707ca-6070-49ab-ab57-896ef94e4d83" containerName="ceilometer-central-agent" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.570264 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="161707ca-6070-49ab-ab57-896ef94e4d83" containerName="ceilometer-central-agent" Sep 30 06:37:20 crc kubenswrapper[4691]: E0930 06:37:20.570418 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="161707ca-6070-49ab-ab57-896ef94e4d83" containerName="ceilometer-notification-agent" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.570495 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="161707ca-6070-49ab-ab57-896ef94e4d83" containerName="ceilometer-notification-agent" Sep 30 06:37:20 crc kubenswrapper[4691]: E0930 06:37:20.570569 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="161707ca-6070-49ab-ab57-896ef94e4d83" containerName="sg-core" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.570642 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="161707ca-6070-49ab-ab57-896ef94e4d83" containerName="sg-core" Sep 30 06:37:20 crc kubenswrapper[4691]: E0930 06:37:20.570721 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc29be35-3ceb-4a88-af6e-77e2d0cbab83" containerName="horizon-log" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.570791 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc29be35-3ceb-4a88-af6e-77e2d0cbab83" containerName="horizon-log" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.571119 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="161707ca-6070-49ab-ab57-896ef94e4d83" containerName="ceilometer-notification-agent" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.571225 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc29be35-3ceb-4a88-af6e-77e2d0cbab83" containerName="horizon" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.571590 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc29be35-3ceb-4a88-af6e-77e2d0cbab83" containerName="horizon-log" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.571649 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="161707ca-6070-49ab-ab57-896ef94e4d83" containerName="sg-core" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.571733 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="161707ca-6070-49ab-ab57-896ef94e4d83" containerName="proxy-httpd" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.571794 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="161707ca-6070-49ab-ab57-896ef94e4d83" containerName="ceilometer-central-agent" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.581586 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.583567 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3be92833-059a-4083-9889-e552dc6eda8d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3be92833-059a-4083-9889-e552dc6eda8d\") " pod="openstack/ceilometer-0" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.584102 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3be92833-059a-4083-9889-e552dc6eda8d-scripts\") pod \"ceilometer-0\" (UID: \"3be92833-059a-4083-9889-e552dc6eda8d\") " pod="openstack/ceilometer-0" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.584136 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3be92833-059a-4083-9889-e552dc6eda8d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3be92833-059a-4083-9889-e552dc6eda8d\") " pod="openstack/ceilometer-0" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.584191 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3be92833-059a-4083-9889-e552dc6eda8d-config-data\") pod \"ceilometer-0\" (UID: \"3be92833-059a-4083-9889-e552dc6eda8d\") " pod="openstack/ceilometer-0" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.584344 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3be92833-059a-4083-9889-e552dc6eda8d-run-httpd\") pod \"ceilometer-0\" (UID: \"3be92833-059a-4083-9889-e552dc6eda8d\") " pod="openstack/ceilometer-0" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.584369 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3be92833-059a-4083-9889-e552dc6eda8d-log-httpd\") pod \"ceilometer-0\" (UID: \"3be92833-059a-4083-9889-e552dc6eda8d\") " pod="openstack/ceilometer-0" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.584427 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jswsb\" (UniqueName: \"kubernetes.io/projected/3be92833-059a-4083-9889-e552dc6eda8d-kube-api-access-jswsb\") pod \"ceilometer-0\" (UID: \"3be92833-059a-4083-9889-e552dc6eda8d\") " pod="openstack/ceilometer-0" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.584584 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc29be35-3ceb-4a88-af6e-77e2d0cbab83-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.584596 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rlmvf\" (UniqueName: \"kubernetes.io/projected/cc29be35-3ceb-4a88-af6e-77e2d0cbab83-kube-api-access-rlmvf\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.584608 4691 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cc29be35-3ceb-4a88-af6e-77e2d0cbab83-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.584617 4691 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/cc29be35-3ceb-4a88-af6e-77e2d0cbab83-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.584645 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cc29be35-3ceb-4a88-af6e-77e2d0cbab83-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.584654 4691 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cc29be35-3ceb-4a88-af6e-77e2d0cbab83-logs\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.585038 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.587789 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.589397 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.590063 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc29be35-3ceb-4a88-af6e-77e2d0cbab83-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "cc29be35-3ceb-4a88-af6e-77e2d0cbab83" (UID: "cc29be35-3ceb-4a88-af6e-77e2d0cbab83"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.685130 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jswsb\" (UniqueName: \"kubernetes.io/projected/3be92833-059a-4083-9889-e552dc6eda8d-kube-api-access-jswsb\") pod \"ceilometer-0\" (UID: \"3be92833-059a-4083-9889-e552dc6eda8d\") " pod="openstack/ceilometer-0" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.685506 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3be92833-059a-4083-9889-e552dc6eda8d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3be92833-059a-4083-9889-e552dc6eda8d\") " pod="openstack/ceilometer-0" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.685615 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3be92833-059a-4083-9889-e552dc6eda8d-scripts\") pod \"ceilometer-0\" (UID: \"3be92833-059a-4083-9889-e552dc6eda8d\") " pod="openstack/ceilometer-0" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.685765 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3be92833-059a-4083-9889-e552dc6eda8d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3be92833-059a-4083-9889-e552dc6eda8d\") " pod="openstack/ceilometer-0" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.685874 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3be92833-059a-4083-9889-e552dc6eda8d-config-data\") pod \"ceilometer-0\" (UID: \"3be92833-059a-4083-9889-e552dc6eda8d\") " pod="openstack/ceilometer-0" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.685997 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3be92833-059a-4083-9889-e552dc6eda8d-run-httpd\") pod \"ceilometer-0\" (UID: \"3be92833-059a-4083-9889-e552dc6eda8d\") " pod="openstack/ceilometer-0" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.686104 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3be92833-059a-4083-9889-e552dc6eda8d-log-httpd\") pod \"ceilometer-0\" (UID: \"3be92833-059a-4083-9889-e552dc6eda8d\") " pod="openstack/ceilometer-0" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.686228 4691 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/cc29be35-3ceb-4a88-af6e-77e2d0cbab83-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.686782 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3be92833-059a-4083-9889-e552dc6eda8d-run-httpd\") pod \"ceilometer-0\" (UID: \"3be92833-059a-4083-9889-e552dc6eda8d\") " pod="openstack/ceilometer-0" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.687091 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3be92833-059a-4083-9889-e552dc6eda8d-log-httpd\") pod \"ceilometer-0\" (UID: \"3be92833-059a-4083-9889-e552dc6eda8d\") " pod="openstack/ceilometer-0" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.689686 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3be92833-059a-4083-9889-e552dc6eda8d-scripts\") pod \"ceilometer-0\" (UID: \"3be92833-059a-4083-9889-e552dc6eda8d\") " pod="openstack/ceilometer-0" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.690421 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3be92833-059a-4083-9889-e552dc6eda8d-config-data\") pod \"ceilometer-0\" (UID: \"3be92833-059a-4083-9889-e552dc6eda8d\") " pod="openstack/ceilometer-0" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.690950 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3be92833-059a-4083-9889-e552dc6eda8d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3be92833-059a-4083-9889-e552dc6eda8d\") " pod="openstack/ceilometer-0" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.697793 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3be92833-059a-4083-9889-e552dc6eda8d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3be92833-059a-4083-9889-e552dc6eda8d\") " pod="openstack/ceilometer-0" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.699937 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jswsb\" (UniqueName: \"kubernetes.io/projected/3be92833-059a-4083-9889-e552dc6eda8d-kube-api-access-jswsb\") pod \"ceilometer-0\" (UID: \"3be92833-059a-4083-9889-e552dc6eda8d\") " pod="openstack/ceilometer-0" Sep 30 06:37:20 crc kubenswrapper[4691]: I0930 06:37:20.953408 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 06:37:21 crc kubenswrapper[4691]: I0930 06:37:21.035426 4691 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-67959b9db8-p8w6w" podUID="cbfa4b8e-37ac-4713-a8af-b610f086f7e2" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.178:9311/healthcheck\": read tcp 10.217.0.2:39116->10.217.0.178:9311: read: connection reset by peer" Sep 30 06:37:21 crc kubenswrapper[4691]: I0930 06:37:21.151722 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-api-0" Sep 30 06:37:21 crc kubenswrapper[4691]: I0930 06:37:21.187006 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-api-0" Sep 30 06:37:21 crc kubenswrapper[4691]: I0930 06:37:21.246845 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="161707ca-6070-49ab-ab57-896ef94e4d83" path="/var/lib/kubelet/pods/161707ca-6070-49ab-ab57-896ef94e4d83/volumes" Sep 30 06:37:21 crc kubenswrapper[4691]: I0930 06:37:21.251107 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5755bf4df8-zx9td" event={"ID":"cc29be35-3ceb-4a88-af6e-77e2d0cbab83","Type":"ContainerDied","Data":"7989021c2b637df47a2ef8b27fea833ce3c2ef304e7224dad15d0d8faa5ab7fb"} Sep 30 06:37:21 crc kubenswrapper[4691]: I0930 06:37:21.251159 4691 scope.go:117] "RemoveContainer" containerID="6cf0134f04e4636a4eb6deba05544969528280e6e7e910d2525ad3eab4aef3df" Sep 30 06:37:21 crc kubenswrapper[4691]: I0930 06:37:21.251291 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5755bf4df8-zx9td" Sep 30 06:37:21 crc kubenswrapper[4691]: I0930 06:37:21.273133 4691 generic.go:334] "Generic (PLEG): container finished" podID="cbfa4b8e-37ac-4713-a8af-b610f086f7e2" containerID="b6752c80f69833593d27c5ffb66ab3315ee613a1f95e62af78e01a7a5d3dd421" exitCode=0 Sep 30 06:37:21 crc kubenswrapper[4691]: I0930 06:37:21.273503 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-67959b9db8-p8w6w" event={"ID":"cbfa4b8e-37ac-4713-a8af-b610f086f7e2","Type":"ContainerDied","Data":"b6752c80f69833593d27c5ffb66ab3315ee613a1f95e62af78e01a7a5d3dd421"} Sep 30 06:37:21 crc kubenswrapper[4691]: I0930 06:37:21.291057 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5755bf4df8-zx9td"] Sep 30 06:37:21 crc kubenswrapper[4691]: I0930 06:37:21.295194 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-api-0" Sep 30 06:37:21 crc kubenswrapper[4691]: I0930 06:37:21.298609 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5755bf4df8-zx9td"] Sep 30 06:37:21 crc kubenswrapper[4691]: I0930 06:37:21.476191 4691 scope.go:117] "RemoveContainer" containerID="b433c5a51bea7ed49e49b10d76eaae08ac7b93bc283b6448ac328bf46caa578c" Sep 30 06:37:21 crc kubenswrapper[4691]: I0930 06:37:21.606495 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 06:37:21 crc kubenswrapper[4691]: I0930 06:37:21.973755 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-67959b9db8-p8w6w" Sep 30 06:37:22 crc kubenswrapper[4691]: I0930 06:37:22.116683 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbfa4b8e-37ac-4713-a8af-b610f086f7e2-combined-ca-bundle\") pod \"cbfa4b8e-37ac-4713-a8af-b610f086f7e2\" (UID: \"cbfa4b8e-37ac-4713-a8af-b610f086f7e2\") " Sep 30 06:37:22 crc kubenswrapper[4691]: I0930 06:37:22.116802 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddktz\" (UniqueName: \"kubernetes.io/projected/cbfa4b8e-37ac-4713-a8af-b610f086f7e2-kube-api-access-ddktz\") pod \"cbfa4b8e-37ac-4713-a8af-b610f086f7e2\" (UID: \"cbfa4b8e-37ac-4713-a8af-b610f086f7e2\") " Sep 30 06:37:22 crc kubenswrapper[4691]: I0930 06:37:22.116830 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbfa4b8e-37ac-4713-a8af-b610f086f7e2-config-data\") pod \"cbfa4b8e-37ac-4713-a8af-b610f086f7e2\" (UID: \"cbfa4b8e-37ac-4713-a8af-b610f086f7e2\") " Sep 30 06:37:22 crc kubenswrapper[4691]: I0930 06:37:22.116867 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cbfa4b8e-37ac-4713-a8af-b610f086f7e2-config-data-custom\") pod \"cbfa4b8e-37ac-4713-a8af-b610f086f7e2\" (UID: \"cbfa4b8e-37ac-4713-a8af-b610f086f7e2\") " Sep 30 06:37:22 crc kubenswrapper[4691]: I0930 06:37:22.117000 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cbfa4b8e-37ac-4713-a8af-b610f086f7e2-logs\") pod \"cbfa4b8e-37ac-4713-a8af-b610f086f7e2\" (UID: \"cbfa4b8e-37ac-4713-a8af-b610f086f7e2\") " Sep 30 06:37:22 crc kubenswrapper[4691]: I0930 06:37:22.117998 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cbfa4b8e-37ac-4713-a8af-b610f086f7e2-logs" (OuterVolumeSpecName: "logs") pod "cbfa4b8e-37ac-4713-a8af-b610f086f7e2" (UID: "cbfa4b8e-37ac-4713-a8af-b610f086f7e2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:37:22 crc kubenswrapper[4691]: I0930 06:37:22.121203 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbfa4b8e-37ac-4713-a8af-b610f086f7e2-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "cbfa4b8e-37ac-4713-a8af-b610f086f7e2" (UID: "cbfa4b8e-37ac-4713-a8af-b610f086f7e2"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:37:22 crc kubenswrapper[4691]: I0930 06:37:22.121654 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbfa4b8e-37ac-4713-a8af-b610f086f7e2-kube-api-access-ddktz" (OuterVolumeSpecName: "kube-api-access-ddktz") pod "cbfa4b8e-37ac-4713-a8af-b610f086f7e2" (UID: "cbfa4b8e-37ac-4713-a8af-b610f086f7e2"). InnerVolumeSpecName "kube-api-access-ddktz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:37:22 crc kubenswrapper[4691]: I0930 06:37:22.156226 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbfa4b8e-37ac-4713-a8af-b610f086f7e2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cbfa4b8e-37ac-4713-a8af-b610f086f7e2" (UID: "cbfa4b8e-37ac-4713-a8af-b610f086f7e2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:37:22 crc kubenswrapper[4691]: I0930 06:37:22.164090 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cbfa4b8e-37ac-4713-a8af-b610f086f7e2-config-data" (OuterVolumeSpecName: "config-data") pod "cbfa4b8e-37ac-4713-a8af-b610f086f7e2" (UID: "cbfa4b8e-37ac-4713-a8af-b610f086f7e2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:37:22 crc kubenswrapper[4691]: I0930 06:37:22.219872 4691 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cbfa4b8e-37ac-4713-a8af-b610f086f7e2-config-data-custom\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:22 crc kubenswrapper[4691]: I0930 06:37:22.219936 4691 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cbfa4b8e-37ac-4713-a8af-b610f086f7e2-logs\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:22 crc kubenswrapper[4691]: I0930 06:37:22.219946 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cbfa4b8e-37ac-4713-a8af-b610f086f7e2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:22 crc kubenswrapper[4691]: I0930 06:37:22.219958 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ddktz\" (UniqueName: \"kubernetes.io/projected/cbfa4b8e-37ac-4713-a8af-b610f086f7e2-kube-api-access-ddktz\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:22 crc kubenswrapper[4691]: I0930 06:37:22.219967 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cbfa4b8e-37ac-4713-a8af-b610f086f7e2-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:22 crc kubenswrapper[4691]: I0930 06:37:22.283800 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-67959b9db8-p8w6w" event={"ID":"cbfa4b8e-37ac-4713-a8af-b610f086f7e2","Type":"ContainerDied","Data":"feb2146197bcd200b2070e9ef6b33126ce9c44667ce4e55e7a22dde92f49def9"} Sep 30 06:37:22 crc kubenswrapper[4691]: I0930 06:37:22.283847 4691 scope.go:117] "RemoveContainer" containerID="b6752c80f69833593d27c5ffb66ab3315ee613a1f95e62af78e01a7a5d3dd421" Sep 30 06:37:22 crc kubenswrapper[4691]: I0930 06:37:22.283959 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-67959b9db8-p8w6w" Sep 30 06:37:22 crc kubenswrapper[4691]: I0930 06:37:22.287063 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3be92833-059a-4083-9889-e552dc6eda8d","Type":"ContainerStarted","Data":"f25b71b6f1052d8b59a7ea4de7a16b20c40da500c24d735a792a518d0c20d88f"} Sep 30 06:37:22 crc kubenswrapper[4691]: I0930 06:37:22.287143 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3be92833-059a-4083-9889-e552dc6eda8d","Type":"ContainerStarted","Data":"d86e2c15860aedf5d0bbd88095c704b3bcb5fdb432d31ef9ace9b46eb5a705a2"} Sep 30 06:37:22 crc kubenswrapper[4691]: I0930 06:37:22.287155 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3be92833-059a-4083-9889-e552dc6eda8d","Type":"ContainerStarted","Data":"eccea64742222760309b7d7d6b9b77fef5f4330f16b3aa773753e093b9691580"} Sep 30 06:37:22 crc kubenswrapper[4691]: I0930 06:37:22.316570 4691 scope.go:117] "RemoveContainer" containerID="ef517aaa6758085874127d7c6326aed742a754e9537fde8af3ab95d6767d0e04" Sep 30 06:37:22 crc kubenswrapper[4691]: I0930 06:37:22.322680 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-67959b9db8-p8w6w"] Sep 30 06:37:22 crc kubenswrapper[4691]: I0930 06:37:22.329946 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-67959b9db8-p8w6w"] Sep 30 06:37:23 crc kubenswrapper[4691]: I0930 06:37:23.241370 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbfa4b8e-37ac-4713-a8af-b610f086f7e2" path="/var/lib/kubelet/pods/cbfa4b8e-37ac-4713-a8af-b610f086f7e2/volumes" Sep 30 06:37:23 crc kubenswrapper[4691]: I0930 06:37:23.242257 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc29be35-3ceb-4a88-af6e-77e2d0cbab83" path="/var/lib/kubelet/pods/cc29be35-3ceb-4a88-af6e-77e2d0cbab83/volumes" Sep 30 06:37:23 crc kubenswrapper[4691]: I0930 06:37:23.843009 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 06:37:24 crc kubenswrapper[4691]: I0930 06:37:24.571359 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-67cb6557b7-q5zk6" Sep 30 06:37:24 crc kubenswrapper[4691]: I0930 06:37:24.630010 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5865f587f5-rvd99"] Sep 30 06:37:24 crc kubenswrapper[4691]: I0930 06:37:24.630273 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5865f587f5-rvd99" podUID="ab991360-2557-48e2-b39e-91dece03bcbe" containerName="dnsmasq-dns" containerID="cri-o://8c0cf071d81fdd3a88c2d92130296e4f87e7baf332fbdd0f9820c61cae827b36" gracePeriod=10 Sep 30 06:37:24 crc kubenswrapper[4691]: I0930 06:37:24.676505 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-6f47894d84-xb69p" Sep 30 06:37:24 crc kubenswrapper[4691]: I0930 06:37:24.885377 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Sep 30 06:37:24 crc kubenswrapper[4691]: I0930 06:37:24.940068 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 06:37:25 crc kubenswrapper[4691]: I0930 06:37:25.324788 4691 generic.go:334] "Generic (PLEG): container finished" podID="ab991360-2557-48e2-b39e-91dece03bcbe" containerID="8c0cf071d81fdd3a88c2d92130296e4f87e7baf332fbdd0f9820c61cae827b36" exitCode=0 Sep 30 06:37:25 crc kubenswrapper[4691]: I0930 06:37:25.324908 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5865f587f5-rvd99" event={"ID":"ab991360-2557-48e2-b39e-91dece03bcbe","Type":"ContainerDied","Data":"8c0cf071d81fdd3a88c2d92130296e4f87e7baf332fbdd0f9820c61cae827b36"} Sep 30 06:37:25 crc kubenswrapper[4691]: I0930 06:37:25.325004 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="0dd335f3-da02-4e62-a202-670de97399a9" containerName="cinder-scheduler" containerID="cri-o://6c26171ffd1df45ef63125ed39237d7064089677283a5d6999e84c70da922bd8" gracePeriod=30 Sep 30 06:37:25 crc kubenswrapper[4691]: I0930 06:37:25.325081 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="0dd335f3-da02-4e62-a202-670de97399a9" containerName="probe" containerID="cri-o://d4acda655110d6831a69dda83e4b9168c76994949041e5c7a6a52d49c8340068" gracePeriod=30 Sep 30 06:37:26 crc kubenswrapper[4691]: I0930 06:37:26.338915 4691 generic.go:334] "Generic (PLEG): container finished" podID="0dd335f3-da02-4e62-a202-670de97399a9" containerID="d4acda655110d6831a69dda83e4b9168c76994949041e5c7a6a52d49c8340068" exitCode=0 Sep 30 06:37:26 crc kubenswrapper[4691]: I0930 06:37:26.339015 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0dd335f3-da02-4e62-a202-670de97399a9","Type":"ContainerDied","Data":"d4acda655110d6831a69dda83e4b9168c76994949041e5c7a6a52d49c8340068"} Sep 30 06:37:26 crc kubenswrapper[4691]: I0930 06:37:26.850247 4691 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5865f587f5-rvd99" podUID="ab991360-2557-48e2-b39e-91dece03bcbe" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.177:5353: connect: connection refused" Sep 30 06:37:26 crc kubenswrapper[4691]: I0930 06:37:26.871302 4691 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-67959b9db8-p8w6w" podUID="cbfa4b8e-37ac-4713-a8af-b610f086f7e2" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.178:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 30 06:37:26 crc kubenswrapper[4691]: I0930 06:37:26.871679 4691 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-67959b9db8-p8w6w" podUID="cbfa4b8e-37ac-4713-a8af-b610f086f7e2" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.178:9311/healthcheck\": dial tcp 10.217.0.178:9311: i/o timeout (Client.Timeout exceeded while awaiting headers)" Sep 30 06:37:27 crc kubenswrapper[4691]: I0930 06:37:27.071491 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Sep 30 06:37:27 crc kubenswrapper[4691]: I0930 06:37:27.236525 4691 scope.go:117] "RemoveContainer" containerID="0bda2d329b80bafe79c6a51e31c421bb8b3254080dc231fa2738d610c27ec014" Sep 30 06:37:27 crc kubenswrapper[4691]: I0930 06:37:27.350750 4691 generic.go:334] "Generic (PLEG): container finished" podID="0dd335f3-da02-4e62-a202-670de97399a9" containerID="6c26171ffd1df45ef63125ed39237d7064089677283a5d6999e84c70da922bd8" exitCode=0 Sep 30 06:37:27 crc kubenswrapper[4691]: I0930 06:37:27.350789 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0dd335f3-da02-4e62-a202-670de97399a9","Type":"ContainerDied","Data":"6c26171ffd1df45ef63125ed39237d7064089677283a5d6999e84c70da922bd8"} Sep 30 06:37:27 crc kubenswrapper[4691]: I0930 06:37:27.387395 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-8687477df-8l865" Sep 30 06:37:27 crc kubenswrapper[4691]: I0930 06:37:27.434430 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6f47894d84-xb69p"] Sep 30 06:37:27 crc kubenswrapper[4691]: I0930 06:37:27.434702 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6f47894d84-xb69p" podUID="4d6bfe51-c6c1-4062-b1e1-22905c50a142" containerName="neutron-api" containerID="cri-o://9e0a6bbbcf9c4b88082c2de5488f2b0e557254e439866c0f97aad7411933c802" gracePeriod=30 Sep 30 06:37:27 crc kubenswrapper[4691]: I0930 06:37:27.435080 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-6f47894d84-xb69p" podUID="4d6bfe51-c6c1-4062-b1e1-22905c50a142" containerName="neutron-httpd" containerID="cri-o://90829e5557768999da198dbb96694764a139fd80b48837ac756544291f70a36c" gracePeriod=30 Sep 30 06:37:28 crc kubenswrapper[4691]: I0930 06:37:28.237404 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-688b4ff469-2cgjc" Sep 30 06:37:28 crc kubenswrapper[4691]: I0930 06:37:28.242208 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-688b4ff469-2cgjc" Sep 30 06:37:28 crc kubenswrapper[4691]: I0930 06:37:28.384943 4691 generic.go:334] "Generic (PLEG): container finished" podID="4d6bfe51-c6c1-4062-b1e1-22905c50a142" containerID="90829e5557768999da198dbb96694764a139fd80b48837ac756544291f70a36c" exitCode=0 Sep 30 06:37:28 crc kubenswrapper[4691]: I0930 06:37:28.385009 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f47894d84-xb69p" event={"ID":"4d6bfe51-c6c1-4062-b1e1-22905c50a142","Type":"ContainerDied","Data":"90829e5557768999da198dbb96694764a139fd80b48837ac756544291f70a36c"} Sep 30 06:37:29 crc kubenswrapper[4691]: I0930 06:37:29.191997 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 30 06:37:29 crc kubenswrapper[4691]: I0930 06:37:29.201774 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5865f587f5-rvd99" Sep 30 06:37:29 crc kubenswrapper[4691]: I0930 06:37:29.250606 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0dd335f3-da02-4e62-a202-670de97399a9-config-data-custom\") pod \"0dd335f3-da02-4e62-a202-670de97399a9\" (UID: \"0dd335f3-da02-4e62-a202-670de97399a9\") " Sep 30 06:37:29 crc kubenswrapper[4691]: I0930 06:37:29.250691 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab991360-2557-48e2-b39e-91dece03bcbe-config\") pod \"ab991360-2557-48e2-b39e-91dece03bcbe\" (UID: \"ab991360-2557-48e2-b39e-91dece03bcbe\") " Sep 30 06:37:29 crc kubenswrapper[4691]: I0930 06:37:29.250752 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ab991360-2557-48e2-b39e-91dece03bcbe-dns-svc\") pod \"ab991360-2557-48e2-b39e-91dece03bcbe\" (UID: \"ab991360-2557-48e2-b39e-91dece03bcbe\") " Sep 30 06:37:29 crc kubenswrapper[4691]: I0930 06:37:29.250798 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmq2r\" (UniqueName: \"kubernetes.io/projected/ab991360-2557-48e2-b39e-91dece03bcbe-kube-api-access-zmq2r\") pod \"ab991360-2557-48e2-b39e-91dece03bcbe\" (UID: \"ab991360-2557-48e2-b39e-91dece03bcbe\") " Sep 30 06:37:29 crc kubenswrapper[4691]: I0930 06:37:29.250828 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhp8g\" (UniqueName: \"kubernetes.io/projected/0dd335f3-da02-4e62-a202-670de97399a9-kube-api-access-hhp8g\") pod \"0dd335f3-da02-4e62-a202-670de97399a9\" (UID: \"0dd335f3-da02-4e62-a202-670de97399a9\") " Sep 30 06:37:29 crc kubenswrapper[4691]: I0930 06:37:29.250862 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ab991360-2557-48e2-b39e-91dece03bcbe-dns-swift-storage-0\") pod \"ab991360-2557-48e2-b39e-91dece03bcbe\" (UID: \"ab991360-2557-48e2-b39e-91dece03bcbe\") " Sep 30 06:37:29 crc kubenswrapper[4691]: I0930 06:37:29.250909 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ab991360-2557-48e2-b39e-91dece03bcbe-ovsdbserver-sb\") pod \"ab991360-2557-48e2-b39e-91dece03bcbe\" (UID: \"ab991360-2557-48e2-b39e-91dece03bcbe\") " Sep 30 06:37:29 crc kubenswrapper[4691]: I0930 06:37:29.250968 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0dd335f3-da02-4e62-a202-670de97399a9-etc-machine-id\") pod \"0dd335f3-da02-4e62-a202-670de97399a9\" (UID: \"0dd335f3-da02-4e62-a202-670de97399a9\") " Sep 30 06:37:29 crc kubenswrapper[4691]: I0930 06:37:29.253719 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0dd335f3-da02-4e62-a202-670de97399a9-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "0dd335f3-da02-4e62-a202-670de97399a9" (UID: "0dd335f3-da02-4e62-a202-670de97399a9"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 06:37:29 crc kubenswrapper[4691]: I0930 06:37:29.262139 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dd335f3-da02-4e62-a202-670de97399a9-kube-api-access-hhp8g" (OuterVolumeSpecName: "kube-api-access-hhp8g") pod "0dd335f3-da02-4e62-a202-670de97399a9" (UID: "0dd335f3-da02-4e62-a202-670de97399a9"). InnerVolumeSpecName "kube-api-access-hhp8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:37:29 crc kubenswrapper[4691]: I0930 06:37:29.276963 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dd335f3-da02-4e62-a202-670de97399a9-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "0dd335f3-da02-4e62-a202-670de97399a9" (UID: "0dd335f3-da02-4e62-a202-670de97399a9"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:37:29 crc kubenswrapper[4691]: I0930 06:37:29.277176 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab991360-2557-48e2-b39e-91dece03bcbe-kube-api-access-zmq2r" (OuterVolumeSpecName: "kube-api-access-zmq2r") pod "ab991360-2557-48e2-b39e-91dece03bcbe" (UID: "ab991360-2557-48e2-b39e-91dece03bcbe"). InnerVolumeSpecName "kube-api-access-zmq2r". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:37:29 crc kubenswrapper[4691]: I0930 06:37:29.326799 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab991360-2557-48e2-b39e-91dece03bcbe-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ab991360-2557-48e2-b39e-91dece03bcbe" (UID: "ab991360-2557-48e2-b39e-91dece03bcbe"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:37:29 crc kubenswrapper[4691]: I0930 06:37:29.328866 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab991360-2557-48e2-b39e-91dece03bcbe-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ab991360-2557-48e2-b39e-91dece03bcbe" (UID: "ab991360-2557-48e2-b39e-91dece03bcbe"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:37:29 crc kubenswrapper[4691]: I0930 06:37:29.350592 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab991360-2557-48e2-b39e-91dece03bcbe-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ab991360-2557-48e2-b39e-91dece03bcbe" (UID: "ab991360-2557-48e2-b39e-91dece03bcbe"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:37:29 crc kubenswrapper[4691]: I0930 06:37:29.351818 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dd335f3-da02-4e62-a202-670de97399a9-combined-ca-bundle\") pod \"0dd335f3-da02-4e62-a202-670de97399a9\" (UID: \"0dd335f3-da02-4e62-a202-670de97399a9\") " Sep 30 06:37:29 crc kubenswrapper[4691]: I0930 06:37:29.351856 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0dd335f3-da02-4e62-a202-670de97399a9-config-data\") pod \"0dd335f3-da02-4e62-a202-670de97399a9\" (UID: \"0dd335f3-da02-4e62-a202-670de97399a9\") " Sep 30 06:37:29 crc kubenswrapper[4691]: I0930 06:37:29.351988 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0dd335f3-da02-4e62-a202-670de97399a9-scripts\") pod \"0dd335f3-da02-4e62-a202-670de97399a9\" (UID: \"0dd335f3-da02-4e62-a202-670de97399a9\") " Sep 30 06:37:29 crc kubenswrapper[4691]: I0930 06:37:29.352012 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ab991360-2557-48e2-b39e-91dece03bcbe-ovsdbserver-nb\") pod \"ab991360-2557-48e2-b39e-91dece03bcbe\" (UID: \"ab991360-2557-48e2-b39e-91dece03bcbe\") " Sep 30 06:37:29 crc kubenswrapper[4691]: I0930 06:37:29.352356 4691 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0dd335f3-da02-4e62-a202-670de97399a9-config-data-custom\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:29 crc kubenswrapper[4691]: I0930 06:37:29.352373 4691 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ab991360-2557-48e2-b39e-91dece03bcbe-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:29 crc kubenswrapper[4691]: I0930 06:37:29.352383 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zmq2r\" (UniqueName: \"kubernetes.io/projected/ab991360-2557-48e2-b39e-91dece03bcbe-kube-api-access-zmq2r\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:29 crc kubenswrapper[4691]: I0930 06:37:29.352392 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hhp8g\" (UniqueName: \"kubernetes.io/projected/0dd335f3-da02-4e62-a202-670de97399a9-kube-api-access-hhp8g\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:29 crc kubenswrapper[4691]: I0930 06:37:29.352400 4691 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ab991360-2557-48e2-b39e-91dece03bcbe-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:29 crc kubenswrapper[4691]: I0930 06:37:29.352407 4691 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ab991360-2557-48e2-b39e-91dece03bcbe-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:29 crc kubenswrapper[4691]: I0930 06:37:29.352418 4691 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0dd335f3-da02-4e62-a202-670de97399a9-etc-machine-id\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:29 crc kubenswrapper[4691]: I0930 06:37:29.354043 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab991360-2557-48e2-b39e-91dece03bcbe-config" (OuterVolumeSpecName: "config") pod "ab991360-2557-48e2-b39e-91dece03bcbe" (UID: "ab991360-2557-48e2-b39e-91dece03bcbe"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:37:29 crc kubenswrapper[4691]: I0930 06:37:29.355820 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dd335f3-da02-4e62-a202-670de97399a9-scripts" (OuterVolumeSpecName: "scripts") pod "0dd335f3-da02-4e62-a202-670de97399a9" (UID: "0dd335f3-da02-4e62-a202-670de97399a9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:37:29 crc kubenswrapper[4691]: I0930 06:37:29.395429 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3be92833-059a-4083-9889-e552dc6eda8d","Type":"ContainerStarted","Data":"ebc1ed6051f60cba9607917440d752aac33b9de2590a55ad74c3a51788f1fe70"} Sep 30 06:37:29 crc kubenswrapper[4691]: I0930 06:37:29.397772 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5865f587f5-rvd99" event={"ID":"ab991360-2557-48e2-b39e-91dece03bcbe","Type":"ContainerDied","Data":"c99bca1cca04bba916cc8a5f28435bd0cdd1222c5d5f29163157bf3f7d99ee45"} Sep 30 06:37:29 crc kubenswrapper[4691]: I0930 06:37:29.397805 4691 scope.go:117] "RemoveContainer" containerID="8c0cf071d81fdd3a88c2d92130296e4f87e7baf332fbdd0f9820c61cae827b36" Sep 30 06:37:29 crc kubenswrapper[4691]: I0930 06:37:29.397923 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5865f587f5-rvd99" Sep 30 06:37:29 crc kubenswrapper[4691]: I0930 06:37:29.401567 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/watcher-decision-engine-0" event={"ID":"af3f1644-3ab8-4a6a-9f80-f8ea42297e98","Type":"ContainerStarted","Data":"174b45df81242ad0f0bcb122c0960c0b000a912c0c02666a77eb10a32d0990ad"} Sep 30 06:37:29 crc kubenswrapper[4691]: I0930 06:37:29.413565 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"7161fa44-37a9-4d36-b45b-b4f5cf9aa2ee","Type":"ContainerStarted","Data":"c3d8cdc8a84003482ecc9cd5bb17b5e3330a0da4e0465639b64aeec4c2bd1156"} Sep 30 06:37:29 crc kubenswrapper[4691]: I0930 06:37:29.421284 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dd335f3-da02-4e62-a202-670de97399a9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0dd335f3-da02-4e62-a202-670de97399a9" (UID: "0dd335f3-da02-4e62-a202-670de97399a9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:37:29 crc kubenswrapper[4691]: I0930 06:37:29.424971 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab991360-2557-48e2-b39e-91dece03bcbe-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ab991360-2557-48e2-b39e-91dece03bcbe" (UID: "ab991360-2557-48e2-b39e-91dece03bcbe"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:37:29 crc kubenswrapper[4691]: I0930 06:37:29.426531 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0dd335f3-da02-4e62-a202-670de97399a9","Type":"ContainerDied","Data":"393fd1f67fe2bf6db3d1de8d2a3ed849638384113c9fcf1f0911e5fa27f61d45"} Sep 30 06:37:29 crc kubenswrapper[4691]: I0930 06:37:29.427288 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 30 06:37:29 crc kubenswrapper[4691]: I0930 06:37:29.429507 4691 scope.go:117] "RemoveContainer" containerID="ecbcabc3de94e7fe361d4043d01edb1a41343dd2b9c59e366f08316095a7c4b2" Sep 30 06:37:29 crc kubenswrapper[4691]: I0930 06:37:29.456035 4691 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab991360-2557-48e2-b39e-91dece03bcbe-config\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:29 crc kubenswrapper[4691]: I0930 06:37:29.456071 4691 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0dd335f3-da02-4e62-a202-670de97399a9-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:29 crc kubenswrapper[4691]: I0930 06:37:29.456487 4691 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ab991360-2557-48e2-b39e-91dece03bcbe-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:29 crc kubenswrapper[4691]: I0930 06:37:29.456505 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dd335f3-da02-4e62-a202-670de97399a9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:29 crc kubenswrapper[4691]: I0930 06:37:29.458075 4691 scope.go:117] "RemoveContainer" containerID="d4acda655110d6831a69dda83e4b9168c76994949041e5c7a6a52d49c8340068" Sep 30 06:37:29 crc kubenswrapper[4691]: I0930 06:37:29.479803 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.548005491 podStartE2EDuration="21.479774498s" podCreationTimestamp="2025-09-30 06:37:08 +0000 UTC" firstStartedPulling="2025-09-30 06:37:09.891771679 +0000 UTC m=+1073.366792719" lastFinishedPulling="2025-09-30 06:37:28.823540676 +0000 UTC m=+1092.298561726" observedRunningTime="2025-09-30 06:37:29.438637417 +0000 UTC m=+1092.913658457" watchObservedRunningTime="2025-09-30 06:37:29.479774498 +0000 UTC m=+1092.954795558" Sep 30 06:37:29 crc kubenswrapper[4691]: I0930 06:37:29.480570 4691 scope.go:117] "RemoveContainer" containerID="6c26171ffd1df45ef63125ed39237d7064089677283a5d6999e84c70da922bd8" Sep 30 06:37:29 crc kubenswrapper[4691]: I0930 06:37:29.483360 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dd335f3-da02-4e62-a202-670de97399a9-config-data" (OuterVolumeSpecName: "config-data") pod "0dd335f3-da02-4e62-a202-670de97399a9" (UID: "0dd335f3-da02-4e62-a202-670de97399a9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:37:29 crc kubenswrapper[4691]: I0930 06:37:29.557886 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0dd335f3-da02-4e62-a202-670de97399a9-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:29 crc kubenswrapper[4691]: I0930 06:37:29.764203 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5865f587f5-rvd99"] Sep 30 06:37:29 crc kubenswrapper[4691]: I0930 06:37:29.772750 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5865f587f5-rvd99"] Sep 30 06:37:29 crc kubenswrapper[4691]: I0930 06:37:29.779808 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 06:37:29 crc kubenswrapper[4691]: I0930 06:37:29.796996 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 06:37:29 crc kubenswrapper[4691]: I0930 06:37:29.804341 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 06:37:29 crc kubenswrapper[4691]: E0930 06:37:29.804713 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dd335f3-da02-4e62-a202-670de97399a9" containerName="cinder-scheduler" Sep 30 06:37:29 crc kubenswrapper[4691]: I0930 06:37:29.804729 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dd335f3-da02-4e62-a202-670de97399a9" containerName="cinder-scheduler" Sep 30 06:37:29 crc kubenswrapper[4691]: E0930 06:37:29.804743 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbfa4b8e-37ac-4713-a8af-b610f086f7e2" containerName="barbican-api" Sep 30 06:37:29 crc kubenswrapper[4691]: I0930 06:37:29.804750 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbfa4b8e-37ac-4713-a8af-b610f086f7e2" containerName="barbican-api" Sep 30 06:37:29 crc kubenswrapper[4691]: E0930 06:37:29.804762 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dd335f3-da02-4e62-a202-670de97399a9" containerName="probe" Sep 30 06:37:29 crc kubenswrapper[4691]: I0930 06:37:29.804768 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dd335f3-da02-4e62-a202-670de97399a9" containerName="probe" Sep 30 06:37:29 crc kubenswrapper[4691]: E0930 06:37:29.804779 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab991360-2557-48e2-b39e-91dece03bcbe" containerName="init" Sep 30 06:37:29 crc kubenswrapper[4691]: I0930 06:37:29.804785 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab991360-2557-48e2-b39e-91dece03bcbe" containerName="init" Sep 30 06:37:29 crc kubenswrapper[4691]: E0930 06:37:29.804799 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab991360-2557-48e2-b39e-91dece03bcbe" containerName="dnsmasq-dns" Sep 30 06:37:29 crc kubenswrapper[4691]: I0930 06:37:29.804804 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab991360-2557-48e2-b39e-91dece03bcbe" containerName="dnsmasq-dns" Sep 30 06:37:29 crc kubenswrapper[4691]: E0930 06:37:29.804820 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbfa4b8e-37ac-4713-a8af-b610f086f7e2" containerName="barbican-api-log" Sep 30 06:37:29 crc kubenswrapper[4691]: I0930 06:37:29.804826 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbfa4b8e-37ac-4713-a8af-b610f086f7e2" containerName="barbican-api-log" Sep 30 06:37:29 crc kubenswrapper[4691]: I0930 06:37:29.805004 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbfa4b8e-37ac-4713-a8af-b610f086f7e2" containerName="barbican-api" Sep 30 06:37:29 crc kubenswrapper[4691]: I0930 06:37:29.805015 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbfa4b8e-37ac-4713-a8af-b610f086f7e2" containerName="barbican-api-log" Sep 30 06:37:29 crc kubenswrapper[4691]: I0930 06:37:29.805027 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="0dd335f3-da02-4e62-a202-670de97399a9" containerName="cinder-scheduler" Sep 30 06:37:29 crc kubenswrapper[4691]: I0930 06:37:29.805036 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab991360-2557-48e2-b39e-91dece03bcbe" containerName="dnsmasq-dns" Sep 30 06:37:29 crc kubenswrapper[4691]: I0930 06:37:29.805045 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="0dd335f3-da02-4e62-a202-670de97399a9" containerName="probe" Sep 30 06:37:29 crc kubenswrapper[4691]: I0930 06:37:29.806041 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 30 06:37:29 crc kubenswrapper[4691]: I0930 06:37:29.807952 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Sep 30 06:37:29 crc kubenswrapper[4691]: I0930 06:37:29.819847 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 06:37:29 crc kubenswrapper[4691]: I0930 06:37:29.964506 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0937a7d3-6bf0-4114-b73b-0d10f2f19945-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"0937a7d3-6bf0-4114-b73b-0d10f2f19945\") " pod="openstack/cinder-scheduler-0" Sep 30 06:37:29 crc kubenswrapper[4691]: I0930 06:37:29.964560 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0937a7d3-6bf0-4114-b73b-0d10f2f19945-scripts\") pod \"cinder-scheduler-0\" (UID: \"0937a7d3-6bf0-4114-b73b-0d10f2f19945\") " pod="openstack/cinder-scheduler-0" Sep 30 06:37:29 crc kubenswrapper[4691]: I0930 06:37:29.964577 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0937a7d3-6bf0-4114-b73b-0d10f2f19945-config-data\") pod \"cinder-scheduler-0\" (UID: \"0937a7d3-6bf0-4114-b73b-0d10f2f19945\") " pod="openstack/cinder-scheduler-0" Sep 30 06:37:29 crc kubenswrapper[4691]: I0930 06:37:29.964895 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0937a7d3-6bf0-4114-b73b-0d10f2f19945-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"0937a7d3-6bf0-4114-b73b-0d10f2f19945\") " pod="openstack/cinder-scheduler-0" Sep 30 06:37:29 crc kubenswrapper[4691]: I0930 06:37:29.965002 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ql8kc\" (UniqueName: \"kubernetes.io/projected/0937a7d3-6bf0-4114-b73b-0d10f2f19945-kube-api-access-ql8kc\") pod \"cinder-scheduler-0\" (UID: \"0937a7d3-6bf0-4114-b73b-0d10f2f19945\") " pod="openstack/cinder-scheduler-0" Sep 30 06:37:29 crc kubenswrapper[4691]: I0930 06:37:29.965155 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0937a7d3-6bf0-4114-b73b-0d10f2f19945-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"0937a7d3-6bf0-4114-b73b-0d10f2f19945\") " pod="openstack/cinder-scheduler-0" Sep 30 06:37:30 crc kubenswrapper[4691]: I0930 06:37:30.066475 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0937a7d3-6bf0-4114-b73b-0d10f2f19945-scripts\") pod \"cinder-scheduler-0\" (UID: \"0937a7d3-6bf0-4114-b73b-0d10f2f19945\") " pod="openstack/cinder-scheduler-0" Sep 30 06:37:30 crc kubenswrapper[4691]: I0930 06:37:30.066514 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0937a7d3-6bf0-4114-b73b-0d10f2f19945-config-data\") pod \"cinder-scheduler-0\" (UID: \"0937a7d3-6bf0-4114-b73b-0d10f2f19945\") " pod="openstack/cinder-scheduler-0" Sep 30 06:37:30 crc kubenswrapper[4691]: I0930 06:37:30.066608 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0937a7d3-6bf0-4114-b73b-0d10f2f19945-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"0937a7d3-6bf0-4114-b73b-0d10f2f19945\") " pod="openstack/cinder-scheduler-0" Sep 30 06:37:30 crc kubenswrapper[4691]: I0930 06:37:30.066645 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ql8kc\" (UniqueName: \"kubernetes.io/projected/0937a7d3-6bf0-4114-b73b-0d10f2f19945-kube-api-access-ql8kc\") pod \"cinder-scheduler-0\" (UID: \"0937a7d3-6bf0-4114-b73b-0d10f2f19945\") " pod="openstack/cinder-scheduler-0" Sep 30 06:37:30 crc kubenswrapper[4691]: I0930 06:37:30.066702 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0937a7d3-6bf0-4114-b73b-0d10f2f19945-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"0937a7d3-6bf0-4114-b73b-0d10f2f19945\") " pod="openstack/cinder-scheduler-0" Sep 30 06:37:30 crc kubenswrapper[4691]: I0930 06:37:30.066742 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0937a7d3-6bf0-4114-b73b-0d10f2f19945-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"0937a7d3-6bf0-4114-b73b-0d10f2f19945\") " pod="openstack/cinder-scheduler-0" Sep 30 06:37:30 crc kubenswrapper[4691]: I0930 06:37:30.066815 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0937a7d3-6bf0-4114-b73b-0d10f2f19945-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"0937a7d3-6bf0-4114-b73b-0d10f2f19945\") " pod="openstack/cinder-scheduler-0" Sep 30 06:37:30 crc kubenswrapper[4691]: I0930 06:37:30.071107 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0937a7d3-6bf0-4114-b73b-0d10f2f19945-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"0937a7d3-6bf0-4114-b73b-0d10f2f19945\") " pod="openstack/cinder-scheduler-0" Sep 30 06:37:30 crc kubenswrapper[4691]: I0930 06:37:30.071401 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0937a7d3-6bf0-4114-b73b-0d10f2f19945-scripts\") pod \"cinder-scheduler-0\" (UID: \"0937a7d3-6bf0-4114-b73b-0d10f2f19945\") " pod="openstack/cinder-scheduler-0" Sep 30 06:37:30 crc kubenswrapper[4691]: I0930 06:37:30.071594 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0937a7d3-6bf0-4114-b73b-0d10f2f19945-config-data\") pod \"cinder-scheduler-0\" (UID: \"0937a7d3-6bf0-4114-b73b-0d10f2f19945\") " pod="openstack/cinder-scheduler-0" Sep 30 06:37:30 crc kubenswrapper[4691]: I0930 06:37:30.071691 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0937a7d3-6bf0-4114-b73b-0d10f2f19945-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"0937a7d3-6bf0-4114-b73b-0d10f2f19945\") " pod="openstack/cinder-scheduler-0" Sep 30 06:37:30 crc kubenswrapper[4691]: I0930 06:37:30.089535 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ql8kc\" (UniqueName: \"kubernetes.io/projected/0937a7d3-6bf0-4114-b73b-0d10f2f19945-kube-api-access-ql8kc\") pod \"cinder-scheduler-0\" (UID: \"0937a7d3-6bf0-4114-b73b-0d10f2f19945\") " pod="openstack/cinder-scheduler-0" Sep 30 06:37:30 crc kubenswrapper[4691]: I0930 06:37:30.136973 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Sep 30 06:37:30 crc kubenswrapper[4691]: I0930 06:37:30.800865 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Sep 30 06:37:31 crc kubenswrapper[4691]: I0930 06:37:31.237175 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0dd335f3-da02-4e62-a202-670de97399a9" path="/var/lib/kubelet/pods/0dd335f3-da02-4e62-a202-670de97399a9/volumes" Sep 30 06:37:31 crc kubenswrapper[4691]: I0930 06:37:31.238098 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab991360-2557-48e2-b39e-91dece03bcbe" path="/var/lib/kubelet/pods/ab991360-2557-48e2-b39e-91dece03bcbe/volumes" Sep 30 06:37:31 crc kubenswrapper[4691]: I0930 06:37:31.335598 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6f47894d84-xb69p" Sep 30 06:37:31 crc kubenswrapper[4691]: I0930 06:37:31.453341 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0937a7d3-6bf0-4114-b73b-0d10f2f19945","Type":"ContainerStarted","Data":"98dcb648f161e83fbf7589c089c646bc1106aafdbada880c9e1ea0ea099d8d44"} Sep 30 06:37:31 crc kubenswrapper[4691]: I0930 06:37:31.455700 4691 generic.go:334] "Generic (PLEG): container finished" podID="4d6bfe51-c6c1-4062-b1e1-22905c50a142" containerID="9e0a6bbbcf9c4b88082c2de5488f2b0e557254e439866c0f97aad7411933c802" exitCode=0 Sep 30 06:37:31 crc kubenswrapper[4691]: I0930 06:37:31.455775 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f47894d84-xb69p" event={"ID":"4d6bfe51-c6c1-4062-b1e1-22905c50a142","Type":"ContainerDied","Data":"9e0a6bbbcf9c4b88082c2de5488f2b0e557254e439866c0f97aad7411933c802"} Sep 30 06:37:31 crc kubenswrapper[4691]: I0930 06:37:31.455791 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-6f47894d84-xb69p" Sep 30 06:37:31 crc kubenswrapper[4691]: I0930 06:37:31.455813 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-6f47894d84-xb69p" event={"ID":"4d6bfe51-c6c1-4062-b1e1-22905c50a142","Type":"ContainerDied","Data":"578d6ca02bf5638cef2fd8a37adebbfeebdfd3896a7769eaf6f7f33ec501117b"} Sep 30 06:37:31 crc kubenswrapper[4691]: I0930 06:37:31.455859 4691 scope.go:117] "RemoveContainer" containerID="90829e5557768999da198dbb96694764a139fd80b48837ac756544291f70a36c" Sep 30 06:37:31 crc kubenswrapper[4691]: I0930 06:37:31.459141 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3be92833-059a-4083-9889-e552dc6eda8d","Type":"ContainerStarted","Data":"d24c0dcdcce162b58e02d74bbd3d4e3151bc794d2e261ced9b6ccbd96084e66e"} Sep 30 06:37:31 crc kubenswrapper[4691]: I0930 06:37:31.459261 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3be92833-059a-4083-9889-e552dc6eda8d" containerName="ceilometer-central-agent" containerID="cri-o://d86e2c15860aedf5d0bbd88095c704b3bcb5fdb432d31ef9ace9b46eb5a705a2" gracePeriod=30 Sep 30 06:37:31 crc kubenswrapper[4691]: I0930 06:37:31.459327 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 06:37:31 crc kubenswrapper[4691]: I0930 06:37:31.459367 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3be92833-059a-4083-9889-e552dc6eda8d" containerName="proxy-httpd" containerID="cri-o://d24c0dcdcce162b58e02d74bbd3d4e3151bc794d2e261ced9b6ccbd96084e66e" gracePeriod=30 Sep 30 06:37:31 crc kubenswrapper[4691]: I0930 06:37:31.459411 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3be92833-059a-4083-9889-e552dc6eda8d" containerName="ceilometer-notification-agent" containerID="cri-o://f25b71b6f1052d8b59a7ea4de7a16b20c40da500c24d735a792a518d0c20d88f" gracePeriod=30 Sep 30 06:37:31 crc kubenswrapper[4691]: I0930 06:37:31.459505 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3be92833-059a-4083-9889-e552dc6eda8d" containerName="sg-core" containerID="cri-o://ebc1ed6051f60cba9607917440d752aac33b9de2590a55ad74c3a51788f1fe70" gracePeriod=30 Sep 30 06:37:31 crc kubenswrapper[4691]: I0930 06:37:31.486347 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.990372133 podStartE2EDuration="11.486328048s" podCreationTimestamp="2025-09-30 06:37:20 +0000 UTC" firstStartedPulling="2025-09-30 06:37:21.657008319 +0000 UTC m=+1085.132029359" lastFinishedPulling="2025-09-30 06:37:30.152964234 +0000 UTC m=+1093.627985274" observedRunningTime="2025-09-30 06:37:31.475238652 +0000 UTC m=+1094.950259722" watchObservedRunningTime="2025-09-30 06:37:31.486328048 +0000 UTC m=+1094.961349088" Sep 30 06:37:31 crc kubenswrapper[4691]: I0930 06:37:31.493499 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tn728\" (UniqueName: \"kubernetes.io/projected/4d6bfe51-c6c1-4062-b1e1-22905c50a142-kube-api-access-tn728\") pod \"4d6bfe51-c6c1-4062-b1e1-22905c50a142\" (UID: \"4d6bfe51-c6c1-4062-b1e1-22905c50a142\") " Sep 30 06:37:31 crc kubenswrapper[4691]: I0930 06:37:31.493569 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4d6bfe51-c6c1-4062-b1e1-22905c50a142-httpd-config\") pod \"4d6bfe51-c6c1-4062-b1e1-22905c50a142\" (UID: \"4d6bfe51-c6c1-4062-b1e1-22905c50a142\") " Sep 30 06:37:31 crc kubenswrapper[4691]: I0930 06:37:31.493647 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d6bfe51-c6c1-4062-b1e1-22905c50a142-combined-ca-bundle\") pod \"4d6bfe51-c6c1-4062-b1e1-22905c50a142\" (UID: \"4d6bfe51-c6c1-4062-b1e1-22905c50a142\") " Sep 30 06:37:31 crc kubenswrapper[4691]: I0930 06:37:31.493698 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4d6bfe51-c6c1-4062-b1e1-22905c50a142-config\") pod \"4d6bfe51-c6c1-4062-b1e1-22905c50a142\" (UID: \"4d6bfe51-c6c1-4062-b1e1-22905c50a142\") " Sep 30 06:37:31 crc kubenswrapper[4691]: I0930 06:37:31.493746 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d6bfe51-c6c1-4062-b1e1-22905c50a142-ovndb-tls-certs\") pod \"4d6bfe51-c6c1-4062-b1e1-22905c50a142\" (UID: \"4d6bfe51-c6c1-4062-b1e1-22905c50a142\") " Sep 30 06:37:31 crc kubenswrapper[4691]: I0930 06:37:31.500249 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d6bfe51-c6c1-4062-b1e1-22905c50a142-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "4d6bfe51-c6c1-4062-b1e1-22905c50a142" (UID: "4d6bfe51-c6c1-4062-b1e1-22905c50a142"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:37:31 crc kubenswrapper[4691]: I0930 06:37:31.500723 4691 scope.go:117] "RemoveContainer" containerID="9e0a6bbbcf9c4b88082c2de5488f2b0e557254e439866c0f97aad7411933c802" Sep 30 06:37:31 crc kubenswrapper[4691]: I0930 06:37:31.504611 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d6bfe51-c6c1-4062-b1e1-22905c50a142-kube-api-access-tn728" (OuterVolumeSpecName: "kube-api-access-tn728") pod "4d6bfe51-c6c1-4062-b1e1-22905c50a142" (UID: "4d6bfe51-c6c1-4062-b1e1-22905c50a142"). InnerVolumeSpecName "kube-api-access-tn728". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:37:31 crc kubenswrapper[4691]: I0930 06:37:31.534286 4691 scope.go:117] "RemoveContainer" containerID="90829e5557768999da198dbb96694764a139fd80b48837ac756544291f70a36c" Sep 30 06:37:31 crc kubenswrapper[4691]: E0930 06:37:31.535033 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90829e5557768999da198dbb96694764a139fd80b48837ac756544291f70a36c\": container with ID starting with 90829e5557768999da198dbb96694764a139fd80b48837ac756544291f70a36c not found: ID does not exist" containerID="90829e5557768999da198dbb96694764a139fd80b48837ac756544291f70a36c" Sep 30 06:37:31 crc kubenswrapper[4691]: I0930 06:37:31.535125 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90829e5557768999da198dbb96694764a139fd80b48837ac756544291f70a36c"} err="failed to get container status \"90829e5557768999da198dbb96694764a139fd80b48837ac756544291f70a36c\": rpc error: code = NotFound desc = could not find container \"90829e5557768999da198dbb96694764a139fd80b48837ac756544291f70a36c\": container with ID starting with 90829e5557768999da198dbb96694764a139fd80b48837ac756544291f70a36c not found: ID does not exist" Sep 30 06:37:31 crc kubenswrapper[4691]: I0930 06:37:31.535196 4691 scope.go:117] "RemoveContainer" containerID="9e0a6bbbcf9c4b88082c2de5488f2b0e557254e439866c0f97aad7411933c802" Sep 30 06:37:31 crc kubenswrapper[4691]: E0930 06:37:31.535592 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e0a6bbbcf9c4b88082c2de5488f2b0e557254e439866c0f97aad7411933c802\": container with ID starting with 9e0a6bbbcf9c4b88082c2de5488f2b0e557254e439866c0f97aad7411933c802 not found: ID does not exist" containerID="9e0a6bbbcf9c4b88082c2de5488f2b0e557254e439866c0f97aad7411933c802" Sep 30 06:37:31 crc kubenswrapper[4691]: I0930 06:37:31.535634 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e0a6bbbcf9c4b88082c2de5488f2b0e557254e439866c0f97aad7411933c802"} err="failed to get container status \"9e0a6bbbcf9c4b88082c2de5488f2b0e557254e439866c0f97aad7411933c802\": rpc error: code = NotFound desc = could not find container \"9e0a6bbbcf9c4b88082c2de5488f2b0e557254e439866c0f97aad7411933c802\": container with ID starting with 9e0a6bbbcf9c4b88082c2de5488f2b0e557254e439866c0f97aad7411933c802 not found: ID does not exist" Sep 30 06:37:31 crc kubenswrapper[4691]: I0930 06:37:31.571966 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d6bfe51-c6c1-4062-b1e1-22905c50a142-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4d6bfe51-c6c1-4062-b1e1-22905c50a142" (UID: "4d6bfe51-c6c1-4062-b1e1-22905c50a142"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:37:31 crc kubenswrapper[4691]: I0930 06:37:31.575698 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d6bfe51-c6c1-4062-b1e1-22905c50a142-config" (OuterVolumeSpecName: "config") pod "4d6bfe51-c6c1-4062-b1e1-22905c50a142" (UID: "4d6bfe51-c6c1-4062-b1e1-22905c50a142"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:37:31 crc kubenswrapper[4691]: I0930 06:37:31.599605 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tn728\" (UniqueName: \"kubernetes.io/projected/4d6bfe51-c6c1-4062-b1e1-22905c50a142-kube-api-access-tn728\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:31 crc kubenswrapper[4691]: I0930 06:37:31.599629 4691 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/4d6bfe51-c6c1-4062-b1e1-22905c50a142-httpd-config\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:31 crc kubenswrapper[4691]: I0930 06:37:31.599844 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4d6bfe51-c6c1-4062-b1e1-22905c50a142-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:31 crc kubenswrapper[4691]: I0930 06:37:31.599854 4691 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/4d6bfe51-c6c1-4062-b1e1-22905c50a142-config\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:31 crc kubenswrapper[4691]: I0930 06:37:31.613512 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4d6bfe51-c6c1-4062-b1e1-22905c50a142-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "4d6bfe51-c6c1-4062-b1e1-22905c50a142" (UID: "4d6bfe51-c6c1-4062-b1e1-22905c50a142"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:37:31 crc kubenswrapper[4691]: I0930 06:37:31.701512 4691 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4d6bfe51-c6c1-4062-b1e1-22905c50a142-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:31 crc kubenswrapper[4691]: I0930 06:37:31.791716 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-6f47894d84-xb69p"] Sep 30 06:37:31 crc kubenswrapper[4691]: I0930 06:37:31.800462 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-6f47894d84-xb69p"] Sep 30 06:37:32 crc kubenswrapper[4691]: I0930 06:37:32.484817 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0937a7d3-6bf0-4114-b73b-0d10f2f19945","Type":"ContainerStarted","Data":"922f4576d921dc05660db2a80cf591e40c226567b2d6bef8c10f87c9279972b0"} Sep 30 06:37:32 crc kubenswrapper[4691]: I0930 06:37:32.485102 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"0937a7d3-6bf0-4114-b73b-0d10f2f19945","Type":"ContainerStarted","Data":"5fc688847b7101dca41dc91f614c813b4753e6b688a4ae0be8aea93143c88393"} Sep 30 06:37:32 crc kubenswrapper[4691]: I0930 06:37:32.491377 4691 generic.go:334] "Generic (PLEG): container finished" podID="3be92833-059a-4083-9889-e552dc6eda8d" containerID="d24c0dcdcce162b58e02d74bbd3d4e3151bc794d2e261ced9b6ccbd96084e66e" exitCode=0 Sep 30 06:37:32 crc kubenswrapper[4691]: I0930 06:37:32.491410 4691 generic.go:334] "Generic (PLEG): container finished" podID="3be92833-059a-4083-9889-e552dc6eda8d" containerID="ebc1ed6051f60cba9607917440d752aac33b9de2590a55ad74c3a51788f1fe70" exitCode=2 Sep 30 06:37:32 crc kubenswrapper[4691]: I0930 06:37:32.491420 4691 generic.go:334] "Generic (PLEG): container finished" podID="3be92833-059a-4083-9889-e552dc6eda8d" containerID="d86e2c15860aedf5d0bbd88095c704b3bcb5fdb432d31ef9ace9b46eb5a705a2" exitCode=0 Sep 30 06:37:32 crc kubenswrapper[4691]: I0930 06:37:32.491441 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3be92833-059a-4083-9889-e552dc6eda8d","Type":"ContainerDied","Data":"d24c0dcdcce162b58e02d74bbd3d4e3151bc794d2e261ced9b6ccbd96084e66e"} Sep 30 06:37:32 crc kubenswrapper[4691]: I0930 06:37:32.491477 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3be92833-059a-4083-9889-e552dc6eda8d","Type":"ContainerDied","Data":"ebc1ed6051f60cba9607917440d752aac33b9de2590a55ad74c3a51788f1fe70"} Sep 30 06:37:32 crc kubenswrapper[4691]: I0930 06:37:32.491487 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3be92833-059a-4083-9889-e552dc6eda8d","Type":"ContainerDied","Data":"d86e2c15860aedf5d0bbd88095c704b3bcb5fdb432d31ef9ace9b46eb5a705a2"} Sep 30 06:37:32 crc kubenswrapper[4691]: I0930 06:37:32.507813 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.507798701 podStartE2EDuration="3.507798701s" podCreationTimestamp="2025-09-30 06:37:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:37:32.505154046 +0000 UTC m=+1095.980175086" watchObservedRunningTime="2025-09-30 06:37:32.507798701 +0000 UTC m=+1095.982819741" Sep 30 06:37:33 crc kubenswrapper[4691]: I0930 06:37:33.234991 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d6bfe51-c6c1-4062-b1e1-22905c50a142" path="/var/lib/kubelet/pods/4d6bfe51-c6c1-4062-b1e1-22905c50a142/volumes" Sep 30 06:37:33 crc kubenswrapper[4691]: I0930 06:37:33.238074 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 06:37:33 crc kubenswrapper[4691]: I0930 06:37:33.333255 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3be92833-059a-4083-9889-e552dc6eda8d-combined-ca-bundle\") pod \"3be92833-059a-4083-9889-e552dc6eda8d\" (UID: \"3be92833-059a-4083-9889-e552dc6eda8d\") " Sep 30 06:37:33 crc kubenswrapper[4691]: I0930 06:37:33.333512 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3be92833-059a-4083-9889-e552dc6eda8d-config-data\") pod \"3be92833-059a-4083-9889-e552dc6eda8d\" (UID: \"3be92833-059a-4083-9889-e552dc6eda8d\") " Sep 30 06:37:33 crc kubenswrapper[4691]: I0930 06:37:33.333571 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jswsb\" (UniqueName: \"kubernetes.io/projected/3be92833-059a-4083-9889-e552dc6eda8d-kube-api-access-jswsb\") pod \"3be92833-059a-4083-9889-e552dc6eda8d\" (UID: \"3be92833-059a-4083-9889-e552dc6eda8d\") " Sep 30 06:37:33 crc kubenswrapper[4691]: I0930 06:37:33.333606 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3be92833-059a-4083-9889-e552dc6eda8d-run-httpd\") pod \"3be92833-059a-4083-9889-e552dc6eda8d\" (UID: \"3be92833-059a-4083-9889-e552dc6eda8d\") " Sep 30 06:37:33 crc kubenswrapper[4691]: I0930 06:37:33.333667 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3be92833-059a-4083-9889-e552dc6eda8d-log-httpd\") pod \"3be92833-059a-4083-9889-e552dc6eda8d\" (UID: \"3be92833-059a-4083-9889-e552dc6eda8d\") " Sep 30 06:37:33 crc kubenswrapper[4691]: I0930 06:37:33.333758 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3be92833-059a-4083-9889-e552dc6eda8d-sg-core-conf-yaml\") pod \"3be92833-059a-4083-9889-e552dc6eda8d\" (UID: \"3be92833-059a-4083-9889-e552dc6eda8d\") " Sep 30 06:37:33 crc kubenswrapper[4691]: I0930 06:37:33.333797 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3be92833-059a-4083-9889-e552dc6eda8d-scripts\") pod \"3be92833-059a-4083-9889-e552dc6eda8d\" (UID: \"3be92833-059a-4083-9889-e552dc6eda8d\") " Sep 30 06:37:33 crc kubenswrapper[4691]: I0930 06:37:33.334203 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3be92833-059a-4083-9889-e552dc6eda8d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "3be92833-059a-4083-9889-e552dc6eda8d" (UID: "3be92833-059a-4083-9889-e552dc6eda8d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:37:33 crc kubenswrapper[4691]: I0930 06:37:33.334808 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3be92833-059a-4083-9889-e552dc6eda8d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "3be92833-059a-4083-9889-e552dc6eda8d" (UID: "3be92833-059a-4083-9889-e552dc6eda8d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:37:33 crc kubenswrapper[4691]: I0930 06:37:33.339548 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3be92833-059a-4083-9889-e552dc6eda8d-scripts" (OuterVolumeSpecName: "scripts") pod "3be92833-059a-4083-9889-e552dc6eda8d" (UID: "3be92833-059a-4083-9889-e552dc6eda8d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:37:33 crc kubenswrapper[4691]: I0930 06:37:33.341480 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3be92833-059a-4083-9889-e552dc6eda8d-kube-api-access-jswsb" (OuterVolumeSpecName: "kube-api-access-jswsb") pod "3be92833-059a-4083-9889-e552dc6eda8d" (UID: "3be92833-059a-4083-9889-e552dc6eda8d"). InnerVolumeSpecName "kube-api-access-jswsb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:37:33 crc kubenswrapper[4691]: I0930 06:37:33.361473 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3be92833-059a-4083-9889-e552dc6eda8d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "3be92833-059a-4083-9889-e552dc6eda8d" (UID: "3be92833-059a-4083-9889-e552dc6eda8d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:37:33 crc kubenswrapper[4691]: I0930 06:37:33.427232 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3be92833-059a-4083-9889-e552dc6eda8d-config-data" (OuterVolumeSpecName: "config-data") pod "3be92833-059a-4083-9889-e552dc6eda8d" (UID: "3be92833-059a-4083-9889-e552dc6eda8d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:37:33 crc kubenswrapper[4691]: I0930 06:37:33.427547 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3be92833-059a-4083-9889-e552dc6eda8d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3be92833-059a-4083-9889-e552dc6eda8d" (UID: "3be92833-059a-4083-9889-e552dc6eda8d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:37:33 crc kubenswrapper[4691]: I0930 06:37:33.435814 4691 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3be92833-059a-4083-9889-e552dc6eda8d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:33 crc kubenswrapper[4691]: I0930 06:37:33.435837 4691 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3be92833-059a-4083-9889-e552dc6eda8d-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:33 crc kubenswrapper[4691]: I0930 06:37:33.435846 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3be92833-059a-4083-9889-e552dc6eda8d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:33 crc kubenswrapper[4691]: I0930 06:37:33.435854 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3be92833-059a-4083-9889-e552dc6eda8d-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:33 crc kubenswrapper[4691]: I0930 06:37:33.435862 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jswsb\" (UniqueName: \"kubernetes.io/projected/3be92833-059a-4083-9889-e552dc6eda8d-kube-api-access-jswsb\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:33 crc kubenswrapper[4691]: I0930 06:37:33.435872 4691 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3be92833-059a-4083-9889-e552dc6eda8d-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:33 crc kubenswrapper[4691]: I0930 06:37:33.435879 4691 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3be92833-059a-4083-9889-e552dc6eda8d-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:33 crc kubenswrapper[4691]: I0930 06:37:33.501426 4691 generic.go:334] "Generic (PLEG): container finished" podID="3be92833-059a-4083-9889-e552dc6eda8d" containerID="f25b71b6f1052d8b59a7ea4de7a16b20c40da500c24d735a792a518d0c20d88f" exitCode=0 Sep 30 06:37:33 crc kubenswrapper[4691]: I0930 06:37:33.501691 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 06:37:33 crc kubenswrapper[4691]: I0930 06:37:33.501711 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3be92833-059a-4083-9889-e552dc6eda8d","Type":"ContainerDied","Data":"f25b71b6f1052d8b59a7ea4de7a16b20c40da500c24d735a792a518d0c20d88f"} Sep 30 06:37:33 crc kubenswrapper[4691]: I0930 06:37:33.501758 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3be92833-059a-4083-9889-e552dc6eda8d","Type":"ContainerDied","Data":"eccea64742222760309b7d7d6b9b77fef5f4330f16b3aa773753e093b9691580"} Sep 30 06:37:33 crc kubenswrapper[4691]: I0930 06:37:33.501775 4691 scope.go:117] "RemoveContainer" containerID="d24c0dcdcce162b58e02d74bbd3d4e3151bc794d2e261ced9b6ccbd96084e66e" Sep 30 06:37:33 crc kubenswrapper[4691]: I0930 06:37:33.527557 4691 scope.go:117] "RemoveContainer" containerID="ebc1ed6051f60cba9607917440d752aac33b9de2590a55ad74c3a51788f1fe70" Sep 30 06:37:33 crc kubenswrapper[4691]: I0930 06:37:33.543573 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 06:37:33 crc kubenswrapper[4691]: I0930 06:37:33.560759 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 06:37:33 crc kubenswrapper[4691]: I0930 06:37:33.564754 4691 scope.go:117] "RemoveContainer" containerID="f25b71b6f1052d8b59a7ea4de7a16b20c40da500c24d735a792a518d0c20d88f" Sep 30 06:37:33 crc kubenswrapper[4691]: I0930 06:37:33.573055 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 06:37:33 crc kubenswrapper[4691]: E0930 06:37:33.573448 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3be92833-059a-4083-9889-e552dc6eda8d" containerName="sg-core" Sep 30 06:37:33 crc kubenswrapper[4691]: I0930 06:37:33.573464 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="3be92833-059a-4083-9889-e552dc6eda8d" containerName="sg-core" Sep 30 06:37:33 crc kubenswrapper[4691]: E0930 06:37:33.573490 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3be92833-059a-4083-9889-e552dc6eda8d" containerName="ceilometer-central-agent" Sep 30 06:37:33 crc kubenswrapper[4691]: I0930 06:37:33.573498 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="3be92833-059a-4083-9889-e552dc6eda8d" containerName="ceilometer-central-agent" Sep 30 06:37:33 crc kubenswrapper[4691]: E0930 06:37:33.573519 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3be92833-059a-4083-9889-e552dc6eda8d" containerName="ceilometer-notification-agent" Sep 30 06:37:33 crc kubenswrapper[4691]: I0930 06:37:33.573526 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="3be92833-059a-4083-9889-e552dc6eda8d" containerName="ceilometer-notification-agent" Sep 30 06:37:33 crc kubenswrapper[4691]: E0930 06:37:33.573545 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3be92833-059a-4083-9889-e552dc6eda8d" containerName="proxy-httpd" Sep 30 06:37:33 crc kubenswrapper[4691]: I0930 06:37:33.573552 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="3be92833-059a-4083-9889-e552dc6eda8d" containerName="proxy-httpd" Sep 30 06:37:33 crc kubenswrapper[4691]: E0930 06:37:33.573564 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d6bfe51-c6c1-4062-b1e1-22905c50a142" containerName="neutron-httpd" Sep 30 06:37:33 crc kubenswrapper[4691]: I0930 06:37:33.573570 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d6bfe51-c6c1-4062-b1e1-22905c50a142" containerName="neutron-httpd" Sep 30 06:37:33 crc kubenswrapper[4691]: E0930 06:37:33.573579 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d6bfe51-c6c1-4062-b1e1-22905c50a142" containerName="neutron-api" Sep 30 06:37:33 crc kubenswrapper[4691]: I0930 06:37:33.573585 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d6bfe51-c6c1-4062-b1e1-22905c50a142" containerName="neutron-api" Sep 30 06:37:33 crc kubenswrapper[4691]: I0930 06:37:33.573771 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="3be92833-059a-4083-9889-e552dc6eda8d" containerName="proxy-httpd" Sep 30 06:37:33 crc kubenswrapper[4691]: I0930 06:37:33.573796 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d6bfe51-c6c1-4062-b1e1-22905c50a142" containerName="neutron-httpd" Sep 30 06:37:33 crc kubenswrapper[4691]: I0930 06:37:33.573808 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="3be92833-059a-4083-9889-e552dc6eda8d" containerName="ceilometer-central-agent" Sep 30 06:37:33 crc kubenswrapper[4691]: I0930 06:37:33.573818 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d6bfe51-c6c1-4062-b1e1-22905c50a142" containerName="neutron-api" Sep 30 06:37:33 crc kubenswrapper[4691]: I0930 06:37:33.573829 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="3be92833-059a-4083-9889-e552dc6eda8d" containerName="ceilometer-notification-agent" Sep 30 06:37:33 crc kubenswrapper[4691]: I0930 06:37:33.573841 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="3be92833-059a-4083-9889-e552dc6eda8d" containerName="sg-core" Sep 30 06:37:33 crc kubenswrapper[4691]: I0930 06:37:33.575760 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 06:37:33 crc kubenswrapper[4691]: I0930 06:37:33.579322 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 06:37:33 crc kubenswrapper[4691]: I0930 06:37:33.579543 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 06:37:33 crc kubenswrapper[4691]: I0930 06:37:33.581240 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 06:37:33 crc kubenswrapper[4691]: I0930 06:37:33.599222 4691 scope.go:117] "RemoveContainer" containerID="d86e2c15860aedf5d0bbd88095c704b3bcb5fdb432d31ef9ace9b46eb5a705a2" Sep 30 06:37:33 crc kubenswrapper[4691]: I0930 06:37:33.619579 4691 scope.go:117] "RemoveContainer" containerID="d24c0dcdcce162b58e02d74bbd3d4e3151bc794d2e261ced9b6ccbd96084e66e" Sep 30 06:37:33 crc kubenswrapper[4691]: E0930 06:37:33.620048 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d24c0dcdcce162b58e02d74bbd3d4e3151bc794d2e261ced9b6ccbd96084e66e\": container with ID starting with d24c0dcdcce162b58e02d74bbd3d4e3151bc794d2e261ced9b6ccbd96084e66e not found: ID does not exist" containerID="d24c0dcdcce162b58e02d74bbd3d4e3151bc794d2e261ced9b6ccbd96084e66e" Sep 30 06:37:33 crc kubenswrapper[4691]: I0930 06:37:33.620102 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d24c0dcdcce162b58e02d74bbd3d4e3151bc794d2e261ced9b6ccbd96084e66e"} err="failed to get container status \"d24c0dcdcce162b58e02d74bbd3d4e3151bc794d2e261ced9b6ccbd96084e66e\": rpc error: code = NotFound desc = could not find container \"d24c0dcdcce162b58e02d74bbd3d4e3151bc794d2e261ced9b6ccbd96084e66e\": container with ID starting with d24c0dcdcce162b58e02d74bbd3d4e3151bc794d2e261ced9b6ccbd96084e66e not found: ID does not exist" Sep 30 06:37:33 crc kubenswrapper[4691]: I0930 06:37:33.620137 4691 scope.go:117] "RemoveContainer" containerID="ebc1ed6051f60cba9607917440d752aac33b9de2590a55ad74c3a51788f1fe70" Sep 30 06:37:33 crc kubenswrapper[4691]: E0930 06:37:33.620491 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebc1ed6051f60cba9607917440d752aac33b9de2590a55ad74c3a51788f1fe70\": container with ID starting with ebc1ed6051f60cba9607917440d752aac33b9de2590a55ad74c3a51788f1fe70 not found: ID does not exist" containerID="ebc1ed6051f60cba9607917440d752aac33b9de2590a55ad74c3a51788f1fe70" Sep 30 06:37:33 crc kubenswrapper[4691]: I0930 06:37:33.620565 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebc1ed6051f60cba9607917440d752aac33b9de2590a55ad74c3a51788f1fe70"} err="failed to get container status \"ebc1ed6051f60cba9607917440d752aac33b9de2590a55ad74c3a51788f1fe70\": rpc error: code = NotFound desc = could not find container \"ebc1ed6051f60cba9607917440d752aac33b9de2590a55ad74c3a51788f1fe70\": container with ID starting with ebc1ed6051f60cba9607917440d752aac33b9de2590a55ad74c3a51788f1fe70 not found: ID does not exist" Sep 30 06:37:33 crc kubenswrapper[4691]: I0930 06:37:33.620606 4691 scope.go:117] "RemoveContainer" containerID="f25b71b6f1052d8b59a7ea4de7a16b20c40da500c24d735a792a518d0c20d88f" Sep 30 06:37:33 crc kubenswrapper[4691]: E0930 06:37:33.620991 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f25b71b6f1052d8b59a7ea4de7a16b20c40da500c24d735a792a518d0c20d88f\": container with ID starting with f25b71b6f1052d8b59a7ea4de7a16b20c40da500c24d735a792a518d0c20d88f not found: ID does not exist" containerID="f25b71b6f1052d8b59a7ea4de7a16b20c40da500c24d735a792a518d0c20d88f" Sep 30 06:37:33 crc kubenswrapper[4691]: I0930 06:37:33.621012 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f25b71b6f1052d8b59a7ea4de7a16b20c40da500c24d735a792a518d0c20d88f"} err="failed to get container status \"f25b71b6f1052d8b59a7ea4de7a16b20c40da500c24d735a792a518d0c20d88f\": rpc error: code = NotFound desc = could not find container \"f25b71b6f1052d8b59a7ea4de7a16b20c40da500c24d735a792a518d0c20d88f\": container with ID starting with f25b71b6f1052d8b59a7ea4de7a16b20c40da500c24d735a792a518d0c20d88f not found: ID does not exist" Sep 30 06:37:33 crc kubenswrapper[4691]: I0930 06:37:33.621027 4691 scope.go:117] "RemoveContainer" containerID="d86e2c15860aedf5d0bbd88095c704b3bcb5fdb432d31ef9ace9b46eb5a705a2" Sep 30 06:37:33 crc kubenswrapper[4691]: E0930 06:37:33.621331 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d86e2c15860aedf5d0bbd88095c704b3bcb5fdb432d31ef9ace9b46eb5a705a2\": container with ID starting with d86e2c15860aedf5d0bbd88095c704b3bcb5fdb432d31ef9ace9b46eb5a705a2 not found: ID does not exist" containerID="d86e2c15860aedf5d0bbd88095c704b3bcb5fdb432d31ef9ace9b46eb5a705a2" Sep 30 06:37:33 crc kubenswrapper[4691]: I0930 06:37:33.621349 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d86e2c15860aedf5d0bbd88095c704b3bcb5fdb432d31ef9ace9b46eb5a705a2"} err="failed to get container status \"d86e2c15860aedf5d0bbd88095c704b3bcb5fdb432d31ef9ace9b46eb5a705a2\": rpc error: code = NotFound desc = could not find container \"d86e2c15860aedf5d0bbd88095c704b3bcb5fdb432d31ef9ace9b46eb5a705a2\": container with ID starting with d86e2c15860aedf5d0bbd88095c704b3bcb5fdb432d31ef9ace9b46eb5a705a2 not found: ID does not exist" Sep 30 06:37:33 crc kubenswrapper[4691]: I0930 06:37:33.741847 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a6e2ebb6-b355-4f57-bdd0-e3c2f004693a-log-httpd\") pod \"ceilometer-0\" (UID: \"a6e2ebb6-b355-4f57-bdd0-e3c2f004693a\") " pod="openstack/ceilometer-0" Sep 30 06:37:33 crc kubenswrapper[4691]: I0930 06:37:33.741896 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6e2ebb6-b355-4f57-bdd0-e3c2f004693a-config-data\") pod \"ceilometer-0\" (UID: \"a6e2ebb6-b355-4f57-bdd0-e3c2f004693a\") " pod="openstack/ceilometer-0" Sep 30 06:37:33 crc kubenswrapper[4691]: I0930 06:37:33.741933 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6e2ebb6-b355-4f57-bdd0-e3c2f004693a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a6e2ebb6-b355-4f57-bdd0-e3c2f004693a\") " pod="openstack/ceilometer-0" Sep 30 06:37:33 crc kubenswrapper[4691]: I0930 06:37:33.741953 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a6e2ebb6-b355-4f57-bdd0-e3c2f004693a-run-httpd\") pod \"ceilometer-0\" (UID: \"a6e2ebb6-b355-4f57-bdd0-e3c2f004693a\") " pod="openstack/ceilometer-0" Sep 30 06:37:33 crc kubenswrapper[4691]: I0930 06:37:33.742051 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ttcq\" (UniqueName: \"kubernetes.io/projected/a6e2ebb6-b355-4f57-bdd0-e3c2f004693a-kube-api-access-7ttcq\") pod \"ceilometer-0\" (UID: \"a6e2ebb6-b355-4f57-bdd0-e3c2f004693a\") " pod="openstack/ceilometer-0" Sep 30 06:37:33 crc kubenswrapper[4691]: I0930 06:37:33.742281 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6e2ebb6-b355-4f57-bdd0-e3c2f004693a-scripts\") pod \"ceilometer-0\" (UID: \"a6e2ebb6-b355-4f57-bdd0-e3c2f004693a\") " pod="openstack/ceilometer-0" Sep 30 06:37:33 crc kubenswrapper[4691]: I0930 06:37:33.742332 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a6e2ebb6-b355-4f57-bdd0-e3c2f004693a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a6e2ebb6-b355-4f57-bdd0-e3c2f004693a\") " pod="openstack/ceilometer-0" Sep 30 06:37:33 crc kubenswrapper[4691]: I0930 06:37:33.844934 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a6e2ebb6-b355-4f57-bdd0-e3c2f004693a-log-httpd\") pod \"ceilometer-0\" (UID: \"a6e2ebb6-b355-4f57-bdd0-e3c2f004693a\") " pod="openstack/ceilometer-0" Sep 30 06:37:33 crc kubenswrapper[4691]: I0930 06:37:33.845093 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6e2ebb6-b355-4f57-bdd0-e3c2f004693a-config-data\") pod \"ceilometer-0\" (UID: \"a6e2ebb6-b355-4f57-bdd0-e3c2f004693a\") " pod="openstack/ceilometer-0" Sep 30 06:37:33 crc kubenswrapper[4691]: I0930 06:37:33.845144 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6e2ebb6-b355-4f57-bdd0-e3c2f004693a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a6e2ebb6-b355-4f57-bdd0-e3c2f004693a\") " pod="openstack/ceilometer-0" Sep 30 06:37:33 crc kubenswrapper[4691]: I0930 06:37:33.845237 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a6e2ebb6-b355-4f57-bdd0-e3c2f004693a-run-httpd\") pod \"ceilometer-0\" (UID: \"a6e2ebb6-b355-4f57-bdd0-e3c2f004693a\") " pod="openstack/ceilometer-0" Sep 30 06:37:33 crc kubenswrapper[4691]: I0930 06:37:33.845316 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ttcq\" (UniqueName: \"kubernetes.io/projected/a6e2ebb6-b355-4f57-bdd0-e3c2f004693a-kube-api-access-7ttcq\") pod \"ceilometer-0\" (UID: \"a6e2ebb6-b355-4f57-bdd0-e3c2f004693a\") " pod="openstack/ceilometer-0" Sep 30 06:37:33 crc kubenswrapper[4691]: I0930 06:37:33.845268 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a6e2ebb6-b355-4f57-bdd0-e3c2f004693a-log-httpd\") pod \"ceilometer-0\" (UID: \"a6e2ebb6-b355-4f57-bdd0-e3c2f004693a\") " pod="openstack/ceilometer-0" Sep 30 06:37:33 crc kubenswrapper[4691]: I0930 06:37:33.845510 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6e2ebb6-b355-4f57-bdd0-e3c2f004693a-scripts\") pod \"ceilometer-0\" (UID: \"a6e2ebb6-b355-4f57-bdd0-e3c2f004693a\") " pod="openstack/ceilometer-0" Sep 30 06:37:33 crc kubenswrapper[4691]: I0930 06:37:33.845588 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a6e2ebb6-b355-4f57-bdd0-e3c2f004693a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a6e2ebb6-b355-4f57-bdd0-e3c2f004693a\") " pod="openstack/ceilometer-0" Sep 30 06:37:33 crc kubenswrapper[4691]: I0930 06:37:33.845812 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a6e2ebb6-b355-4f57-bdd0-e3c2f004693a-run-httpd\") pod \"ceilometer-0\" (UID: \"a6e2ebb6-b355-4f57-bdd0-e3c2f004693a\") " pod="openstack/ceilometer-0" Sep 30 06:37:33 crc kubenswrapper[4691]: I0930 06:37:33.848573 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6e2ebb6-b355-4f57-bdd0-e3c2f004693a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a6e2ebb6-b355-4f57-bdd0-e3c2f004693a\") " pod="openstack/ceilometer-0" Sep 30 06:37:33 crc kubenswrapper[4691]: I0930 06:37:33.849575 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a6e2ebb6-b355-4f57-bdd0-e3c2f004693a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a6e2ebb6-b355-4f57-bdd0-e3c2f004693a\") " pod="openstack/ceilometer-0" Sep 30 06:37:33 crc kubenswrapper[4691]: I0930 06:37:33.850670 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6e2ebb6-b355-4f57-bdd0-e3c2f004693a-scripts\") pod \"ceilometer-0\" (UID: \"a6e2ebb6-b355-4f57-bdd0-e3c2f004693a\") " pod="openstack/ceilometer-0" Sep 30 06:37:33 crc kubenswrapper[4691]: I0930 06:37:33.851546 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6e2ebb6-b355-4f57-bdd0-e3c2f004693a-config-data\") pod \"ceilometer-0\" (UID: \"a6e2ebb6-b355-4f57-bdd0-e3c2f004693a\") " pod="openstack/ceilometer-0" Sep 30 06:37:33 crc kubenswrapper[4691]: I0930 06:37:33.861212 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ttcq\" (UniqueName: \"kubernetes.io/projected/a6e2ebb6-b355-4f57-bdd0-e3c2f004693a-kube-api-access-7ttcq\") pod \"ceilometer-0\" (UID: \"a6e2ebb6-b355-4f57-bdd0-e3c2f004693a\") " pod="openstack/ceilometer-0" Sep 30 06:37:33 crc kubenswrapper[4691]: I0930 06:37:33.903331 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 06:37:34 crc kubenswrapper[4691]: I0930 06:37:34.438052 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 06:37:34 crc kubenswrapper[4691]: I0930 06:37:34.515824 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a6e2ebb6-b355-4f57-bdd0-e3c2f004693a","Type":"ContainerStarted","Data":"d58be6dd2cf9a2bd829283c9ac2aa04fb8925c7e65cc7024f06810081223fe55"} Sep 30 06:37:34 crc kubenswrapper[4691]: I0930 06:37:34.973358 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-zs2kc"] Sep 30 06:37:34 crc kubenswrapper[4691]: I0930 06:37:34.975449 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-zs2kc" Sep 30 06:37:34 crc kubenswrapper[4691]: I0930 06:37:34.987522 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-zs2kc"] Sep 30 06:37:35 crc kubenswrapper[4691]: I0930 06:37:35.059281 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-9j44w"] Sep 30 06:37:35 crc kubenswrapper[4691]: I0930 06:37:35.060667 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-9j44w" Sep 30 06:37:35 crc kubenswrapper[4691]: I0930 06:37:35.080873 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bqt8\" (UniqueName: \"kubernetes.io/projected/63e509fa-065b-49fa-8e8b-292350d86b8f-kube-api-access-5bqt8\") pod \"nova-api-db-create-zs2kc\" (UID: \"63e509fa-065b-49fa-8e8b-292350d86b8f\") " pod="openstack/nova-api-db-create-zs2kc" Sep 30 06:37:35 crc kubenswrapper[4691]: I0930 06:37:35.086246 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-9j44w"] Sep 30 06:37:35 crc kubenswrapper[4691]: I0930 06:37:35.140995 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Sep 30 06:37:35 crc kubenswrapper[4691]: I0930 06:37:35.183365 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bqt8\" (UniqueName: \"kubernetes.io/projected/63e509fa-065b-49fa-8e8b-292350d86b8f-kube-api-access-5bqt8\") pod \"nova-api-db-create-zs2kc\" (UID: \"63e509fa-065b-49fa-8e8b-292350d86b8f\") " pod="openstack/nova-api-db-create-zs2kc" Sep 30 06:37:35 crc kubenswrapper[4691]: I0930 06:37:35.183458 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgcl9\" (UniqueName: \"kubernetes.io/projected/9a199530-065b-4a27-bd83-b0b8f0ae2c13-kube-api-access-sgcl9\") pod \"nova-cell0-db-create-9j44w\" (UID: \"9a199530-065b-4a27-bd83-b0b8f0ae2c13\") " pod="openstack/nova-cell0-db-create-9j44w" Sep 30 06:37:35 crc kubenswrapper[4691]: I0930 06:37:35.200854 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bqt8\" (UniqueName: \"kubernetes.io/projected/63e509fa-065b-49fa-8e8b-292350d86b8f-kube-api-access-5bqt8\") pod \"nova-api-db-create-zs2kc\" (UID: \"63e509fa-065b-49fa-8e8b-292350d86b8f\") " pod="openstack/nova-api-db-create-zs2kc" Sep 30 06:37:35 crc kubenswrapper[4691]: I0930 06:37:35.243392 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3be92833-059a-4083-9889-e552dc6eda8d" path="/var/lib/kubelet/pods/3be92833-059a-4083-9889-e552dc6eda8d/volumes" Sep 30 06:37:35 crc kubenswrapper[4691]: I0930 06:37:35.265276 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-f7vsc"] Sep 30 06:37:35 crc kubenswrapper[4691]: I0930 06:37:35.267753 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-f7vsc" Sep 30 06:37:35 crc kubenswrapper[4691]: I0930 06:37:35.276333 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-f7vsc"] Sep 30 06:37:35 crc kubenswrapper[4691]: I0930 06:37:35.285207 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgcl9\" (UniqueName: \"kubernetes.io/projected/9a199530-065b-4a27-bd83-b0b8f0ae2c13-kube-api-access-sgcl9\") pod \"nova-cell0-db-create-9j44w\" (UID: \"9a199530-065b-4a27-bd83-b0b8f0ae2c13\") " pod="openstack/nova-cell0-db-create-9j44w" Sep 30 06:37:35 crc kubenswrapper[4691]: I0930 06:37:35.305933 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-zs2kc" Sep 30 06:37:35 crc kubenswrapper[4691]: I0930 06:37:35.309512 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgcl9\" (UniqueName: \"kubernetes.io/projected/9a199530-065b-4a27-bd83-b0b8f0ae2c13-kube-api-access-sgcl9\") pod \"nova-cell0-db-create-9j44w\" (UID: \"9a199530-065b-4a27-bd83-b0b8f0ae2c13\") " pod="openstack/nova-cell0-db-create-9j44w" Sep 30 06:37:35 crc kubenswrapper[4691]: I0930 06:37:35.335972 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/watcher-decision-engine-0" Sep 30 06:37:35 crc kubenswrapper[4691]: I0930 06:37:35.336020 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/watcher-decision-engine-0" Sep 30 06:37:35 crc kubenswrapper[4691]: I0930 06:37:35.372088 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/watcher-decision-engine-0" Sep 30 06:37:35 crc kubenswrapper[4691]: I0930 06:37:35.387137 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnlt9\" (UniqueName: \"kubernetes.io/projected/149dd243-4659-4434-a6d1-63fb57617546-kube-api-access-cnlt9\") pod \"nova-cell1-db-create-f7vsc\" (UID: \"149dd243-4659-4434-a6d1-63fb57617546\") " pod="openstack/nova-cell1-db-create-f7vsc" Sep 30 06:37:35 crc kubenswrapper[4691]: I0930 06:37:35.489113 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cnlt9\" (UniqueName: \"kubernetes.io/projected/149dd243-4659-4434-a6d1-63fb57617546-kube-api-access-cnlt9\") pod \"nova-cell1-db-create-f7vsc\" (UID: \"149dd243-4659-4434-a6d1-63fb57617546\") " pod="openstack/nova-cell1-db-create-f7vsc" Sep 30 06:37:35 crc kubenswrapper[4691]: I0930 06:37:35.507339 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnlt9\" (UniqueName: \"kubernetes.io/projected/149dd243-4659-4434-a6d1-63fb57617546-kube-api-access-cnlt9\") pod \"nova-cell1-db-create-f7vsc\" (UID: \"149dd243-4659-4434-a6d1-63fb57617546\") " pod="openstack/nova-cell1-db-create-f7vsc" Sep 30 06:37:35 crc kubenswrapper[4691]: I0930 06:37:35.527937 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a6e2ebb6-b355-4f57-bdd0-e3c2f004693a","Type":"ContainerStarted","Data":"0713b85797703b13631fa914b183c5b54e11897c5a706ccf5cd4c98c5c772382"} Sep 30 06:37:35 crc kubenswrapper[4691]: I0930 06:37:35.528004 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a6e2ebb6-b355-4f57-bdd0-e3c2f004693a","Type":"ContainerStarted","Data":"3e6baa15cf5976146083765bcf26dd94cdc365d2a99fc1b5ad03b07dd7a66d42"} Sep 30 06:37:35 crc kubenswrapper[4691]: I0930 06:37:35.557727 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-9j44w" Sep 30 06:37:35 crc kubenswrapper[4691]: I0930 06:37:35.575164 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/watcher-decision-engine-0" Sep 30 06:37:35 crc kubenswrapper[4691]: I0930 06:37:35.620858 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-f7vsc" Sep 30 06:37:35 crc kubenswrapper[4691]: I0930 06:37:35.771004 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-zs2kc"] Sep 30 06:37:36 crc kubenswrapper[4691]: I0930 06:37:36.036950 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-9j44w"] Sep 30 06:37:36 crc kubenswrapper[4691]: I0930 06:37:36.170586 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-f7vsc"] Sep 30 06:37:36 crc kubenswrapper[4691]: W0930 06:37:36.194161 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod149dd243_4659_4434_a6d1_63fb57617546.slice/crio-774a33533df681799897a53a758685eefb741d4aeac4be0d29946a8889886839 WatchSource:0}: Error finding container 774a33533df681799897a53a758685eefb741d4aeac4be0d29946a8889886839: Status 404 returned error can't find the container with id 774a33533df681799897a53a758685eefb741d4aeac4be0d29946a8889886839 Sep 30 06:37:36 crc kubenswrapper[4691]: I0930 06:37:36.540764 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a6e2ebb6-b355-4f57-bdd0-e3c2f004693a","Type":"ContainerStarted","Data":"eeff54f6517adb79cbb94666d22116dd91610eae852a6ac745d091cf6489839d"} Sep 30 06:37:36 crc kubenswrapper[4691]: I0930 06:37:36.542216 4691 generic.go:334] "Generic (PLEG): container finished" podID="9a199530-065b-4a27-bd83-b0b8f0ae2c13" containerID="10cdfd114fab731994de2eee98c74403b9f52834b6af5b4fe1d54c3a849b5851" exitCode=0 Sep 30 06:37:36 crc kubenswrapper[4691]: I0930 06:37:36.542251 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-9j44w" event={"ID":"9a199530-065b-4a27-bd83-b0b8f0ae2c13","Type":"ContainerDied","Data":"10cdfd114fab731994de2eee98c74403b9f52834b6af5b4fe1d54c3a849b5851"} Sep 30 06:37:36 crc kubenswrapper[4691]: I0930 06:37:36.542423 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-9j44w" event={"ID":"9a199530-065b-4a27-bd83-b0b8f0ae2c13","Type":"ContainerStarted","Data":"310494015bb9a71e652b490b371eca34873082aa9ae4fff20fe427a37e4bb3d1"} Sep 30 06:37:36 crc kubenswrapper[4691]: I0930 06:37:36.545265 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-f7vsc" event={"ID":"149dd243-4659-4434-a6d1-63fb57617546","Type":"ContainerStarted","Data":"44dc43004671078ef18a6070b1f939de4f8bbefd20a88c929fb127889b048565"} Sep 30 06:37:36 crc kubenswrapper[4691]: I0930 06:37:36.545304 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-f7vsc" event={"ID":"149dd243-4659-4434-a6d1-63fb57617546","Type":"ContainerStarted","Data":"774a33533df681799897a53a758685eefb741d4aeac4be0d29946a8889886839"} Sep 30 06:37:36 crc kubenswrapper[4691]: I0930 06:37:36.546783 4691 generic.go:334] "Generic (PLEG): container finished" podID="63e509fa-065b-49fa-8e8b-292350d86b8f" containerID="43088109a9909db3c1bd729fe02db38943455e3bb1680ececbbb8ca2edc63c80" exitCode=0 Sep 30 06:37:36 crc kubenswrapper[4691]: I0930 06:37:36.546847 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-zs2kc" event={"ID":"63e509fa-065b-49fa-8e8b-292350d86b8f","Type":"ContainerDied","Data":"43088109a9909db3c1bd729fe02db38943455e3bb1680ececbbb8ca2edc63c80"} Sep 30 06:37:36 crc kubenswrapper[4691]: I0930 06:37:36.547314 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-zs2kc" event={"ID":"63e509fa-065b-49fa-8e8b-292350d86b8f","Type":"ContainerStarted","Data":"9eb1d649a7083f322dc84dc73b21fda1de962691db3305613bc4d0445924e360"} Sep 30 06:37:36 crc kubenswrapper[4691]: I0930 06:37:36.582702 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-f7vsc" podStartSLOduration=1.582686264 podStartE2EDuration="1.582686264s" podCreationTimestamp="2025-09-30 06:37:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:37:36.580799842 +0000 UTC m=+1100.055820882" watchObservedRunningTime="2025-09-30 06:37:36.582686264 +0000 UTC m=+1100.057707304" Sep 30 06:37:37 crc kubenswrapper[4691]: I0930 06:37:37.574260 4691 generic.go:334] "Generic (PLEG): container finished" podID="149dd243-4659-4434-a6d1-63fb57617546" containerID="44dc43004671078ef18a6070b1f939de4f8bbefd20a88c929fb127889b048565" exitCode=0 Sep 30 06:37:37 crc kubenswrapper[4691]: I0930 06:37:37.574681 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-f7vsc" event={"ID":"149dd243-4659-4434-a6d1-63fb57617546","Type":"ContainerDied","Data":"44dc43004671078ef18a6070b1f939de4f8bbefd20a88c929fb127889b048565"} Sep 30 06:37:38 crc kubenswrapper[4691]: I0930 06:37:38.149561 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-zs2kc" Sep 30 06:37:38 crc kubenswrapper[4691]: I0930 06:37:38.158632 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-9j44w" Sep 30 06:37:38 crc kubenswrapper[4691]: I0930 06:37:38.167075 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgcl9\" (UniqueName: \"kubernetes.io/projected/9a199530-065b-4a27-bd83-b0b8f0ae2c13-kube-api-access-sgcl9\") pod \"9a199530-065b-4a27-bd83-b0b8f0ae2c13\" (UID: \"9a199530-065b-4a27-bd83-b0b8f0ae2c13\") " Sep 30 06:37:38 crc kubenswrapper[4691]: I0930 06:37:38.167187 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bqt8\" (UniqueName: \"kubernetes.io/projected/63e509fa-065b-49fa-8e8b-292350d86b8f-kube-api-access-5bqt8\") pod \"63e509fa-065b-49fa-8e8b-292350d86b8f\" (UID: \"63e509fa-065b-49fa-8e8b-292350d86b8f\") " Sep 30 06:37:38 crc kubenswrapper[4691]: I0930 06:37:38.173409 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a199530-065b-4a27-bd83-b0b8f0ae2c13-kube-api-access-sgcl9" (OuterVolumeSpecName: "kube-api-access-sgcl9") pod "9a199530-065b-4a27-bd83-b0b8f0ae2c13" (UID: "9a199530-065b-4a27-bd83-b0b8f0ae2c13"). InnerVolumeSpecName "kube-api-access-sgcl9". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:37:38 crc kubenswrapper[4691]: I0930 06:37:38.173561 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63e509fa-065b-49fa-8e8b-292350d86b8f-kube-api-access-5bqt8" (OuterVolumeSpecName: "kube-api-access-5bqt8") pod "63e509fa-065b-49fa-8e8b-292350d86b8f" (UID: "63e509fa-065b-49fa-8e8b-292350d86b8f"). InnerVolumeSpecName "kube-api-access-5bqt8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:37:38 crc kubenswrapper[4691]: I0930 06:37:38.269480 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5bqt8\" (UniqueName: \"kubernetes.io/projected/63e509fa-065b-49fa-8e8b-292350d86b8f-kube-api-access-5bqt8\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:38 crc kubenswrapper[4691]: I0930 06:37:38.269506 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgcl9\" (UniqueName: \"kubernetes.io/projected/9a199530-065b-4a27-bd83-b0b8f0ae2c13-kube-api-access-sgcl9\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:38 crc kubenswrapper[4691]: I0930 06:37:38.431858 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 06:37:38 crc kubenswrapper[4691]: I0930 06:37:38.433866 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="df950ad9-45e7-4c79-ba30-ef4b423809b0" containerName="glance-log" containerID="cri-o://6e10c72881f1db003dcd37cc7e2bc8d9653b2faad08dd92dbbc418620d9785a7" gracePeriod=30 Sep 30 06:37:38 crc kubenswrapper[4691]: I0930 06:37:38.434345 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="df950ad9-45e7-4c79-ba30-ef4b423809b0" containerName="glance-httpd" containerID="cri-o://80b311ef92245ba29ced3c5c44eb4f6e0a4d8ae29cd79b446f38c4dda55ef30f" gracePeriod=30 Sep 30 06:37:38 crc kubenswrapper[4691]: I0930 06:37:38.584918 4691 generic.go:334] "Generic (PLEG): container finished" podID="df950ad9-45e7-4c79-ba30-ef4b423809b0" containerID="6e10c72881f1db003dcd37cc7e2bc8d9653b2faad08dd92dbbc418620d9785a7" exitCode=143 Sep 30 06:37:38 crc kubenswrapper[4691]: I0930 06:37:38.585011 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"df950ad9-45e7-4c79-ba30-ef4b423809b0","Type":"ContainerDied","Data":"6e10c72881f1db003dcd37cc7e2bc8d9653b2faad08dd92dbbc418620d9785a7"} Sep 30 06:37:38 crc kubenswrapper[4691]: I0930 06:37:38.586799 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-zs2kc" event={"ID":"63e509fa-065b-49fa-8e8b-292350d86b8f","Type":"ContainerDied","Data":"9eb1d649a7083f322dc84dc73b21fda1de962691db3305613bc4d0445924e360"} Sep 30 06:37:38 crc kubenswrapper[4691]: I0930 06:37:38.586837 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-zs2kc" Sep 30 06:37:38 crc kubenswrapper[4691]: I0930 06:37:38.586859 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9eb1d649a7083f322dc84dc73b21fda1de962691db3305613bc4d0445924e360" Sep 30 06:37:38 crc kubenswrapper[4691]: I0930 06:37:38.590288 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a6e2ebb6-b355-4f57-bdd0-e3c2f004693a","Type":"ContainerStarted","Data":"0adcddc8119b25acec937c92689758224e3a51a61572bb4e32943507f9c3eaee"} Sep 30 06:37:38 crc kubenswrapper[4691]: I0930 06:37:38.590459 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 06:37:38 crc kubenswrapper[4691]: I0930 06:37:38.598439 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-9j44w" Sep 30 06:37:38 crc kubenswrapper[4691]: I0930 06:37:38.599979 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-9j44w" event={"ID":"9a199530-065b-4a27-bd83-b0b8f0ae2c13","Type":"ContainerDied","Data":"310494015bb9a71e652b490b371eca34873082aa9ae4fff20fe427a37e4bb3d1"} Sep 30 06:37:38 crc kubenswrapper[4691]: I0930 06:37:38.600032 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="310494015bb9a71e652b490b371eca34873082aa9ae4fff20fe427a37e4bb3d1" Sep 30 06:37:38 crc kubenswrapper[4691]: I0930 06:37:38.953508 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-f7vsc" Sep 30 06:37:38 crc kubenswrapper[4691]: I0930 06:37:38.973064 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.98841799 podStartE2EDuration="5.973049421s" podCreationTimestamp="2025-09-30 06:37:33 +0000 UTC" firstStartedPulling="2025-09-30 06:37:34.440301284 +0000 UTC m=+1097.915322334" lastFinishedPulling="2025-09-30 06:37:37.424932735 +0000 UTC m=+1100.899953765" observedRunningTime="2025-09-30 06:37:38.622301684 +0000 UTC m=+1102.097322734" watchObservedRunningTime="2025-09-30 06:37:38.973049421 +0000 UTC m=+1102.448070461" Sep 30 06:37:38 crc kubenswrapper[4691]: I0930 06:37:38.982184 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cnlt9\" (UniqueName: \"kubernetes.io/projected/149dd243-4659-4434-a6d1-63fb57617546-kube-api-access-cnlt9\") pod \"149dd243-4659-4434-a6d1-63fb57617546\" (UID: \"149dd243-4659-4434-a6d1-63fb57617546\") " Sep 30 06:37:38 crc kubenswrapper[4691]: I0930 06:37:38.990245 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/149dd243-4659-4434-a6d1-63fb57617546-kube-api-access-cnlt9" (OuterVolumeSpecName: "kube-api-access-cnlt9") pod "149dd243-4659-4434-a6d1-63fb57617546" (UID: "149dd243-4659-4434-a6d1-63fb57617546"). InnerVolumeSpecName "kube-api-access-cnlt9". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:37:39 crc kubenswrapper[4691]: I0930 06:37:39.084993 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cnlt9\" (UniqueName: \"kubernetes.io/projected/149dd243-4659-4434-a6d1-63fb57617546-kube-api-access-cnlt9\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:39 crc kubenswrapper[4691]: I0930 06:37:39.608978 4691 generic.go:334] "Generic (PLEG): container finished" podID="df950ad9-45e7-4c79-ba30-ef4b423809b0" containerID="80b311ef92245ba29ced3c5c44eb4f6e0a4d8ae29cd79b446f38c4dda55ef30f" exitCode=0 Sep 30 06:37:39 crc kubenswrapper[4691]: I0930 06:37:39.609270 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"df950ad9-45e7-4c79-ba30-ef4b423809b0","Type":"ContainerDied","Data":"80b311ef92245ba29ced3c5c44eb4f6e0a4d8ae29cd79b446f38c4dda55ef30f"} Sep 30 06:37:39 crc kubenswrapper[4691]: I0930 06:37:39.611002 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-f7vsc" event={"ID":"149dd243-4659-4434-a6d1-63fb57617546","Type":"ContainerDied","Data":"774a33533df681799897a53a758685eefb741d4aeac4be0d29946a8889886839"} Sep 30 06:37:39 crc kubenswrapper[4691]: I0930 06:37:39.611032 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="774a33533df681799897a53a758685eefb741d4aeac4be0d29946a8889886839" Sep 30 06:37:39 crc kubenswrapper[4691]: I0930 06:37:39.611048 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-f7vsc" Sep 30 06:37:40 crc kubenswrapper[4691]: I0930 06:37:40.015249 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 06:37:40 crc kubenswrapper[4691]: I0930 06:37:40.101548 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/df950ad9-45e7-4c79-ba30-ef4b423809b0-httpd-run\") pod \"df950ad9-45e7-4c79-ba30-ef4b423809b0\" (UID: \"df950ad9-45e7-4c79-ba30-ef4b423809b0\") " Sep 30 06:37:40 crc kubenswrapper[4691]: I0930 06:37:40.101786 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"df950ad9-45e7-4c79-ba30-ef4b423809b0\" (UID: \"df950ad9-45e7-4c79-ba30-ef4b423809b0\") " Sep 30 06:37:40 crc kubenswrapper[4691]: I0930 06:37:40.102056 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df950ad9-45e7-4c79-ba30-ef4b423809b0-logs\") pod \"df950ad9-45e7-4c79-ba30-ef4b423809b0\" (UID: \"df950ad9-45e7-4c79-ba30-ef4b423809b0\") " Sep 30 06:37:40 crc kubenswrapper[4691]: I0930 06:37:40.102115 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df950ad9-45e7-4c79-ba30-ef4b423809b0-config-data\") pod \"df950ad9-45e7-4c79-ba30-ef4b423809b0\" (UID: \"df950ad9-45e7-4c79-ba30-ef4b423809b0\") " Sep 30 06:37:40 crc kubenswrapper[4691]: I0930 06:37:40.102144 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df950ad9-45e7-4c79-ba30-ef4b423809b0-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "df950ad9-45e7-4c79-ba30-ef4b423809b0" (UID: "df950ad9-45e7-4c79-ba30-ef4b423809b0"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:37:40 crc kubenswrapper[4691]: I0930 06:37:40.102180 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df950ad9-45e7-4c79-ba30-ef4b423809b0-scripts\") pod \"df950ad9-45e7-4c79-ba30-ef4b423809b0\" (UID: \"df950ad9-45e7-4c79-ba30-ef4b423809b0\") " Sep 30 06:37:40 crc kubenswrapper[4691]: I0930 06:37:40.102211 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df950ad9-45e7-4c79-ba30-ef4b423809b0-combined-ca-bundle\") pod \"df950ad9-45e7-4c79-ba30-ef4b423809b0\" (UID: \"df950ad9-45e7-4c79-ba30-ef4b423809b0\") " Sep 30 06:37:40 crc kubenswrapper[4691]: I0930 06:37:40.102288 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9h4x\" (UniqueName: \"kubernetes.io/projected/df950ad9-45e7-4c79-ba30-ef4b423809b0-kube-api-access-g9h4x\") pod \"df950ad9-45e7-4c79-ba30-ef4b423809b0\" (UID: \"df950ad9-45e7-4c79-ba30-ef4b423809b0\") " Sep 30 06:37:40 crc kubenswrapper[4691]: I0930 06:37:40.102621 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/df950ad9-45e7-4c79-ba30-ef4b423809b0-internal-tls-certs\") pod \"df950ad9-45e7-4c79-ba30-ef4b423809b0\" (UID: \"df950ad9-45e7-4c79-ba30-ef4b423809b0\") " Sep 30 06:37:40 crc kubenswrapper[4691]: I0930 06:37:40.102768 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df950ad9-45e7-4c79-ba30-ef4b423809b0-logs" (OuterVolumeSpecName: "logs") pod "df950ad9-45e7-4c79-ba30-ef4b423809b0" (UID: "df950ad9-45e7-4c79-ba30-ef4b423809b0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:37:40 crc kubenswrapper[4691]: I0930 06:37:40.103023 4691 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/df950ad9-45e7-4c79-ba30-ef4b423809b0-httpd-run\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:40 crc kubenswrapper[4691]: I0930 06:37:40.103034 4691 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/df950ad9-45e7-4c79-ba30-ef4b423809b0-logs\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:40 crc kubenswrapper[4691]: I0930 06:37:40.111273 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df950ad9-45e7-4c79-ba30-ef4b423809b0-kube-api-access-g9h4x" (OuterVolumeSpecName: "kube-api-access-g9h4x") pod "df950ad9-45e7-4c79-ba30-ef4b423809b0" (UID: "df950ad9-45e7-4c79-ba30-ef4b423809b0"). InnerVolumeSpecName "kube-api-access-g9h4x". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:37:40 crc kubenswrapper[4691]: I0930 06:37:40.111280 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df950ad9-45e7-4c79-ba30-ef4b423809b0-scripts" (OuterVolumeSpecName: "scripts") pod "df950ad9-45e7-4c79-ba30-ef4b423809b0" (UID: "df950ad9-45e7-4c79-ba30-ef4b423809b0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:37:40 crc kubenswrapper[4691]: I0930 06:37:40.135042 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "df950ad9-45e7-4c79-ba30-ef4b423809b0" (UID: "df950ad9-45e7-4c79-ba30-ef4b423809b0"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 06:37:40 crc kubenswrapper[4691]: I0930 06:37:40.164214 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df950ad9-45e7-4c79-ba30-ef4b423809b0-config-data" (OuterVolumeSpecName: "config-data") pod "df950ad9-45e7-4c79-ba30-ef4b423809b0" (UID: "df950ad9-45e7-4c79-ba30-ef4b423809b0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:37:40 crc kubenswrapper[4691]: I0930 06:37:40.164583 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df950ad9-45e7-4c79-ba30-ef4b423809b0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "df950ad9-45e7-4c79-ba30-ef4b423809b0" (UID: "df950ad9-45e7-4c79-ba30-ef4b423809b0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:37:40 crc kubenswrapper[4691]: I0930 06:37:40.172588 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df950ad9-45e7-4c79-ba30-ef4b423809b0-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "df950ad9-45e7-4c79-ba30-ef4b423809b0" (UID: "df950ad9-45e7-4c79-ba30-ef4b423809b0"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:37:40 crc kubenswrapper[4691]: I0930 06:37:40.204398 4691 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/df950ad9-45e7-4c79-ba30-ef4b423809b0-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:40 crc kubenswrapper[4691]: I0930 06:37:40.204450 4691 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Sep 30 06:37:40 crc kubenswrapper[4691]: I0930 06:37:40.204461 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/df950ad9-45e7-4c79-ba30-ef4b423809b0-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:40 crc kubenswrapper[4691]: I0930 06:37:40.204469 4691 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/df950ad9-45e7-4c79-ba30-ef4b423809b0-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:40 crc kubenswrapper[4691]: I0930 06:37:40.204478 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/df950ad9-45e7-4c79-ba30-ef4b423809b0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:40 crc kubenswrapper[4691]: I0930 06:37:40.204486 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9h4x\" (UniqueName: \"kubernetes.io/projected/df950ad9-45e7-4c79-ba30-ef4b423809b0-kube-api-access-g9h4x\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:40 crc kubenswrapper[4691]: I0930 06:37:40.231493 4691 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Sep 30 06:37:40 crc kubenswrapper[4691]: I0930 06:37:40.306872 4691 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:40 crc kubenswrapper[4691]: I0930 06:37:40.324742 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Sep 30 06:37:40 crc kubenswrapper[4691]: I0930 06:37:40.622930 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"df950ad9-45e7-4c79-ba30-ef4b423809b0","Type":"ContainerDied","Data":"f2e64b185b731a5cfda931e79a68c8c1f00dea6e941defca043a69f79c3b9b71"} Sep 30 06:37:40 crc kubenswrapper[4691]: I0930 06:37:40.623977 4691 scope.go:117] "RemoveContainer" containerID="80b311ef92245ba29ced3c5c44eb4f6e0a4d8ae29cd79b446f38c4dda55ef30f" Sep 30 06:37:40 crc kubenswrapper[4691]: I0930 06:37:40.622997 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 06:37:40 crc kubenswrapper[4691]: I0930 06:37:40.659707 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 06:37:40 crc kubenswrapper[4691]: I0930 06:37:40.662862 4691 scope.go:117] "RemoveContainer" containerID="6e10c72881f1db003dcd37cc7e2bc8d9653b2faad08dd92dbbc418620d9785a7" Sep 30 06:37:40 crc kubenswrapper[4691]: I0930 06:37:40.690870 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 06:37:40 crc kubenswrapper[4691]: I0930 06:37:40.698643 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 06:37:40 crc kubenswrapper[4691]: E0930 06:37:40.699505 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="149dd243-4659-4434-a6d1-63fb57617546" containerName="mariadb-database-create" Sep 30 06:37:40 crc kubenswrapper[4691]: I0930 06:37:40.699593 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="149dd243-4659-4434-a6d1-63fb57617546" containerName="mariadb-database-create" Sep 30 06:37:40 crc kubenswrapper[4691]: E0930 06:37:40.699662 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df950ad9-45e7-4c79-ba30-ef4b423809b0" containerName="glance-log" Sep 30 06:37:40 crc kubenswrapper[4691]: I0930 06:37:40.699720 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="df950ad9-45e7-4c79-ba30-ef4b423809b0" containerName="glance-log" Sep 30 06:37:40 crc kubenswrapper[4691]: E0930 06:37:40.699785 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df950ad9-45e7-4c79-ba30-ef4b423809b0" containerName="glance-httpd" Sep 30 06:37:40 crc kubenswrapper[4691]: I0930 06:37:40.699835 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="df950ad9-45e7-4c79-ba30-ef4b423809b0" containerName="glance-httpd" Sep 30 06:37:40 crc kubenswrapper[4691]: E0930 06:37:40.699904 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63e509fa-065b-49fa-8e8b-292350d86b8f" containerName="mariadb-database-create" Sep 30 06:37:40 crc kubenswrapper[4691]: I0930 06:37:40.699959 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="63e509fa-065b-49fa-8e8b-292350d86b8f" containerName="mariadb-database-create" Sep 30 06:37:40 crc kubenswrapper[4691]: E0930 06:37:40.700045 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a199530-065b-4a27-bd83-b0b8f0ae2c13" containerName="mariadb-database-create" Sep 30 06:37:40 crc kubenswrapper[4691]: I0930 06:37:40.700097 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a199530-065b-4a27-bd83-b0b8f0ae2c13" containerName="mariadb-database-create" Sep 30 06:37:40 crc kubenswrapper[4691]: I0930 06:37:40.700318 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="63e509fa-065b-49fa-8e8b-292350d86b8f" containerName="mariadb-database-create" Sep 30 06:37:40 crc kubenswrapper[4691]: I0930 06:37:40.700384 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="df950ad9-45e7-4c79-ba30-ef4b423809b0" containerName="glance-log" Sep 30 06:37:40 crc kubenswrapper[4691]: I0930 06:37:40.700445 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a199530-065b-4a27-bd83-b0b8f0ae2c13" containerName="mariadb-database-create" Sep 30 06:37:40 crc kubenswrapper[4691]: I0930 06:37:40.700506 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="149dd243-4659-4434-a6d1-63fb57617546" containerName="mariadb-database-create" Sep 30 06:37:40 crc kubenswrapper[4691]: I0930 06:37:40.700577 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="df950ad9-45e7-4c79-ba30-ef4b423809b0" containerName="glance-httpd" Sep 30 06:37:40 crc kubenswrapper[4691]: I0930 06:37:40.701705 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 06:37:40 crc kubenswrapper[4691]: I0930 06:37:40.714173 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 06:37:40 crc kubenswrapper[4691]: I0930 06:37:40.748696 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Sep 30 06:37:40 crc kubenswrapper[4691]: I0930 06:37:40.749531 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Sep 30 06:37:40 crc kubenswrapper[4691]: I0930 06:37:40.818212 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34863af3-4c23-43ce-b483-713ca0d1f744-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"34863af3-4c23-43ce-b483-713ca0d1f744\") " pod="openstack/glance-default-internal-api-0" Sep 30 06:37:40 crc kubenswrapper[4691]: I0930 06:37:40.818276 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"34863af3-4c23-43ce-b483-713ca0d1f744\") " pod="openstack/glance-default-internal-api-0" Sep 30 06:37:40 crc kubenswrapper[4691]: I0930 06:37:40.818311 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/34863af3-4c23-43ce-b483-713ca0d1f744-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"34863af3-4c23-43ce-b483-713ca0d1f744\") " pod="openstack/glance-default-internal-api-0" Sep 30 06:37:40 crc kubenswrapper[4691]: I0930 06:37:40.818345 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34863af3-4c23-43ce-b483-713ca0d1f744-scripts\") pod \"glance-default-internal-api-0\" (UID: \"34863af3-4c23-43ce-b483-713ca0d1f744\") " pod="openstack/glance-default-internal-api-0" Sep 30 06:37:40 crc kubenswrapper[4691]: I0930 06:37:40.818365 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgpf2\" (UniqueName: \"kubernetes.io/projected/34863af3-4c23-43ce-b483-713ca0d1f744-kube-api-access-fgpf2\") pod \"glance-default-internal-api-0\" (UID: \"34863af3-4c23-43ce-b483-713ca0d1f744\") " pod="openstack/glance-default-internal-api-0" Sep 30 06:37:40 crc kubenswrapper[4691]: I0930 06:37:40.818387 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34863af3-4c23-43ce-b483-713ca0d1f744-config-data\") pod \"glance-default-internal-api-0\" (UID: \"34863af3-4c23-43ce-b483-713ca0d1f744\") " pod="openstack/glance-default-internal-api-0" Sep 30 06:37:40 crc kubenswrapper[4691]: I0930 06:37:40.818447 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/34863af3-4c23-43ce-b483-713ca0d1f744-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"34863af3-4c23-43ce-b483-713ca0d1f744\") " pod="openstack/glance-default-internal-api-0" Sep 30 06:37:40 crc kubenswrapper[4691]: I0930 06:37:40.818490 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34863af3-4c23-43ce-b483-713ca0d1f744-logs\") pod \"glance-default-internal-api-0\" (UID: \"34863af3-4c23-43ce-b483-713ca0d1f744\") " pod="openstack/glance-default-internal-api-0" Sep 30 06:37:40 crc kubenswrapper[4691]: I0930 06:37:40.920257 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/34863af3-4c23-43ce-b483-713ca0d1f744-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"34863af3-4c23-43ce-b483-713ca0d1f744\") " pod="openstack/glance-default-internal-api-0" Sep 30 06:37:40 crc kubenswrapper[4691]: I0930 06:37:40.920314 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34863af3-4c23-43ce-b483-713ca0d1f744-scripts\") pod \"glance-default-internal-api-0\" (UID: \"34863af3-4c23-43ce-b483-713ca0d1f744\") " pod="openstack/glance-default-internal-api-0" Sep 30 06:37:40 crc kubenswrapper[4691]: I0930 06:37:40.920339 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgpf2\" (UniqueName: \"kubernetes.io/projected/34863af3-4c23-43ce-b483-713ca0d1f744-kube-api-access-fgpf2\") pod \"glance-default-internal-api-0\" (UID: \"34863af3-4c23-43ce-b483-713ca0d1f744\") " pod="openstack/glance-default-internal-api-0" Sep 30 06:37:40 crc kubenswrapper[4691]: I0930 06:37:40.920361 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34863af3-4c23-43ce-b483-713ca0d1f744-config-data\") pod \"glance-default-internal-api-0\" (UID: \"34863af3-4c23-43ce-b483-713ca0d1f744\") " pod="openstack/glance-default-internal-api-0" Sep 30 06:37:40 crc kubenswrapper[4691]: I0930 06:37:40.920391 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/34863af3-4c23-43ce-b483-713ca0d1f744-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"34863af3-4c23-43ce-b483-713ca0d1f744\") " pod="openstack/glance-default-internal-api-0" Sep 30 06:37:40 crc kubenswrapper[4691]: I0930 06:37:40.920428 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34863af3-4c23-43ce-b483-713ca0d1f744-logs\") pod \"glance-default-internal-api-0\" (UID: \"34863af3-4c23-43ce-b483-713ca0d1f744\") " pod="openstack/glance-default-internal-api-0" Sep 30 06:37:40 crc kubenswrapper[4691]: I0930 06:37:40.920488 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34863af3-4c23-43ce-b483-713ca0d1f744-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"34863af3-4c23-43ce-b483-713ca0d1f744\") " pod="openstack/glance-default-internal-api-0" Sep 30 06:37:40 crc kubenswrapper[4691]: I0930 06:37:40.920536 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"34863af3-4c23-43ce-b483-713ca0d1f744\") " pod="openstack/glance-default-internal-api-0" Sep 30 06:37:40 crc kubenswrapper[4691]: I0930 06:37:40.920872 4691 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"34863af3-4c23-43ce-b483-713ca0d1f744\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/glance-default-internal-api-0" Sep 30 06:37:40 crc kubenswrapper[4691]: I0930 06:37:40.921024 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34863af3-4c23-43ce-b483-713ca0d1f744-logs\") pod \"glance-default-internal-api-0\" (UID: \"34863af3-4c23-43ce-b483-713ca0d1f744\") " pod="openstack/glance-default-internal-api-0" Sep 30 06:37:40 crc kubenswrapper[4691]: I0930 06:37:40.921416 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/34863af3-4c23-43ce-b483-713ca0d1f744-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"34863af3-4c23-43ce-b483-713ca0d1f744\") " pod="openstack/glance-default-internal-api-0" Sep 30 06:37:40 crc kubenswrapper[4691]: I0930 06:37:40.925242 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34863af3-4c23-43ce-b483-713ca0d1f744-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"34863af3-4c23-43ce-b483-713ca0d1f744\") " pod="openstack/glance-default-internal-api-0" Sep 30 06:37:40 crc kubenswrapper[4691]: I0930 06:37:40.925425 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34863af3-4c23-43ce-b483-713ca0d1f744-scripts\") pod \"glance-default-internal-api-0\" (UID: \"34863af3-4c23-43ce-b483-713ca0d1f744\") " pod="openstack/glance-default-internal-api-0" Sep 30 06:37:40 crc kubenswrapper[4691]: I0930 06:37:40.926137 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34863af3-4c23-43ce-b483-713ca0d1f744-config-data\") pod \"glance-default-internal-api-0\" (UID: \"34863af3-4c23-43ce-b483-713ca0d1f744\") " pod="openstack/glance-default-internal-api-0" Sep 30 06:37:40 crc kubenswrapper[4691]: I0930 06:37:40.927644 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/34863af3-4c23-43ce-b483-713ca0d1f744-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"34863af3-4c23-43ce-b483-713ca0d1f744\") " pod="openstack/glance-default-internal-api-0" Sep 30 06:37:40 crc kubenswrapper[4691]: I0930 06:37:40.936806 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgpf2\" (UniqueName: \"kubernetes.io/projected/34863af3-4c23-43ce-b483-713ca0d1f744-kube-api-access-fgpf2\") pod \"glance-default-internal-api-0\" (UID: \"34863af3-4c23-43ce-b483-713ca0d1f744\") " pod="openstack/glance-default-internal-api-0" Sep 30 06:37:40 crc kubenswrapper[4691]: I0930 06:37:40.956399 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"34863af3-4c23-43ce-b483-713ca0d1f744\") " pod="openstack/glance-default-internal-api-0" Sep 30 06:37:40 crc kubenswrapper[4691]: I0930 06:37:40.975578 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 06:37:40 crc kubenswrapper[4691]: I0930 06:37:40.975801 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a6e2ebb6-b355-4f57-bdd0-e3c2f004693a" containerName="ceilometer-central-agent" containerID="cri-o://3e6baa15cf5976146083765bcf26dd94cdc365d2a99fc1b5ad03b07dd7a66d42" gracePeriod=30 Sep 30 06:37:40 crc kubenswrapper[4691]: I0930 06:37:40.975935 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a6e2ebb6-b355-4f57-bdd0-e3c2f004693a" containerName="proxy-httpd" containerID="cri-o://0adcddc8119b25acec937c92689758224e3a51a61572bb4e32943507f9c3eaee" gracePeriod=30 Sep 30 06:37:40 crc kubenswrapper[4691]: I0930 06:37:40.976038 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a6e2ebb6-b355-4f57-bdd0-e3c2f004693a" containerName="sg-core" containerID="cri-o://eeff54f6517adb79cbb94666d22116dd91610eae852a6ac745d091cf6489839d" gracePeriod=30 Sep 30 06:37:40 crc kubenswrapper[4691]: I0930 06:37:40.976048 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a6e2ebb6-b355-4f57-bdd0-e3c2f004693a" containerName="ceilometer-notification-agent" containerID="cri-o://0713b85797703b13631fa914b183c5b54e11897c5a706ccf5cd4c98c5c772382" gracePeriod=30 Sep 30 06:37:41 crc kubenswrapper[4691]: I0930 06:37:41.066794 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Sep 30 06:37:41 crc kubenswrapper[4691]: I0930 06:37:41.244082 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df950ad9-45e7-4c79-ba30-ef4b423809b0" path="/var/lib/kubelet/pods/df950ad9-45e7-4c79-ba30-ef4b423809b0/volumes" Sep 30 06:37:41 crc kubenswrapper[4691]: W0930 06:37:41.589974 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34863af3_4c23_43ce_b483_713ca0d1f744.slice/crio-c9878984e4fae14b979c344037f6524bd167a6957fd0625b8e4a050a1c298ce7 WatchSource:0}: Error finding container c9878984e4fae14b979c344037f6524bd167a6957fd0625b8e4a050a1c298ce7: Status 404 returned error can't find the container with id c9878984e4fae14b979c344037f6524bd167a6957fd0625b8e4a050a1c298ce7 Sep 30 06:37:41 crc kubenswrapper[4691]: I0930 06:37:41.590684 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Sep 30 06:37:41 crc kubenswrapper[4691]: I0930 06:37:41.645871 4691 generic.go:334] "Generic (PLEG): container finished" podID="a6e2ebb6-b355-4f57-bdd0-e3c2f004693a" containerID="0adcddc8119b25acec937c92689758224e3a51a61572bb4e32943507f9c3eaee" exitCode=0 Sep 30 06:37:41 crc kubenswrapper[4691]: I0930 06:37:41.645917 4691 generic.go:334] "Generic (PLEG): container finished" podID="a6e2ebb6-b355-4f57-bdd0-e3c2f004693a" containerID="eeff54f6517adb79cbb94666d22116dd91610eae852a6ac745d091cf6489839d" exitCode=2 Sep 30 06:37:41 crc kubenswrapper[4691]: I0930 06:37:41.645924 4691 generic.go:334] "Generic (PLEG): container finished" podID="a6e2ebb6-b355-4f57-bdd0-e3c2f004693a" containerID="0713b85797703b13631fa914b183c5b54e11897c5a706ccf5cd4c98c5c772382" exitCode=0 Sep 30 06:37:41 crc kubenswrapper[4691]: I0930 06:37:41.645922 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a6e2ebb6-b355-4f57-bdd0-e3c2f004693a","Type":"ContainerDied","Data":"0adcddc8119b25acec937c92689758224e3a51a61572bb4e32943507f9c3eaee"} Sep 30 06:37:41 crc kubenswrapper[4691]: I0930 06:37:41.645991 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a6e2ebb6-b355-4f57-bdd0-e3c2f004693a","Type":"ContainerDied","Data":"eeff54f6517adb79cbb94666d22116dd91610eae852a6ac745d091cf6489839d"} Sep 30 06:37:41 crc kubenswrapper[4691]: I0930 06:37:41.646004 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a6e2ebb6-b355-4f57-bdd0-e3c2f004693a","Type":"ContainerDied","Data":"0713b85797703b13631fa914b183c5b54e11897c5a706ccf5cd4c98c5c772382"} Sep 30 06:37:41 crc kubenswrapper[4691]: I0930 06:37:41.648447 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"34863af3-4c23-43ce-b483-713ca0d1f744","Type":"ContainerStarted","Data":"c9878984e4fae14b979c344037f6524bd167a6957fd0625b8e4a050a1c298ce7"} Sep 30 06:37:42 crc kubenswrapper[4691]: I0930 06:37:42.361369 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 06:37:42 crc kubenswrapper[4691]: I0930 06:37:42.446295 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a6e2ebb6-b355-4f57-bdd0-e3c2f004693a-run-httpd\") pod \"a6e2ebb6-b355-4f57-bdd0-e3c2f004693a\" (UID: \"a6e2ebb6-b355-4f57-bdd0-e3c2f004693a\") " Sep 30 06:37:42 crc kubenswrapper[4691]: I0930 06:37:42.446346 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a6e2ebb6-b355-4f57-bdd0-e3c2f004693a-log-httpd\") pod \"a6e2ebb6-b355-4f57-bdd0-e3c2f004693a\" (UID: \"a6e2ebb6-b355-4f57-bdd0-e3c2f004693a\") " Sep 30 06:37:42 crc kubenswrapper[4691]: I0930 06:37:42.446406 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6e2ebb6-b355-4f57-bdd0-e3c2f004693a-combined-ca-bundle\") pod \"a6e2ebb6-b355-4f57-bdd0-e3c2f004693a\" (UID: \"a6e2ebb6-b355-4f57-bdd0-e3c2f004693a\") " Sep 30 06:37:42 crc kubenswrapper[4691]: I0930 06:37:42.446425 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a6e2ebb6-b355-4f57-bdd0-e3c2f004693a-sg-core-conf-yaml\") pod \"a6e2ebb6-b355-4f57-bdd0-e3c2f004693a\" (UID: \"a6e2ebb6-b355-4f57-bdd0-e3c2f004693a\") " Sep 30 06:37:42 crc kubenswrapper[4691]: I0930 06:37:42.446468 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ttcq\" (UniqueName: \"kubernetes.io/projected/a6e2ebb6-b355-4f57-bdd0-e3c2f004693a-kube-api-access-7ttcq\") pod \"a6e2ebb6-b355-4f57-bdd0-e3c2f004693a\" (UID: \"a6e2ebb6-b355-4f57-bdd0-e3c2f004693a\") " Sep 30 06:37:42 crc kubenswrapper[4691]: I0930 06:37:42.446513 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6e2ebb6-b355-4f57-bdd0-e3c2f004693a-scripts\") pod \"a6e2ebb6-b355-4f57-bdd0-e3c2f004693a\" (UID: \"a6e2ebb6-b355-4f57-bdd0-e3c2f004693a\") " Sep 30 06:37:42 crc kubenswrapper[4691]: I0930 06:37:42.446547 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6e2ebb6-b355-4f57-bdd0-e3c2f004693a-config-data\") pod \"a6e2ebb6-b355-4f57-bdd0-e3c2f004693a\" (UID: \"a6e2ebb6-b355-4f57-bdd0-e3c2f004693a\") " Sep 30 06:37:42 crc kubenswrapper[4691]: I0930 06:37:42.450756 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6e2ebb6-b355-4f57-bdd0-e3c2f004693a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a6e2ebb6-b355-4f57-bdd0-e3c2f004693a" (UID: "a6e2ebb6-b355-4f57-bdd0-e3c2f004693a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:37:42 crc kubenswrapper[4691]: I0930 06:37:42.450817 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6e2ebb6-b355-4f57-bdd0-e3c2f004693a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a6e2ebb6-b355-4f57-bdd0-e3c2f004693a" (UID: "a6e2ebb6-b355-4f57-bdd0-e3c2f004693a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:37:42 crc kubenswrapper[4691]: I0930 06:37:42.453196 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6e2ebb6-b355-4f57-bdd0-e3c2f004693a-kube-api-access-7ttcq" (OuterVolumeSpecName: "kube-api-access-7ttcq") pod "a6e2ebb6-b355-4f57-bdd0-e3c2f004693a" (UID: "a6e2ebb6-b355-4f57-bdd0-e3c2f004693a"). InnerVolumeSpecName "kube-api-access-7ttcq". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:37:42 crc kubenswrapper[4691]: I0930 06:37:42.453713 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6e2ebb6-b355-4f57-bdd0-e3c2f004693a-scripts" (OuterVolumeSpecName: "scripts") pod "a6e2ebb6-b355-4f57-bdd0-e3c2f004693a" (UID: "a6e2ebb6-b355-4f57-bdd0-e3c2f004693a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:37:42 crc kubenswrapper[4691]: I0930 06:37:42.482354 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6e2ebb6-b355-4f57-bdd0-e3c2f004693a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a6e2ebb6-b355-4f57-bdd0-e3c2f004693a" (UID: "a6e2ebb6-b355-4f57-bdd0-e3c2f004693a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:37:42 crc kubenswrapper[4691]: I0930 06:37:42.539489 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6e2ebb6-b355-4f57-bdd0-e3c2f004693a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a6e2ebb6-b355-4f57-bdd0-e3c2f004693a" (UID: "a6e2ebb6-b355-4f57-bdd0-e3c2f004693a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:37:42 crc kubenswrapper[4691]: I0930 06:37:42.548693 4691 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a6e2ebb6-b355-4f57-bdd0-e3c2f004693a-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:42 crc kubenswrapper[4691]: I0930 06:37:42.548723 4691 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a6e2ebb6-b355-4f57-bdd0-e3c2f004693a-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:42 crc kubenswrapper[4691]: I0930 06:37:42.548734 4691 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a6e2ebb6-b355-4f57-bdd0-e3c2f004693a-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:42 crc kubenswrapper[4691]: I0930 06:37:42.548742 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6e2ebb6-b355-4f57-bdd0-e3c2f004693a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:42 crc kubenswrapper[4691]: I0930 06:37:42.548752 4691 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a6e2ebb6-b355-4f57-bdd0-e3c2f004693a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:42 crc kubenswrapper[4691]: I0930 06:37:42.548760 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ttcq\" (UniqueName: \"kubernetes.io/projected/a6e2ebb6-b355-4f57-bdd0-e3c2f004693a-kube-api-access-7ttcq\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:42 crc kubenswrapper[4691]: I0930 06:37:42.553476 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6e2ebb6-b355-4f57-bdd0-e3c2f004693a-config-data" (OuterVolumeSpecName: "config-data") pod "a6e2ebb6-b355-4f57-bdd0-e3c2f004693a" (UID: "a6e2ebb6-b355-4f57-bdd0-e3c2f004693a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:37:42 crc kubenswrapper[4691]: I0930 06:37:42.650212 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6e2ebb6-b355-4f57-bdd0-e3c2f004693a-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:42 crc kubenswrapper[4691]: I0930 06:37:42.664224 4691 generic.go:334] "Generic (PLEG): container finished" podID="a6e2ebb6-b355-4f57-bdd0-e3c2f004693a" containerID="3e6baa15cf5976146083765bcf26dd94cdc365d2a99fc1b5ad03b07dd7a66d42" exitCode=0 Sep 30 06:37:42 crc kubenswrapper[4691]: I0930 06:37:42.664288 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a6e2ebb6-b355-4f57-bdd0-e3c2f004693a","Type":"ContainerDied","Data":"3e6baa15cf5976146083765bcf26dd94cdc365d2a99fc1b5ad03b07dd7a66d42"} Sep 30 06:37:42 crc kubenswrapper[4691]: I0930 06:37:42.664318 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a6e2ebb6-b355-4f57-bdd0-e3c2f004693a","Type":"ContainerDied","Data":"d58be6dd2cf9a2bd829283c9ac2aa04fb8925c7e65cc7024f06810081223fe55"} Sep 30 06:37:42 crc kubenswrapper[4691]: I0930 06:37:42.664339 4691 scope.go:117] "RemoveContainer" containerID="0adcddc8119b25acec937c92689758224e3a51a61572bb4e32943507f9c3eaee" Sep 30 06:37:42 crc kubenswrapper[4691]: I0930 06:37:42.664438 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 06:37:42 crc kubenswrapper[4691]: I0930 06:37:42.667900 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"34863af3-4c23-43ce-b483-713ca0d1f744","Type":"ContainerStarted","Data":"54367a404a57acfce7e73a7ddb5dc2dfbd8fe2e1bc42360e57eed98e862911c8"} Sep 30 06:37:42 crc kubenswrapper[4691]: I0930 06:37:42.698466 4691 scope.go:117] "RemoveContainer" containerID="eeff54f6517adb79cbb94666d22116dd91610eae852a6ac745d091cf6489839d" Sep 30 06:37:42 crc kubenswrapper[4691]: I0930 06:37:42.715812 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 06:37:42 crc kubenswrapper[4691]: I0930 06:37:42.734398 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 06:37:42 crc kubenswrapper[4691]: I0930 06:37:42.742582 4691 scope.go:117] "RemoveContainer" containerID="0713b85797703b13631fa914b183c5b54e11897c5a706ccf5cd4c98c5c772382" Sep 30 06:37:42 crc kubenswrapper[4691]: I0930 06:37:42.746433 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 06:37:42 crc kubenswrapper[4691]: E0930 06:37:42.746791 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6e2ebb6-b355-4f57-bdd0-e3c2f004693a" containerName="ceilometer-notification-agent" Sep 30 06:37:42 crc kubenswrapper[4691]: I0930 06:37:42.746806 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6e2ebb6-b355-4f57-bdd0-e3c2f004693a" containerName="ceilometer-notification-agent" Sep 30 06:37:42 crc kubenswrapper[4691]: E0930 06:37:42.746826 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6e2ebb6-b355-4f57-bdd0-e3c2f004693a" containerName="proxy-httpd" Sep 30 06:37:42 crc kubenswrapper[4691]: I0930 06:37:42.746833 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6e2ebb6-b355-4f57-bdd0-e3c2f004693a" containerName="proxy-httpd" Sep 30 06:37:42 crc kubenswrapper[4691]: E0930 06:37:42.746849 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6e2ebb6-b355-4f57-bdd0-e3c2f004693a" containerName="ceilometer-central-agent" Sep 30 06:37:42 crc kubenswrapper[4691]: I0930 06:37:42.746856 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6e2ebb6-b355-4f57-bdd0-e3c2f004693a" containerName="ceilometer-central-agent" Sep 30 06:37:42 crc kubenswrapper[4691]: E0930 06:37:42.746877 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6e2ebb6-b355-4f57-bdd0-e3c2f004693a" containerName="sg-core" Sep 30 06:37:42 crc kubenswrapper[4691]: I0930 06:37:42.746898 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6e2ebb6-b355-4f57-bdd0-e3c2f004693a" containerName="sg-core" Sep 30 06:37:42 crc kubenswrapper[4691]: I0930 06:37:42.747106 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6e2ebb6-b355-4f57-bdd0-e3c2f004693a" containerName="proxy-httpd" Sep 30 06:37:42 crc kubenswrapper[4691]: I0930 06:37:42.747124 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6e2ebb6-b355-4f57-bdd0-e3c2f004693a" containerName="ceilometer-notification-agent" Sep 30 06:37:42 crc kubenswrapper[4691]: I0930 06:37:42.747147 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6e2ebb6-b355-4f57-bdd0-e3c2f004693a" containerName="ceilometer-central-agent" Sep 30 06:37:42 crc kubenswrapper[4691]: I0930 06:37:42.747163 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6e2ebb6-b355-4f57-bdd0-e3c2f004693a" containerName="sg-core" Sep 30 06:37:42 crc kubenswrapper[4691]: I0930 06:37:42.750039 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 06:37:42 crc kubenswrapper[4691]: I0930 06:37:42.754719 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 06:37:42 crc kubenswrapper[4691]: I0930 06:37:42.754915 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 06:37:42 crc kubenswrapper[4691]: I0930 06:37:42.760698 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 06:37:42 crc kubenswrapper[4691]: I0930 06:37:42.777121 4691 scope.go:117] "RemoveContainer" containerID="3e6baa15cf5976146083765bcf26dd94cdc365d2a99fc1b5ad03b07dd7a66d42" Sep 30 06:37:42 crc kubenswrapper[4691]: I0930 06:37:42.818173 4691 scope.go:117] "RemoveContainer" containerID="0adcddc8119b25acec937c92689758224e3a51a61572bb4e32943507f9c3eaee" Sep 30 06:37:42 crc kubenswrapper[4691]: E0930 06:37:42.818712 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0adcddc8119b25acec937c92689758224e3a51a61572bb4e32943507f9c3eaee\": container with ID starting with 0adcddc8119b25acec937c92689758224e3a51a61572bb4e32943507f9c3eaee not found: ID does not exist" containerID="0adcddc8119b25acec937c92689758224e3a51a61572bb4e32943507f9c3eaee" Sep 30 06:37:42 crc kubenswrapper[4691]: I0930 06:37:42.818743 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0adcddc8119b25acec937c92689758224e3a51a61572bb4e32943507f9c3eaee"} err="failed to get container status \"0adcddc8119b25acec937c92689758224e3a51a61572bb4e32943507f9c3eaee\": rpc error: code = NotFound desc = could not find container \"0adcddc8119b25acec937c92689758224e3a51a61572bb4e32943507f9c3eaee\": container with ID starting with 0adcddc8119b25acec937c92689758224e3a51a61572bb4e32943507f9c3eaee not found: ID does not exist" Sep 30 06:37:42 crc kubenswrapper[4691]: I0930 06:37:42.818763 4691 scope.go:117] "RemoveContainer" containerID="eeff54f6517adb79cbb94666d22116dd91610eae852a6ac745d091cf6489839d" Sep 30 06:37:42 crc kubenswrapper[4691]: E0930 06:37:42.819072 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eeff54f6517adb79cbb94666d22116dd91610eae852a6ac745d091cf6489839d\": container with ID starting with eeff54f6517adb79cbb94666d22116dd91610eae852a6ac745d091cf6489839d not found: ID does not exist" containerID="eeff54f6517adb79cbb94666d22116dd91610eae852a6ac745d091cf6489839d" Sep 30 06:37:42 crc kubenswrapper[4691]: I0930 06:37:42.819128 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eeff54f6517adb79cbb94666d22116dd91610eae852a6ac745d091cf6489839d"} err="failed to get container status \"eeff54f6517adb79cbb94666d22116dd91610eae852a6ac745d091cf6489839d\": rpc error: code = NotFound desc = could not find container \"eeff54f6517adb79cbb94666d22116dd91610eae852a6ac745d091cf6489839d\": container with ID starting with eeff54f6517adb79cbb94666d22116dd91610eae852a6ac745d091cf6489839d not found: ID does not exist" Sep 30 06:37:42 crc kubenswrapper[4691]: I0930 06:37:42.819156 4691 scope.go:117] "RemoveContainer" containerID="0713b85797703b13631fa914b183c5b54e11897c5a706ccf5cd4c98c5c772382" Sep 30 06:37:42 crc kubenswrapper[4691]: E0930 06:37:42.819500 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0713b85797703b13631fa914b183c5b54e11897c5a706ccf5cd4c98c5c772382\": container with ID starting with 0713b85797703b13631fa914b183c5b54e11897c5a706ccf5cd4c98c5c772382 not found: ID does not exist" containerID="0713b85797703b13631fa914b183c5b54e11897c5a706ccf5cd4c98c5c772382" Sep 30 06:37:42 crc kubenswrapper[4691]: I0930 06:37:42.819538 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0713b85797703b13631fa914b183c5b54e11897c5a706ccf5cd4c98c5c772382"} err="failed to get container status \"0713b85797703b13631fa914b183c5b54e11897c5a706ccf5cd4c98c5c772382\": rpc error: code = NotFound desc = could not find container \"0713b85797703b13631fa914b183c5b54e11897c5a706ccf5cd4c98c5c772382\": container with ID starting with 0713b85797703b13631fa914b183c5b54e11897c5a706ccf5cd4c98c5c772382 not found: ID does not exist" Sep 30 06:37:42 crc kubenswrapper[4691]: I0930 06:37:42.819553 4691 scope.go:117] "RemoveContainer" containerID="3e6baa15cf5976146083765bcf26dd94cdc365d2a99fc1b5ad03b07dd7a66d42" Sep 30 06:37:42 crc kubenswrapper[4691]: E0930 06:37:42.819805 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e6baa15cf5976146083765bcf26dd94cdc365d2a99fc1b5ad03b07dd7a66d42\": container with ID starting with 3e6baa15cf5976146083765bcf26dd94cdc365d2a99fc1b5ad03b07dd7a66d42 not found: ID does not exist" containerID="3e6baa15cf5976146083765bcf26dd94cdc365d2a99fc1b5ad03b07dd7a66d42" Sep 30 06:37:42 crc kubenswrapper[4691]: I0930 06:37:42.819824 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e6baa15cf5976146083765bcf26dd94cdc365d2a99fc1b5ad03b07dd7a66d42"} err="failed to get container status \"3e6baa15cf5976146083765bcf26dd94cdc365d2a99fc1b5ad03b07dd7a66d42\": rpc error: code = NotFound desc = could not find container \"3e6baa15cf5976146083765bcf26dd94cdc365d2a99fc1b5ad03b07dd7a66d42\": container with ID starting with 3e6baa15cf5976146083765bcf26dd94cdc365d2a99fc1b5ad03b07dd7a66d42 not found: ID does not exist" Sep 30 06:37:42 crc kubenswrapper[4691]: I0930 06:37:42.852727 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fc3c5478-606b-4088-aec6-a5652fc3ffb1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fc3c5478-606b-4088-aec6-a5652fc3ffb1\") " pod="openstack/ceilometer-0" Sep 30 06:37:42 crc kubenswrapper[4691]: I0930 06:37:42.852789 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mjz4\" (UniqueName: \"kubernetes.io/projected/fc3c5478-606b-4088-aec6-a5652fc3ffb1-kube-api-access-7mjz4\") pod \"ceilometer-0\" (UID: \"fc3c5478-606b-4088-aec6-a5652fc3ffb1\") " pod="openstack/ceilometer-0" Sep 30 06:37:42 crc kubenswrapper[4691]: I0930 06:37:42.852860 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc3c5478-606b-4088-aec6-a5652fc3ffb1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fc3c5478-606b-4088-aec6-a5652fc3ffb1\") " pod="openstack/ceilometer-0" Sep 30 06:37:42 crc kubenswrapper[4691]: I0930 06:37:42.852971 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc3c5478-606b-4088-aec6-a5652fc3ffb1-scripts\") pod \"ceilometer-0\" (UID: \"fc3c5478-606b-4088-aec6-a5652fc3ffb1\") " pod="openstack/ceilometer-0" Sep 30 06:37:42 crc kubenswrapper[4691]: I0930 06:37:42.853007 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc3c5478-606b-4088-aec6-a5652fc3ffb1-config-data\") pod \"ceilometer-0\" (UID: \"fc3c5478-606b-4088-aec6-a5652fc3ffb1\") " pod="openstack/ceilometer-0" Sep 30 06:37:42 crc kubenswrapper[4691]: I0930 06:37:42.853066 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc3c5478-606b-4088-aec6-a5652fc3ffb1-log-httpd\") pod \"ceilometer-0\" (UID: \"fc3c5478-606b-4088-aec6-a5652fc3ffb1\") " pod="openstack/ceilometer-0" Sep 30 06:37:42 crc kubenswrapper[4691]: I0930 06:37:42.853090 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc3c5478-606b-4088-aec6-a5652fc3ffb1-run-httpd\") pod \"ceilometer-0\" (UID: \"fc3c5478-606b-4088-aec6-a5652fc3ffb1\") " pod="openstack/ceilometer-0" Sep 30 06:37:42 crc kubenswrapper[4691]: I0930 06:37:42.954632 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc3c5478-606b-4088-aec6-a5652fc3ffb1-scripts\") pod \"ceilometer-0\" (UID: \"fc3c5478-606b-4088-aec6-a5652fc3ffb1\") " pod="openstack/ceilometer-0" Sep 30 06:37:42 crc kubenswrapper[4691]: I0930 06:37:42.954684 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc3c5478-606b-4088-aec6-a5652fc3ffb1-config-data\") pod \"ceilometer-0\" (UID: \"fc3c5478-606b-4088-aec6-a5652fc3ffb1\") " pod="openstack/ceilometer-0" Sep 30 06:37:42 crc kubenswrapper[4691]: I0930 06:37:42.954759 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc3c5478-606b-4088-aec6-a5652fc3ffb1-log-httpd\") pod \"ceilometer-0\" (UID: \"fc3c5478-606b-4088-aec6-a5652fc3ffb1\") " pod="openstack/ceilometer-0" Sep 30 06:37:42 crc kubenswrapper[4691]: I0930 06:37:42.954791 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc3c5478-606b-4088-aec6-a5652fc3ffb1-run-httpd\") pod \"ceilometer-0\" (UID: \"fc3c5478-606b-4088-aec6-a5652fc3ffb1\") " pod="openstack/ceilometer-0" Sep 30 06:37:42 crc kubenswrapper[4691]: I0930 06:37:42.954824 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fc3c5478-606b-4088-aec6-a5652fc3ffb1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fc3c5478-606b-4088-aec6-a5652fc3ffb1\") " pod="openstack/ceilometer-0" Sep 30 06:37:42 crc kubenswrapper[4691]: I0930 06:37:42.954861 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mjz4\" (UniqueName: \"kubernetes.io/projected/fc3c5478-606b-4088-aec6-a5652fc3ffb1-kube-api-access-7mjz4\") pod \"ceilometer-0\" (UID: \"fc3c5478-606b-4088-aec6-a5652fc3ffb1\") " pod="openstack/ceilometer-0" Sep 30 06:37:42 crc kubenswrapper[4691]: I0930 06:37:42.954987 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc3c5478-606b-4088-aec6-a5652fc3ffb1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fc3c5478-606b-4088-aec6-a5652fc3ffb1\") " pod="openstack/ceilometer-0" Sep 30 06:37:42 crc kubenswrapper[4691]: I0930 06:37:42.955318 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc3c5478-606b-4088-aec6-a5652fc3ffb1-run-httpd\") pod \"ceilometer-0\" (UID: \"fc3c5478-606b-4088-aec6-a5652fc3ffb1\") " pod="openstack/ceilometer-0" Sep 30 06:37:42 crc kubenswrapper[4691]: I0930 06:37:42.955394 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc3c5478-606b-4088-aec6-a5652fc3ffb1-log-httpd\") pod \"ceilometer-0\" (UID: \"fc3c5478-606b-4088-aec6-a5652fc3ffb1\") " pod="openstack/ceilometer-0" Sep 30 06:37:42 crc kubenswrapper[4691]: I0930 06:37:42.959388 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc3c5478-606b-4088-aec6-a5652fc3ffb1-config-data\") pod \"ceilometer-0\" (UID: \"fc3c5478-606b-4088-aec6-a5652fc3ffb1\") " pod="openstack/ceilometer-0" Sep 30 06:37:42 crc kubenswrapper[4691]: I0930 06:37:42.959479 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc3c5478-606b-4088-aec6-a5652fc3ffb1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fc3c5478-606b-4088-aec6-a5652fc3ffb1\") " pod="openstack/ceilometer-0" Sep 30 06:37:42 crc kubenswrapper[4691]: I0930 06:37:42.959508 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fc3c5478-606b-4088-aec6-a5652fc3ffb1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fc3c5478-606b-4088-aec6-a5652fc3ffb1\") " pod="openstack/ceilometer-0" Sep 30 06:37:42 crc kubenswrapper[4691]: I0930 06:37:42.961063 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc3c5478-606b-4088-aec6-a5652fc3ffb1-scripts\") pod \"ceilometer-0\" (UID: \"fc3c5478-606b-4088-aec6-a5652fc3ffb1\") " pod="openstack/ceilometer-0" Sep 30 06:37:42 crc kubenswrapper[4691]: I0930 06:37:42.970644 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mjz4\" (UniqueName: \"kubernetes.io/projected/fc3c5478-606b-4088-aec6-a5652fc3ffb1-kube-api-access-7mjz4\") pod \"ceilometer-0\" (UID: \"fc3c5478-606b-4088-aec6-a5652fc3ffb1\") " pod="openstack/ceilometer-0" Sep 30 06:37:43 crc kubenswrapper[4691]: I0930 06:37:43.066850 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 06:37:43 crc kubenswrapper[4691]: I0930 06:37:43.205784 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 06:37:43 crc kubenswrapper[4691]: I0930 06:37:43.243708 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6e2ebb6-b355-4f57-bdd0-e3c2f004693a" path="/var/lib/kubelet/pods/a6e2ebb6-b355-4f57-bdd0-e3c2f004693a/volumes" Sep 30 06:37:43 crc kubenswrapper[4691]: I0930 06:37:43.512770 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 06:37:43 crc kubenswrapper[4691]: I0930 06:37:43.688962 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fc3c5478-606b-4088-aec6-a5652fc3ffb1","Type":"ContainerStarted","Data":"5a7a5bcff3758c5f9d9da70a55d698c27487ff6e99c2729cf15ef82a76a16249"} Sep 30 06:37:43 crc kubenswrapper[4691]: I0930 06:37:43.692406 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"34863af3-4c23-43ce-b483-713ca0d1f744","Type":"ContainerStarted","Data":"440a5a9617d667637aa5a4e53fc42419594cd8f908e8b3b7d3c5799e1d975cf6"} Sep 30 06:37:43 crc kubenswrapper[4691]: I0930 06:37:43.716796 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.716778121 podStartE2EDuration="3.716778121s" podCreationTimestamp="2025-09-30 06:37:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:37:43.711372792 +0000 UTC m=+1107.186393842" watchObservedRunningTime="2025-09-30 06:37:43.716778121 +0000 UTC m=+1107.191799171" Sep 30 06:37:44 crc kubenswrapper[4691]: I0930 06:37:44.708836 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fc3c5478-606b-4088-aec6-a5652fc3ffb1","Type":"ContainerStarted","Data":"e03474c65cd794f937c4e31b37cd3635524bbb84f9d33a96c20c25488dbd6b12"} Sep 30 06:37:44 crc kubenswrapper[4691]: I0930 06:37:44.709179 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fc3c5478-606b-4088-aec6-a5652fc3ffb1","Type":"ContainerStarted","Data":"4bff9abf5544d3a91d38a65772c3ac052c7cc96983bb4b7e3b467eb6d4005e7b"} Sep 30 06:37:45 crc kubenswrapper[4691]: I0930 06:37:45.198294 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-2f4d-account-create-lb6cd"] Sep 30 06:37:45 crc kubenswrapper[4691]: I0930 06:37:45.199895 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2f4d-account-create-lb6cd" Sep 30 06:37:45 crc kubenswrapper[4691]: I0930 06:37:45.206468 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Sep 30 06:37:45 crc kubenswrapper[4691]: I0930 06:37:45.215983 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-2f4d-account-create-lb6cd"] Sep 30 06:37:45 crc kubenswrapper[4691]: I0930 06:37:45.301281 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqvkh\" (UniqueName: \"kubernetes.io/projected/1b656027-0cbe-4ac6-bc45-b6b91d5246c9-kube-api-access-lqvkh\") pod \"nova-api-2f4d-account-create-lb6cd\" (UID: \"1b656027-0cbe-4ac6-bc45-b6b91d5246c9\") " pod="openstack/nova-api-2f4d-account-create-lb6cd" Sep 30 06:37:45 crc kubenswrapper[4691]: I0930 06:37:45.403073 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqvkh\" (UniqueName: \"kubernetes.io/projected/1b656027-0cbe-4ac6-bc45-b6b91d5246c9-kube-api-access-lqvkh\") pod \"nova-api-2f4d-account-create-lb6cd\" (UID: \"1b656027-0cbe-4ac6-bc45-b6b91d5246c9\") " pod="openstack/nova-api-2f4d-account-create-lb6cd" Sep 30 06:37:45 crc kubenswrapper[4691]: I0930 06:37:45.405504 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-7d09-account-create-9q4qn"] Sep 30 06:37:45 crc kubenswrapper[4691]: I0930 06:37:45.406782 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-7d09-account-create-9q4qn" Sep 30 06:37:45 crc kubenswrapper[4691]: I0930 06:37:45.409848 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Sep 30 06:37:45 crc kubenswrapper[4691]: I0930 06:37:45.435393 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqvkh\" (UniqueName: \"kubernetes.io/projected/1b656027-0cbe-4ac6-bc45-b6b91d5246c9-kube-api-access-lqvkh\") pod \"nova-api-2f4d-account-create-lb6cd\" (UID: \"1b656027-0cbe-4ac6-bc45-b6b91d5246c9\") " pod="openstack/nova-api-2f4d-account-create-lb6cd" Sep 30 06:37:45 crc kubenswrapper[4691]: I0930 06:37:45.456978 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-7d09-account-create-9q4qn"] Sep 30 06:37:45 crc kubenswrapper[4691]: I0930 06:37:45.504961 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmn4d\" (UniqueName: \"kubernetes.io/projected/5ee2df6e-d485-4863-ad00-4a6c50d7f726-kube-api-access-jmn4d\") pod \"nova-cell0-7d09-account-create-9q4qn\" (UID: \"5ee2df6e-d485-4863-ad00-4a6c50d7f726\") " pod="openstack/nova-cell0-7d09-account-create-9q4qn" Sep 30 06:37:45 crc kubenswrapper[4691]: I0930 06:37:45.531377 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2f4d-account-create-lb6cd" Sep 30 06:37:45 crc kubenswrapper[4691]: I0930 06:37:45.601929 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-7d40-account-create-2b8c5"] Sep 30 06:37:45 crc kubenswrapper[4691]: I0930 06:37:45.603236 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-7d40-account-create-2b8c5" Sep 30 06:37:45 crc kubenswrapper[4691]: I0930 06:37:45.612996 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-7d40-account-create-2b8c5"] Sep 30 06:37:45 crc kubenswrapper[4691]: I0930 06:37:45.613546 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Sep 30 06:37:45 crc kubenswrapper[4691]: I0930 06:37:45.614280 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmn4d\" (UniqueName: \"kubernetes.io/projected/5ee2df6e-d485-4863-ad00-4a6c50d7f726-kube-api-access-jmn4d\") pod \"nova-cell0-7d09-account-create-9q4qn\" (UID: \"5ee2df6e-d485-4863-ad00-4a6c50d7f726\") " pod="openstack/nova-cell0-7d09-account-create-9q4qn" Sep 30 06:37:45 crc kubenswrapper[4691]: I0930 06:37:45.638999 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmn4d\" (UniqueName: \"kubernetes.io/projected/5ee2df6e-d485-4863-ad00-4a6c50d7f726-kube-api-access-jmn4d\") pod \"nova-cell0-7d09-account-create-9q4qn\" (UID: \"5ee2df6e-d485-4863-ad00-4a6c50d7f726\") " pod="openstack/nova-cell0-7d09-account-create-9q4qn" Sep 30 06:37:45 crc kubenswrapper[4691]: I0930 06:37:45.716031 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chhzf\" (UniqueName: \"kubernetes.io/projected/14ced404-77df-4f44-aece-f7e3d8add6d2-kube-api-access-chhzf\") pod \"nova-cell1-7d40-account-create-2b8c5\" (UID: \"14ced404-77df-4f44-aece-f7e3d8add6d2\") " pod="openstack/nova-cell1-7d40-account-create-2b8c5" Sep 30 06:37:45 crc kubenswrapper[4691]: I0930 06:37:45.727356 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fc3c5478-606b-4088-aec6-a5652fc3ffb1","Type":"ContainerStarted","Data":"d7219b79838d64327fa98263364adc021362f2ae5163a59e56137838bd4be800"} Sep 30 06:37:45 crc kubenswrapper[4691]: I0930 06:37:45.731197 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-7d09-account-create-9q4qn" Sep 30 06:37:45 crc kubenswrapper[4691]: I0930 06:37:45.818268 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chhzf\" (UniqueName: \"kubernetes.io/projected/14ced404-77df-4f44-aece-f7e3d8add6d2-kube-api-access-chhzf\") pod \"nova-cell1-7d40-account-create-2b8c5\" (UID: \"14ced404-77df-4f44-aece-f7e3d8add6d2\") " pod="openstack/nova-cell1-7d40-account-create-2b8c5" Sep 30 06:37:45 crc kubenswrapper[4691]: I0930 06:37:45.833790 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chhzf\" (UniqueName: \"kubernetes.io/projected/14ced404-77df-4f44-aece-f7e3d8add6d2-kube-api-access-chhzf\") pod \"nova-cell1-7d40-account-create-2b8c5\" (UID: \"14ced404-77df-4f44-aece-f7e3d8add6d2\") " pod="openstack/nova-cell1-7d40-account-create-2b8c5" Sep 30 06:37:46 crc kubenswrapper[4691]: I0930 06:37:46.009460 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-2f4d-account-create-lb6cd"] Sep 30 06:37:46 crc kubenswrapper[4691]: I0930 06:37:46.039613 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-7d40-account-create-2b8c5" Sep 30 06:37:46 crc kubenswrapper[4691]: I0930 06:37:46.160798 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-7d09-account-create-9q4qn"] Sep 30 06:37:46 crc kubenswrapper[4691]: I0930 06:37:46.471107 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-7d40-account-create-2b8c5"] Sep 30 06:37:46 crc kubenswrapper[4691]: W0930 06:37:46.487873 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod14ced404_77df_4f44_aece_f7e3d8add6d2.slice/crio-52e7c2f153d408f8c2370e0c31ac7e34640ae3e259ffd72792442a5610baf032 WatchSource:0}: Error finding container 52e7c2f153d408f8c2370e0c31ac7e34640ae3e259ffd72792442a5610baf032: Status 404 returned error can't find the container with id 52e7c2f153d408f8c2370e0c31ac7e34640ae3e259ffd72792442a5610baf032 Sep 30 06:37:46 crc kubenswrapper[4691]: I0930 06:37:46.739262 4691 generic.go:334] "Generic (PLEG): container finished" podID="5ee2df6e-d485-4863-ad00-4a6c50d7f726" containerID="4fac8c1261ee213426f764c37fe193efedcaa5ff9e16a5655a1a9aa7ecdb85d3" exitCode=0 Sep 30 06:37:46 crc kubenswrapper[4691]: I0930 06:37:46.739561 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-7d09-account-create-9q4qn" event={"ID":"5ee2df6e-d485-4863-ad00-4a6c50d7f726","Type":"ContainerDied","Data":"4fac8c1261ee213426f764c37fe193efedcaa5ff9e16a5655a1a9aa7ecdb85d3"} Sep 30 06:37:46 crc kubenswrapper[4691]: I0930 06:37:46.739607 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-7d09-account-create-9q4qn" event={"ID":"5ee2df6e-d485-4863-ad00-4a6c50d7f726","Type":"ContainerStarted","Data":"2229901b25ab6ca231e36611f254979e29fcd4e6e239dcf03c626f51db4b763e"} Sep 30 06:37:46 crc kubenswrapper[4691]: I0930 06:37:46.742993 4691 generic.go:334] "Generic (PLEG): container finished" podID="1b656027-0cbe-4ac6-bc45-b6b91d5246c9" containerID="b41bf93c569126220b66ae79b81c8d5f338fe1362cfdb688b6c6a84396578d36" exitCode=0 Sep 30 06:37:46 crc kubenswrapper[4691]: I0930 06:37:46.743040 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-2f4d-account-create-lb6cd" event={"ID":"1b656027-0cbe-4ac6-bc45-b6b91d5246c9","Type":"ContainerDied","Data":"b41bf93c569126220b66ae79b81c8d5f338fe1362cfdb688b6c6a84396578d36"} Sep 30 06:37:46 crc kubenswrapper[4691]: I0930 06:37:46.743089 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-2f4d-account-create-lb6cd" event={"ID":"1b656027-0cbe-4ac6-bc45-b6b91d5246c9","Type":"ContainerStarted","Data":"eb008f383fca0b964a4d9c8e67b71cc64154cc0773565c524bc425b2a8cb6d1e"} Sep 30 06:37:46 crc kubenswrapper[4691]: I0930 06:37:46.745375 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-7d40-account-create-2b8c5" event={"ID":"14ced404-77df-4f44-aece-f7e3d8add6d2","Type":"ContainerStarted","Data":"52e7c2f153d408f8c2370e0c31ac7e34640ae3e259ffd72792442a5610baf032"} Sep 30 06:37:47 crc kubenswrapper[4691]: I0930 06:37:47.475840 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 06:37:47 crc kubenswrapper[4691]: I0930 06:37:47.476138 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="bd274ff4-663a-4621-af28-d5fca3e5b139" containerName="glance-log" containerID="cri-o://5a2b6edbc643a5929286b269316a17d2033f2b4ddab512e09a4bf6912eb16c15" gracePeriod=30 Sep 30 06:37:47 crc kubenswrapper[4691]: I0930 06:37:47.476193 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="bd274ff4-663a-4621-af28-d5fca3e5b139" containerName="glance-httpd" containerID="cri-o://b84ccdedf764479d11a712b4b4ddd548c5760af2524c9c926d968905051ec7a8" gracePeriod=30 Sep 30 06:37:47 crc kubenswrapper[4691]: I0930 06:37:47.756108 4691 generic.go:334] "Generic (PLEG): container finished" podID="bd274ff4-663a-4621-af28-d5fca3e5b139" containerID="5a2b6edbc643a5929286b269316a17d2033f2b4ddab512e09a4bf6912eb16c15" exitCode=143 Sep 30 06:37:47 crc kubenswrapper[4691]: I0930 06:37:47.756196 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bd274ff4-663a-4621-af28-d5fca3e5b139","Type":"ContainerDied","Data":"5a2b6edbc643a5929286b269316a17d2033f2b4ddab512e09a4bf6912eb16c15"} Sep 30 06:37:47 crc kubenswrapper[4691]: I0930 06:37:47.758625 4691 generic.go:334] "Generic (PLEG): container finished" podID="14ced404-77df-4f44-aece-f7e3d8add6d2" containerID="f02be3efb729e7e0e0f978ba4fa4f1b56019a452276a1452c54353dfaf7957f6" exitCode=0 Sep 30 06:37:47 crc kubenswrapper[4691]: I0930 06:37:47.758690 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-7d40-account-create-2b8c5" event={"ID":"14ced404-77df-4f44-aece-f7e3d8add6d2","Type":"ContainerDied","Data":"f02be3efb729e7e0e0f978ba4fa4f1b56019a452276a1452c54353dfaf7957f6"} Sep 30 06:37:47 crc kubenswrapper[4691]: I0930 06:37:47.762067 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fc3c5478-606b-4088-aec6-a5652fc3ffb1","Type":"ContainerStarted","Data":"e56c1aef99344c6617c9438d314d5ed8068b0ff13f7b97939a3f06470b415f8f"} Sep 30 06:37:47 crc kubenswrapper[4691]: I0930 06:37:47.762378 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fc3c5478-606b-4088-aec6-a5652fc3ffb1" containerName="proxy-httpd" containerID="cri-o://e56c1aef99344c6617c9438d314d5ed8068b0ff13f7b97939a3f06470b415f8f" gracePeriod=30 Sep 30 06:37:47 crc kubenswrapper[4691]: I0930 06:37:47.762390 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fc3c5478-606b-4088-aec6-a5652fc3ffb1" containerName="ceilometer-central-agent" containerID="cri-o://4bff9abf5544d3a91d38a65772c3ac052c7cc96983bb4b7e3b467eb6d4005e7b" gracePeriod=30 Sep 30 06:37:47 crc kubenswrapper[4691]: I0930 06:37:47.762423 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fc3c5478-606b-4088-aec6-a5652fc3ffb1" containerName="ceilometer-notification-agent" containerID="cri-o://e03474c65cd794f937c4e31b37cd3635524bbb84f9d33a96c20c25488dbd6b12" gracePeriod=30 Sep 30 06:37:47 crc kubenswrapper[4691]: I0930 06:37:47.762387 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fc3c5478-606b-4088-aec6-a5652fc3ffb1" containerName="sg-core" containerID="cri-o://d7219b79838d64327fa98263364adc021362f2ae5163a59e56137838bd4be800" gracePeriod=30 Sep 30 06:37:47 crc kubenswrapper[4691]: I0930 06:37:47.796603 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.726082263 podStartE2EDuration="5.79658424s" podCreationTimestamp="2025-09-30 06:37:42 +0000 UTC" firstStartedPulling="2025-09-30 06:37:43.524711134 +0000 UTC m=+1106.999732204" lastFinishedPulling="2025-09-30 06:37:46.595213141 +0000 UTC m=+1110.070234181" observedRunningTime="2025-09-30 06:37:47.791360548 +0000 UTC m=+1111.266381598" watchObservedRunningTime="2025-09-30 06:37:47.79658424 +0000 UTC m=+1111.271605290" Sep 30 06:37:48 crc kubenswrapper[4691]: I0930 06:37:48.143816 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2f4d-account-create-lb6cd" Sep 30 06:37:48 crc kubenswrapper[4691]: I0930 06:37:48.278422 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqvkh\" (UniqueName: \"kubernetes.io/projected/1b656027-0cbe-4ac6-bc45-b6b91d5246c9-kube-api-access-lqvkh\") pod \"1b656027-0cbe-4ac6-bc45-b6b91d5246c9\" (UID: \"1b656027-0cbe-4ac6-bc45-b6b91d5246c9\") " Sep 30 06:37:48 crc kubenswrapper[4691]: I0930 06:37:48.286388 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b656027-0cbe-4ac6-bc45-b6b91d5246c9-kube-api-access-lqvkh" (OuterVolumeSpecName: "kube-api-access-lqvkh") pod "1b656027-0cbe-4ac6-bc45-b6b91d5246c9" (UID: "1b656027-0cbe-4ac6-bc45-b6b91d5246c9"). InnerVolumeSpecName "kube-api-access-lqvkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:37:48 crc kubenswrapper[4691]: I0930 06:37:48.323451 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-7d09-account-create-9q4qn" Sep 30 06:37:48 crc kubenswrapper[4691]: I0930 06:37:48.382421 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmn4d\" (UniqueName: \"kubernetes.io/projected/5ee2df6e-d485-4863-ad00-4a6c50d7f726-kube-api-access-jmn4d\") pod \"5ee2df6e-d485-4863-ad00-4a6c50d7f726\" (UID: \"5ee2df6e-d485-4863-ad00-4a6c50d7f726\") " Sep 30 06:37:48 crc kubenswrapper[4691]: I0930 06:37:48.382971 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqvkh\" (UniqueName: \"kubernetes.io/projected/1b656027-0cbe-4ac6-bc45-b6b91d5246c9-kube-api-access-lqvkh\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:48 crc kubenswrapper[4691]: I0930 06:37:48.386330 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ee2df6e-d485-4863-ad00-4a6c50d7f726-kube-api-access-jmn4d" (OuterVolumeSpecName: "kube-api-access-jmn4d") pod "5ee2df6e-d485-4863-ad00-4a6c50d7f726" (UID: "5ee2df6e-d485-4863-ad00-4a6c50d7f726"). InnerVolumeSpecName "kube-api-access-jmn4d". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:37:48 crc kubenswrapper[4691]: I0930 06:37:48.484788 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jmn4d\" (UniqueName: \"kubernetes.io/projected/5ee2df6e-d485-4863-ad00-4a6c50d7f726-kube-api-access-jmn4d\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:48 crc kubenswrapper[4691]: I0930 06:37:48.781846 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 30 06:37:48 crc kubenswrapper[4691]: I0930 06:37:48.783646 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-2f4d-account-create-lb6cd" event={"ID":"1b656027-0cbe-4ac6-bc45-b6b91d5246c9","Type":"ContainerDied","Data":"eb008f383fca0b964a4d9c8e67b71cc64154cc0773565c524bc425b2a8cb6d1e"} Sep 30 06:37:48 crc kubenswrapper[4691]: I0930 06:37:48.783689 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb008f383fca0b964a4d9c8e67b71cc64154cc0773565c524bc425b2a8cb6d1e" Sep 30 06:37:48 crc kubenswrapper[4691]: I0930 06:37:48.783702 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-2f4d-account-create-lb6cd" Sep 30 06:37:48 crc kubenswrapper[4691]: I0930 06:37:48.785819 4691 generic.go:334] "Generic (PLEG): container finished" podID="bd274ff4-663a-4621-af28-d5fca3e5b139" containerID="b84ccdedf764479d11a712b4b4ddd548c5760af2524c9c926d968905051ec7a8" exitCode=0 Sep 30 06:37:48 crc kubenswrapper[4691]: I0930 06:37:48.785894 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bd274ff4-663a-4621-af28-d5fca3e5b139","Type":"ContainerDied","Data":"b84ccdedf764479d11a712b4b4ddd548c5760af2524c9c926d968905051ec7a8"} Sep 30 06:37:48 crc kubenswrapper[4691]: I0930 06:37:48.792007 4691 generic.go:334] "Generic (PLEG): container finished" podID="fc3c5478-606b-4088-aec6-a5652fc3ffb1" containerID="e56c1aef99344c6617c9438d314d5ed8068b0ff13f7b97939a3f06470b415f8f" exitCode=0 Sep 30 06:37:48 crc kubenswrapper[4691]: I0930 06:37:48.792032 4691 generic.go:334] "Generic (PLEG): container finished" podID="fc3c5478-606b-4088-aec6-a5652fc3ffb1" containerID="d7219b79838d64327fa98263364adc021362f2ae5163a59e56137838bd4be800" exitCode=2 Sep 30 06:37:48 crc kubenswrapper[4691]: I0930 06:37:48.792041 4691 generic.go:334] "Generic (PLEG): container finished" podID="fc3c5478-606b-4088-aec6-a5652fc3ffb1" containerID="e03474c65cd794f937c4e31b37cd3635524bbb84f9d33a96c20c25488dbd6b12" exitCode=0 Sep 30 06:37:48 crc kubenswrapper[4691]: I0930 06:37:48.792086 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fc3c5478-606b-4088-aec6-a5652fc3ffb1","Type":"ContainerDied","Data":"e56c1aef99344c6617c9438d314d5ed8068b0ff13f7b97939a3f06470b415f8f"} Sep 30 06:37:48 crc kubenswrapper[4691]: I0930 06:37:48.792126 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fc3c5478-606b-4088-aec6-a5652fc3ffb1","Type":"ContainerDied","Data":"d7219b79838d64327fa98263364adc021362f2ae5163a59e56137838bd4be800"} Sep 30 06:37:48 crc kubenswrapper[4691]: I0930 06:37:48.792136 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fc3c5478-606b-4088-aec6-a5652fc3ffb1","Type":"ContainerDied","Data":"e03474c65cd794f937c4e31b37cd3635524bbb84f9d33a96c20c25488dbd6b12"} Sep 30 06:37:48 crc kubenswrapper[4691]: I0930 06:37:48.793638 4691 generic.go:334] "Generic (PLEG): container finished" podID="faa07cbe-8687-4be2-8757-8e730fccb6bb" containerID="4b5150a2721319b421cd3bc5176d9411074b6febc4ead3606b3a6efedbea3587" exitCode=137 Sep 30 06:37:48 crc kubenswrapper[4691]: I0930 06:37:48.793677 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 30 06:37:48 crc kubenswrapper[4691]: I0930 06:37:48.793668 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"faa07cbe-8687-4be2-8757-8e730fccb6bb","Type":"ContainerDied","Data":"4b5150a2721319b421cd3bc5176d9411074b6febc4ead3606b3a6efedbea3587"} Sep 30 06:37:48 crc kubenswrapper[4691]: I0930 06:37:48.793793 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"faa07cbe-8687-4be2-8757-8e730fccb6bb","Type":"ContainerDied","Data":"aa55314589fad3b35775c0a75df9ffc5ddae38488d7f3783e12f6abd00b93fbd"} Sep 30 06:37:48 crc kubenswrapper[4691]: I0930 06:37:48.793813 4691 scope.go:117] "RemoveContainer" containerID="4b5150a2721319b421cd3bc5176d9411074b6febc4ead3606b3a6efedbea3587" Sep 30 06:37:48 crc kubenswrapper[4691]: I0930 06:37:48.795272 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-7d09-account-create-9q4qn" Sep 30 06:37:48 crc kubenswrapper[4691]: I0930 06:37:48.797002 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-7d09-account-create-9q4qn" event={"ID":"5ee2df6e-d485-4863-ad00-4a6c50d7f726","Type":"ContainerDied","Data":"2229901b25ab6ca231e36611f254979e29fcd4e6e239dcf03c626f51db4b763e"} Sep 30 06:37:48 crc kubenswrapper[4691]: I0930 06:37:48.797037 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2229901b25ab6ca231e36611f254979e29fcd4e6e239dcf03c626f51db4b763e" Sep 30 06:37:48 crc kubenswrapper[4691]: I0930 06:37:48.866750 4691 scope.go:117] "RemoveContainer" containerID="6afdf23a85746e8816a73dc3776966f7353659e9415744704f9fc5c1f961ffc0" Sep 30 06:37:48 crc kubenswrapper[4691]: I0930 06:37:48.891740 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/faa07cbe-8687-4be2-8757-8e730fccb6bb-scripts\") pod \"faa07cbe-8687-4be2-8757-8e730fccb6bb\" (UID: \"faa07cbe-8687-4be2-8757-8e730fccb6bb\") " Sep 30 06:37:48 crc kubenswrapper[4691]: I0930 06:37:48.891853 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faa07cbe-8687-4be2-8757-8e730fccb6bb-config-data\") pod \"faa07cbe-8687-4be2-8757-8e730fccb6bb\" (UID: \"faa07cbe-8687-4be2-8757-8e730fccb6bb\") " Sep 30 06:37:48 crc kubenswrapper[4691]: I0930 06:37:48.891928 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/faa07cbe-8687-4be2-8757-8e730fccb6bb-config-data-custom\") pod \"faa07cbe-8687-4be2-8757-8e730fccb6bb\" (UID: \"faa07cbe-8687-4be2-8757-8e730fccb6bb\") " Sep 30 06:37:48 crc kubenswrapper[4691]: I0930 06:37:48.891955 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/faa07cbe-8687-4be2-8757-8e730fccb6bb-logs\") pod \"faa07cbe-8687-4be2-8757-8e730fccb6bb\" (UID: \"faa07cbe-8687-4be2-8757-8e730fccb6bb\") " Sep 30 06:37:48 crc kubenswrapper[4691]: I0930 06:37:48.894202 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faa07cbe-8687-4be2-8757-8e730fccb6bb-combined-ca-bundle\") pod \"faa07cbe-8687-4be2-8757-8e730fccb6bb\" (UID: \"faa07cbe-8687-4be2-8757-8e730fccb6bb\") " Sep 30 06:37:48 crc kubenswrapper[4691]: I0930 06:37:48.894281 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/faa07cbe-8687-4be2-8757-8e730fccb6bb-etc-machine-id\") pod \"faa07cbe-8687-4be2-8757-8e730fccb6bb\" (UID: \"faa07cbe-8687-4be2-8757-8e730fccb6bb\") " Sep 30 06:37:48 crc kubenswrapper[4691]: I0930 06:37:48.894382 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ps4l8\" (UniqueName: \"kubernetes.io/projected/faa07cbe-8687-4be2-8757-8e730fccb6bb-kube-api-access-ps4l8\") pod \"faa07cbe-8687-4be2-8757-8e730fccb6bb\" (UID: \"faa07cbe-8687-4be2-8757-8e730fccb6bb\") " Sep 30 06:37:48 crc kubenswrapper[4691]: I0930 06:37:48.897031 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/faa07cbe-8687-4be2-8757-8e730fccb6bb-scripts" (OuterVolumeSpecName: "scripts") pod "faa07cbe-8687-4be2-8757-8e730fccb6bb" (UID: "faa07cbe-8687-4be2-8757-8e730fccb6bb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:37:48 crc kubenswrapper[4691]: I0930 06:37:48.897279 4691 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/faa07cbe-8687-4be2-8757-8e730fccb6bb-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:48 crc kubenswrapper[4691]: I0930 06:37:48.897573 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/faa07cbe-8687-4be2-8757-8e730fccb6bb-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "faa07cbe-8687-4be2-8757-8e730fccb6bb" (UID: "faa07cbe-8687-4be2-8757-8e730fccb6bb"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:37:48 crc kubenswrapper[4691]: I0930 06:37:48.897940 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/faa07cbe-8687-4be2-8757-8e730fccb6bb-logs" (OuterVolumeSpecName: "logs") pod "faa07cbe-8687-4be2-8757-8e730fccb6bb" (UID: "faa07cbe-8687-4be2-8757-8e730fccb6bb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:37:48 crc kubenswrapper[4691]: I0930 06:37:48.898042 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/faa07cbe-8687-4be2-8757-8e730fccb6bb-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "faa07cbe-8687-4be2-8757-8e730fccb6bb" (UID: "faa07cbe-8687-4be2-8757-8e730fccb6bb"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 06:37:48 crc kubenswrapper[4691]: I0930 06:37:48.899276 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/faa07cbe-8687-4be2-8757-8e730fccb6bb-kube-api-access-ps4l8" (OuterVolumeSpecName: "kube-api-access-ps4l8") pod "faa07cbe-8687-4be2-8757-8e730fccb6bb" (UID: "faa07cbe-8687-4be2-8757-8e730fccb6bb"). InnerVolumeSpecName "kube-api-access-ps4l8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:37:48 crc kubenswrapper[4691]: I0930 06:37:48.925629 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/faa07cbe-8687-4be2-8757-8e730fccb6bb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "faa07cbe-8687-4be2-8757-8e730fccb6bb" (UID: "faa07cbe-8687-4be2-8757-8e730fccb6bb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:37:48 crc kubenswrapper[4691]: I0930 06:37:48.951078 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 06:37:48 crc kubenswrapper[4691]: I0930 06:37:48.953606 4691 scope.go:117] "RemoveContainer" containerID="4b5150a2721319b421cd3bc5176d9411074b6febc4ead3606b3a6efedbea3587" Sep 30 06:37:48 crc kubenswrapper[4691]: E0930 06:37:48.954079 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b5150a2721319b421cd3bc5176d9411074b6febc4ead3606b3a6efedbea3587\": container with ID starting with 4b5150a2721319b421cd3bc5176d9411074b6febc4ead3606b3a6efedbea3587 not found: ID does not exist" containerID="4b5150a2721319b421cd3bc5176d9411074b6febc4ead3606b3a6efedbea3587" Sep 30 06:37:48 crc kubenswrapper[4691]: I0930 06:37:48.954208 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b5150a2721319b421cd3bc5176d9411074b6febc4ead3606b3a6efedbea3587"} err="failed to get container status \"4b5150a2721319b421cd3bc5176d9411074b6febc4ead3606b3a6efedbea3587\": rpc error: code = NotFound desc = could not find container \"4b5150a2721319b421cd3bc5176d9411074b6febc4ead3606b3a6efedbea3587\": container with ID starting with 4b5150a2721319b421cd3bc5176d9411074b6febc4ead3606b3a6efedbea3587 not found: ID does not exist" Sep 30 06:37:48 crc kubenswrapper[4691]: I0930 06:37:48.954293 4691 scope.go:117] "RemoveContainer" containerID="6afdf23a85746e8816a73dc3776966f7353659e9415744704f9fc5c1f961ffc0" Sep 30 06:37:48 crc kubenswrapper[4691]: E0930 06:37:48.957174 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6afdf23a85746e8816a73dc3776966f7353659e9415744704f9fc5c1f961ffc0\": container with ID starting with 6afdf23a85746e8816a73dc3776966f7353659e9415744704f9fc5c1f961ffc0 not found: ID does not exist" containerID="6afdf23a85746e8816a73dc3776966f7353659e9415744704f9fc5c1f961ffc0" Sep 30 06:37:48 crc kubenswrapper[4691]: I0930 06:37:48.957225 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6afdf23a85746e8816a73dc3776966f7353659e9415744704f9fc5c1f961ffc0"} err="failed to get container status \"6afdf23a85746e8816a73dc3776966f7353659e9415744704f9fc5c1f961ffc0\": rpc error: code = NotFound desc = could not find container \"6afdf23a85746e8816a73dc3776966f7353659e9415744704f9fc5c1f961ffc0\": container with ID starting with 6afdf23a85746e8816a73dc3776966f7353659e9415744704f9fc5c1f961ffc0 not found: ID does not exist" Sep 30 06:37:48 crc kubenswrapper[4691]: I0930 06:37:48.976996 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/faa07cbe-8687-4be2-8757-8e730fccb6bb-config-data" (OuterVolumeSpecName: "config-data") pod "faa07cbe-8687-4be2-8757-8e730fccb6bb" (UID: "faa07cbe-8687-4be2-8757-8e730fccb6bb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:37:48 crc kubenswrapper[4691]: I0930 06:37:48.998821 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"bd274ff4-663a-4621-af28-d5fca3e5b139\" (UID: \"bd274ff4-663a-4621-af28-d5fca3e5b139\") " Sep 30 06:37:48 crc kubenswrapper[4691]: I0930 06:37:48.998956 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd274ff4-663a-4621-af28-d5fca3e5b139-config-data\") pod \"bd274ff4-663a-4621-af28-d5fca3e5b139\" (UID: \"bd274ff4-663a-4621-af28-d5fca3e5b139\") " Sep 30 06:37:48 crc kubenswrapper[4691]: I0930 06:37:48.999014 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bd274ff4-663a-4621-af28-d5fca3e5b139-httpd-run\") pod \"bd274ff4-663a-4621-af28-d5fca3e5b139\" (UID: \"bd274ff4-663a-4621-af28-d5fca3e5b139\") " Sep 30 06:37:48 crc kubenswrapper[4691]: I0930 06:37:48.999039 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd274ff4-663a-4621-af28-d5fca3e5b139-logs\") pod \"bd274ff4-663a-4621-af28-d5fca3e5b139\" (UID: \"bd274ff4-663a-4621-af28-d5fca3e5b139\") " Sep 30 06:37:48 crc kubenswrapper[4691]: I0930 06:37:48.999093 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd274ff4-663a-4621-af28-d5fca3e5b139-scripts\") pod \"bd274ff4-663a-4621-af28-d5fca3e5b139\" (UID: \"bd274ff4-663a-4621-af28-d5fca3e5b139\") " Sep 30 06:37:48 crc kubenswrapper[4691]: I0930 06:37:48.999186 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd274ff4-663a-4621-af28-d5fca3e5b139-public-tls-certs\") pod \"bd274ff4-663a-4621-af28-d5fca3e5b139\" (UID: \"bd274ff4-663a-4621-af28-d5fca3e5b139\") " Sep 30 06:37:48 crc kubenswrapper[4691]: I0930 06:37:48.999222 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjdqv\" (UniqueName: \"kubernetes.io/projected/bd274ff4-663a-4621-af28-d5fca3e5b139-kube-api-access-bjdqv\") pod \"bd274ff4-663a-4621-af28-d5fca3e5b139\" (UID: \"bd274ff4-663a-4621-af28-d5fca3e5b139\") " Sep 30 06:37:48 crc kubenswrapper[4691]: I0930 06:37:48.999250 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd274ff4-663a-4621-af28-d5fca3e5b139-combined-ca-bundle\") pod \"bd274ff4-663a-4621-af28-d5fca3e5b139\" (UID: \"bd274ff4-663a-4621-af28-d5fca3e5b139\") " Sep 30 06:37:48 crc kubenswrapper[4691]: I0930 06:37:48.999427 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd274ff4-663a-4621-af28-d5fca3e5b139-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "bd274ff4-663a-4621-af28-d5fca3e5b139" (UID: "bd274ff4-663a-4621-af28-d5fca3e5b139"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:37:48 crc kubenswrapper[4691]: I0930 06:37:48.999708 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/faa07cbe-8687-4be2-8757-8e730fccb6bb-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:48 crc kubenswrapper[4691]: I0930 06:37:48.999734 4691 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/faa07cbe-8687-4be2-8757-8e730fccb6bb-config-data-custom\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:48 crc kubenswrapper[4691]: I0930 06:37:48.999748 4691 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/faa07cbe-8687-4be2-8757-8e730fccb6bb-logs\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:48 crc kubenswrapper[4691]: I0930 06:37:48.999762 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/faa07cbe-8687-4be2-8757-8e730fccb6bb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:48 crc kubenswrapper[4691]: I0930 06:37:48.999771 4691 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/faa07cbe-8687-4be2-8757-8e730fccb6bb-etc-machine-id\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:48 crc kubenswrapper[4691]: I0930 06:37:48.999781 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ps4l8\" (UniqueName: \"kubernetes.io/projected/faa07cbe-8687-4be2-8757-8e730fccb6bb-kube-api-access-ps4l8\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:48 crc kubenswrapper[4691]: I0930 06:37:48.999790 4691 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bd274ff4-663a-4621-af28-d5fca3e5b139-httpd-run\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.000513 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd274ff4-663a-4621-af28-d5fca3e5b139-logs" (OuterVolumeSpecName: "logs") pod "bd274ff4-663a-4621-af28-d5fca3e5b139" (UID: "bd274ff4-663a-4621-af28-d5fca3e5b139"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.002793 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "bd274ff4-663a-4621-af28-d5fca3e5b139" (UID: "bd274ff4-663a-4621-af28-d5fca3e5b139"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.004191 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd274ff4-663a-4621-af28-d5fca3e5b139-scripts" (OuterVolumeSpecName: "scripts") pod "bd274ff4-663a-4621-af28-d5fca3e5b139" (UID: "bd274ff4-663a-4621-af28-d5fca3e5b139"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.006875 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd274ff4-663a-4621-af28-d5fca3e5b139-kube-api-access-bjdqv" (OuterVolumeSpecName: "kube-api-access-bjdqv") pod "bd274ff4-663a-4621-af28-d5fca3e5b139" (UID: "bd274ff4-663a-4621-af28-d5fca3e5b139"). InnerVolumeSpecName "kube-api-access-bjdqv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.047977 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd274ff4-663a-4621-af28-d5fca3e5b139-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bd274ff4-663a-4621-af28-d5fca3e5b139" (UID: "bd274ff4-663a-4621-af28-d5fca3e5b139"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.113013 4691 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.113045 4691 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd274ff4-663a-4621-af28-d5fca3e5b139-logs\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.113054 4691 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd274ff4-663a-4621-af28-d5fca3e5b139-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.113062 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjdqv\" (UniqueName: \"kubernetes.io/projected/bd274ff4-663a-4621-af28-d5fca3e5b139-kube-api-access-bjdqv\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.113073 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd274ff4-663a-4621-af28-d5fca3e5b139-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.114066 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd274ff4-663a-4621-af28-d5fca3e5b139-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "bd274ff4-663a-4621-af28-d5fca3e5b139" (UID: "bd274ff4-663a-4621-af28-d5fca3e5b139"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.177060 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd274ff4-663a-4621-af28-d5fca3e5b139-config-data" (OuterVolumeSpecName: "config-data") pod "bd274ff4-663a-4621-af28-d5fca3e5b139" (UID: "bd274ff4-663a-4621-af28-d5fca3e5b139"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.185711 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.189990 4691 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.205208 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.212188 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Sep 30 06:37:49 crc kubenswrapper[4691]: E0930 06:37:49.212669 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="faa07cbe-8687-4be2-8757-8e730fccb6bb" containerName="cinder-api" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.212761 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="faa07cbe-8687-4be2-8757-8e730fccb6bb" containerName="cinder-api" Sep 30 06:37:49 crc kubenswrapper[4691]: E0930 06:37:49.212831 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ee2df6e-d485-4863-ad00-4a6c50d7f726" containerName="mariadb-account-create" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.212896 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ee2df6e-d485-4863-ad00-4a6c50d7f726" containerName="mariadb-account-create" Sep 30 06:37:49 crc kubenswrapper[4691]: E0930 06:37:49.212951 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd274ff4-663a-4621-af28-d5fca3e5b139" containerName="glance-log" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.213022 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd274ff4-663a-4621-af28-d5fca3e5b139" containerName="glance-log" Sep 30 06:37:49 crc kubenswrapper[4691]: E0930 06:37:49.213093 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd274ff4-663a-4621-af28-d5fca3e5b139" containerName="glance-httpd" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.213148 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd274ff4-663a-4621-af28-d5fca3e5b139" containerName="glance-httpd" Sep 30 06:37:49 crc kubenswrapper[4691]: E0930 06:37:49.213205 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="faa07cbe-8687-4be2-8757-8e730fccb6bb" containerName="cinder-api-log" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.213253 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="faa07cbe-8687-4be2-8757-8e730fccb6bb" containerName="cinder-api-log" Sep 30 06:37:49 crc kubenswrapper[4691]: E0930 06:37:49.213312 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b656027-0cbe-4ac6-bc45-b6b91d5246c9" containerName="mariadb-account-create" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.213364 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b656027-0cbe-4ac6-bc45-b6b91d5246c9" containerName="mariadb-account-create" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.213598 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ee2df6e-d485-4863-ad00-4a6c50d7f726" containerName="mariadb-account-create" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.213671 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd274ff4-663a-4621-af28-d5fca3e5b139" containerName="glance-httpd" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.213733 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="faa07cbe-8687-4be2-8757-8e730fccb6bb" containerName="cinder-api" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.213790 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b656027-0cbe-4ac6-bc45-b6b91d5246c9" containerName="mariadb-account-create" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.213846 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd274ff4-663a-4621-af28-d5fca3e5b139" containerName="glance-log" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.213917 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="faa07cbe-8687-4be2-8757-8e730fccb6bb" containerName="cinder-api-log" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.215728 4691 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.215755 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd274ff4-663a-4621-af28-d5fca3e5b139-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.215766 4691 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd274ff4-663a-4621-af28-d5fca3e5b139-public-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.219153 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.224258 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.224461 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.248631 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.251380 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="faa07cbe-8687-4be2-8757-8e730fccb6bb" path="/var/lib/kubelet/pods/faa07cbe-8687-4be2-8757-8e730fccb6bb/volumes" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.256355 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.318085 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/397c7023-cd6a-42ac-8d37-5813f5f9d45e-logs\") pod \"cinder-api-0\" (UID: \"397c7023-cd6a-42ac-8d37-5813f5f9d45e\") " pod="openstack/cinder-api-0" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.318336 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/397c7023-cd6a-42ac-8d37-5813f5f9d45e-config-data\") pod \"cinder-api-0\" (UID: \"397c7023-cd6a-42ac-8d37-5813f5f9d45e\") " pod="openstack/cinder-api-0" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.318405 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/397c7023-cd6a-42ac-8d37-5813f5f9d45e-config-data-custom\") pod \"cinder-api-0\" (UID: \"397c7023-cd6a-42ac-8d37-5813f5f9d45e\") " pod="openstack/cinder-api-0" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.318425 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/397c7023-cd6a-42ac-8d37-5813f5f9d45e-scripts\") pod \"cinder-api-0\" (UID: \"397c7023-cd6a-42ac-8d37-5813f5f9d45e\") " pod="openstack/cinder-api-0" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.318469 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/397c7023-cd6a-42ac-8d37-5813f5f9d45e-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"397c7023-cd6a-42ac-8d37-5813f5f9d45e\") " pod="openstack/cinder-api-0" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.318489 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lgnk\" (UniqueName: \"kubernetes.io/projected/397c7023-cd6a-42ac-8d37-5813f5f9d45e-kube-api-access-5lgnk\") pod \"cinder-api-0\" (UID: \"397c7023-cd6a-42ac-8d37-5813f5f9d45e\") " pod="openstack/cinder-api-0" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.318518 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/397c7023-cd6a-42ac-8d37-5813f5f9d45e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"397c7023-cd6a-42ac-8d37-5813f5f9d45e\") " pod="openstack/cinder-api-0" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.318555 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/397c7023-cd6a-42ac-8d37-5813f5f9d45e-public-tls-certs\") pod \"cinder-api-0\" (UID: \"397c7023-cd6a-42ac-8d37-5813f5f9d45e\") " pod="openstack/cinder-api-0" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.318592 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/397c7023-cd6a-42ac-8d37-5813f5f9d45e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"397c7023-cd6a-42ac-8d37-5813f5f9d45e\") " pod="openstack/cinder-api-0" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.343087 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-7d40-account-create-2b8c5" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.420318 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-chhzf\" (UniqueName: \"kubernetes.io/projected/14ced404-77df-4f44-aece-f7e3d8add6d2-kube-api-access-chhzf\") pod \"14ced404-77df-4f44-aece-f7e3d8add6d2\" (UID: \"14ced404-77df-4f44-aece-f7e3d8add6d2\") " Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.420711 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/397c7023-cd6a-42ac-8d37-5813f5f9d45e-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"397c7023-cd6a-42ac-8d37-5813f5f9d45e\") " pod="openstack/cinder-api-0" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.420746 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lgnk\" (UniqueName: \"kubernetes.io/projected/397c7023-cd6a-42ac-8d37-5813f5f9d45e-kube-api-access-5lgnk\") pod \"cinder-api-0\" (UID: \"397c7023-cd6a-42ac-8d37-5813f5f9d45e\") " pod="openstack/cinder-api-0" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.420788 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/397c7023-cd6a-42ac-8d37-5813f5f9d45e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"397c7023-cd6a-42ac-8d37-5813f5f9d45e\") " pod="openstack/cinder-api-0" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.420810 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/397c7023-cd6a-42ac-8d37-5813f5f9d45e-public-tls-certs\") pod \"cinder-api-0\" (UID: \"397c7023-cd6a-42ac-8d37-5813f5f9d45e\") " pod="openstack/cinder-api-0" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.420853 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/397c7023-cd6a-42ac-8d37-5813f5f9d45e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"397c7023-cd6a-42ac-8d37-5813f5f9d45e\") " pod="openstack/cinder-api-0" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.420925 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/397c7023-cd6a-42ac-8d37-5813f5f9d45e-logs\") pod \"cinder-api-0\" (UID: \"397c7023-cd6a-42ac-8d37-5813f5f9d45e\") " pod="openstack/cinder-api-0" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.420955 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/397c7023-cd6a-42ac-8d37-5813f5f9d45e-config-data\") pod \"cinder-api-0\" (UID: \"397c7023-cd6a-42ac-8d37-5813f5f9d45e\") " pod="openstack/cinder-api-0" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.421001 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/397c7023-cd6a-42ac-8d37-5813f5f9d45e-config-data-custom\") pod \"cinder-api-0\" (UID: \"397c7023-cd6a-42ac-8d37-5813f5f9d45e\") " pod="openstack/cinder-api-0" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.421023 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/397c7023-cd6a-42ac-8d37-5813f5f9d45e-scripts\") pod \"cinder-api-0\" (UID: \"397c7023-cd6a-42ac-8d37-5813f5f9d45e\") " pod="openstack/cinder-api-0" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.421119 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/397c7023-cd6a-42ac-8d37-5813f5f9d45e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"397c7023-cd6a-42ac-8d37-5813f5f9d45e\") " pod="openstack/cinder-api-0" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.424245 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/397c7023-cd6a-42ac-8d37-5813f5f9d45e-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"397c7023-cd6a-42ac-8d37-5813f5f9d45e\") " pod="openstack/cinder-api-0" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.424745 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/397c7023-cd6a-42ac-8d37-5813f5f9d45e-logs\") pod \"cinder-api-0\" (UID: \"397c7023-cd6a-42ac-8d37-5813f5f9d45e\") " pod="openstack/cinder-api-0" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.426166 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14ced404-77df-4f44-aece-f7e3d8add6d2-kube-api-access-chhzf" (OuterVolumeSpecName: "kube-api-access-chhzf") pod "14ced404-77df-4f44-aece-f7e3d8add6d2" (UID: "14ced404-77df-4f44-aece-f7e3d8add6d2"). InnerVolumeSpecName "kube-api-access-chhzf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.427212 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/397c7023-cd6a-42ac-8d37-5813f5f9d45e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"397c7023-cd6a-42ac-8d37-5813f5f9d45e\") " pod="openstack/cinder-api-0" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.427517 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/397c7023-cd6a-42ac-8d37-5813f5f9d45e-public-tls-certs\") pod \"cinder-api-0\" (UID: \"397c7023-cd6a-42ac-8d37-5813f5f9d45e\") " pod="openstack/cinder-api-0" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.427518 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/397c7023-cd6a-42ac-8d37-5813f5f9d45e-scripts\") pod \"cinder-api-0\" (UID: \"397c7023-cd6a-42ac-8d37-5813f5f9d45e\") " pod="openstack/cinder-api-0" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.428534 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/397c7023-cd6a-42ac-8d37-5813f5f9d45e-config-data-custom\") pod \"cinder-api-0\" (UID: \"397c7023-cd6a-42ac-8d37-5813f5f9d45e\") " pod="openstack/cinder-api-0" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.431226 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/397c7023-cd6a-42ac-8d37-5813f5f9d45e-config-data\") pod \"cinder-api-0\" (UID: \"397c7023-cd6a-42ac-8d37-5813f5f9d45e\") " pod="openstack/cinder-api-0" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.439641 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lgnk\" (UniqueName: \"kubernetes.io/projected/397c7023-cd6a-42ac-8d37-5813f5f9d45e-kube-api-access-5lgnk\") pod \"cinder-api-0\" (UID: \"397c7023-cd6a-42ac-8d37-5813f5f9d45e\") " pod="openstack/cinder-api-0" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.526958 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-chhzf\" (UniqueName: \"kubernetes.io/projected/14ced404-77df-4f44-aece-f7e3d8add6d2-kube-api-access-chhzf\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.574917 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.805800 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"bd274ff4-663a-4621-af28-d5fca3e5b139","Type":"ContainerDied","Data":"5a164581da1961c3054d1499980b075e6558bafcef62c4a0f3321527ce14d0a6"} Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.806091 4691 scope.go:117] "RemoveContainer" containerID="b84ccdedf764479d11a712b4b4ddd548c5760af2524c9c926d968905051ec7a8" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.805835 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.808086 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-7d40-account-create-2b8c5" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.808083 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-7d40-account-create-2b8c5" event={"ID":"14ced404-77df-4f44-aece-f7e3d8add6d2","Type":"ContainerDied","Data":"52e7c2f153d408f8c2370e0c31ac7e34640ae3e259ffd72792442a5610baf032"} Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.808127 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="52e7c2f153d408f8c2370e0c31ac7e34640ae3e259ffd72792442a5610baf032" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.833635 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.834406 4691 scope.go:117] "RemoveContainer" containerID="5a2b6edbc643a5929286b269316a17d2033f2b4ddab512e09a4bf6912eb16c15" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.846391 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.856618 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 06:37:49 crc kubenswrapper[4691]: E0930 06:37:49.857135 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14ced404-77df-4f44-aece-f7e3d8add6d2" containerName="mariadb-account-create" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.857154 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="14ced404-77df-4f44-aece-f7e3d8add6d2" containerName="mariadb-account-create" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.857335 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="14ced404-77df-4f44-aece-f7e3d8add6d2" containerName="mariadb-account-create" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.858399 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.861170 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.861264 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.866080 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.934895 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d2a83f16-21dd-442b-b27d-6c583c783055-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d2a83f16-21dd-442b-b27d-6c583c783055\") " pod="openstack/glance-default-external-api-0" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.934940 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2a83f16-21dd-442b-b27d-6c583c783055-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d2a83f16-21dd-442b-b27d-6c583c783055\") " pod="openstack/glance-default-external-api-0" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.934991 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2a83f16-21dd-442b-b27d-6c583c783055-logs\") pod \"glance-default-external-api-0\" (UID: \"d2a83f16-21dd-442b-b27d-6c583c783055\") " pod="openstack/glance-default-external-api-0" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.935008 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sz4w\" (UniqueName: \"kubernetes.io/projected/d2a83f16-21dd-442b-b27d-6c583c783055-kube-api-access-5sz4w\") pod \"glance-default-external-api-0\" (UID: \"d2a83f16-21dd-442b-b27d-6c583c783055\") " pod="openstack/glance-default-external-api-0" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.935044 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2a83f16-21dd-442b-b27d-6c583c783055-config-data\") pod \"glance-default-external-api-0\" (UID: \"d2a83f16-21dd-442b-b27d-6c583c783055\") " pod="openstack/glance-default-external-api-0" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.935075 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2a83f16-21dd-442b-b27d-6c583c783055-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d2a83f16-21dd-442b-b27d-6c583c783055\") " pod="openstack/glance-default-external-api-0" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.935103 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"d2a83f16-21dd-442b-b27d-6c583c783055\") " pod="openstack/glance-default-external-api-0" Sep 30 06:37:49 crc kubenswrapper[4691]: I0930 06:37:49.935140 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2a83f16-21dd-442b-b27d-6c583c783055-scripts\") pod \"glance-default-external-api-0\" (UID: \"d2a83f16-21dd-442b-b27d-6c583c783055\") " pod="openstack/glance-default-external-api-0" Sep 30 06:37:50 crc kubenswrapper[4691]: I0930 06:37:50.037265 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d2a83f16-21dd-442b-b27d-6c583c783055-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d2a83f16-21dd-442b-b27d-6c583c783055\") " pod="openstack/glance-default-external-api-0" Sep 30 06:37:50 crc kubenswrapper[4691]: I0930 06:37:50.037313 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2a83f16-21dd-442b-b27d-6c583c783055-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d2a83f16-21dd-442b-b27d-6c583c783055\") " pod="openstack/glance-default-external-api-0" Sep 30 06:37:50 crc kubenswrapper[4691]: I0930 06:37:50.037457 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2a83f16-21dd-442b-b27d-6c583c783055-logs\") pod \"glance-default-external-api-0\" (UID: \"d2a83f16-21dd-442b-b27d-6c583c783055\") " pod="openstack/glance-default-external-api-0" Sep 30 06:37:50 crc kubenswrapper[4691]: I0930 06:37:50.037480 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5sz4w\" (UniqueName: \"kubernetes.io/projected/d2a83f16-21dd-442b-b27d-6c583c783055-kube-api-access-5sz4w\") pod \"glance-default-external-api-0\" (UID: \"d2a83f16-21dd-442b-b27d-6c583c783055\") " pod="openstack/glance-default-external-api-0" Sep 30 06:37:50 crc kubenswrapper[4691]: I0930 06:37:50.037538 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2a83f16-21dd-442b-b27d-6c583c783055-config-data\") pod \"glance-default-external-api-0\" (UID: \"d2a83f16-21dd-442b-b27d-6c583c783055\") " pod="openstack/glance-default-external-api-0" Sep 30 06:37:50 crc kubenswrapper[4691]: I0930 06:37:50.037602 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2a83f16-21dd-442b-b27d-6c583c783055-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d2a83f16-21dd-442b-b27d-6c583c783055\") " pod="openstack/glance-default-external-api-0" Sep 30 06:37:50 crc kubenswrapper[4691]: I0930 06:37:50.037630 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"d2a83f16-21dd-442b-b27d-6c583c783055\") " pod="openstack/glance-default-external-api-0" Sep 30 06:37:50 crc kubenswrapper[4691]: I0930 06:37:50.037685 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2a83f16-21dd-442b-b27d-6c583c783055-scripts\") pod \"glance-default-external-api-0\" (UID: \"d2a83f16-21dd-442b-b27d-6c583c783055\") " pod="openstack/glance-default-external-api-0" Sep 30 06:37:50 crc kubenswrapper[4691]: I0930 06:37:50.038152 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/d2a83f16-21dd-442b-b27d-6c583c783055-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"d2a83f16-21dd-442b-b27d-6c583c783055\") " pod="openstack/glance-default-external-api-0" Sep 30 06:37:50 crc kubenswrapper[4691]: I0930 06:37:50.038601 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d2a83f16-21dd-442b-b27d-6c583c783055-logs\") pod \"glance-default-external-api-0\" (UID: \"d2a83f16-21dd-442b-b27d-6c583c783055\") " pod="openstack/glance-default-external-api-0" Sep 30 06:37:50 crc kubenswrapper[4691]: I0930 06:37:50.038989 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Sep 30 06:37:50 crc kubenswrapper[4691]: I0930 06:37:50.039606 4691 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"d2a83f16-21dd-442b-b27d-6c583c783055\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/glance-default-external-api-0" Sep 30 06:37:50 crc kubenswrapper[4691]: I0930 06:37:50.044102 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d2a83f16-21dd-442b-b27d-6c583c783055-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"d2a83f16-21dd-442b-b27d-6c583c783055\") " pod="openstack/glance-default-external-api-0" Sep 30 06:37:50 crc kubenswrapper[4691]: I0930 06:37:50.044175 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d2a83f16-21dd-442b-b27d-6c583c783055-scripts\") pod \"glance-default-external-api-0\" (UID: \"d2a83f16-21dd-442b-b27d-6c583c783055\") " pod="openstack/glance-default-external-api-0" Sep 30 06:37:50 crc kubenswrapper[4691]: I0930 06:37:50.052231 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d2a83f16-21dd-442b-b27d-6c583c783055-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"d2a83f16-21dd-442b-b27d-6c583c783055\") " pod="openstack/glance-default-external-api-0" Sep 30 06:37:50 crc kubenswrapper[4691]: I0930 06:37:50.052540 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d2a83f16-21dd-442b-b27d-6c583c783055-config-data\") pod \"glance-default-external-api-0\" (UID: \"d2a83f16-21dd-442b-b27d-6c583c783055\") " pod="openstack/glance-default-external-api-0" Sep 30 06:37:50 crc kubenswrapper[4691]: I0930 06:37:50.059629 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sz4w\" (UniqueName: \"kubernetes.io/projected/d2a83f16-21dd-442b-b27d-6c583c783055-kube-api-access-5sz4w\") pod \"glance-default-external-api-0\" (UID: \"d2a83f16-21dd-442b-b27d-6c583c783055\") " pod="openstack/glance-default-external-api-0" Sep 30 06:37:50 crc kubenswrapper[4691]: I0930 06:37:50.098154 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-0\" (UID: \"d2a83f16-21dd-442b-b27d-6c583c783055\") " pod="openstack/glance-default-external-api-0" Sep 30 06:37:50 crc kubenswrapper[4691]: I0930 06:37:50.185677 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Sep 30 06:37:50 crc kubenswrapper[4691]: I0930 06:37:50.677749 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-k7vkj"] Sep 30 06:37:50 crc kubenswrapper[4691]: I0930 06:37:50.679354 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-k7vkj" Sep 30 06:37:50 crc kubenswrapper[4691]: I0930 06:37:50.683747 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Sep 30 06:37:50 crc kubenswrapper[4691]: I0930 06:37:50.683774 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Sep 30 06:37:50 crc kubenswrapper[4691]: I0930 06:37:50.683987 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-6spfr" Sep 30 06:37:50 crc kubenswrapper[4691]: I0930 06:37:50.688407 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-k7vkj"] Sep 30 06:37:50 crc kubenswrapper[4691]: I0930 06:37:50.732283 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Sep 30 06:37:50 crc kubenswrapper[4691]: W0930 06:37:50.736257 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd2a83f16_21dd_442b_b27d_6c583c783055.slice/crio-6b0967c70c9caa7ac79242e291cb822716b49be10b1bc9b84814456e274ba1da WatchSource:0}: Error finding container 6b0967c70c9caa7ac79242e291cb822716b49be10b1bc9b84814456e274ba1da: Status 404 returned error can't find the container with id 6b0967c70c9caa7ac79242e291cb822716b49be10b1bc9b84814456e274ba1da Sep 30 06:37:50 crc kubenswrapper[4691]: I0930 06:37:50.753130 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c1d80b8d-cdaa-4dec-9ef5-dca9c837ed7a-scripts\") pod \"nova-cell0-conductor-db-sync-k7vkj\" (UID: \"c1d80b8d-cdaa-4dec-9ef5-dca9c837ed7a\") " pod="openstack/nova-cell0-conductor-db-sync-k7vkj" Sep 30 06:37:50 crc kubenswrapper[4691]: I0930 06:37:50.753198 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrj78\" (UniqueName: \"kubernetes.io/projected/c1d80b8d-cdaa-4dec-9ef5-dca9c837ed7a-kube-api-access-qrj78\") pod \"nova-cell0-conductor-db-sync-k7vkj\" (UID: \"c1d80b8d-cdaa-4dec-9ef5-dca9c837ed7a\") " pod="openstack/nova-cell0-conductor-db-sync-k7vkj" Sep 30 06:37:50 crc kubenswrapper[4691]: I0930 06:37:50.753223 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1d80b8d-cdaa-4dec-9ef5-dca9c837ed7a-config-data\") pod \"nova-cell0-conductor-db-sync-k7vkj\" (UID: \"c1d80b8d-cdaa-4dec-9ef5-dca9c837ed7a\") " pod="openstack/nova-cell0-conductor-db-sync-k7vkj" Sep 30 06:37:50 crc kubenswrapper[4691]: I0930 06:37:50.753306 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1d80b8d-cdaa-4dec-9ef5-dca9c837ed7a-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-k7vkj\" (UID: \"c1d80b8d-cdaa-4dec-9ef5-dca9c837ed7a\") " pod="openstack/nova-cell0-conductor-db-sync-k7vkj" Sep 30 06:37:50 crc kubenswrapper[4691]: I0930 06:37:50.855233 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrj78\" (UniqueName: \"kubernetes.io/projected/c1d80b8d-cdaa-4dec-9ef5-dca9c837ed7a-kube-api-access-qrj78\") pod \"nova-cell0-conductor-db-sync-k7vkj\" (UID: \"c1d80b8d-cdaa-4dec-9ef5-dca9c837ed7a\") " pod="openstack/nova-cell0-conductor-db-sync-k7vkj" Sep 30 06:37:50 crc kubenswrapper[4691]: I0930 06:37:50.855293 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1d80b8d-cdaa-4dec-9ef5-dca9c837ed7a-config-data\") pod \"nova-cell0-conductor-db-sync-k7vkj\" (UID: \"c1d80b8d-cdaa-4dec-9ef5-dca9c837ed7a\") " pod="openstack/nova-cell0-conductor-db-sync-k7vkj" Sep 30 06:37:50 crc kubenswrapper[4691]: I0930 06:37:50.855403 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1d80b8d-cdaa-4dec-9ef5-dca9c837ed7a-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-k7vkj\" (UID: \"c1d80b8d-cdaa-4dec-9ef5-dca9c837ed7a\") " pod="openstack/nova-cell0-conductor-db-sync-k7vkj" Sep 30 06:37:50 crc kubenswrapper[4691]: I0930 06:37:50.855479 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c1d80b8d-cdaa-4dec-9ef5-dca9c837ed7a-scripts\") pod \"nova-cell0-conductor-db-sync-k7vkj\" (UID: \"c1d80b8d-cdaa-4dec-9ef5-dca9c837ed7a\") " pod="openstack/nova-cell0-conductor-db-sync-k7vkj" Sep 30 06:37:50 crc kubenswrapper[4691]: I0930 06:37:50.859309 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c1d80b8d-cdaa-4dec-9ef5-dca9c837ed7a-scripts\") pod \"nova-cell0-conductor-db-sync-k7vkj\" (UID: \"c1d80b8d-cdaa-4dec-9ef5-dca9c837ed7a\") " pod="openstack/nova-cell0-conductor-db-sync-k7vkj" Sep 30 06:37:50 crc kubenswrapper[4691]: I0930 06:37:50.862483 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1d80b8d-cdaa-4dec-9ef5-dca9c837ed7a-config-data\") pod \"nova-cell0-conductor-db-sync-k7vkj\" (UID: \"c1d80b8d-cdaa-4dec-9ef5-dca9c837ed7a\") " pod="openstack/nova-cell0-conductor-db-sync-k7vkj" Sep 30 06:37:50 crc kubenswrapper[4691]: I0930 06:37:50.865855 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d2a83f16-21dd-442b-b27d-6c583c783055","Type":"ContainerStarted","Data":"6b0967c70c9caa7ac79242e291cb822716b49be10b1bc9b84814456e274ba1da"} Sep 30 06:37:50 crc kubenswrapper[4691]: I0930 06:37:50.873849 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1d80b8d-cdaa-4dec-9ef5-dca9c837ed7a-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-k7vkj\" (UID: \"c1d80b8d-cdaa-4dec-9ef5-dca9c837ed7a\") " pod="openstack/nova-cell0-conductor-db-sync-k7vkj" Sep 30 06:37:50 crc kubenswrapper[4691]: I0930 06:37:50.874607 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrj78\" (UniqueName: \"kubernetes.io/projected/c1d80b8d-cdaa-4dec-9ef5-dca9c837ed7a-kube-api-access-qrj78\") pod \"nova-cell0-conductor-db-sync-k7vkj\" (UID: \"c1d80b8d-cdaa-4dec-9ef5-dca9c837ed7a\") " pod="openstack/nova-cell0-conductor-db-sync-k7vkj" Sep 30 06:37:50 crc kubenswrapper[4691]: I0930 06:37:50.878648 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 06:37:50 crc kubenswrapper[4691]: I0930 06:37:50.883530 4691 generic.go:334] "Generic (PLEG): container finished" podID="fc3c5478-606b-4088-aec6-a5652fc3ffb1" containerID="4bff9abf5544d3a91d38a65772c3ac052c7cc96983bb4b7e3b467eb6d4005e7b" exitCode=0 Sep 30 06:37:50 crc kubenswrapper[4691]: I0930 06:37:50.883589 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fc3c5478-606b-4088-aec6-a5652fc3ffb1","Type":"ContainerDied","Data":"4bff9abf5544d3a91d38a65772c3ac052c7cc96983bb4b7e3b467eb6d4005e7b"} Sep 30 06:37:50 crc kubenswrapper[4691]: I0930 06:37:50.883613 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fc3c5478-606b-4088-aec6-a5652fc3ffb1","Type":"ContainerDied","Data":"5a7a5bcff3758c5f9d9da70a55d698c27487ff6e99c2729cf15ef82a76a16249"} Sep 30 06:37:50 crc kubenswrapper[4691]: I0930 06:37:50.883629 4691 scope.go:117] "RemoveContainer" containerID="e56c1aef99344c6617c9438d314d5ed8068b0ff13f7b97939a3f06470b415f8f" Sep 30 06:37:50 crc kubenswrapper[4691]: I0930 06:37:50.886857 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"397c7023-cd6a-42ac-8d37-5813f5f9d45e","Type":"ContainerStarted","Data":"07532e10633130f4803c87b8cc1138f40f0509f74a6d6ee9c146a345aa91e269"} Sep 30 06:37:50 crc kubenswrapper[4691]: I0930 06:37:50.886878 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"397c7023-cd6a-42ac-8d37-5813f5f9d45e","Type":"ContainerStarted","Data":"d21b5b620a57752b5cea4c6b49a0ade23ea0bf40582bdcb2fe1f2f8f23cf5432"} Sep 30 06:37:50 crc kubenswrapper[4691]: I0930 06:37:50.956435 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fc3c5478-606b-4088-aec6-a5652fc3ffb1-sg-core-conf-yaml\") pod \"fc3c5478-606b-4088-aec6-a5652fc3ffb1\" (UID: \"fc3c5478-606b-4088-aec6-a5652fc3ffb1\") " Sep 30 06:37:50 crc kubenswrapper[4691]: I0930 06:37:50.956480 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc3c5478-606b-4088-aec6-a5652fc3ffb1-log-httpd\") pod \"fc3c5478-606b-4088-aec6-a5652fc3ffb1\" (UID: \"fc3c5478-606b-4088-aec6-a5652fc3ffb1\") " Sep 30 06:37:50 crc kubenswrapper[4691]: I0930 06:37:50.956504 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mjz4\" (UniqueName: \"kubernetes.io/projected/fc3c5478-606b-4088-aec6-a5652fc3ffb1-kube-api-access-7mjz4\") pod \"fc3c5478-606b-4088-aec6-a5652fc3ffb1\" (UID: \"fc3c5478-606b-4088-aec6-a5652fc3ffb1\") " Sep 30 06:37:50 crc kubenswrapper[4691]: I0930 06:37:50.956626 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc3c5478-606b-4088-aec6-a5652fc3ffb1-combined-ca-bundle\") pod \"fc3c5478-606b-4088-aec6-a5652fc3ffb1\" (UID: \"fc3c5478-606b-4088-aec6-a5652fc3ffb1\") " Sep 30 06:37:50 crc kubenswrapper[4691]: I0930 06:37:50.956649 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc3c5478-606b-4088-aec6-a5652fc3ffb1-scripts\") pod \"fc3c5478-606b-4088-aec6-a5652fc3ffb1\" (UID: \"fc3c5478-606b-4088-aec6-a5652fc3ffb1\") " Sep 30 06:37:50 crc kubenswrapper[4691]: I0930 06:37:50.956729 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc3c5478-606b-4088-aec6-a5652fc3ffb1-config-data\") pod \"fc3c5478-606b-4088-aec6-a5652fc3ffb1\" (UID: \"fc3c5478-606b-4088-aec6-a5652fc3ffb1\") " Sep 30 06:37:50 crc kubenswrapper[4691]: I0930 06:37:50.956761 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc3c5478-606b-4088-aec6-a5652fc3ffb1-run-httpd\") pod \"fc3c5478-606b-4088-aec6-a5652fc3ffb1\" (UID: \"fc3c5478-606b-4088-aec6-a5652fc3ffb1\") " Sep 30 06:37:50 crc kubenswrapper[4691]: I0930 06:37:50.958240 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc3c5478-606b-4088-aec6-a5652fc3ffb1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "fc3c5478-606b-4088-aec6-a5652fc3ffb1" (UID: "fc3c5478-606b-4088-aec6-a5652fc3ffb1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:37:50 crc kubenswrapper[4691]: I0930 06:37:50.958576 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc3c5478-606b-4088-aec6-a5652fc3ffb1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "fc3c5478-606b-4088-aec6-a5652fc3ffb1" (UID: "fc3c5478-606b-4088-aec6-a5652fc3ffb1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:37:50 crc kubenswrapper[4691]: I0930 06:37:50.962841 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc3c5478-606b-4088-aec6-a5652fc3ffb1-scripts" (OuterVolumeSpecName: "scripts") pod "fc3c5478-606b-4088-aec6-a5652fc3ffb1" (UID: "fc3c5478-606b-4088-aec6-a5652fc3ffb1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:37:50 crc kubenswrapper[4691]: I0930 06:37:50.982473 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc3c5478-606b-4088-aec6-a5652fc3ffb1-kube-api-access-7mjz4" (OuterVolumeSpecName: "kube-api-access-7mjz4") pod "fc3c5478-606b-4088-aec6-a5652fc3ffb1" (UID: "fc3c5478-606b-4088-aec6-a5652fc3ffb1"). InnerVolumeSpecName "kube-api-access-7mjz4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:37:50 crc kubenswrapper[4691]: I0930 06:37:50.985413 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc3c5478-606b-4088-aec6-a5652fc3ffb1-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "fc3c5478-606b-4088-aec6-a5652fc3ffb1" (UID: "fc3c5478-606b-4088-aec6-a5652fc3ffb1"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:37:50 crc kubenswrapper[4691]: I0930 06:37:50.993242 4691 scope.go:117] "RemoveContainer" containerID="d7219b79838d64327fa98263364adc021362f2ae5163a59e56137838bd4be800" Sep 30 06:37:51 crc kubenswrapper[4691]: I0930 06:37:51.005623 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-k7vkj" Sep 30 06:37:51 crc kubenswrapper[4691]: I0930 06:37:51.045366 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc3c5478-606b-4088-aec6-a5652fc3ffb1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fc3c5478-606b-4088-aec6-a5652fc3ffb1" (UID: "fc3c5478-606b-4088-aec6-a5652fc3ffb1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:37:51 crc kubenswrapper[4691]: I0930 06:37:51.060013 4691 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc3c5478-606b-4088-aec6-a5652fc3ffb1-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:51 crc kubenswrapper[4691]: I0930 06:37:51.060047 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mjz4\" (UniqueName: \"kubernetes.io/projected/fc3c5478-606b-4088-aec6-a5652fc3ffb1-kube-api-access-7mjz4\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:51 crc kubenswrapper[4691]: I0930 06:37:51.060059 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc3c5478-606b-4088-aec6-a5652fc3ffb1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:51 crc kubenswrapper[4691]: I0930 06:37:51.060070 4691 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc3c5478-606b-4088-aec6-a5652fc3ffb1-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:51 crc kubenswrapper[4691]: I0930 06:37:51.060079 4691 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc3c5478-606b-4088-aec6-a5652fc3ffb1-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:51 crc kubenswrapper[4691]: I0930 06:37:51.060087 4691 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fc3c5478-606b-4088-aec6-a5652fc3ffb1-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:51 crc kubenswrapper[4691]: I0930 06:37:51.067217 4691 scope.go:117] "RemoveContainer" containerID="e03474c65cd794f937c4e31b37cd3635524bbb84f9d33a96c20c25488dbd6b12" Sep 30 06:37:51 crc kubenswrapper[4691]: I0930 06:37:51.067290 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Sep 30 06:37:51 crc kubenswrapper[4691]: I0930 06:37:51.067328 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Sep 30 06:37:51 crc kubenswrapper[4691]: I0930 06:37:51.084640 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc3c5478-606b-4088-aec6-a5652fc3ffb1-config-data" (OuterVolumeSpecName: "config-data") pod "fc3c5478-606b-4088-aec6-a5652fc3ffb1" (UID: "fc3c5478-606b-4088-aec6-a5652fc3ffb1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:37:51 crc kubenswrapper[4691]: I0930 06:37:51.115264 4691 scope.go:117] "RemoveContainer" containerID="4bff9abf5544d3a91d38a65772c3ac052c7cc96983bb4b7e3b467eb6d4005e7b" Sep 30 06:37:51 crc kubenswrapper[4691]: I0930 06:37:51.135921 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Sep 30 06:37:51 crc kubenswrapper[4691]: I0930 06:37:51.146415 4691 scope.go:117] "RemoveContainer" containerID="e56c1aef99344c6617c9438d314d5ed8068b0ff13f7b97939a3f06470b415f8f" Sep 30 06:37:51 crc kubenswrapper[4691]: E0930 06:37:51.153550 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e56c1aef99344c6617c9438d314d5ed8068b0ff13f7b97939a3f06470b415f8f\": container with ID starting with e56c1aef99344c6617c9438d314d5ed8068b0ff13f7b97939a3f06470b415f8f not found: ID does not exist" containerID="e56c1aef99344c6617c9438d314d5ed8068b0ff13f7b97939a3f06470b415f8f" Sep 30 06:37:51 crc kubenswrapper[4691]: I0930 06:37:51.153592 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e56c1aef99344c6617c9438d314d5ed8068b0ff13f7b97939a3f06470b415f8f"} err="failed to get container status \"e56c1aef99344c6617c9438d314d5ed8068b0ff13f7b97939a3f06470b415f8f\": rpc error: code = NotFound desc = could not find container \"e56c1aef99344c6617c9438d314d5ed8068b0ff13f7b97939a3f06470b415f8f\": container with ID starting with e56c1aef99344c6617c9438d314d5ed8068b0ff13f7b97939a3f06470b415f8f not found: ID does not exist" Sep 30 06:37:51 crc kubenswrapper[4691]: I0930 06:37:51.153618 4691 scope.go:117] "RemoveContainer" containerID="d7219b79838d64327fa98263364adc021362f2ae5163a59e56137838bd4be800" Sep 30 06:37:51 crc kubenswrapper[4691]: E0930 06:37:51.153973 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7219b79838d64327fa98263364adc021362f2ae5163a59e56137838bd4be800\": container with ID starting with d7219b79838d64327fa98263364adc021362f2ae5163a59e56137838bd4be800 not found: ID does not exist" containerID="d7219b79838d64327fa98263364adc021362f2ae5163a59e56137838bd4be800" Sep 30 06:37:51 crc kubenswrapper[4691]: I0930 06:37:51.154009 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7219b79838d64327fa98263364adc021362f2ae5163a59e56137838bd4be800"} err="failed to get container status \"d7219b79838d64327fa98263364adc021362f2ae5163a59e56137838bd4be800\": rpc error: code = NotFound desc = could not find container \"d7219b79838d64327fa98263364adc021362f2ae5163a59e56137838bd4be800\": container with ID starting with d7219b79838d64327fa98263364adc021362f2ae5163a59e56137838bd4be800 not found: ID does not exist" Sep 30 06:37:51 crc kubenswrapper[4691]: I0930 06:37:51.154034 4691 scope.go:117] "RemoveContainer" containerID="e03474c65cd794f937c4e31b37cd3635524bbb84f9d33a96c20c25488dbd6b12" Sep 30 06:37:51 crc kubenswrapper[4691]: E0930 06:37:51.154232 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e03474c65cd794f937c4e31b37cd3635524bbb84f9d33a96c20c25488dbd6b12\": container with ID starting with e03474c65cd794f937c4e31b37cd3635524bbb84f9d33a96c20c25488dbd6b12 not found: ID does not exist" containerID="e03474c65cd794f937c4e31b37cd3635524bbb84f9d33a96c20c25488dbd6b12" Sep 30 06:37:51 crc kubenswrapper[4691]: I0930 06:37:51.154255 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e03474c65cd794f937c4e31b37cd3635524bbb84f9d33a96c20c25488dbd6b12"} err="failed to get container status \"e03474c65cd794f937c4e31b37cd3635524bbb84f9d33a96c20c25488dbd6b12\": rpc error: code = NotFound desc = could not find container \"e03474c65cd794f937c4e31b37cd3635524bbb84f9d33a96c20c25488dbd6b12\": container with ID starting with e03474c65cd794f937c4e31b37cd3635524bbb84f9d33a96c20c25488dbd6b12 not found: ID does not exist" Sep 30 06:37:51 crc kubenswrapper[4691]: I0930 06:37:51.154269 4691 scope.go:117] "RemoveContainer" containerID="4bff9abf5544d3a91d38a65772c3ac052c7cc96983bb4b7e3b467eb6d4005e7b" Sep 30 06:37:51 crc kubenswrapper[4691]: E0930 06:37:51.154434 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4bff9abf5544d3a91d38a65772c3ac052c7cc96983bb4b7e3b467eb6d4005e7b\": container with ID starting with 4bff9abf5544d3a91d38a65772c3ac052c7cc96983bb4b7e3b467eb6d4005e7b not found: ID does not exist" containerID="4bff9abf5544d3a91d38a65772c3ac052c7cc96983bb4b7e3b467eb6d4005e7b" Sep 30 06:37:51 crc kubenswrapper[4691]: I0930 06:37:51.154457 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4bff9abf5544d3a91d38a65772c3ac052c7cc96983bb4b7e3b467eb6d4005e7b"} err="failed to get container status \"4bff9abf5544d3a91d38a65772c3ac052c7cc96983bb4b7e3b467eb6d4005e7b\": rpc error: code = NotFound desc = could not find container \"4bff9abf5544d3a91d38a65772c3ac052c7cc96983bb4b7e3b467eb6d4005e7b\": container with ID starting with 4bff9abf5544d3a91d38a65772c3ac052c7cc96983bb4b7e3b467eb6d4005e7b not found: ID does not exist" Sep 30 06:37:51 crc kubenswrapper[4691]: I0930 06:37:51.164442 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc3c5478-606b-4088-aec6-a5652fc3ffb1-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 06:37:51 crc kubenswrapper[4691]: I0930 06:37:51.176775 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Sep 30 06:37:51 crc kubenswrapper[4691]: I0930 06:37:51.283356 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd274ff4-663a-4621-af28-d5fca3e5b139" path="/var/lib/kubelet/pods/bd274ff4-663a-4621-af28-d5fca3e5b139/volumes" Sep 30 06:37:51 crc kubenswrapper[4691]: I0930 06:37:51.552800 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-k7vkj"] Sep 30 06:37:51 crc kubenswrapper[4691]: I0930 06:37:51.901660 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 06:37:51 crc kubenswrapper[4691]: I0930 06:37:51.916781 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"397c7023-cd6a-42ac-8d37-5813f5f9d45e","Type":"ContainerStarted","Data":"a9c98d579f3946230259ccff549894c027aa62b5bf9ac8ce1a68915868a12df7"} Sep 30 06:37:51 crc kubenswrapper[4691]: I0930 06:37:51.917409 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Sep 30 06:37:51 crc kubenswrapper[4691]: I0930 06:37:51.936971 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 06:37:51 crc kubenswrapper[4691]: I0930 06:37:51.937010 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d2a83f16-21dd-442b-b27d-6c583c783055","Type":"ContainerStarted","Data":"bccaad23e45872ffc9eb3fb01492aef128a3a057da30fa9305fd4116b36cd834"} Sep 30 06:37:51 crc kubenswrapper[4691]: I0930 06:37:51.956041 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-k7vkj" event={"ID":"c1d80b8d-cdaa-4dec-9ef5-dca9c837ed7a","Type":"ContainerStarted","Data":"69863cea142f7fc425749abcf6db07d69e2f6e54809b792eaba4fda589c3c369"} Sep 30 06:37:51 crc kubenswrapper[4691]: I0930 06:37:51.956079 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Sep 30 06:37:51 crc kubenswrapper[4691]: I0930 06:37:51.956095 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 06:37:51 crc kubenswrapper[4691]: I0930 06:37:51.956201 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Sep 30 06:37:51 crc kubenswrapper[4691]: I0930 06:37:51.979962 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 06:37:51 crc kubenswrapper[4691]: E0930 06:37:51.980418 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc3c5478-606b-4088-aec6-a5652fc3ffb1" containerName="proxy-httpd" Sep 30 06:37:51 crc kubenswrapper[4691]: I0930 06:37:51.980431 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc3c5478-606b-4088-aec6-a5652fc3ffb1" containerName="proxy-httpd" Sep 30 06:37:51 crc kubenswrapper[4691]: E0930 06:37:51.980460 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc3c5478-606b-4088-aec6-a5652fc3ffb1" containerName="sg-core" Sep 30 06:37:51 crc kubenswrapper[4691]: I0930 06:37:51.980466 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc3c5478-606b-4088-aec6-a5652fc3ffb1" containerName="sg-core" Sep 30 06:37:51 crc kubenswrapper[4691]: E0930 06:37:51.980479 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc3c5478-606b-4088-aec6-a5652fc3ffb1" containerName="ceilometer-central-agent" Sep 30 06:37:51 crc kubenswrapper[4691]: I0930 06:37:51.980487 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc3c5478-606b-4088-aec6-a5652fc3ffb1" containerName="ceilometer-central-agent" Sep 30 06:37:51 crc kubenswrapper[4691]: E0930 06:37:51.980495 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc3c5478-606b-4088-aec6-a5652fc3ffb1" containerName="ceilometer-notification-agent" Sep 30 06:37:51 crc kubenswrapper[4691]: I0930 06:37:51.980501 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc3c5478-606b-4088-aec6-a5652fc3ffb1" containerName="ceilometer-notification-agent" Sep 30 06:37:51 crc kubenswrapper[4691]: I0930 06:37:51.980679 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc3c5478-606b-4088-aec6-a5652fc3ffb1" containerName="ceilometer-central-agent" Sep 30 06:37:51 crc kubenswrapper[4691]: I0930 06:37:51.980687 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc3c5478-606b-4088-aec6-a5652fc3ffb1" containerName="ceilometer-notification-agent" Sep 30 06:37:51 crc kubenswrapper[4691]: I0930 06:37:51.980701 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc3c5478-606b-4088-aec6-a5652fc3ffb1" containerName="proxy-httpd" Sep 30 06:37:51 crc kubenswrapper[4691]: I0930 06:37:51.980720 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc3c5478-606b-4088-aec6-a5652fc3ffb1" containerName="sg-core" Sep 30 06:37:51 crc kubenswrapper[4691]: I0930 06:37:51.983933 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 06:37:51 crc kubenswrapper[4691]: I0930 06:37:51.990336 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=2.99031367 podStartE2EDuration="2.99031367s" podCreationTimestamp="2025-09-30 06:37:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:37:51.948012368 +0000 UTC m=+1115.423033418" watchObservedRunningTime="2025-09-30 06:37:51.99031367 +0000 UTC m=+1115.465334720" Sep 30 06:37:51 crc kubenswrapper[4691]: I0930 06:37:51.991298 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 06:37:51 crc kubenswrapper[4691]: I0930 06:37:51.991972 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 06:37:52 crc kubenswrapper[4691]: I0930 06:37:52.017945 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 06:37:52 crc kubenswrapper[4691]: I0930 06:37:52.084418 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f0c2e886-a6f2-4a1f-9690-7120d2f44389-run-httpd\") pod \"ceilometer-0\" (UID: \"f0c2e886-a6f2-4a1f-9690-7120d2f44389\") " pod="openstack/ceilometer-0" Sep 30 06:37:52 crc kubenswrapper[4691]: I0930 06:37:52.084541 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q78nl\" (UniqueName: \"kubernetes.io/projected/f0c2e886-a6f2-4a1f-9690-7120d2f44389-kube-api-access-q78nl\") pod \"ceilometer-0\" (UID: \"f0c2e886-a6f2-4a1f-9690-7120d2f44389\") " pod="openstack/ceilometer-0" Sep 30 06:37:52 crc kubenswrapper[4691]: I0930 06:37:52.084571 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0c2e886-a6f2-4a1f-9690-7120d2f44389-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f0c2e886-a6f2-4a1f-9690-7120d2f44389\") " pod="openstack/ceilometer-0" Sep 30 06:37:52 crc kubenswrapper[4691]: I0930 06:37:52.084627 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f0c2e886-a6f2-4a1f-9690-7120d2f44389-log-httpd\") pod \"ceilometer-0\" (UID: \"f0c2e886-a6f2-4a1f-9690-7120d2f44389\") " pod="openstack/ceilometer-0" Sep 30 06:37:52 crc kubenswrapper[4691]: I0930 06:37:52.084696 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0c2e886-a6f2-4a1f-9690-7120d2f44389-scripts\") pod \"ceilometer-0\" (UID: \"f0c2e886-a6f2-4a1f-9690-7120d2f44389\") " pod="openstack/ceilometer-0" Sep 30 06:37:52 crc kubenswrapper[4691]: I0930 06:37:52.084719 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f0c2e886-a6f2-4a1f-9690-7120d2f44389-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f0c2e886-a6f2-4a1f-9690-7120d2f44389\") " pod="openstack/ceilometer-0" Sep 30 06:37:52 crc kubenswrapper[4691]: I0930 06:37:52.084757 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0c2e886-a6f2-4a1f-9690-7120d2f44389-config-data\") pod \"ceilometer-0\" (UID: \"f0c2e886-a6f2-4a1f-9690-7120d2f44389\") " pod="openstack/ceilometer-0" Sep 30 06:37:52 crc kubenswrapper[4691]: I0930 06:37:52.186737 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0c2e886-a6f2-4a1f-9690-7120d2f44389-config-data\") pod \"ceilometer-0\" (UID: \"f0c2e886-a6f2-4a1f-9690-7120d2f44389\") " pod="openstack/ceilometer-0" Sep 30 06:37:52 crc kubenswrapper[4691]: I0930 06:37:52.187041 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f0c2e886-a6f2-4a1f-9690-7120d2f44389-run-httpd\") pod \"ceilometer-0\" (UID: \"f0c2e886-a6f2-4a1f-9690-7120d2f44389\") " pod="openstack/ceilometer-0" Sep 30 06:37:52 crc kubenswrapper[4691]: I0930 06:37:52.187101 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q78nl\" (UniqueName: \"kubernetes.io/projected/f0c2e886-a6f2-4a1f-9690-7120d2f44389-kube-api-access-q78nl\") pod \"ceilometer-0\" (UID: \"f0c2e886-a6f2-4a1f-9690-7120d2f44389\") " pod="openstack/ceilometer-0" Sep 30 06:37:52 crc kubenswrapper[4691]: I0930 06:37:52.187121 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0c2e886-a6f2-4a1f-9690-7120d2f44389-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f0c2e886-a6f2-4a1f-9690-7120d2f44389\") " pod="openstack/ceilometer-0" Sep 30 06:37:52 crc kubenswrapper[4691]: I0930 06:37:52.187163 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f0c2e886-a6f2-4a1f-9690-7120d2f44389-log-httpd\") pod \"ceilometer-0\" (UID: \"f0c2e886-a6f2-4a1f-9690-7120d2f44389\") " pod="openstack/ceilometer-0" Sep 30 06:37:52 crc kubenswrapper[4691]: I0930 06:37:52.187219 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0c2e886-a6f2-4a1f-9690-7120d2f44389-scripts\") pod \"ceilometer-0\" (UID: \"f0c2e886-a6f2-4a1f-9690-7120d2f44389\") " pod="openstack/ceilometer-0" Sep 30 06:37:52 crc kubenswrapper[4691]: I0930 06:37:52.187246 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f0c2e886-a6f2-4a1f-9690-7120d2f44389-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f0c2e886-a6f2-4a1f-9690-7120d2f44389\") " pod="openstack/ceilometer-0" Sep 30 06:37:52 crc kubenswrapper[4691]: I0930 06:37:52.187474 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f0c2e886-a6f2-4a1f-9690-7120d2f44389-run-httpd\") pod \"ceilometer-0\" (UID: \"f0c2e886-a6f2-4a1f-9690-7120d2f44389\") " pod="openstack/ceilometer-0" Sep 30 06:37:52 crc kubenswrapper[4691]: I0930 06:37:52.187716 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f0c2e886-a6f2-4a1f-9690-7120d2f44389-log-httpd\") pod \"ceilometer-0\" (UID: \"f0c2e886-a6f2-4a1f-9690-7120d2f44389\") " pod="openstack/ceilometer-0" Sep 30 06:37:52 crc kubenswrapper[4691]: I0930 06:37:52.192614 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0c2e886-a6f2-4a1f-9690-7120d2f44389-scripts\") pod \"ceilometer-0\" (UID: \"f0c2e886-a6f2-4a1f-9690-7120d2f44389\") " pod="openstack/ceilometer-0" Sep 30 06:37:52 crc kubenswrapper[4691]: I0930 06:37:52.192611 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0c2e886-a6f2-4a1f-9690-7120d2f44389-config-data\") pod \"ceilometer-0\" (UID: \"f0c2e886-a6f2-4a1f-9690-7120d2f44389\") " pod="openstack/ceilometer-0" Sep 30 06:37:52 crc kubenswrapper[4691]: I0930 06:37:52.193583 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0c2e886-a6f2-4a1f-9690-7120d2f44389-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f0c2e886-a6f2-4a1f-9690-7120d2f44389\") " pod="openstack/ceilometer-0" Sep 30 06:37:52 crc kubenswrapper[4691]: I0930 06:37:52.195201 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f0c2e886-a6f2-4a1f-9690-7120d2f44389-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f0c2e886-a6f2-4a1f-9690-7120d2f44389\") " pod="openstack/ceilometer-0" Sep 30 06:37:52 crc kubenswrapper[4691]: I0930 06:37:52.218398 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q78nl\" (UniqueName: \"kubernetes.io/projected/f0c2e886-a6f2-4a1f-9690-7120d2f44389-kube-api-access-q78nl\") pod \"ceilometer-0\" (UID: \"f0c2e886-a6f2-4a1f-9690-7120d2f44389\") " pod="openstack/ceilometer-0" Sep 30 06:37:52 crc kubenswrapper[4691]: I0930 06:37:52.307629 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 06:37:52 crc kubenswrapper[4691]: I0930 06:37:52.798121 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 06:37:52 crc kubenswrapper[4691]: I0930 06:37:52.971061 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f0c2e886-a6f2-4a1f-9690-7120d2f44389","Type":"ContainerStarted","Data":"9f2ea05176d3c14fb7d81fec7c8bec9bb6f857b646711d486c058417c59a6c4e"} Sep 30 06:37:52 crc kubenswrapper[4691]: I0930 06:37:52.976200 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"d2a83f16-21dd-442b-b27d-6c583c783055","Type":"ContainerStarted","Data":"2e29538ea9bc43117f47f9f2868a12cb6753ebf56606ccf3dfdefd9808e84f4b"} Sep 30 06:37:52 crc kubenswrapper[4691]: I0930 06:37:52.996308 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.996292836 podStartE2EDuration="3.996292836s" podCreationTimestamp="2025-09-30 06:37:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:37:52.991406684 +0000 UTC m=+1116.466427724" watchObservedRunningTime="2025-09-30 06:37:52.996292836 +0000 UTC m=+1116.471313876" Sep 30 06:37:53 crc kubenswrapper[4691]: I0930 06:37:53.236795 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc3c5478-606b-4088-aec6-a5652fc3ffb1" path="/var/lib/kubelet/pods/fc3c5478-606b-4088-aec6-a5652fc3ffb1/volumes" Sep 30 06:37:53 crc kubenswrapper[4691]: I0930 06:37:53.712383 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Sep 30 06:37:53 crc kubenswrapper[4691]: I0930 06:37:53.714454 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Sep 30 06:37:53 crc kubenswrapper[4691]: I0930 06:37:53.987013 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f0c2e886-a6f2-4a1f-9690-7120d2f44389","Type":"ContainerStarted","Data":"5181953c2169c756ab36c302026e9d44321f562785354b67afd3756e89b3481b"} Sep 30 06:37:53 crc kubenswrapper[4691]: I0930 06:37:53.987258 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f0c2e886-a6f2-4a1f-9690-7120d2f44389","Type":"ContainerStarted","Data":"249a4e5f85afaa9d4d7a97ab62f95b81ab9fbdf61fb6c3d67c6b68d38b385d81"} Sep 30 06:37:53 crc kubenswrapper[4691]: I0930 06:37:53.987271 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f0c2e886-a6f2-4a1f-9690-7120d2f44389","Type":"ContainerStarted","Data":"cec4918c369d3c4a1dbe1c0335efc2804e351076c2e134fdb347de3b89c650b4"} Sep 30 06:37:58 crc kubenswrapper[4691]: I0930 06:37:58.624109 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 06:38:00 crc kubenswrapper[4691]: I0930 06:38:00.046114 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-k7vkj" event={"ID":"c1d80b8d-cdaa-4dec-9ef5-dca9c837ed7a","Type":"ContainerStarted","Data":"a50e72b3684249c6ef44c44bf367e3bc96a00911cf3cfc02d0c6b7bf6ce6bc14"} Sep 30 06:38:00 crc kubenswrapper[4691]: I0930 06:38:00.050286 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f0c2e886-a6f2-4a1f-9690-7120d2f44389","Type":"ContainerStarted","Data":"82a5cc9cf557bb5b922785b251a895243ef07ee04e63f330c4ddc2de9ece46c8"} Sep 30 06:38:00 crc kubenswrapper[4691]: I0930 06:38:00.050739 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 06:38:00 crc kubenswrapper[4691]: I0930 06:38:00.050515 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f0c2e886-a6f2-4a1f-9690-7120d2f44389" containerName="ceilometer-central-agent" containerID="cri-o://cec4918c369d3c4a1dbe1c0335efc2804e351076c2e134fdb347de3b89c650b4" gracePeriod=30 Sep 30 06:38:00 crc kubenswrapper[4691]: I0930 06:38:00.050554 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f0c2e886-a6f2-4a1f-9690-7120d2f44389" containerName="proxy-httpd" containerID="cri-o://82a5cc9cf557bb5b922785b251a895243ef07ee04e63f330c4ddc2de9ece46c8" gracePeriod=30 Sep 30 06:38:00 crc kubenswrapper[4691]: I0930 06:38:00.050578 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f0c2e886-a6f2-4a1f-9690-7120d2f44389" containerName="ceilometer-notification-agent" containerID="cri-o://249a4e5f85afaa9d4d7a97ab62f95b81ab9fbdf61fb6c3d67c6b68d38b385d81" gracePeriod=30 Sep 30 06:38:00 crc kubenswrapper[4691]: I0930 06:38:00.050639 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f0c2e886-a6f2-4a1f-9690-7120d2f44389" containerName="sg-core" containerID="cri-o://5181953c2169c756ab36c302026e9d44321f562785354b67afd3756e89b3481b" gracePeriod=30 Sep 30 06:38:00 crc kubenswrapper[4691]: I0930 06:38:00.066591 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-k7vkj" podStartSLOduration=2.052127815 podStartE2EDuration="10.066566319s" podCreationTimestamp="2025-09-30 06:37:50 +0000 UTC" firstStartedPulling="2025-09-30 06:37:51.563527181 +0000 UTC m=+1115.038548211" lastFinishedPulling="2025-09-30 06:37:59.577965685 +0000 UTC m=+1123.052986715" observedRunningTime="2025-09-30 06:38:00.063358876 +0000 UTC m=+1123.538379956" watchObservedRunningTime="2025-09-30 06:38:00.066566319 +0000 UTC m=+1123.541587369" Sep 30 06:38:00 crc kubenswrapper[4691]: I0930 06:38:00.088654 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.331431184 podStartE2EDuration="9.088628611s" podCreationTimestamp="2025-09-30 06:37:51 +0000 UTC" firstStartedPulling="2025-09-30 06:37:52.80799346 +0000 UTC m=+1116.283014510" lastFinishedPulling="2025-09-30 06:37:59.565190857 +0000 UTC m=+1123.040211937" observedRunningTime="2025-09-30 06:38:00.087873996 +0000 UTC m=+1123.562895046" watchObservedRunningTime="2025-09-30 06:38:00.088628611 +0000 UTC m=+1123.563649681" Sep 30 06:38:00 crc kubenswrapper[4691]: I0930 06:38:00.187123 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Sep 30 06:38:00 crc kubenswrapper[4691]: I0930 06:38:00.187211 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Sep 30 06:38:00 crc kubenswrapper[4691]: I0930 06:38:00.235002 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Sep 30 06:38:00 crc kubenswrapper[4691]: I0930 06:38:00.263649 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Sep 30 06:38:01 crc kubenswrapper[4691]: I0930 06:38:01.067796 4691 generic.go:334] "Generic (PLEG): container finished" podID="f0c2e886-a6f2-4a1f-9690-7120d2f44389" containerID="82a5cc9cf557bb5b922785b251a895243ef07ee04e63f330c4ddc2de9ece46c8" exitCode=0 Sep 30 06:38:01 crc kubenswrapper[4691]: I0930 06:38:01.067874 4691 generic.go:334] "Generic (PLEG): container finished" podID="f0c2e886-a6f2-4a1f-9690-7120d2f44389" containerID="5181953c2169c756ab36c302026e9d44321f562785354b67afd3756e89b3481b" exitCode=2 Sep 30 06:38:01 crc kubenswrapper[4691]: I0930 06:38:01.067910 4691 generic.go:334] "Generic (PLEG): container finished" podID="f0c2e886-a6f2-4a1f-9690-7120d2f44389" containerID="cec4918c369d3c4a1dbe1c0335efc2804e351076c2e134fdb347de3b89c650b4" exitCode=0 Sep 30 06:38:01 crc kubenswrapper[4691]: I0930 06:38:01.067874 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f0c2e886-a6f2-4a1f-9690-7120d2f44389","Type":"ContainerDied","Data":"82a5cc9cf557bb5b922785b251a895243ef07ee04e63f330c4ddc2de9ece46c8"} Sep 30 06:38:01 crc kubenswrapper[4691]: I0930 06:38:01.067967 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f0c2e886-a6f2-4a1f-9690-7120d2f44389","Type":"ContainerDied","Data":"5181953c2169c756ab36c302026e9d44321f562785354b67afd3756e89b3481b"} Sep 30 06:38:01 crc kubenswrapper[4691]: I0930 06:38:01.067991 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f0c2e886-a6f2-4a1f-9690-7120d2f44389","Type":"ContainerDied","Data":"cec4918c369d3c4a1dbe1c0335efc2804e351076c2e134fdb347de3b89c650b4"} Sep 30 06:38:01 crc kubenswrapper[4691]: I0930 06:38:01.068245 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Sep 30 06:38:01 crc kubenswrapper[4691]: I0930 06:38:01.068451 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Sep 30 06:38:01 crc kubenswrapper[4691]: I0930 06:38:01.474382 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Sep 30 06:38:02 crc kubenswrapper[4691]: I0930 06:38:02.084121 4691 generic.go:334] "Generic (PLEG): container finished" podID="f0c2e886-a6f2-4a1f-9690-7120d2f44389" containerID="249a4e5f85afaa9d4d7a97ab62f95b81ab9fbdf61fb6c3d67c6b68d38b385d81" exitCode=0 Sep 30 06:38:02 crc kubenswrapper[4691]: I0930 06:38:02.084196 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f0c2e886-a6f2-4a1f-9690-7120d2f44389","Type":"ContainerDied","Data":"249a4e5f85afaa9d4d7a97ab62f95b81ab9fbdf61fb6c3d67c6b68d38b385d81"} Sep 30 06:38:02 crc kubenswrapper[4691]: I0930 06:38:02.359635 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 06:38:02 crc kubenswrapper[4691]: I0930 06:38:02.483149 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f0c2e886-a6f2-4a1f-9690-7120d2f44389-run-httpd\") pod \"f0c2e886-a6f2-4a1f-9690-7120d2f44389\" (UID: \"f0c2e886-a6f2-4a1f-9690-7120d2f44389\") " Sep 30 06:38:02 crc kubenswrapper[4691]: I0930 06:38:02.483195 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0c2e886-a6f2-4a1f-9690-7120d2f44389-config-data\") pod \"f0c2e886-a6f2-4a1f-9690-7120d2f44389\" (UID: \"f0c2e886-a6f2-4a1f-9690-7120d2f44389\") " Sep 30 06:38:02 crc kubenswrapper[4691]: I0930 06:38:02.483238 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f0c2e886-a6f2-4a1f-9690-7120d2f44389-sg-core-conf-yaml\") pod \"f0c2e886-a6f2-4a1f-9690-7120d2f44389\" (UID: \"f0c2e886-a6f2-4a1f-9690-7120d2f44389\") " Sep 30 06:38:02 crc kubenswrapper[4691]: I0930 06:38:02.483290 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0c2e886-a6f2-4a1f-9690-7120d2f44389-scripts\") pod \"f0c2e886-a6f2-4a1f-9690-7120d2f44389\" (UID: \"f0c2e886-a6f2-4a1f-9690-7120d2f44389\") " Sep 30 06:38:02 crc kubenswrapper[4691]: I0930 06:38:02.483306 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q78nl\" (UniqueName: \"kubernetes.io/projected/f0c2e886-a6f2-4a1f-9690-7120d2f44389-kube-api-access-q78nl\") pod \"f0c2e886-a6f2-4a1f-9690-7120d2f44389\" (UID: \"f0c2e886-a6f2-4a1f-9690-7120d2f44389\") " Sep 30 06:38:02 crc kubenswrapper[4691]: I0930 06:38:02.483358 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0c2e886-a6f2-4a1f-9690-7120d2f44389-combined-ca-bundle\") pod \"f0c2e886-a6f2-4a1f-9690-7120d2f44389\" (UID: \"f0c2e886-a6f2-4a1f-9690-7120d2f44389\") " Sep 30 06:38:02 crc kubenswrapper[4691]: I0930 06:38:02.484018 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f0c2e886-a6f2-4a1f-9690-7120d2f44389-log-httpd\") pod \"f0c2e886-a6f2-4a1f-9690-7120d2f44389\" (UID: \"f0c2e886-a6f2-4a1f-9690-7120d2f44389\") " Sep 30 06:38:02 crc kubenswrapper[4691]: I0930 06:38:02.484287 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0c2e886-a6f2-4a1f-9690-7120d2f44389-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f0c2e886-a6f2-4a1f-9690-7120d2f44389" (UID: "f0c2e886-a6f2-4a1f-9690-7120d2f44389"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:38:02 crc kubenswrapper[4691]: I0930 06:38:02.484541 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0c2e886-a6f2-4a1f-9690-7120d2f44389-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f0c2e886-a6f2-4a1f-9690-7120d2f44389" (UID: "f0c2e886-a6f2-4a1f-9690-7120d2f44389"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:38:02 crc kubenswrapper[4691]: I0930 06:38:02.484556 4691 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f0c2e886-a6f2-4a1f-9690-7120d2f44389-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 06:38:02 crc kubenswrapper[4691]: I0930 06:38:02.488777 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0c2e886-a6f2-4a1f-9690-7120d2f44389-scripts" (OuterVolumeSpecName: "scripts") pod "f0c2e886-a6f2-4a1f-9690-7120d2f44389" (UID: "f0c2e886-a6f2-4a1f-9690-7120d2f44389"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:38:02 crc kubenswrapper[4691]: I0930 06:38:02.500357 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0c2e886-a6f2-4a1f-9690-7120d2f44389-kube-api-access-q78nl" (OuterVolumeSpecName: "kube-api-access-q78nl") pod "f0c2e886-a6f2-4a1f-9690-7120d2f44389" (UID: "f0c2e886-a6f2-4a1f-9690-7120d2f44389"). InnerVolumeSpecName "kube-api-access-q78nl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:38:02 crc kubenswrapper[4691]: I0930 06:38:02.534334 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0c2e886-a6f2-4a1f-9690-7120d2f44389-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f0c2e886-a6f2-4a1f-9690-7120d2f44389" (UID: "f0c2e886-a6f2-4a1f-9690-7120d2f44389"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:38:02 crc kubenswrapper[4691]: I0930 06:38:02.586093 4691 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f0c2e886-a6f2-4a1f-9690-7120d2f44389-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 06:38:02 crc kubenswrapper[4691]: I0930 06:38:02.586125 4691 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0c2e886-a6f2-4a1f-9690-7120d2f44389-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 06:38:02 crc kubenswrapper[4691]: I0930 06:38:02.586135 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q78nl\" (UniqueName: \"kubernetes.io/projected/f0c2e886-a6f2-4a1f-9690-7120d2f44389-kube-api-access-q78nl\") on node \"crc\" DevicePath \"\"" Sep 30 06:38:02 crc kubenswrapper[4691]: I0930 06:38:02.586144 4691 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f0c2e886-a6f2-4a1f-9690-7120d2f44389-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 06:38:02 crc kubenswrapper[4691]: I0930 06:38:02.610206 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0c2e886-a6f2-4a1f-9690-7120d2f44389-config-data" (OuterVolumeSpecName: "config-data") pod "f0c2e886-a6f2-4a1f-9690-7120d2f44389" (UID: "f0c2e886-a6f2-4a1f-9690-7120d2f44389"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:38:02 crc kubenswrapper[4691]: I0930 06:38:02.622126 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0c2e886-a6f2-4a1f-9690-7120d2f44389-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f0c2e886-a6f2-4a1f-9690-7120d2f44389" (UID: "f0c2e886-a6f2-4a1f-9690-7120d2f44389"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:38:02 crc kubenswrapper[4691]: I0930 06:38:02.687452 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0c2e886-a6f2-4a1f-9690-7120d2f44389-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 06:38:02 crc kubenswrapper[4691]: I0930 06:38:02.687484 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0c2e886-a6f2-4a1f-9690-7120d2f44389-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:38:02 crc kubenswrapper[4691]: I0930 06:38:02.991679 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Sep 30 06:38:02 crc kubenswrapper[4691]: I0930 06:38:02.992989 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Sep 30 06:38:03 crc kubenswrapper[4691]: I0930 06:38:03.124544 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 06:38:03 crc kubenswrapper[4691]: I0930 06:38:03.139356 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f0c2e886-a6f2-4a1f-9690-7120d2f44389","Type":"ContainerDied","Data":"9f2ea05176d3c14fb7d81fec7c8bec9bb6f857b646711d486c058417c59a6c4e"} Sep 30 06:38:03 crc kubenswrapper[4691]: I0930 06:38:03.139409 4691 scope.go:117] "RemoveContainer" containerID="82a5cc9cf557bb5b922785b251a895243ef07ee04e63f330c4ddc2de9ece46c8" Sep 30 06:38:03 crc kubenswrapper[4691]: I0930 06:38:03.176911 4691 scope.go:117] "RemoveContainer" containerID="5181953c2169c756ab36c302026e9d44321f562785354b67afd3756e89b3481b" Sep 30 06:38:03 crc kubenswrapper[4691]: I0930 06:38:03.219611 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 06:38:03 crc kubenswrapper[4691]: I0930 06:38:03.227335 4691 scope.go:117] "RemoveContainer" containerID="249a4e5f85afaa9d4d7a97ab62f95b81ab9fbdf61fb6c3d67c6b68d38b385d81" Sep 30 06:38:03 crc kubenswrapper[4691]: I0930 06:38:03.257110 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 06:38:03 crc kubenswrapper[4691]: I0930 06:38:03.257147 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 06:38:03 crc kubenswrapper[4691]: E0930 06:38:03.257393 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0c2e886-a6f2-4a1f-9690-7120d2f44389" containerName="ceilometer-notification-agent" Sep 30 06:38:03 crc kubenswrapper[4691]: I0930 06:38:03.257408 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0c2e886-a6f2-4a1f-9690-7120d2f44389" containerName="ceilometer-notification-agent" Sep 30 06:38:03 crc kubenswrapper[4691]: E0930 06:38:03.257422 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0c2e886-a6f2-4a1f-9690-7120d2f44389" containerName="proxy-httpd" Sep 30 06:38:03 crc kubenswrapper[4691]: I0930 06:38:03.257429 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0c2e886-a6f2-4a1f-9690-7120d2f44389" containerName="proxy-httpd" Sep 30 06:38:03 crc kubenswrapper[4691]: E0930 06:38:03.257449 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0c2e886-a6f2-4a1f-9690-7120d2f44389" containerName="sg-core" Sep 30 06:38:03 crc kubenswrapper[4691]: I0930 06:38:03.257454 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0c2e886-a6f2-4a1f-9690-7120d2f44389" containerName="sg-core" Sep 30 06:38:03 crc kubenswrapper[4691]: E0930 06:38:03.257485 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0c2e886-a6f2-4a1f-9690-7120d2f44389" containerName="ceilometer-central-agent" Sep 30 06:38:03 crc kubenswrapper[4691]: I0930 06:38:03.257490 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0c2e886-a6f2-4a1f-9690-7120d2f44389" containerName="ceilometer-central-agent" Sep 30 06:38:03 crc kubenswrapper[4691]: I0930 06:38:03.257650 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0c2e886-a6f2-4a1f-9690-7120d2f44389" containerName="proxy-httpd" Sep 30 06:38:03 crc kubenswrapper[4691]: I0930 06:38:03.257668 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0c2e886-a6f2-4a1f-9690-7120d2f44389" containerName="ceilometer-notification-agent" Sep 30 06:38:03 crc kubenswrapper[4691]: I0930 06:38:03.257683 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0c2e886-a6f2-4a1f-9690-7120d2f44389" containerName="sg-core" Sep 30 06:38:03 crc kubenswrapper[4691]: I0930 06:38:03.257691 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0c2e886-a6f2-4a1f-9690-7120d2f44389" containerName="ceilometer-central-agent" Sep 30 06:38:03 crc kubenswrapper[4691]: I0930 06:38:03.259450 4691 scope.go:117] "RemoveContainer" containerID="cec4918c369d3c4a1dbe1c0335efc2804e351076c2e134fdb347de3b89c650b4" Sep 30 06:38:03 crc kubenswrapper[4691]: I0930 06:38:03.259632 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 06:38:03 crc kubenswrapper[4691]: I0930 06:38:03.259713 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 06:38:03 crc kubenswrapper[4691]: I0930 06:38:03.263913 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 06:38:03 crc kubenswrapper[4691]: I0930 06:38:03.264171 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 06:38:03 crc kubenswrapper[4691]: I0930 06:38:03.326722 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xk7q4\" (UniqueName: \"kubernetes.io/projected/db1e0286-c4b2-40de-aa35-50da1ced7b3d-kube-api-access-xk7q4\") pod \"ceilometer-0\" (UID: \"db1e0286-c4b2-40de-aa35-50da1ced7b3d\") " pod="openstack/ceilometer-0" Sep 30 06:38:03 crc kubenswrapper[4691]: I0930 06:38:03.326962 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db1e0286-c4b2-40de-aa35-50da1ced7b3d-scripts\") pod \"ceilometer-0\" (UID: \"db1e0286-c4b2-40de-aa35-50da1ced7b3d\") " pod="openstack/ceilometer-0" Sep 30 06:38:03 crc kubenswrapper[4691]: I0930 06:38:03.327105 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/db1e0286-c4b2-40de-aa35-50da1ced7b3d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"db1e0286-c4b2-40de-aa35-50da1ced7b3d\") " pod="openstack/ceilometer-0" Sep 30 06:38:03 crc kubenswrapper[4691]: I0930 06:38:03.327131 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/db1e0286-c4b2-40de-aa35-50da1ced7b3d-run-httpd\") pod \"ceilometer-0\" (UID: \"db1e0286-c4b2-40de-aa35-50da1ced7b3d\") " pod="openstack/ceilometer-0" Sep 30 06:38:03 crc kubenswrapper[4691]: I0930 06:38:03.327205 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/db1e0286-c4b2-40de-aa35-50da1ced7b3d-log-httpd\") pod \"ceilometer-0\" (UID: \"db1e0286-c4b2-40de-aa35-50da1ced7b3d\") " pod="openstack/ceilometer-0" Sep 30 06:38:03 crc kubenswrapper[4691]: I0930 06:38:03.327283 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db1e0286-c4b2-40de-aa35-50da1ced7b3d-config-data\") pod \"ceilometer-0\" (UID: \"db1e0286-c4b2-40de-aa35-50da1ced7b3d\") " pod="openstack/ceilometer-0" Sep 30 06:38:03 crc kubenswrapper[4691]: I0930 06:38:03.327340 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db1e0286-c4b2-40de-aa35-50da1ced7b3d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"db1e0286-c4b2-40de-aa35-50da1ced7b3d\") " pod="openstack/ceilometer-0" Sep 30 06:38:03 crc kubenswrapper[4691]: I0930 06:38:03.429704 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db1e0286-c4b2-40de-aa35-50da1ced7b3d-scripts\") pod \"ceilometer-0\" (UID: \"db1e0286-c4b2-40de-aa35-50da1ced7b3d\") " pod="openstack/ceilometer-0" Sep 30 06:38:03 crc kubenswrapper[4691]: I0930 06:38:03.429759 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/db1e0286-c4b2-40de-aa35-50da1ced7b3d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"db1e0286-c4b2-40de-aa35-50da1ced7b3d\") " pod="openstack/ceilometer-0" Sep 30 06:38:03 crc kubenswrapper[4691]: I0930 06:38:03.429779 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/db1e0286-c4b2-40de-aa35-50da1ced7b3d-run-httpd\") pod \"ceilometer-0\" (UID: \"db1e0286-c4b2-40de-aa35-50da1ced7b3d\") " pod="openstack/ceilometer-0" Sep 30 06:38:03 crc kubenswrapper[4691]: I0930 06:38:03.429811 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/db1e0286-c4b2-40de-aa35-50da1ced7b3d-log-httpd\") pod \"ceilometer-0\" (UID: \"db1e0286-c4b2-40de-aa35-50da1ced7b3d\") " pod="openstack/ceilometer-0" Sep 30 06:38:03 crc kubenswrapper[4691]: I0930 06:38:03.429844 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db1e0286-c4b2-40de-aa35-50da1ced7b3d-config-data\") pod \"ceilometer-0\" (UID: \"db1e0286-c4b2-40de-aa35-50da1ced7b3d\") " pod="openstack/ceilometer-0" Sep 30 06:38:03 crc kubenswrapper[4691]: I0930 06:38:03.429876 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db1e0286-c4b2-40de-aa35-50da1ced7b3d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"db1e0286-c4b2-40de-aa35-50da1ced7b3d\") " pod="openstack/ceilometer-0" Sep 30 06:38:03 crc kubenswrapper[4691]: I0930 06:38:03.429934 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xk7q4\" (UniqueName: \"kubernetes.io/projected/db1e0286-c4b2-40de-aa35-50da1ced7b3d-kube-api-access-xk7q4\") pod \"ceilometer-0\" (UID: \"db1e0286-c4b2-40de-aa35-50da1ced7b3d\") " pod="openstack/ceilometer-0" Sep 30 06:38:03 crc kubenswrapper[4691]: I0930 06:38:03.430264 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/db1e0286-c4b2-40de-aa35-50da1ced7b3d-run-httpd\") pod \"ceilometer-0\" (UID: \"db1e0286-c4b2-40de-aa35-50da1ced7b3d\") " pod="openstack/ceilometer-0" Sep 30 06:38:03 crc kubenswrapper[4691]: I0930 06:38:03.430306 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/db1e0286-c4b2-40de-aa35-50da1ced7b3d-log-httpd\") pod \"ceilometer-0\" (UID: \"db1e0286-c4b2-40de-aa35-50da1ced7b3d\") " pod="openstack/ceilometer-0" Sep 30 06:38:03 crc kubenswrapper[4691]: I0930 06:38:03.437250 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/db1e0286-c4b2-40de-aa35-50da1ced7b3d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"db1e0286-c4b2-40de-aa35-50da1ced7b3d\") " pod="openstack/ceilometer-0" Sep 30 06:38:03 crc kubenswrapper[4691]: I0930 06:38:03.437507 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db1e0286-c4b2-40de-aa35-50da1ced7b3d-scripts\") pod \"ceilometer-0\" (UID: \"db1e0286-c4b2-40de-aa35-50da1ced7b3d\") " pod="openstack/ceilometer-0" Sep 30 06:38:03 crc kubenswrapper[4691]: I0930 06:38:03.437654 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db1e0286-c4b2-40de-aa35-50da1ced7b3d-config-data\") pod \"ceilometer-0\" (UID: \"db1e0286-c4b2-40de-aa35-50da1ced7b3d\") " pod="openstack/ceilometer-0" Sep 30 06:38:03 crc kubenswrapper[4691]: I0930 06:38:03.438572 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db1e0286-c4b2-40de-aa35-50da1ced7b3d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"db1e0286-c4b2-40de-aa35-50da1ced7b3d\") " pod="openstack/ceilometer-0" Sep 30 06:38:03 crc kubenswrapper[4691]: I0930 06:38:03.447312 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xk7q4\" (UniqueName: \"kubernetes.io/projected/db1e0286-c4b2-40de-aa35-50da1ced7b3d-kube-api-access-xk7q4\") pod \"ceilometer-0\" (UID: \"db1e0286-c4b2-40de-aa35-50da1ced7b3d\") " pod="openstack/ceilometer-0" Sep 30 06:38:03 crc kubenswrapper[4691]: I0930 06:38:03.578110 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 06:38:04 crc kubenswrapper[4691]: I0930 06:38:04.016517 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 06:38:04 crc kubenswrapper[4691]: I0930 06:38:04.158742 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"db1e0286-c4b2-40de-aa35-50da1ced7b3d","Type":"ContainerStarted","Data":"0428b8d2be746d95cf6b44538569e93729b4c122d91cc0b2c758a783f4bed8fe"} Sep 30 06:38:05 crc kubenswrapper[4691]: I0930 06:38:05.175693 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"db1e0286-c4b2-40de-aa35-50da1ced7b3d","Type":"ContainerStarted","Data":"dc4528c6d8605bfad4e07ba2709f92b75d027e4079ee642a2ea767e8972d1f82"} Sep 30 06:38:05 crc kubenswrapper[4691]: I0930 06:38:05.258534 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0c2e886-a6f2-4a1f-9690-7120d2f44389" path="/var/lib/kubelet/pods/f0c2e886-a6f2-4a1f-9690-7120d2f44389/volumes" Sep 30 06:38:05 crc kubenswrapper[4691]: I0930 06:38:05.270016 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 06:38:07 crc kubenswrapper[4691]: I0930 06:38:07.209181 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"db1e0286-c4b2-40de-aa35-50da1ced7b3d","Type":"ContainerStarted","Data":"3193c237b4fa967606e2c27c23f17f87ba693fa9642fe7bcc631599c73df621f"} Sep 30 06:38:08 crc kubenswrapper[4691]: I0930 06:38:08.221696 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"db1e0286-c4b2-40de-aa35-50da1ced7b3d","Type":"ContainerStarted","Data":"9465f81ca628b815e22720da48adf88bc4c9e6bd874a84d19a8b5a169e4feb67"} Sep 30 06:38:09 crc kubenswrapper[4691]: I0930 06:38:09.234351 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="db1e0286-c4b2-40de-aa35-50da1ced7b3d" containerName="ceilometer-central-agent" containerID="cri-o://dc4528c6d8605bfad4e07ba2709f92b75d027e4079ee642a2ea767e8972d1f82" gracePeriod=30 Sep 30 06:38:09 crc kubenswrapper[4691]: I0930 06:38:09.235090 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="db1e0286-c4b2-40de-aa35-50da1ced7b3d" containerName="proxy-httpd" containerID="cri-o://bfc1c1ac0c802fcf2f08fd6cd8f864f8c310284e76c046e31f2cc00889a3bd5a" gracePeriod=30 Sep 30 06:38:09 crc kubenswrapper[4691]: I0930 06:38:09.235135 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="db1e0286-c4b2-40de-aa35-50da1ced7b3d" containerName="sg-core" containerID="cri-o://9465f81ca628b815e22720da48adf88bc4c9e6bd874a84d19a8b5a169e4feb67" gracePeriod=30 Sep 30 06:38:09 crc kubenswrapper[4691]: I0930 06:38:09.235188 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="db1e0286-c4b2-40de-aa35-50da1ced7b3d" containerName="ceilometer-notification-agent" containerID="cri-o://3193c237b4fa967606e2c27c23f17f87ba693fa9642fe7bcc631599c73df621f" gracePeriod=30 Sep 30 06:38:09 crc kubenswrapper[4691]: I0930 06:38:09.254359 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"db1e0286-c4b2-40de-aa35-50da1ced7b3d","Type":"ContainerStarted","Data":"bfc1c1ac0c802fcf2f08fd6cd8f864f8c310284e76c046e31f2cc00889a3bd5a"} Sep 30 06:38:09 crc kubenswrapper[4691]: I0930 06:38:09.254405 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 06:38:10 crc kubenswrapper[4691]: I0930 06:38:10.252606 4691 generic.go:334] "Generic (PLEG): container finished" podID="db1e0286-c4b2-40de-aa35-50da1ced7b3d" containerID="9465f81ca628b815e22720da48adf88bc4c9e6bd874a84d19a8b5a169e4feb67" exitCode=2 Sep 30 06:38:10 crc kubenswrapper[4691]: I0930 06:38:10.253006 4691 generic.go:334] "Generic (PLEG): container finished" podID="db1e0286-c4b2-40de-aa35-50da1ced7b3d" containerID="3193c237b4fa967606e2c27c23f17f87ba693fa9642fe7bcc631599c73df621f" exitCode=0 Sep 30 06:38:10 crc kubenswrapper[4691]: I0930 06:38:10.252710 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"db1e0286-c4b2-40de-aa35-50da1ced7b3d","Type":"ContainerDied","Data":"9465f81ca628b815e22720da48adf88bc4c9e6bd874a84d19a8b5a169e4feb67"} Sep 30 06:38:10 crc kubenswrapper[4691]: I0930 06:38:10.253067 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"db1e0286-c4b2-40de-aa35-50da1ced7b3d","Type":"ContainerDied","Data":"3193c237b4fa967606e2c27c23f17f87ba693fa9642fe7bcc631599c73df621f"} Sep 30 06:38:13 crc kubenswrapper[4691]: I0930 06:38:13.294588 4691 generic.go:334] "Generic (PLEG): container finished" podID="db1e0286-c4b2-40de-aa35-50da1ced7b3d" containerID="dc4528c6d8605bfad4e07ba2709f92b75d027e4079ee642a2ea767e8972d1f82" exitCode=0 Sep 30 06:38:13 crc kubenswrapper[4691]: I0930 06:38:13.295015 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"db1e0286-c4b2-40de-aa35-50da1ced7b3d","Type":"ContainerDied","Data":"dc4528c6d8605bfad4e07ba2709f92b75d027e4079ee642a2ea767e8972d1f82"} Sep 30 06:38:13 crc kubenswrapper[4691]: I0930 06:38:13.298265 4691 generic.go:334] "Generic (PLEG): container finished" podID="c1d80b8d-cdaa-4dec-9ef5-dca9c837ed7a" containerID="a50e72b3684249c6ef44c44bf367e3bc96a00911cf3cfc02d0c6b7bf6ce6bc14" exitCode=0 Sep 30 06:38:13 crc kubenswrapper[4691]: I0930 06:38:13.298318 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-k7vkj" event={"ID":"c1d80b8d-cdaa-4dec-9ef5-dca9c837ed7a","Type":"ContainerDied","Data":"a50e72b3684249c6ef44c44bf367e3bc96a00911cf3cfc02d0c6b7bf6ce6bc14"} Sep 30 06:38:13 crc kubenswrapper[4691]: I0930 06:38:13.320992 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=5.5110078940000005 podStartE2EDuration="10.320968118s" podCreationTimestamp="2025-09-30 06:38:03 +0000 UTC" firstStartedPulling="2025-09-30 06:38:04.019226644 +0000 UTC m=+1127.494247684" lastFinishedPulling="2025-09-30 06:38:08.829186868 +0000 UTC m=+1132.304207908" observedRunningTime="2025-09-30 06:38:09.276125563 +0000 UTC m=+1132.751146623" watchObservedRunningTime="2025-09-30 06:38:13.320968118 +0000 UTC m=+1136.795989188" Sep 30 06:38:14 crc kubenswrapper[4691]: I0930 06:38:14.808237 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-k7vkj" Sep 30 06:38:14 crc kubenswrapper[4691]: I0930 06:38:14.966813 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1d80b8d-cdaa-4dec-9ef5-dca9c837ed7a-combined-ca-bundle\") pod \"c1d80b8d-cdaa-4dec-9ef5-dca9c837ed7a\" (UID: \"c1d80b8d-cdaa-4dec-9ef5-dca9c837ed7a\") " Sep 30 06:38:14 crc kubenswrapper[4691]: I0930 06:38:14.966911 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1d80b8d-cdaa-4dec-9ef5-dca9c837ed7a-config-data\") pod \"c1d80b8d-cdaa-4dec-9ef5-dca9c837ed7a\" (UID: \"c1d80b8d-cdaa-4dec-9ef5-dca9c837ed7a\") " Sep 30 06:38:14 crc kubenswrapper[4691]: I0930 06:38:14.967024 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c1d80b8d-cdaa-4dec-9ef5-dca9c837ed7a-scripts\") pod \"c1d80b8d-cdaa-4dec-9ef5-dca9c837ed7a\" (UID: \"c1d80b8d-cdaa-4dec-9ef5-dca9c837ed7a\") " Sep 30 06:38:14 crc kubenswrapper[4691]: I0930 06:38:14.967055 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrj78\" (UniqueName: \"kubernetes.io/projected/c1d80b8d-cdaa-4dec-9ef5-dca9c837ed7a-kube-api-access-qrj78\") pod \"c1d80b8d-cdaa-4dec-9ef5-dca9c837ed7a\" (UID: \"c1d80b8d-cdaa-4dec-9ef5-dca9c837ed7a\") " Sep 30 06:38:14 crc kubenswrapper[4691]: I0930 06:38:14.973774 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1d80b8d-cdaa-4dec-9ef5-dca9c837ed7a-scripts" (OuterVolumeSpecName: "scripts") pod "c1d80b8d-cdaa-4dec-9ef5-dca9c837ed7a" (UID: "c1d80b8d-cdaa-4dec-9ef5-dca9c837ed7a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:38:14 crc kubenswrapper[4691]: I0930 06:38:14.974671 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1d80b8d-cdaa-4dec-9ef5-dca9c837ed7a-kube-api-access-qrj78" (OuterVolumeSpecName: "kube-api-access-qrj78") pod "c1d80b8d-cdaa-4dec-9ef5-dca9c837ed7a" (UID: "c1d80b8d-cdaa-4dec-9ef5-dca9c837ed7a"). InnerVolumeSpecName "kube-api-access-qrj78". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:38:14 crc kubenswrapper[4691]: I0930 06:38:14.998237 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1d80b8d-cdaa-4dec-9ef5-dca9c837ed7a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c1d80b8d-cdaa-4dec-9ef5-dca9c837ed7a" (UID: "c1d80b8d-cdaa-4dec-9ef5-dca9c837ed7a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:38:15 crc kubenswrapper[4691]: I0930 06:38:15.020621 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1d80b8d-cdaa-4dec-9ef5-dca9c837ed7a-config-data" (OuterVolumeSpecName: "config-data") pod "c1d80b8d-cdaa-4dec-9ef5-dca9c837ed7a" (UID: "c1d80b8d-cdaa-4dec-9ef5-dca9c837ed7a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:38:15 crc kubenswrapper[4691]: I0930 06:38:15.069025 4691 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c1d80b8d-cdaa-4dec-9ef5-dca9c837ed7a-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 06:38:15 crc kubenswrapper[4691]: I0930 06:38:15.069058 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrj78\" (UniqueName: \"kubernetes.io/projected/c1d80b8d-cdaa-4dec-9ef5-dca9c837ed7a-kube-api-access-qrj78\") on node \"crc\" DevicePath \"\"" Sep 30 06:38:15 crc kubenswrapper[4691]: I0930 06:38:15.069071 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c1d80b8d-cdaa-4dec-9ef5-dca9c837ed7a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:38:15 crc kubenswrapper[4691]: I0930 06:38:15.069082 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c1d80b8d-cdaa-4dec-9ef5-dca9c837ed7a-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 06:38:15 crc kubenswrapper[4691]: I0930 06:38:15.328338 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-k7vkj" event={"ID":"c1d80b8d-cdaa-4dec-9ef5-dca9c837ed7a","Type":"ContainerDied","Data":"69863cea142f7fc425749abcf6db07d69e2f6e54809b792eaba4fda589c3c369"} Sep 30 06:38:15 crc kubenswrapper[4691]: I0930 06:38:15.328399 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69863cea142f7fc425749abcf6db07d69e2f6e54809b792eaba4fda589c3c369" Sep 30 06:38:15 crc kubenswrapper[4691]: I0930 06:38:15.328414 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-k7vkj" Sep 30 06:38:15 crc kubenswrapper[4691]: I0930 06:38:15.466661 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Sep 30 06:38:15 crc kubenswrapper[4691]: E0930 06:38:15.467388 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1d80b8d-cdaa-4dec-9ef5-dca9c837ed7a" containerName="nova-cell0-conductor-db-sync" Sep 30 06:38:15 crc kubenswrapper[4691]: I0930 06:38:15.467409 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1d80b8d-cdaa-4dec-9ef5-dca9c837ed7a" containerName="nova-cell0-conductor-db-sync" Sep 30 06:38:15 crc kubenswrapper[4691]: I0930 06:38:15.467635 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1d80b8d-cdaa-4dec-9ef5-dca9c837ed7a" containerName="nova-cell0-conductor-db-sync" Sep 30 06:38:15 crc kubenswrapper[4691]: I0930 06:38:15.468473 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Sep 30 06:38:15 crc kubenswrapper[4691]: I0930 06:38:15.471145 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-6spfr" Sep 30 06:38:15 crc kubenswrapper[4691]: I0930 06:38:15.472159 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Sep 30 06:38:15 crc kubenswrapper[4691]: I0930 06:38:15.484587 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Sep 30 06:38:15 crc kubenswrapper[4691]: I0930 06:38:15.579268 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c523d401-a3b1-4181-8216-bbf80156c7c4-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"c523d401-a3b1-4181-8216-bbf80156c7c4\") " pod="openstack/nova-cell0-conductor-0" Sep 30 06:38:15 crc kubenswrapper[4691]: I0930 06:38:15.579372 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c523d401-a3b1-4181-8216-bbf80156c7c4-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"c523d401-a3b1-4181-8216-bbf80156c7c4\") " pod="openstack/nova-cell0-conductor-0" Sep 30 06:38:15 crc kubenswrapper[4691]: I0930 06:38:15.579544 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqvhw\" (UniqueName: \"kubernetes.io/projected/c523d401-a3b1-4181-8216-bbf80156c7c4-kube-api-access-cqvhw\") pod \"nova-cell0-conductor-0\" (UID: \"c523d401-a3b1-4181-8216-bbf80156c7c4\") " pod="openstack/nova-cell0-conductor-0" Sep 30 06:38:15 crc kubenswrapper[4691]: I0930 06:38:15.681752 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c523d401-a3b1-4181-8216-bbf80156c7c4-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"c523d401-a3b1-4181-8216-bbf80156c7c4\") " pod="openstack/nova-cell0-conductor-0" Sep 30 06:38:15 crc kubenswrapper[4691]: I0930 06:38:15.681834 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c523d401-a3b1-4181-8216-bbf80156c7c4-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"c523d401-a3b1-4181-8216-bbf80156c7c4\") " pod="openstack/nova-cell0-conductor-0" Sep 30 06:38:15 crc kubenswrapper[4691]: I0930 06:38:15.682000 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqvhw\" (UniqueName: \"kubernetes.io/projected/c523d401-a3b1-4181-8216-bbf80156c7c4-kube-api-access-cqvhw\") pod \"nova-cell0-conductor-0\" (UID: \"c523d401-a3b1-4181-8216-bbf80156c7c4\") " pod="openstack/nova-cell0-conductor-0" Sep 30 06:38:15 crc kubenswrapper[4691]: I0930 06:38:15.688087 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c523d401-a3b1-4181-8216-bbf80156c7c4-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"c523d401-a3b1-4181-8216-bbf80156c7c4\") " pod="openstack/nova-cell0-conductor-0" Sep 30 06:38:15 crc kubenswrapper[4691]: I0930 06:38:15.698874 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c523d401-a3b1-4181-8216-bbf80156c7c4-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"c523d401-a3b1-4181-8216-bbf80156c7c4\") " pod="openstack/nova-cell0-conductor-0" Sep 30 06:38:15 crc kubenswrapper[4691]: I0930 06:38:15.710458 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqvhw\" (UniqueName: \"kubernetes.io/projected/c523d401-a3b1-4181-8216-bbf80156c7c4-kube-api-access-cqvhw\") pod \"nova-cell0-conductor-0\" (UID: \"c523d401-a3b1-4181-8216-bbf80156c7c4\") " pod="openstack/nova-cell0-conductor-0" Sep 30 06:38:15 crc kubenswrapper[4691]: I0930 06:38:15.808636 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Sep 30 06:38:16 crc kubenswrapper[4691]: I0930 06:38:16.341579 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Sep 30 06:38:16 crc kubenswrapper[4691]: W0930 06:38:16.348039 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc523d401_a3b1_4181_8216_bbf80156c7c4.slice/crio-7fb010f03418c05f27313bd3638f273486678525aab5de5cc63b09537336aa37 WatchSource:0}: Error finding container 7fb010f03418c05f27313bd3638f273486678525aab5de5cc63b09537336aa37: Status 404 returned error can't find the container with id 7fb010f03418c05f27313bd3638f273486678525aab5de5cc63b09537336aa37 Sep 30 06:38:17 crc kubenswrapper[4691]: I0930 06:38:17.358733 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"c523d401-a3b1-4181-8216-bbf80156c7c4","Type":"ContainerStarted","Data":"532f136a4c12da0f2983243759fd30cd233a7138c40feeec21634639939decd4"} Sep 30 06:38:17 crc kubenswrapper[4691]: I0930 06:38:17.362131 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"c523d401-a3b1-4181-8216-bbf80156c7c4","Type":"ContainerStarted","Data":"7fb010f03418c05f27313bd3638f273486678525aab5de5cc63b09537336aa37"} Sep 30 06:38:17 crc kubenswrapper[4691]: I0930 06:38:17.362299 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Sep 30 06:38:17 crc kubenswrapper[4691]: I0930 06:38:17.392833 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.392813948 podStartE2EDuration="2.392813948s" podCreationTimestamp="2025-09-30 06:38:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:38:17.382094313 +0000 UTC m=+1140.857115423" watchObservedRunningTime="2025-09-30 06:38:17.392813948 +0000 UTC m=+1140.867834998" Sep 30 06:38:25 crc kubenswrapper[4691]: I0930 06:38:25.971729 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Sep 30 06:38:26 crc kubenswrapper[4691]: I0930 06:38:26.385194 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-fvqrc"] Sep 30 06:38:26 crc kubenswrapper[4691]: I0930 06:38:26.387040 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-fvqrc" Sep 30 06:38:26 crc kubenswrapper[4691]: I0930 06:38:26.391320 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Sep 30 06:38:26 crc kubenswrapper[4691]: I0930 06:38:26.391460 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Sep 30 06:38:26 crc kubenswrapper[4691]: I0930 06:38:26.409687 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-fvqrc"] Sep 30 06:38:26 crc kubenswrapper[4691]: I0930 06:38:26.437102 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c832fe6-a00d-4666-9f42-1adaae1d9007-config-data\") pod \"nova-cell0-cell-mapping-fvqrc\" (UID: \"5c832fe6-a00d-4666-9f42-1adaae1d9007\") " pod="openstack/nova-cell0-cell-mapping-fvqrc" Sep 30 06:38:26 crc kubenswrapper[4691]: I0930 06:38:26.437160 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c832fe6-a00d-4666-9f42-1adaae1d9007-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-fvqrc\" (UID: \"5c832fe6-a00d-4666-9f42-1adaae1d9007\") " pod="openstack/nova-cell0-cell-mapping-fvqrc" Sep 30 06:38:26 crc kubenswrapper[4691]: I0930 06:38:26.437241 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c832fe6-a00d-4666-9f42-1adaae1d9007-scripts\") pod \"nova-cell0-cell-mapping-fvqrc\" (UID: \"5c832fe6-a00d-4666-9f42-1adaae1d9007\") " pod="openstack/nova-cell0-cell-mapping-fvqrc" Sep 30 06:38:26 crc kubenswrapper[4691]: I0930 06:38:26.437279 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g59hf\" (UniqueName: \"kubernetes.io/projected/5c832fe6-a00d-4666-9f42-1adaae1d9007-kube-api-access-g59hf\") pod \"nova-cell0-cell-mapping-fvqrc\" (UID: \"5c832fe6-a00d-4666-9f42-1adaae1d9007\") " pod="openstack/nova-cell0-cell-mapping-fvqrc" Sep 30 06:38:26 crc kubenswrapper[4691]: I0930 06:38:26.538027 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c832fe6-a00d-4666-9f42-1adaae1d9007-config-data\") pod \"nova-cell0-cell-mapping-fvqrc\" (UID: \"5c832fe6-a00d-4666-9f42-1adaae1d9007\") " pod="openstack/nova-cell0-cell-mapping-fvqrc" Sep 30 06:38:26 crc kubenswrapper[4691]: I0930 06:38:26.538074 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c832fe6-a00d-4666-9f42-1adaae1d9007-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-fvqrc\" (UID: \"5c832fe6-a00d-4666-9f42-1adaae1d9007\") " pod="openstack/nova-cell0-cell-mapping-fvqrc" Sep 30 06:38:26 crc kubenswrapper[4691]: I0930 06:38:26.538142 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c832fe6-a00d-4666-9f42-1adaae1d9007-scripts\") pod \"nova-cell0-cell-mapping-fvqrc\" (UID: \"5c832fe6-a00d-4666-9f42-1adaae1d9007\") " pod="openstack/nova-cell0-cell-mapping-fvqrc" Sep 30 06:38:26 crc kubenswrapper[4691]: I0930 06:38:26.538173 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g59hf\" (UniqueName: \"kubernetes.io/projected/5c832fe6-a00d-4666-9f42-1adaae1d9007-kube-api-access-g59hf\") pod \"nova-cell0-cell-mapping-fvqrc\" (UID: \"5c832fe6-a00d-4666-9f42-1adaae1d9007\") " pod="openstack/nova-cell0-cell-mapping-fvqrc" Sep 30 06:38:26 crc kubenswrapper[4691]: I0930 06:38:26.548059 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c832fe6-a00d-4666-9f42-1adaae1d9007-config-data\") pod \"nova-cell0-cell-mapping-fvqrc\" (UID: \"5c832fe6-a00d-4666-9f42-1adaae1d9007\") " pod="openstack/nova-cell0-cell-mapping-fvqrc" Sep 30 06:38:26 crc kubenswrapper[4691]: I0930 06:38:26.557066 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g59hf\" (UniqueName: \"kubernetes.io/projected/5c832fe6-a00d-4666-9f42-1adaae1d9007-kube-api-access-g59hf\") pod \"nova-cell0-cell-mapping-fvqrc\" (UID: \"5c832fe6-a00d-4666-9f42-1adaae1d9007\") " pod="openstack/nova-cell0-cell-mapping-fvqrc" Sep 30 06:38:26 crc kubenswrapper[4691]: I0930 06:38:26.557106 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c832fe6-a00d-4666-9f42-1adaae1d9007-scripts\") pod \"nova-cell0-cell-mapping-fvqrc\" (UID: \"5c832fe6-a00d-4666-9f42-1adaae1d9007\") " pod="openstack/nova-cell0-cell-mapping-fvqrc" Sep 30 06:38:26 crc kubenswrapper[4691]: I0930 06:38:26.557310 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c832fe6-a00d-4666-9f42-1adaae1d9007-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-fvqrc\" (UID: \"5c832fe6-a00d-4666-9f42-1adaae1d9007\") " pod="openstack/nova-cell0-cell-mapping-fvqrc" Sep 30 06:38:26 crc kubenswrapper[4691]: I0930 06:38:26.595797 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Sep 30 06:38:26 crc kubenswrapper[4691]: I0930 06:38:26.605793 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 06:38:26 crc kubenswrapper[4691]: I0930 06:38:26.605930 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 06:38:26 crc kubenswrapper[4691]: I0930 06:38:26.612988 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Sep 30 06:38:26 crc kubenswrapper[4691]: I0930 06:38:26.640583 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chrjx\" (UniqueName: \"kubernetes.io/projected/5292180f-82cd-4fe7-8bb6-5017fdc07cc4-kube-api-access-chrjx\") pod \"nova-api-0\" (UID: \"5292180f-82cd-4fe7-8bb6-5017fdc07cc4\") " pod="openstack/nova-api-0" Sep 30 06:38:26 crc kubenswrapper[4691]: I0930 06:38:26.640638 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5292180f-82cd-4fe7-8bb6-5017fdc07cc4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5292180f-82cd-4fe7-8bb6-5017fdc07cc4\") " pod="openstack/nova-api-0" Sep 30 06:38:26 crc kubenswrapper[4691]: I0930 06:38:26.640667 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5292180f-82cd-4fe7-8bb6-5017fdc07cc4-logs\") pod \"nova-api-0\" (UID: \"5292180f-82cd-4fe7-8bb6-5017fdc07cc4\") " pod="openstack/nova-api-0" Sep 30 06:38:26 crc kubenswrapper[4691]: I0930 06:38:26.640790 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5292180f-82cd-4fe7-8bb6-5017fdc07cc4-config-data\") pod \"nova-api-0\" (UID: \"5292180f-82cd-4fe7-8bb6-5017fdc07cc4\") " pod="openstack/nova-api-0" Sep 30 06:38:26 crc kubenswrapper[4691]: I0930 06:38:26.668405 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Sep 30 06:38:26 crc kubenswrapper[4691]: I0930 06:38:26.670454 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 06:38:26 crc kubenswrapper[4691]: I0930 06:38:26.674026 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Sep 30 06:38:26 crc kubenswrapper[4691]: I0930 06:38:26.687866 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 06:38:26 crc kubenswrapper[4691]: I0930 06:38:26.714628 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 06:38:26 crc kubenswrapper[4691]: I0930 06:38:26.716060 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 30 06:38:26 crc kubenswrapper[4691]: I0930 06:38:26.720290 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Sep 30 06:38:26 crc kubenswrapper[4691]: I0930 06:38:26.721923 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-fvqrc" Sep 30 06:38:26 crc kubenswrapper[4691]: I0930 06:38:26.752666 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5292180f-82cd-4fe7-8bb6-5017fdc07cc4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5292180f-82cd-4fe7-8bb6-5017fdc07cc4\") " pod="openstack/nova-api-0" Sep 30 06:38:26 crc kubenswrapper[4691]: I0930 06:38:26.752722 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5292180f-82cd-4fe7-8bb6-5017fdc07cc4-logs\") pod \"nova-api-0\" (UID: \"5292180f-82cd-4fe7-8bb6-5017fdc07cc4\") " pod="openstack/nova-api-0" Sep 30 06:38:26 crc kubenswrapper[4691]: I0930 06:38:26.752763 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3555510-11fa-4c8a-bf0b-9196b7a61f36-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e3555510-11fa-4c8a-bf0b-9196b7a61f36\") " pod="openstack/nova-metadata-0" Sep 30 06:38:26 crc kubenswrapper[4691]: I0930 06:38:26.752793 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mc97q\" (UniqueName: \"kubernetes.io/projected/92ee0693-710e-43cd-a7f6-823fcd0acf42-kube-api-access-mc97q\") pod \"nova-cell1-novncproxy-0\" (UID: \"92ee0693-710e-43cd-a7f6-823fcd0acf42\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 06:38:26 crc kubenswrapper[4691]: I0930 06:38:26.752875 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92ee0693-710e-43cd-a7f6-823fcd0acf42-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"92ee0693-710e-43cd-a7f6-823fcd0acf42\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 06:38:26 crc kubenswrapper[4691]: I0930 06:38:26.752916 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5292180f-82cd-4fe7-8bb6-5017fdc07cc4-config-data\") pod \"nova-api-0\" (UID: \"5292180f-82cd-4fe7-8bb6-5017fdc07cc4\") " pod="openstack/nova-api-0" Sep 30 06:38:26 crc kubenswrapper[4691]: I0930 06:38:26.752961 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3555510-11fa-4c8a-bf0b-9196b7a61f36-logs\") pod \"nova-metadata-0\" (UID: \"e3555510-11fa-4c8a-bf0b-9196b7a61f36\") " pod="openstack/nova-metadata-0" Sep 30 06:38:26 crc kubenswrapper[4691]: I0930 06:38:26.752988 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92ee0693-710e-43cd-a7f6-823fcd0acf42-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"92ee0693-710e-43cd-a7f6-823fcd0acf42\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 06:38:26 crc kubenswrapper[4691]: I0930 06:38:26.753011 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzsbv\" (UniqueName: \"kubernetes.io/projected/e3555510-11fa-4c8a-bf0b-9196b7a61f36-kube-api-access-rzsbv\") pod \"nova-metadata-0\" (UID: \"e3555510-11fa-4c8a-bf0b-9196b7a61f36\") " pod="openstack/nova-metadata-0" Sep 30 06:38:26 crc kubenswrapper[4691]: I0930 06:38:26.753045 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chrjx\" (UniqueName: \"kubernetes.io/projected/5292180f-82cd-4fe7-8bb6-5017fdc07cc4-kube-api-access-chrjx\") pod \"nova-api-0\" (UID: \"5292180f-82cd-4fe7-8bb6-5017fdc07cc4\") " pod="openstack/nova-api-0" Sep 30 06:38:26 crc kubenswrapper[4691]: I0930 06:38:26.753067 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3555510-11fa-4c8a-bf0b-9196b7a61f36-config-data\") pod \"nova-metadata-0\" (UID: \"e3555510-11fa-4c8a-bf0b-9196b7a61f36\") " pod="openstack/nova-metadata-0" Sep 30 06:38:26 crc kubenswrapper[4691]: I0930 06:38:26.755584 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5292180f-82cd-4fe7-8bb6-5017fdc07cc4-logs\") pod \"nova-api-0\" (UID: \"5292180f-82cd-4fe7-8bb6-5017fdc07cc4\") " pod="openstack/nova-api-0" Sep 30 06:38:26 crc kubenswrapper[4691]: I0930 06:38:26.767806 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5292180f-82cd-4fe7-8bb6-5017fdc07cc4-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5292180f-82cd-4fe7-8bb6-5017fdc07cc4\") " pod="openstack/nova-api-0" Sep 30 06:38:26 crc kubenswrapper[4691]: I0930 06:38:26.767875 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 06:38:26 crc kubenswrapper[4691]: I0930 06:38:26.770073 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5292180f-82cd-4fe7-8bb6-5017fdc07cc4-config-data\") pod \"nova-api-0\" (UID: \"5292180f-82cd-4fe7-8bb6-5017fdc07cc4\") " pod="openstack/nova-api-0" Sep 30 06:38:26 crc kubenswrapper[4691]: I0930 06:38:26.790863 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chrjx\" (UniqueName: \"kubernetes.io/projected/5292180f-82cd-4fe7-8bb6-5017fdc07cc4-kube-api-access-chrjx\") pod \"nova-api-0\" (UID: \"5292180f-82cd-4fe7-8bb6-5017fdc07cc4\") " pod="openstack/nova-api-0" Sep 30 06:38:26 crc kubenswrapper[4691]: I0930 06:38:26.796257 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 06:38:26 crc kubenswrapper[4691]: I0930 06:38:26.797666 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 06:38:26 crc kubenswrapper[4691]: I0930 06:38:26.805647 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Sep 30 06:38:26 crc kubenswrapper[4691]: I0930 06:38:26.818128 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 06:38:26 crc kubenswrapper[4691]: I0930 06:38:26.841371 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57676fb98c-lb5zs"] Sep 30 06:38:26 crc kubenswrapper[4691]: I0930 06:38:26.843079 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57676fb98c-lb5zs" Sep 30 06:38:26 crc kubenswrapper[4691]: I0930 06:38:26.856287 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92ee0693-710e-43cd-a7f6-823fcd0acf42-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"92ee0693-710e-43cd-a7f6-823fcd0acf42\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 06:38:26 crc kubenswrapper[4691]: I0930 06:38:26.858945 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3555510-11fa-4c8a-bf0b-9196b7a61f36-logs\") pod \"nova-metadata-0\" (UID: \"e3555510-11fa-4c8a-bf0b-9196b7a61f36\") " pod="openstack/nova-metadata-0" Sep 30 06:38:26 crc kubenswrapper[4691]: I0930 06:38:26.858997 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92ee0693-710e-43cd-a7f6-823fcd0acf42-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"92ee0693-710e-43cd-a7f6-823fcd0acf42\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 06:38:26 crc kubenswrapper[4691]: I0930 06:38:26.859021 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzsbv\" (UniqueName: \"kubernetes.io/projected/e3555510-11fa-4c8a-bf0b-9196b7a61f36-kube-api-access-rzsbv\") pod \"nova-metadata-0\" (UID: \"e3555510-11fa-4c8a-bf0b-9196b7a61f36\") " pod="openstack/nova-metadata-0" Sep 30 06:38:26 crc kubenswrapper[4691]: I0930 06:38:26.859059 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3555510-11fa-4c8a-bf0b-9196b7a61f36-config-data\") pod \"nova-metadata-0\" (UID: \"e3555510-11fa-4c8a-bf0b-9196b7a61f36\") " pod="openstack/nova-metadata-0" Sep 30 06:38:26 crc kubenswrapper[4691]: I0930 06:38:26.859111 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3555510-11fa-4c8a-bf0b-9196b7a61f36-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e3555510-11fa-4c8a-bf0b-9196b7a61f36\") " pod="openstack/nova-metadata-0" Sep 30 06:38:26 crc kubenswrapper[4691]: I0930 06:38:26.859141 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mc97q\" (UniqueName: \"kubernetes.io/projected/92ee0693-710e-43cd-a7f6-823fcd0acf42-kube-api-access-mc97q\") pod \"nova-cell1-novncproxy-0\" (UID: \"92ee0693-710e-43cd-a7f6-823fcd0acf42\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 06:38:26 crc kubenswrapper[4691]: I0930 06:38:26.859720 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3555510-11fa-4c8a-bf0b-9196b7a61f36-logs\") pod \"nova-metadata-0\" (UID: \"e3555510-11fa-4c8a-bf0b-9196b7a61f36\") " pod="openstack/nova-metadata-0" Sep 30 06:38:26 crc kubenswrapper[4691]: I0930 06:38:26.859817 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57676fb98c-lb5zs"] Sep 30 06:38:26 crc kubenswrapper[4691]: I0930 06:38:26.865494 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92ee0693-710e-43cd-a7f6-823fcd0acf42-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"92ee0693-710e-43cd-a7f6-823fcd0acf42\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 06:38:26 crc kubenswrapper[4691]: I0930 06:38:26.866131 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3555510-11fa-4c8a-bf0b-9196b7a61f36-config-data\") pod \"nova-metadata-0\" (UID: \"e3555510-11fa-4c8a-bf0b-9196b7a61f36\") " pod="openstack/nova-metadata-0" Sep 30 06:38:26 crc kubenswrapper[4691]: I0930 06:38:26.876992 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3555510-11fa-4c8a-bf0b-9196b7a61f36-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e3555510-11fa-4c8a-bf0b-9196b7a61f36\") " pod="openstack/nova-metadata-0" Sep 30 06:38:26 crc kubenswrapper[4691]: I0930 06:38:26.878713 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92ee0693-710e-43cd-a7f6-823fcd0acf42-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"92ee0693-710e-43cd-a7f6-823fcd0acf42\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 06:38:26 crc kubenswrapper[4691]: I0930 06:38:26.880403 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mc97q\" (UniqueName: \"kubernetes.io/projected/92ee0693-710e-43cd-a7f6-823fcd0acf42-kube-api-access-mc97q\") pod \"nova-cell1-novncproxy-0\" (UID: \"92ee0693-710e-43cd-a7f6-823fcd0acf42\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 06:38:26 crc kubenswrapper[4691]: I0930 06:38:26.887342 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzsbv\" (UniqueName: \"kubernetes.io/projected/e3555510-11fa-4c8a-bf0b-9196b7a61f36-kube-api-access-rzsbv\") pod \"nova-metadata-0\" (UID: \"e3555510-11fa-4c8a-bf0b-9196b7a61f36\") " pod="openstack/nova-metadata-0" Sep 30 06:38:26 crc kubenswrapper[4691]: I0930 06:38:26.953773 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 06:38:26 crc kubenswrapper[4691]: I0930 06:38:26.962426 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlndh\" (UniqueName: \"kubernetes.io/projected/1944fc3a-694c-4642-a242-ace9c04f708e-kube-api-access-jlndh\") pod \"dnsmasq-dns-57676fb98c-lb5zs\" (UID: \"1944fc3a-694c-4642-a242-ace9c04f708e\") " pod="openstack/dnsmasq-dns-57676fb98c-lb5zs" Sep 30 06:38:26 crc kubenswrapper[4691]: I0930 06:38:26.962526 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1944fc3a-694c-4642-a242-ace9c04f708e-dns-swift-storage-0\") pod \"dnsmasq-dns-57676fb98c-lb5zs\" (UID: \"1944fc3a-694c-4642-a242-ace9c04f708e\") " pod="openstack/dnsmasq-dns-57676fb98c-lb5zs" Sep 30 06:38:26 crc kubenswrapper[4691]: I0930 06:38:26.962565 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1944fc3a-694c-4642-a242-ace9c04f708e-ovsdbserver-nb\") pod \"dnsmasq-dns-57676fb98c-lb5zs\" (UID: \"1944fc3a-694c-4642-a242-ace9c04f708e\") " pod="openstack/dnsmasq-dns-57676fb98c-lb5zs" Sep 30 06:38:26 crc kubenswrapper[4691]: I0930 06:38:26.962717 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nc9zd\" (UniqueName: \"kubernetes.io/projected/fc428a66-0149-4f08-89d7-e9b5749e4bd5-kube-api-access-nc9zd\") pod \"nova-scheduler-0\" (UID: \"fc428a66-0149-4f08-89d7-e9b5749e4bd5\") " pod="openstack/nova-scheduler-0" Sep 30 06:38:26 crc kubenswrapper[4691]: I0930 06:38:26.962755 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc428a66-0149-4f08-89d7-e9b5749e4bd5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fc428a66-0149-4f08-89d7-e9b5749e4bd5\") " pod="openstack/nova-scheduler-0" Sep 30 06:38:26 crc kubenswrapper[4691]: I0930 06:38:26.962864 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc428a66-0149-4f08-89d7-e9b5749e4bd5-config-data\") pod \"nova-scheduler-0\" (UID: \"fc428a66-0149-4f08-89d7-e9b5749e4bd5\") " pod="openstack/nova-scheduler-0" Sep 30 06:38:26 crc kubenswrapper[4691]: I0930 06:38:26.962983 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1944fc3a-694c-4642-a242-ace9c04f708e-ovsdbserver-sb\") pod \"dnsmasq-dns-57676fb98c-lb5zs\" (UID: \"1944fc3a-694c-4642-a242-ace9c04f708e\") " pod="openstack/dnsmasq-dns-57676fb98c-lb5zs" Sep 30 06:38:26 crc kubenswrapper[4691]: I0930 06:38:26.963088 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1944fc3a-694c-4642-a242-ace9c04f708e-config\") pod \"dnsmasq-dns-57676fb98c-lb5zs\" (UID: \"1944fc3a-694c-4642-a242-ace9c04f708e\") " pod="openstack/dnsmasq-dns-57676fb98c-lb5zs" Sep 30 06:38:26 crc kubenswrapper[4691]: I0930 06:38:26.963128 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1944fc3a-694c-4642-a242-ace9c04f708e-dns-svc\") pod \"dnsmasq-dns-57676fb98c-lb5zs\" (UID: \"1944fc3a-694c-4642-a242-ace9c04f708e\") " pod="openstack/dnsmasq-dns-57676fb98c-lb5zs" Sep 30 06:38:26 crc kubenswrapper[4691]: I0930 06:38:26.995281 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 06:38:27 crc kubenswrapper[4691]: I0930 06:38:27.047228 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 30 06:38:27 crc kubenswrapper[4691]: I0930 06:38:27.065129 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlndh\" (UniqueName: \"kubernetes.io/projected/1944fc3a-694c-4642-a242-ace9c04f708e-kube-api-access-jlndh\") pod \"dnsmasq-dns-57676fb98c-lb5zs\" (UID: \"1944fc3a-694c-4642-a242-ace9c04f708e\") " pod="openstack/dnsmasq-dns-57676fb98c-lb5zs" Sep 30 06:38:27 crc kubenswrapper[4691]: I0930 06:38:27.065202 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1944fc3a-694c-4642-a242-ace9c04f708e-dns-swift-storage-0\") pod \"dnsmasq-dns-57676fb98c-lb5zs\" (UID: \"1944fc3a-694c-4642-a242-ace9c04f708e\") " pod="openstack/dnsmasq-dns-57676fb98c-lb5zs" Sep 30 06:38:27 crc kubenswrapper[4691]: I0930 06:38:27.065240 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1944fc3a-694c-4642-a242-ace9c04f708e-ovsdbserver-nb\") pod \"dnsmasq-dns-57676fb98c-lb5zs\" (UID: \"1944fc3a-694c-4642-a242-ace9c04f708e\") " pod="openstack/dnsmasq-dns-57676fb98c-lb5zs" Sep 30 06:38:27 crc kubenswrapper[4691]: I0930 06:38:27.065289 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nc9zd\" (UniqueName: \"kubernetes.io/projected/fc428a66-0149-4f08-89d7-e9b5749e4bd5-kube-api-access-nc9zd\") pod \"nova-scheduler-0\" (UID: \"fc428a66-0149-4f08-89d7-e9b5749e4bd5\") " pod="openstack/nova-scheduler-0" Sep 30 06:38:27 crc kubenswrapper[4691]: I0930 06:38:27.065304 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc428a66-0149-4f08-89d7-e9b5749e4bd5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fc428a66-0149-4f08-89d7-e9b5749e4bd5\") " pod="openstack/nova-scheduler-0" Sep 30 06:38:27 crc kubenswrapper[4691]: I0930 06:38:27.065345 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc428a66-0149-4f08-89d7-e9b5749e4bd5-config-data\") pod \"nova-scheduler-0\" (UID: \"fc428a66-0149-4f08-89d7-e9b5749e4bd5\") " pod="openstack/nova-scheduler-0" Sep 30 06:38:27 crc kubenswrapper[4691]: I0930 06:38:27.065376 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1944fc3a-694c-4642-a242-ace9c04f708e-ovsdbserver-sb\") pod \"dnsmasq-dns-57676fb98c-lb5zs\" (UID: \"1944fc3a-694c-4642-a242-ace9c04f708e\") " pod="openstack/dnsmasq-dns-57676fb98c-lb5zs" Sep 30 06:38:27 crc kubenswrapper[4691]: I0930 06:38:27.065431 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1944fc3a-694c-4642-a242-ace9c04f708e-config\") pod \"dnsmasq-dns-57676fb98c-lb5zs\" (UID: \"1944fc3a-694c-4642-a242-ace9c04f708e\") " pod="openstack/dnsmasq-dns-57676fb98c-lb5zs" Sep 30 06:38:27 crc kubenswrapper[4691]: I0930 06:38:27.065457 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1944fc3a-694c-4642-a242-ace9c04f708e-dns-svc\") pod \"dnsmasq-dns-57676fb98c-lb5zs\" (UID: \"1944fc3a-694c-4642-a242-ace9c04f708e\") " pod="openstack/dnsmasq-dns-57676fb98c-lb5zs" Sep 30 06:38:27 crc kubenswrapper[4691]: I0930 06:38:27.066388 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1944fc3a-694c-4642-a242-ace9c04f708e-dns-svc\") pod \"dnsmasq-dns-57676fb98c-lb5zs\" (UID: \"1944fc3a-694c-4642-a242-ace9c04f708e\") " pod="openstack/dnsmasq-dns-57676fb98c-lb5zs" Sep 30 06:38:27 crc kubenswrapper[4691]: I0930 06:38:27.067180 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1944fc3a-694c-4642-a242-ace9c04f708e-dns-swift-storage-0\") pod \"dnsmasq-dns-57676fb98c-lb5zs\" (UID: \"1944fc3a-694c-4642-a242-ace9c04f708e\") " pod="openstack/dnsmasq-dns-57676fb98c-lb5zs" Sep 30 06:38:27 crc kubenswrapper[4691]: I0930 06:38:27.067737 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1944fc3a-694c-4642-a242-ace9c04f708e-ovsdbserver-nb\") pod \"dnsmasq-dns-57676fb98c-lb5zs\" (UID: \"1944fc3a-694c-4642-a242-ace9c04f708e\") " pod="openstack/dnsmasq-dns-57676fb98c-lb5zs" Sep 30 06:38:27 crc kubenswrapper[4691]: I0930 06:38:27.072830 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1944fc3a-694c-4642-a242-ace9c04f708e-config\") pod \"dnsmasq-dns-57676fb98c-lb5zs\" (UID: \"1944fc3a-694c-4642-a242-ace9c04f708e\") " pod="openstack/dnsmasq-dns-57676fb98c-lb5zs" Sep 30 06:38:27 crc kubenswrapper[4691]: I0930 06:38:27.072861 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1944fc3a-694c-4642-a242-ace9c04f708e-ovsdbserver-sb\") pod \"dnsmasq-dns-57676fb98c-lb5zs\" (UID: \"1944fc3a-694c-4642-a242-ace9c04f708e\") " pod="openstack/dnsmasq-dns-57676fb98c-lb5zs" Sep 30 06:38:27 crc kubenswrapper[4691]: I0930 06:38:27.080131 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc428a66-0149-4f08-89d7-e9b5749e4bd5-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"fc428a66-0149-4f08-89d7-e9b5749e4bd5\") " pod="openstack/nova-scheduler-0" Sep 30 06:38:27 crc kubenswrapper[4691]: I0930 06:38:27.082101 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc428a66-0149-4f08-89d7-e9b5749e4bd5-config-data\") pod \"nova-scheduler-0\" (UID: \"fc428a66-0149-4f08-89d7-e9b5749e4bd5\") " pod="openstack/nova-scheduler-0" Sep 30 06:38:27 crc kubenswrapper[4691]: I0930 06:38:27.087405 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nc9zd\" (UniqueName: \"kubernetes.io/projected/fc428a66-0149-4f08-89d7-e9b5749e4bd5-kube-api-access-nc9zd\") pod \"nova-scheduler-0\" (UID: \"fc428a66-0149-4f08-89d7-e9b5749e4bd5\") " pod="openstack/nova-scheduler-0" Sep 30 06:38:27 crc kubenswrapper[4691]: I0930 06:38:27.093166 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlndh\" (UniqueName: \"kubernetes.io/projected/1944fc3a-694c-4642-a242-ace9c04f708e-kube-api-access-jlndh\") pod \"dnsmasq-dns-57676fb98c-lb5zs\" (UID: \"1944fc3a-694c-4642-a242-ace9c04f708e\") " pod="openstack/dnsmasq-dns-57676fb98c-lb5zs" Sep 30 06:38:27 crc kubenswrapper[4691]: I0930 06:38:27.254523 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 06:38:27 crc kubenswrapper[4691]: I0930 06:38:27.267914 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57676fb98c-lb5zs" Sep 30 06:38:27 crc kubenswrapper[4691]: I0930 06:38:27.362466 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-fvqrc"] Sep 30 06:38:27 crc kubenswrapper[4691]: I0930 06:38:27.528931 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-fvqrc" event={"ID":"5c832fe6-a00d-4666-9f42-1adaae1d9007","Type":"ContainerStarted","Data":"98bc0e69bb7b50e3eeedf0d7a4c7d88d6ebdedc29c538baa2c264665a9e00cd2"} Sep 30 06:38:27 crc kubenswrapper[4691]: I0930 06:38:27.545834 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 06:38:27 crc kubenswrapper[4691]: W0930 06:38:27.557671 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5292180f_82cd_4fe7_8bb6_5017fdc07cc4.slice/crio-76d1f2b939ff24113c9be76c16f1c1e5a9b8a4d9ad81641b8627e47fa994164d WatchSource:0}: Error finding container 76d1f2b939ff24113c9be76c16f1c1e5a9b8a4d9ad81641b8627e47fa994164d: Status 404 returned error can't find the container with id 76d1f2b939ff24113c9be76c16f1c1e5a9b8a4d9ad81641b8627e47fa994164d Sep 30 06:38:27 crc kubenswrapper[4691]: W0930 06:38:27.642465 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode3555510_11fa_4c8a_bf0b_9196b7a61f36.slice/crio-254e8bc3af4ee6fc1b8f9812d4ddd4418660149226b672b88afff7e8fb971012 WatchSource:0}: Error finding container 254e8bc3af4ee6fc1b8f9812d4ddd4418660149226b672b88afff7e8fb971012: Status 404 returned error can't find the container with id 254e8bc3af4ee6fc1b8f9812d4ddd4418660149226b672b88afff7e8fb971012 Sep 30 06:38:27 crc kubenswrapper[4691]: I0930 06:38:27.645633 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 06:38:27 crc kubenswrapper[4691]: I0930 06:38:27.667349 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-dtmdb"] Sep 30 06:38:27 crc kubenswrapper[4691]: I0930 06:38:27.668716 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-dtmdb" Sep 30 06:38:27 crc kubenswrapper[4691]: I0930 06:38:27.671282 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Sep 30 06:38:27 crc kubenswrapper[4691]: I0930 06:38:27.671492 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Sep 30 06:38:27 crc kubenswrapper[4691]: I0930 06:38:27.687724 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-dtmdb"] Sep 30 06:38:27 crc kubenswrapper[4691]: I0930 06:38:27.726722 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 06:38:27 crc kubenswrapper[4691]: I0930 06:38:27.776421 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c0bdcf6-bc45-43e0-9d6c-e3a4be10a619-config-data\") pod \"nova-cell1-conductor-db-sync-dtmdb\" (UID: \"0c0bdcf6-bc45-43e0-9d6c-e3a4be10a619\") " pod="openstack/nova-cell1-conductor-db-sync-dtmdb" Sep 30 06:38:27 crc kubenswrapper[4691]: I0930 06:38:27.776540 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c0bdcf6-bc45-43e0-9d6c-e3a4be10a619-scripts\") pod \"nova-cell1-conductor-db-sync-dtmdb\" (UID: \"0c0bdcf6-bc45-43e0-9d6c-e3a4be10a619\") " pod="openstack/nova-cell1-conductor-db-sync-dtmdb" Sep 30 06:38:27 crc kubenswrapper[4691]: I0930 06:38:27.776584 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c0bdcf6-bc45-43e0-9d6c-e3a4be10a619-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-dtmdb\" (UID: \"0c0bdcf6-bc45-43e0-9d6c-e3a4be10a619\") " pod="openstack/nova-cell1-conductor-db-sync-dtmdb" Sep 30 06:38:27 crc kubenswrapper[4691]: I0930 06:38:27.776643 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjhd4\" (UniqueName: \"kubernetes.io/projected/0c0bdcf6-bc45-43e0-9d6c-e3a4be10a619-kube-api-access-cjhd4\") pod \"nova-cell1-conductor-db-sync-dtmdb\" (UID: \"0c0bdcf6-bc45-43e0-9d6c-e3a4be10a619\") " pod="openstack/nova-cell1-conductor-db-sync-dtmdb" Sep 30 06:38:27 crc kubenswrapper[4691]: I0930 06:38:27.879418 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c0bdcf6-bc45-43e0-9d6c-e3a4be10a619-config-data\") pod \"nova-cell1-conductor-db-sync-dtmdb\" (UID: \"0c0bdcf6-bc45-43e0-9d6c-e3a4be10a619\") " pod="openstack/nova-cell1-conductor-db-sync-dtmdb" Sep 30 06:38:27 crc kubenswrapper[4691]: I0930 06:38:27.879473 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c0bdcf6-bc45-43e0-9d6c-e3a4be10a619-scripts\") pod \"nova-cell1-conductor-db-sync-dtmdb\" (UID: \"0c0bdcf6-bc45-43e0-9d6c-e3a4be10a619\") " pod="openstack/nova-cell1-conductor-db-sync-dtmdb" Sep 30 06:38:27 crc kubenswrapper[4691]: I0930 06:38:27.879496 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c0bdcf6-bc45-43e0-9d6c-e3a4be10a619-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-dtmdb\" (UID: \"0c0bdcf6-bc45-43e0-9d6c-e3a4be10a619\") " pod="openstack/nova-cell1-conductor-db-sync-dtmdb" Sep 30 06:38:27 crc kubenswrapper[4691]: I0930 06:38:27.879524 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjhd4\" (UniqueName: \"kubernetes.io/projected/0c0bdcf6-bc45-43e0-9d6c-e3a4be10a619-kube-api-access-cjhd4\") pod \"nova-cell1-conductor-db-sync-dtmdb\" (UID: \"0c0bdcf6-bc45-43e0-9d6c-e3a4be10a619\") " pod="openstack/nova-cell1-conductor-db-sync-dtmdb" Sep 30 06:38:27 crc kubenswrapper[4691]: I0930 06:38:27.885292 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c0bdcf6-bc45-43e0-9d6c-e3a4be10a619-scripts\") pod \"nova-cell1-conductor-db-sync-dtmdb\" (UID: \"0c0bdcf6-bc45-43e0-9d6c-e3a4be10a619\") " pod="openstack/nova-cell1-conductor-db-sync-dtmdb" Sep 30 06:38:27 crc kubenswrapper[4691]: I0930 06:38:27.891602 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c0bdcf6-bc45-43e0-9d6c-e3a4be10a619-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-dtmdb\" (UID: \"0c0bdcf6-bc45-43e0-9d6c-e3a4be10a619\") " pod="openstack/nova-cell1-conductor-db-sync-dtmdb" Sep 30 06:38:27 crc kubenswrapper[4691]: I0930 06:38:27.893272 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57676fb98c-lb5zs"] Sep 30 06:38:27 crc kubenswrapper[4691]: I0930 06:38:27.900064 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c0bdcf6-bc45-43e0-9d6c-e3a4be10a619-config-data\") pod \"nova-cell1-conductor-db-sync-dtmdb\" (UID: \"0c0bdcf6-bc45-43e0-9d6c-e3a4be10a619\") " pod="openstack/nova-cell1-conductor-db-sync-dtmdb" Sep 30 06:38:27 crc kubenswrapper[4691]: I0930 06:38:27.904121 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 06:38:27 crc kubenswrapper[4691]: I0930 06:38:27.909654 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjhd4\" (UniqueName: \"kubernetes.io/projected/0c0bdcf6-bc45-43e0-9d6c-e3a4be10a619-kube-api-access-cjhd4\") pod \"nova-cell1-conductor-db-sync-dtmdb\" (UID: \"0c0bdcf6-bc45-43e0-9d6c-e3a4be10a619\") " pod="openstack/nova-cell1-conductor-db-sync-dtmdb" Sep 30 06:38:28 crc kubenswrapper[4691]: I0930 06:38:28.019335 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-dtmdb" Sep 30 06:38:28 crc kubenswrapper[4691]: I0930 06:38:28.541713 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fc428a66-0149-4f08-89d7-e9b5749e4bd5","Type":"ContainerStarted","Data":"d7ce394cd7c959c17a8afe33e6feff262c007833f9dd31ca03275813d8fcdf72"} Sep 30 06:38:28 crc kubenswrapper[4691]: I0930 06:38:28.547017 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5292180f-82cd-4fe7-8bb6-5017fdc07cc4","Type":"ContainerStarted","Data":"76d1f2b939ff24113c9be76c16f1c1e5a9b8a4d9ad81641b8627e47fa994164d"} Sep 30 06:38:28 crc kubenswrapper[4691]: I0930 06:38:28.547050 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-fvqrc" event={"ID":"5c832fe6-a00d-4666-9f42-1adaae1d9007","Type":"ContainerStarted","Data":"c4bae847fff95c49052aac08cac27819361a386f663fe0425075f23a19795649"} Sep 30 06:38:28 crc kubenswrapper[4691]: I0930 06:38:28.549398 4691 generic.go:334] "Generic (PLEG): container finished" podID="1944fc3a-694c-4642-a242-ace9c04f708e" containerID="c74887900b88c9e327db441aae0694a79b15b2f90098071683a764dbb7efb0a7" exitCode=0 Sep 30 06:38:28 crc kubenswrapper[4691]: I0930 06:38:28.549542 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57676fb98c-lb5zs" event={"ID":"1944fc3a-694c-4642-a242-ace9c04f708e","Type":"ContainerDied","Data":"c74887900b88c9e327db441aae0694a79b15b2f90098071683a764dbb7efb0a7"} Sep 30 06:38:28 crc kubenswrapper[4691]: I0930 06:38:28.549605 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57676fb98c-lb5zs" event={"ID":"1944fc3a-694c-4642-a242-ace9c04f708e","Type":"ContainerStarted","Data":"1e8067f5448b49ac398222e9f86e42b61bced0f8e7a31b7ad8f2ac4556de0d6d"} Sep 30 06:38:28 crc kubenswrapper[4691]: I0930 06:38:28.551430 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e3555510-11fa-4c8a-bf0b-9196b7a61f36","Type":"ContainerStarted","Data":"254e8bc3af4ee6fc1b8f9812d4ddd4418660149226b672b88afff7e8fb971012"} Sep 30 06:38:28 crc kubenswrapper[4691]: I0930 06:38:28.552290 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"92ee0693-710e-43cd-a7f6-823fcd0acf42","Type":"ContainerStarted","Data":"3f0078cada3bedce31a70406666b76110370196bfc5fde48570a713fcf9120ed"} Sep 30 06:38:28 crc kubenswrapper[4691]: I0930 06:38:28.571008 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-fvqrc" podStartSLOduration=2.570986006 podStartE2EDuration="2.570986006s" podCreationTimestamp="2025-09-30 06:38:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:38:28.561642638 +0000 UTC m=+1152.036663678" watchObservedRunningTime="2025-09-30 06:38:28.570986006 +0000 UTC m=+1152.046007046" Sep 30 06:38:28 crc kubenswrapper[4691]: I0930 06:38:28.660115 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-dtmdb"] Sep 30 06:38:28 crc kubenswrapper[4691]: W0930 06:38:28.694412 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0c0bdcf6_bc45_43e0_9d6c_e3a4be10a619.slice/crio-3d935ef3f68065a90a1e78389e18ae5baa665eb8036888b7cde6a8111f8c1018 WatchSource:0}: Error finding container 3d935ef3f68065a90a1e78389e18ae5baa665eb8036888b7cde6a8111f8c1018: Status 404 returned error can't find the container with id 3d935ef3f68065a90a1e78389e18ae5baa665eb8036888b7cde6a8111f8c1018 Sep 30 06:38:29 crc kubenswrapper[4691]: I0930 06:38:29.566431 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57676fb98c-lb5zs" event={"ID":"1944fc3a-694c-4642-a242-ace9c04f708e","Type":"ContainerStarted","Data":"0bd4b43e5a043453bd748d5aa177344a3b55fc5430f9653230d74ed1396dc421"} Sep 30 06:38:29 crc kubenswrapper[4691]: I0930 06:38:29.566661 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57676fb98c-lb5zs" Sep 30 06:38:29 crc kubenswrapper[4691]: I0930 06:38:29.570088 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-dtmdb" event={"ID":"0c0bdcf6-bc45-43e0-9d6c-e3a4be10a619","Type":"ContainerStarted","Data":"c2574f2c349e8d4c9d576972fe500c1613b4f900eb12a5a5c9b576d65fbfed51"} Sep 30 06:38:29 crc kubenswrapper[4691]: I0930 06:38:29.570112 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-dtmdb" event={"ID":"0c0bdcf6-bc45-43e0-9d6c-e3a4be10a619","Type":"ContainerStarted","Data":"3d935ef3f68065a90a1e78389e18ae5baa665eb8036888b7cde6a8111f8c1018"} Sep 30 06:38:29 crc kubenswrapper[4691]: I0930 06:38:29.594232 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57676fb98c-lb5zs" podStartSLOduration=3.594210295 podStartE2EDuration="3.594210295s" podCreationTimestamp="2025-09-30 06:38:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:38:29.583146707 +0000 UTC m=+1153.058167767" watchObservedRunningTime="2025-09-30 06:38:29.594210295 +0000 UTC m=+1153.069231325" Sep 30 06:38:29 crc kubenswrapper[4691]: I0930 06:38:29.620559 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-dtmdb" podStartSLOduration=2.620536587 podStartE2EDuration="2.620536587s" podCreationTimestamp="2025-09-30 06:38:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:38:29.613094376 +0000 UTC m=+1153.088115426" watchObservedRunningTime="2025-09-30 06:38:29.620536587 +0000 UTC m=+1153.095557627" Sep 30 06:38:30 crc kubenswrapper[4691]: I0930 06:38:30.394335 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 06:38:30 crc kubenswrapper[4691]: I0930 06:38:30.402187 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 06:38:31 crc kubenswrapper[4691]: I0930 06:38:31.587030 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e3555510-11fa-4c8a-bf0b-9196b7a61f36","Type":"ContainerStarted","Data":"1c582eb449995b0023fe2b4ebc8e15a19718f42294e28fe52551eeaf0d5bcbb2"} Sep 30 06:38:31 crc kubenswrapper[4691]: I0930 06:38:31.587337 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e3555510-11fa-4c8a-bf0b-9196b7a61f36","Type":"ContainerStarted","Data":"2d466f630232df7ed4e680aee3df7a0daf4cfc0f6007f4c5daf91a421d70e73a"} Sep 30 06:38:31 crc kubenswrapper[4691]: I0930 06:38:31.587127 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e3555510-11fa-4c8a-bf0b-9196b7a61f36" containerName="nova-metadata-log" containerID="cri-o://2d466f630232df7ed4e680aee3df7a0daf4cfc0f6007f4c5daf91a421d70e73a" gracePeriod=30 Sep 30 06:38:31 crc kubenswrapper[4691]: I0930 06:38:31.587444 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e3555510-11fa-4c8a-bf0b-9196b7a61f36" containerName="nova-metadata-metadata" containerID="cri-o://1c582eb449995b0023fe2b4ebc8e15a19718f42294e28fe52551eeaf0d5bcbb2" gracePeriod=30 Sep 30 06:38:31 crc kubenswrapper[4691]: I0930 06:38:31.588972 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"92ee0693-710e-43cd-a7f6-823fcd0acf42","Type":"ContainerStarted","Data":"37dd74b2067d2407970a9ec08d0544da7cedbe0561f322a4e76318d781fc5140"} Sep 30 06:38:31 crc kubenswrapper[4691]: I0930 06:38:31.589069 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="92ee0693-710e-43cd-a7f6-823fcd0acf42" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://37dd74b2067d2407970a9ec08d0544da7cedbe0561f322a4e76318d781fc5140" gracePeriod=30 Sep 30 06:38:31 crc kubenswrapper[4691]: I0930 06:38:31.590405 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fc428a66-0149-4f08-89d7-e9b5749e4bd5","Type":"ContainerStarted","Data":"09b919abf4e0a07a23272d86e2c52a73a49f66b747a38fe48f877c0008b23e05"} Sep 30 06:38:31 crc kubenswrapper[4691]: I0930 06:38:31.595193 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5292180f-82cd-4fe7-8bb6-5017fdc07cc4","Type":"ContainerStarted","Data":"02bf8afb49936dc693509565049ffdb8c2f018d8c5453dbf221b452f3ce3b527"} Sep 30 06:38:31 crc kubenswrapper[4691]: I0930 06:38:31.595292 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5292180f-82cd-4fe7-8bb6-5017fdc07cc4","Type":"ContainerStarted","Data":"fe0833b835aad6001045481370c96f8840cbc23c38a5ed4a2ee7167864378366"} Sep 30 06:38:31 crc kubenswrapper[4691]: I0930 06:38:31.621675 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.24820586 podStartE2EDuration="5.621655108s" podCreationTimestamp="2025-09-30 06:38:26 +0000 UTC" firstStartedPulling="2025-09-30 06:38:27.645482829 +0000 UTC m=+1151.120503869" lastFinishedPulling="2025-09-30 06:38:31.018932077 +0000 UTC m=+1154.493953117" observedRunningTime="2025-09-30 06:38:31.611200342 +0000 UTC m=+1155.086221382" watchObservedRunningTime="2025-09-30 06:38:31.621655108 +0000 UTC m=+1155.096676148" Sep 30 06:38:31 crc kubenswrapper[4691]: I0930 06:38:31.655080 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.384164471 podStartE2EDuration="5.655030187s" podCreationTimestamp="2025-09-30 06:38:26 +0000 UTC" firstStartedPulling="2025-09-30 06:38:27.751275374 +0000 UTC m=+1151.226296414" lastFinishedPulling="2025-09-30 06:38:31.02214109 +0000 UTC m=+1154.497162130" observedRunningTime="2025-09-30 06:38:31.630355513 +0000 UTC m=+1155.105376573" watchObservedRunningTime="2025-09-30 06:38:31.655030187 +0000 UTC m=+1155.130051227" Sep 30 06:38:31 crc kubenswrapper[4691]: I0930 06:38:31.668110 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.5735966059999997 podStartE2EDuration="5.667825235s" podCreationTimestamp="2025-09-30 06:38:26 +0000 UTC" firstStartedPulling="2025-09-30 06:38:27.926116888 +0000 UTC m=+1151.401137928" lastFinishedPulling="2025-09-30 06:38:31.020345517 +0000 UTC m=+1154.495366557" observedRunningTime="2025-09-30 06:38:31.647237625 +0000 UTC m=+1155.122258665" watchObservedRunningTime="2025-09-30 06:38:31.667825235 +0000 UTC m=+1155.142846275" Sep 30 06:38:31 crc kubenswrapper[4691]: I0930 06:38:31.679787 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.244536202 podStartE2EDuration="5.679768414s" podCreationTimestamp="2025-09-30 06:38:26 +0000 UTC" firstStartedPulling="2025-09-30 06:38:27.583928643 +0000 UTC m=+1151.058949683" lastFinishedPulling="2025-09-30 06:38:31.019160855 +0000 UTC m=+1154.494181895" observedRunningTime="2025-09-30 06:38:31.663518385 +0000 UTC m=+1155.138539435" watchObservedRunningTime="2025-09-30 06:38:31.679768414 +0000 UTC m=+1155.154789454" Sep 30 06:38:31 crc kubenswrapper[4691]: I0930 06:38:31.996690 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 30 06:38:31 crc kubenswrapper[4691]: I0930 06:38:31.996962 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 30 06:38:32 crc kubenswrapper[4691]: I0930 06:38:32.047708 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Sep 30 06:38:32 crc kubenswrapper[4691]: I0930 06:38:32.254950 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Sep 30 06:38:32 crc kubenswrapper[4691]: I0930 06:38:32.604652 4691 generic.go:334] "Generic (PLEG): container finished" podID="e3555510-11fa-4c8a-bf0b-9196b7a61f36" containerID="2d466f630232df7ed4e680aee3df7a0daf4cfc0f6007f4c5daf91a421d70e73a" exitCode=143 Sep 30 06:38:32 crc kubenswrapper[4691]: I0930 06:38:32.604755 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e3555510-11fa-4c8a-bf0b-9196b7a61f36","Type":"ContainerDied","Data":"2d466f630232df7ed4e680aee3df7a0daf4cfc0f6007f4c5daf91a421d70e73a"} Sep 30 06:38:33 crc kubenswrapper[4691]: I0930 06:38:33.585054 4691 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="db1e0286-c4b2-40de-aa35-50da1ced7b3d" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Sep 30 06:38:35 crc kubenswrapper[4691]: I0930 06:38:35.641719 4691 generic.go:334] "Generic (PLEG): container finished" podID="5c832fe6-a00d-4666-9f42-1adaae1d9007" containerID="c4bae847fff95c49052aac08cac27819361a386f663fe0425075f23a19795649" exitCode=0 Sep 30 06:38:35 crc kubenswrapper[4691]: I0930 06:38:35.641783 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-fvqrc" event={"ID":"5c832fe6-a00d-4666-9f42-1adaae1d9007","Type":"ContainerDied","Data":"c4bae847fff95c49052aac08cac27819361a386f663fe0425075f23a19795649"} Sep 30 06:38:36 crc kubenswrapper[4691]: I0930 06:38:36.954564 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 06:38:36 crc kubenswrapper[4691]: I0930 06:38:36.955103 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 06:38:37 crc kubenswrapper[4691]: I0930 06:38:37.192452 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-fvqrc" Sep 30 06:38:37 crc kubenswrapper[4691]: I0930 06:38:37.237814 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c832fe6-a00d-4666-9f42-1adaae1d9007-scripts\") pod \"5c832fe6-a00d-4666-9f42-1adaae1d9007\" (UID: \"5c832fe6-a00d-4666-9f42-1adaae1d9007\") " Sep 30 06:38:37 crc kubenswrapper[4691]: I0930 06:38:37.238141 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g59hf\" (UniqueName: \"kubernetes.io/projected/5c832fe6-a00d-4666-9f42-1adaae1d9007-kube-api-access-g59hf\") pod \"5c832fe6-a00d-4666-9f42-1adaae1d9007\" (UID: \"5c832fe6-a00d-4666-9f42-1adaae1d9007\") " Sep 30 06:38:37 crc kubenswrapper[4691]: I0930 06:38:37.238206 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c832fe6-a00d-4666-9f42-1adaae1d9007-config-data\") pod \"5c832fe6-a00d-4666-9f42-1adaae1d9007\" (UID: \"5c832fe6-a00d-4666-9f42-1adaae1d9007\") " Sep 30 06:38:37 crc kubenswrapper[4691]: I0930 06:38:37.238225 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c832fe6-a00d-4666-9f42-1adaae1d9007-combined-ca-bundle\") pod \"5c832fe6-a00d-4666-9f42-1adaae1d9007\" (UID: \"5c832fe6-a00d-4666-9f42-1adaae1d9007\") " Sep 30 06:38:37 crc kubenswrapper[4691]: I0930 06:38:37.244466 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c832fe6-a00d-4666-9f42-1adaae1d9007-kube-api-access-g59hf" (OuterVolumeSpecName: "kube-api-access-g59hf") pod "5c832fe6-a00d-4666-9f42-1adaae1d9007" (UID: "5c832fe6-a00d-4666-9f42-1adaae1d9007"). InnerVolumeSpecName "kube-api-access-g59hf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:38:37 crc kubenswrapper[4691]: I0930 06:38:37.244861 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c832fe6-a00d-4666-9f42-1adaae1d9007-scripts" (OuterVolumeSpecName: "scripts") pod "5c832fe6-a00d-4666-9f42-1adaae1d9007" (UID: "5c832fe6-a00d-4666-9f42-1adaae1d9007"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:38:37 crc kubenswrapper[4691]: I0930 06:38:37.256168 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Sep 30 06:38:37 crc kubenswrapper[4691]: I0930 06:38:37.269000 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57676fb98c-lb5zs" Sep 30 06:38:37 crc kubenswrapper[4691]: I0930 06:38:37.270259 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c832fe6-a00d-4666-9f42-1adaae1d9007-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5c832fe6-a00d-4666-9f42-1adaae1d9007" (UID: "5c832fe6-a00d-4666-9f42-1adaae1d9007"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:38:37 crc kubenswrapper[4691]: I0930 06:38:37.276516 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c832fe6-a00d-4666-9f42-1adaae1d9007-config-data" (OuterVolumeSpecName: "config-data") pod "5c832fe6-a00d-4666-9f42-1adaae1d9007" (UID: "5c832fe6-a00d-4666-9f42-1adaae1d9007"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:38:37 crc kubenswrapper[4691]: I0930 06:38:37.294494 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Sep 30 06:38:37 crc kubenswrapper[4691]: I0930 06:38:37.340970 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g59hf\" (UniqueName: \"kubernetes.io/projected/5c832fe6-a00d-4666-9f42-1adaae1d9007-kube-api-access-g59hf\") on node \"crc\" DevicePath \"\"" Sep 30 06:38:37 crc kubenswrapper[4691]: I0930 06:38:37.340995 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c832fe6-a00d-4666-9f42-1adaae1d9007-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 06:38:37 crc kubenswrapper[4691]: I0930 06:38:37.341005 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c832fe6-a00d-4666-9f42-1adaae1d9007-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:38:37 crc kubenswrapper[4691]: I0930 06:38:37.341013 4691 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c832fe6-a00d-4666-9f42-1adaae1d9007-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 06:38:37 crc kubenswrapper[4691]: I0930 06:38:37.351351 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67cb6557b7-q5zk6"] Sep 30 06:38:37 crc kubenswrapper[4691]: I0930 06:38:37.351585 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-67cb6557b7-q5zk6" podUID="02b60d82-3894-4906-a792-84d9d0c2538e" containerName="dnsmasq-dns" containerID="cri-o://99ccc05210f364478faded7979c78de08cfdbabef8111547b044e25c689345fe" gracePeriod=10 Sep 30 06:38:37 crc kubenswrapper[4691]: I0930 06:38:37.687703 4691 generic.go:334] "Generic (PLEG): container finished" podID="02b60d82-3894-4906-a792-84d9d0c2538e" containerID="99ccc05210f364478faded7979c78de08cfdbabef8111547b044e25c689345fe" exitCode=0 Sep 30 06:38:37 crc kubenswrapper[4691]: I0930 06:38:37.687774 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67cb6557b7-q5zk6" event={"ID":"02b60d82-3894-4906-a792-84d9d0c2538e","Type":"ContainerDied","Data":"99ccc05210f364478faded7979c78de08cfdbabef8111547b044e25c689345fe"} Sep 30 06:38:37 crc kubenswrapper[4691]: I0930 06:38:37.690183 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-fvqrc" Sep 30 06:38:37 crc kubenswrapper[4691]: I0930 06:38:37.690183 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-fvqrc" event={"ID":"5c832fe6-a00d-4666-9f42-1adaae1d9007","Type":"ContainerDied","Data":"98bc0e69bb7b50e3eeedf0d7a4c7d88d6ebdedc29c538baa2c264665a9e00cd2"} Sep 30 06:38:37 crc kubenswrapper[4691]: I0930 06:38:37.690302 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98bc0e69bb7b50e3eeedf0d7a4c7d88d6ebdedc29c538baa2c264665a9e00cd2" Sep 30 06:38:37 crc kubenswrapper[4691]: I0930 06:38:37.692906 4691 generic.go:334] "Generic (PLEG): container finished" podID="0c0bdcf6-bc45-43e0-9d6c-e3a4be10a619" containerID="c2574f2c349e8d4c9d576972fe500c1613b4f900eb12a5a5c9b576d65fbfed51" exitCode=0 Sep 30 06:38:37 crc kubenswrapper[4691]: I0930 06:38:37.693017 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-dtmdb" event={"ID":"0c0bdcf6-bc45-43e0-9d6c-e3a4be10a619","Type":"ContainerDied","Data":"c2574f2c349e8d4c9d576972fe500c1613b4f900eb12a5a5c9b576d65fbfed51"} Sep 30 06:38:37 crc kubenswrapper[4691]: I0930 06:38:37.723667 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67cb6557b7-q5zk6" Sep 30 06:38:37 crc kubenswrapper[4691]: I0930 06:38:37.744778 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Sep 30 06:38:37 crc kubenswrapper[4691]: I0930 06:38:37.854311 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/02b60d82-3894-4906-a792-84d9d0c2538e-dns-swift-storage-0\") pod \"02b60d82-3894-4906-a792-84d9d0c2538e\" (UID: \"02b60d82-3894-4906-a792-84d9d0c2538e\") " Sep 30 06:38:37 crc kubenswrapper[4691]: I0930 06:38:37.854438 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/02b60d82-3894-4906-a792-84d9d0c2538e-ovsdbserver-nb\") pod \"02b60d82-3894-4906-a792-84d9d0c2538e\" (UID: \"02b60d82-3894-4906-a792-84d9d0c2538e\") " Sep 30 06:38:37 crc kubenswrapper[4691]: I0930 06:38:37.854641 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29wkn\" (UniqueName: \"kubernetes.io/projected/02b60d82-3894-4906-a792-84d9d0c2538e-kube-api-access-29wkn\") pod \"02b60d82-3894-4906-a792-84d9d0c2538e\" (UID: \"02b60d82-3894-4906-a792-84d9d0c2538e\") " Sep 30 06:38:37 crc kubenswrapper[4691]: I0930 06:38:37.854713 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/02b60d82-3894-4906-a792-84d9d0c2538e-ovsdbserver-sb\") pod \"02b60d82-3894-4906-a792-84d9d0c2538e\" (UID: \"02b60d82-3894-4906-a792-84d9d0c2538e\") " Sep 30 06:38:37 crc kubenswrapper[4691]: I0930 06:38:37.854792 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02b60d82-3894-4906-a792-84d9d0c2538e-config\") pod \"02b60d82-3894-4906-a792-84d9d0c2538e\" (UID: \"02b60d82-3894-4906-a792-84d9d0c2538e\") " Sep 30 06:38:37 crc kubenswrapper[4691]: I0930 06:38:37.854870 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/02b60d82-3894-4906-a792-84d9d0c2538e-dns-svc\") pod \"02b60d82-3894-4906-a792-84d9d0c2538e\" (UID: \"02b60d82-3894-4906-a792-84d9d0c2538e\") " Sep 30 06:38:37 crc kubenswrapper[4691]: I0930 06:38:37.867228 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02b60d82-3894-4906-a792-84d9d0c2538e-kube-api-access-29wkn" (OuterVolumeSpecName: "kube-api-access-29wkn") pod "02b60d82-3894-4906-a792-84d9d0c2538e" (UID: "02b60d82-3894-4906-a792-84d9d0c2538e"). InnerVolumeSpecName "kube-api-access-29wkn". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:38:37 crc kubenswrapper[4691]: I0930 06:38:37.869248 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 30 06:38:37 crc kubenswrapper[4691]: I0930 06:38:37.869458 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="5292180f-82cd-4fe7-8bb6-5017fdc07cc4" containerName="nova-api-log" containerID="cri-o://fe0833b835aad6001045481370c96f8840cbc23c38a5ed4a2ee7167864378366" gracePeriod=30 Sep 30 06:38:37 crc kubenswrapper[4691]: I0930 06:38:37.869785 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="5292180f-82cd-4fe7-8bb6-5017fdc07cc4" containerName="nova-api-api" containerID="cri-o://02bf8afb49936dc693509565049ffdb8c2f018d8c5453dbf221b452f3ce3b527" gracePeriod=30 Sep 30 06:38:37 crc kubenswrapper[4691]: I0930 06:38:37.880089 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5292180f-82cd-4fe7-8bb6-5017fdc07cc4" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.205:8774/\": EOF" Sep 30 06:38:37 crc kubenswrapper[4691]: I0930 06:38:37.881725 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="5292180f-82cd-4fe7-8bb6-5017fdc07cc4" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.205:8774/\": EOF" Sep 30 06:38:37 crc kubenswrapper[4691]: I0930 06:38:37.926171 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02b60d82-3894-4906-a792-84d9d0c2538e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "02b60d82-3894-4906-a792-84d9d0c2538e" (UID: "02b60d82-3894-4906-a792-84d9d0c2538e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:38:37 crc kubenswrapper[4691]: I0930 06:38:37.950567 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02b60d82-3894-4906-a792-84d9d0c2538e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "02b60d82-3894-4906-a792-84d9d0c2538e" (UID: "02b60d82-3894-4906-a792-84d9d0c2538e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:38:37 crc kubenswrapper[4691]: I0930 06:38:37.951096 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02b60d82-3894-4906-a792-84d9d0c2538e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "02b60d82-3894-4906-a792-84d9d0c2538e" (UID: "02b60d82-3894-4906-a792-84d9d0c2538e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:38:37 crc kubenswrapper[4691]: I0930 06:38:37.957752 4691 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/02b60d82-3894-4906-a792-84d9d0c2538e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 06:38:37 crc kubenswrapper[4691]: I0930 06:38:37.957772 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29wkn\" (UniqueName: \"kubernetes.io/projected/02b60d82-3894-4906-a792-84d9d0c2538e-kube-api-access-29wkn\") on node \"crc\" DevicePath \"\"" Sep 30 06:38:37 crc kubenswrapper[4691]: I0930 06:38:37.957782 4691 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/02b60d82-3894-4906-a792-84d9d0c2538e-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 06:38:37 crc kubenswrapper[4691]: I0930 06:38:37.957792 4691 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/02b60d82-3894-4906-a792-84d9d0c2538e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 06:38:37 crc kubenswrapper[4691]: I0930 06:38:37.963211 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02b60d82-3894-4906-a792-84d9d0c2538e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "02b60d82-3894-4906-a792-84d9d0c2538e" (UID: "02b60d82-3894-4906-a792-84d9d0c2538e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:38:37 crc kubenswrapper[4691]: I0930 06:38:37.976495 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/02b60d82-3894-4906-a792-84d9d0c2538e-config" (OuterVolumeSpecName: "config") pod "02b60d82-3894-4906-a792-84d9d0c2538e" (UID: "02b60d82-3894-4906-a792-84d9d0c2538e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:38:38 crc kubenswrapper[4691]: I0930 06:38:38.059291 4691 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/02b60d82-3894-4906-a792-84d9d0c2538e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 06:38:38 crc kubenswrapper[4691]: I0930 06:38:38.059323 4691 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/02b60d82-3894-4906-a792-84d9d0c2538e-config\") on node \"crc\" DevicePath \"\"" Sep 30 06:38:38 crc kubenswrapper[4691]: I0930 06:38:38.220977 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 06:38:38 crc kubenswrapper[4691]: I0930 06:38:38.704247 4691 generic.go:334] "Generic (PLEG): container finished" podID="5292180f-82cd-4fe7-8bb6-5017fdc07cc4" containerID="fe0833b835aad6001045481370c96f8840cbc23c38a5ed4a2ee7167864378366" exitCode=143 Sep 30 06:38:38 crc kubenswrapper[4691]: I0930 06:38:38.704348 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5292180f-82cd-4fe7-8bb6-5017fdc07cc4","Type":"ContainerDied","Data":"fe0833b835aad6001045481370c96f8840cbc23c38a5ed4a2ee7167864378366"} Sep 30 06:38:38 crc kubenswrapper[4691]: I0930 06:38:38.706111 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-67cb6557b7-q5zk6" event={"ID":"02b60d82-3894-4906-a792-84d9d0c2538e","Type":"ContainerDied","Data":"335cbf83626b016c722d2696db1e91dbd98294200053b48ef1c025713d009cff"} Sep 30 06:38:38 crc kubenswrapper[4691]: I0930 06:38:38.706154 4691 scope.go:117] "RemoveContainer" containerID="99ccc05210f364478faded7979c78de08cfdbabef8111547b044e25c689345fe" Sep 30 06:38:38 crc kubenswrapper[4691]: I0930 06:38:38.706280 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-67cb6557b7-q5zk6" Sep 30 06:38:38 crc kubenswrapper[4691]: I0930 06:38:38.753342 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-67cb6557b7-q5zk6"] Sep 30 06:38:38 crc kubenswrapper[4691]: I0930 06:38:38.759401 4691 scope.go:117] "RemoveContainer" containerID="f1a093915c38b87c7ffe83ea81e71d0931ff1af2141184b2d2980c0c107fff2b" Sep 30 06:38:38 crc kubenswrapper[4691]: I0930 06:38:38.761343 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-67cb6557b7-q5zk6"] Sep 30 06:38:39 crc kubenswrapper[4691]: I0930 06:38:39.075038 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-dtmdb" Sep 30 06:38:39 crc kubenswrapper[4691]: I0930 06:38:39.193386 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c0bdcf6-bc45-43e0-9d6c-e3a4be10a619-scripts\") pod \"0c0bdcf6-bc45-43e0-9d6c-e3a4be10a619\" (UID: \"0c0bdcf6-bc45-43e0-9d6c-e3a4be10a619\") " Sep 30 06:38:39 crc kubenswrapper[4691]: I0930 06:38:39.193531 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c0bdcf6-bc45-43e0-9d6c-e3a4be10a619-config-data\") pod \"0c0bdcf6-bc45-43e0-9d6c-e3a4be10a619\" (UID: \"0c0bdcf6-bc45-43e0-9d6c-e3a4be10a619\") " Sep 30 06:38:39 crc kubenswrapper[4691]: I0930 06:38:39.193648 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjhd4\" (UniqueName: \"kubernetes.io/projected/0c0bdcf6-bc45-43e0-9d6c-e3a4be10a619-kube-api-access-cjhd4\") pod \"0c0bdcf6-bc45-43e0-9d6c-e3a4be10a619\" (UID: \"0c0bdcf6-bc45-43e0-9d6c-e3a4be10a619\") " Sep 30 06:38:39 crc kubenswrapper[4691]: I0930 06:38:39.193688 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c0bdcf6-bc45-43e0-9d6c-e3a4be10a619-combined-ca-bundle\") pod \"0c0bdcf6-bc45-43e0-9d6c-e3a4be10a619\" (UID: \"0c0bdcf6-bc45-43e0-9d6c-e3a4be10a619\") " Sep 30 06:38:39 crc kubenswrapper[4691]: I0930 06:38:39.202949 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c0bdcf6-bc45-43e0-9d6c-e3a4be10a619-scripts" (OuterVolumeSpecName: "scripts") pod "0c0bdcf6-bc45-43e0-9d6c-e3a4be10a619" (UID: "0c0bdcf6-bc45-43e0-9d6c-e3a4be10a619"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:38:39 crc kubenswrapper[4691]: I0930 06:38:39.205299 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c0bdcf6-bc45-43e0-9d6c-e3a4be10a619-kube-api-access-cjhd4" (OuterVolumeSpecName: "kube-api-access-cjhd4") pod "0c0bdcf6-bc45-43e0-9d6c-e3a4be10a619" (UID: "0c0bdcf6-bc45-43e0-9d6c-e3a4be10a619"). InnerVolumeSpecName "kube-api-access-cjhd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:38:39 crc kubenswrapper[4691]: I0930 06:38:39.233155 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c0bdcf6-bc45-43e0-9d6c-e3a4be10a619-config-data" (OuterVolumeSpecName: "config-data") pod "0c0bdcf6-bc45-43e0-9d6c-e3a4be10a619" (UID: "0c0bdcf6-bc45-43e0-9d6c-e3a4be10a619"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:38:39 crc kubenswrapper[4691]: I0930 06:38:39.257125 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02b60d82-3894-4906-a792-84d9d0c2538e" path="/var/lib/kubelet/pods/02b60d82-3894-4906-a792-84d9d0c2538e/volumes" Sep 30 06:38:39 crc kubenswrapper[4691]: I0930 06:38:39.273553 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c0bdcf6-bc45-43e0-9d6c-e3a4be10a619-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0c0bdcf6-bc45-43e0-9d6c-e3a4be10a619" (UID: "0c0bdcf6-bc45-43e0-9d6c-e3a4be10a619"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:38:39 crc kubenswrapper[4691]: I0930 06:38:39.296011 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cjhd4\" (UniqueName: \"kubernetes.io/projected/0c0bdcf6-bc45-43e0-9d6c-e3a4be10a619-kube-api-access-cjhd4\") on node \"crc\" DevicePath \"\"" Sep 30 06:38:39 crc kubenswrapper[4691]: I0930 06:38:39.296045 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0c0bdcf6-bc45-43e0-9d6c-e3a4be10a619-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:38:39 crc kubenswrapper[4691]: I0930 06:38:39.296054 4691 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0c0bdcf6-bc45-43e0-9d6c-e3a4be10a619-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 06:38:39 crc kubenswrapper[4691]: I0930 06:38:39.296062 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0c0bdcf6-bc45-43e0-9d6c-e3a4be10a619-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 06:38:39 crc kubenswrapper[4691]: I0930 06:38:39.634769 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 06:38:39 crc kubenswrapper[4691]: I0930 06:38:39.718199 4691 generic.go:334] "Generic (PLEG): container finished" podID="db1e0286-c4b2-40de-aa35-50da1ced7b3d" containerID="bfc1c1ac0c802fcf2f08fd6cd8f864f8c310284e76c046e31f2cc00889a3bd5a" exitCode=137 Sep 30 06:38:39 crc kubenswrapper[4691]: I0930 06:38:39.718269 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 06:38:39 crc kubenswrapper[4691]: I0930 06:38:39.718287 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"db1e0286-c4b2-40de-aa35-50da1ced7b3d","Type":"ContainerDied","Data":"bfc1c1ac0c802fcf2f08fd6cd8f864f8c310284e76c046e31f2cc00889a3bd5a"} Sep 30 06:38:39 crc kubenswrapper[4691]: I0930 06:38:39.719429 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"db1e0286-c4b2-40de-aa35-50da1ced7b3d","Type":"ContainerDied","Data":"0428b8d2be746d95cf6b44538569e93729b4c122d91cc0b2c758a783f4bed8fe"} Sep 30 06:38:39 crc kubenswrapper[4691]: I0930 06:38:39.719457 4691 scope.go:117] "RemoveContainer" containerID="bfc1c1ac0c802fcf2f08fd6cd8f864f8c310284e76c046e31f2cc00889a3bd5a" Sep 30 06:38:39 crc kubenswrapper[4691]: I0930 06:38:39.723352 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="fc428a66-0149-4f08-89d7-e9b5749e4bd5" containerName="nova-scheduler-scheduler" containerID="cri-o://09b919abf4e0a07a23272d86e2c52a73a49f66b747a38fe48f877c0008b23e05" gracePeriod=30 Sep 30 06:38:39 crc kubenswrapper[4691]: I0930 06:38:39.723636 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-dtmdb" Sep 30 06:38:39 crc kubenswrapper[4691]: I0930 06:38:39.725993 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-dtmdb" event={"ID":"0c0bdcf6-bc45-43e0-9d6c-e3a4be10a619","Type":"ContainerDied","Data":"3d935ef3f68065a90a1e78389e18ae5baa665eb8036888b7cde6a8111f8c1018"} Sep 30 06:38:39 crc kubenswrapper[4691]: I0930 06:38:39.726030 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3d935ef3f68065a90a1e78389e18ae5baa665eb8036888b7cde6a8111f8c1018" Sep 30 06:38:39 crc kubenswrapper[4691]: I0930 06:38:39.757285 4691 scope.go:117] "RemoveContainer" containerID="9465f81ca628b815e22720da48adf88bc4c9e6bd874a84d19a8b5a169e4feb67" Sep 30 06:38:39 crc kubenswrapper[4691]: I0930 06:38:39.779585 4691 scope.go:117] "RemoveContainer" containerID="3193c237b4fa967606e2c27c23f17f87ba693fa9642fe7bcc631599c73df621f" Sep 30 06:38:39 crc kubenswrapper[4691]: I0930 06:38:39.803935 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db1e0286-c4b2-40de-aa35-50da1ced7b3d-combined-ca-bundle\") pod \"db1e0286-c4b2-40de-aa35-50da1ced7b3d\" (UID: \"db1e0286-c4b2-40de-aa35-50da1ced7b3d\") " Sep 30 06:38:39 crc kubenswrapper[4691]: I0930 06:38:39.804033 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xk7q4\" (UniqueName: \"kubernetes.io/projected/db1e0286-c4b2-40de-aa35-50da1ced7b3d-kube-api-access-xk7q4\") pod \"db1e0286-c4b2-40de-aa35-50da1ced7b3d\" (UID: \"db1e0286-c4b2-40de-aa35-50da1ced7b3d\") " Sep 30 06:38:39 crc kubenswrapper[4691]: I0930 06:38:39.804095 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/db1e0286-c4b2-40de-aa35-50da1ced7b3d-sg-core-conf-yaml\") pod \"db1e0286-c4b2-40de-aa35-50da1ced7b3d\" (UID: \"db1e0286-c4b2-40de-aa35-50da1ced7b3d\") " Sep 30 06:38:39 crc kubenswrapper[4691]: I0930 06:38:39.804255 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/db1e0286-c4b2-40de-aa35-50da1ced7b3d-log-httpd\") pod \"db1e0286-c4b2-40de-aa35-50da1ced7b3d\" (UID: \"db1e0286-c4b2-40de-aa35-50da1ced7b3d\") " Sep 30 06:38:39 crc kubenswrapper[4691]: I0930 06:38:39.804298 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/db1e0286-c4b2-40de-aa35-50da1ced7b3d-run-httpd\") pod \"db1e0286-c4b2-40de-aa35-50da1ced7b3d\" (UID: \"db1e0286-c4b2-40de-aa35-50da1ced7b3d\") " Sep 30 06:38:39 crc kubenswrapper[4691]: I0930 06:38:39.804348 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db1e0286-c4b2-40de-aa35-50da1ced7b3d-scripts\") pod \"db1e0286-c4b2-40de-aa35-50da1ced7b3d\" (UID: \"db1e0286-c4b2-40de-aa35-50da1ced7b3d\") " Sep 30 06:38:39 crc kubenswrapper[4691]: I0930 06:38:39.804385 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db1e0286-c4b2-40de-aa35-50da1ced7b3d-config-data\") pod \"db1e0286-c4b2-40de-aa35-50da1ced7b3d\" (UID: \"db1e0286-c4b2-40de-aa35-50da1ced7b3d\") " Sep 30 06:38:39 crc kubenswrapper[4691]: I0930 06:38:39.805142 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db1e0286-c4b2-40de-aa35-50da1ced7b3d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "db1e0286-c4b2-40de-aa35-50da1ced7b3d" (UID: "db1e0286-c4b2-40de-aa35-50da1ced7b3d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:38:39 crc kubenswrapper[4691]: I0930 06:38:39.805207 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Sep 30 06:38:39 crc kubenswrapper[4691]: I0930 06:38:39.805502 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db1e0286-c4b2-40de-aa35-50da1ced7b3d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "db1e0286-c4b2-40de-aa35-50da1ced7b3d" (UID: "db1e0286-c4b2-40de-aa35-50da1ced7b3d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:38:39 crc kubenswrapper[4691]: E0930 06:38:39.805646 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db1e0286-c4b2-40de-aa35-50da1ced7b3d" containerName="proxy-httpd" Sep 30 06:38:39 crc kubenswrapper[4691]: I0930 06:38:39.805667 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="db1e0286-c4b2-40de-aa35-50da1ced7b3d" containerName="proxy-httpd" Sep 30 06:38:39 crc kubenswrapper[4691]: E0930 06:38:39.805695 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02b60d82-3894-4906-a792-84d9d0c2538e" containerName="init" Sep 30 06:38:39 crc kubenswrapper[4691]: I0930 06:38:39.805701 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="02b60d82-3894-4906-a792-84d9d0c2538e" containerName="init" Sep 30 06:38:39 crc kubenswrapper[4691]: E0930 06:38:39.805718 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db1e0286-c4b2-40de-aa35-50da1ced7b3d" containerName="ceilometer-central-agent" Sep 30 06:38:39 crc kubenswrapper[4691]: I0930 06:38:39.805725 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="db1e0286-c4b2-40de-aa35-50da1ced7b3d" containerName="ceilometer-central-agent" Sep 30 06:38:39 crc kubenswrapper[4691]: E0930 06:38:39.805738 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db1e0286-c4b2-40de-aa35-50da1ced7b3d" containerName="sg-core" Sep 30 06:38:39 crc kubenswrapper[4691]: I0930 06:38:39.805744 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="db1e0286-c4b2-40de-aa35-50da1ced7b3d" containerName="sg-core" Sep 30 06:38:39 crc kubenswrapper[4691]: E0930 06:38:39.805764 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c832fe6-a00d-4666-9f42-1adaae1d9007" containerName="nova-manage" Sep 30 06:38:39 crc kubenswrapper[4691]: I0930 06:38:39.805769 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c832fe6-a00d-4666-9f42-1adaae1d9007" containerName="nova-manage" Sep 30 06:38:39 crc kubenswrapper[4691]: E0930 06:38:39.805782 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db1e0286-c4b2-40de-aa35-50da1ced7b3d" containerName="ceilometer-notification-agent" Sep 30 06:38:39 crc kubenswrapper[4691]: I0930 06:38:39.805788 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="db1e0286-c4b2-40de-aa35-50da1ced7b3d" containerName="ceilometer-notification-agent" Sep 30 06:38:39 crc kubenswrapper[4691]: E0930 06:38:39.805805 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02b60d82-3894-4906-a792-84d9d0c2538e" containerName="dnsmasq-dns" Sep 30 06:38:39 crc kubenswrapper[4691]: I0930 06:38:39.805810 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="02b60d82-3894-4906-a792-84d9d0c2538e" containerName="dnsmasq-dns" Sep 30 06:38:39 crc kubenswrapper[4691]: E0930 06:38:39.805825 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c0bdcf6-bc45-43e0-9d6c-e3a4be10a619" containerName="nova-cell1-conductor-db-sync" Sep 30 06:38:39 crc kubenswrapper[4691]: I0930 06:38:39.805831 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c0bdcf6-bc45-43e0-9d6c-e3a4be10a619" containerName="nova-cell1-conductor-db-sync" Sep 30 06:38:39 crc kubenswrapper[4691]: I0930 06:38:39.807385 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="db1e0286-c4b2-40de-aa35-50da1ced7b3d" containerName="ceilometer-notification-agent" Sep 30 06:38:39 crc kubenswrapper[4691]: I0930 06:38:39.807402 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c832fe6-a00d-4666-9f42-1adaae1d9007" containerName="nova-manage" Sep 30 06:38:39 crc kubenswrapper[4691]: I0930 06:38:39.807418 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c0bdcf6-bc45-43e0-9d6c-e3a4be10a619" containerName="nova-cell1-conductor-db-sync" Sep 30 06:38:39 crc kubenswrapper[4691]: I0930 06:38:39.807430 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="db1e0286-c4b2-40de-aa35-50da1ced7b3d" containerName="sg-core" Sep 30 06:38:39 crc kubenswrapper[4691]: I0930 06:38:39.807441 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="db1e0286-c4b2-40de-aa35-50da1ced7b3d" containerName="proxy-httpd" Sep 30 06:38:39 crc kubenswrapper[4691]: I0930 06:38:39.807451 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="02b60d82-3894-4906-a792-84d9d0c2538e" containerName="dnsmasq-dns" Sep 30 06:38:39 crc kubenswrapper[4691]: I0930 06:38:39.807460 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="db1e0286-c4b2-40de-aa35-50da1ced7b3d" containerName="ceilometer-central-agent" Sep 30 06:38:39 crc kubenswrapper[4691]: I0930 06:38:39.808090 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Sep 30 06:38:39 crc kubenswrapper[4691]: I0930 06:38:39.809694 4691 scope.go:117] "RemoveContainer" containerID="dc4528c6d8605bfad4e07ba2709f92b75d027e4079ee642a2ea767e8972d1f82" Sep 30 06:38:39 crc kubenswrapper[4691]: I0930 06:38:39.810279 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db1e0286-c4b2-40de-aa35-50da1ced7b3d-scripts" (OuterVolumeSpecName: "scripts") pod "db1e0286-c4b2-40de-aa35-50da1ced7b3d" (UID: "db1e0286-c4b2-40de-aa35-50da1ced7b3d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:38:39 crc kubenswrapper[4691]: I0930 06:38:39.814287 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Sep 30 06:38:39 crc kubenswrapper[4691]: I0930 06:38:39.819164 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Sep 30 06:38:39 crc kubenswrapper[4691]: I0930 06:38:39.823761 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db1e0286-c4b2-40de-aa35-50da1ced7b3d-kube-api-access-xk7q4" (OuterVolumeSpecName: "kube-api-access-xk7q4") pod "db1e0286-c4b2-40de-aa35-50da1ced7b3d" (UID: "db1e0286-c4b2-40de-aa35-50da1ced7b3d"). InnerVolumeSpecName "kube-api-access-xk7q4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:38:39 crc kubenswrapper[4691]: I0930 06:38:39.848619 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db1e0286-c4b2-40de-aa35-50da1ced7b3d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "db1e0286-c4b2-40de-aa35-50da1ced7b3d" (UID: "db1e0286-c4b2-40de-aa35-50da1ced7b3d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:38:39 crc kubenswrapper[4691]: I0930 06:38:39.856830 4691 scope.go:117] "RemoveContainer" containerID="bfc1c1ac0c802fcf2f08fd6cd8f864f8c310284e76c046e31f2cc00889a3bd5a" Sep 30 06:38:39 crc kubenswrapper[4691]: E0930 06:38:39.857483 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfc1c1ac0c802fcf2f08fd6cd8f864f8c310284e76c046e31f2cc00889a3bd5a\": container with ID starting with bfc1c1ac0c802fcf2f08fd6cd8f864f8c310284e76c046e31f2cc00889a3bd5a not found: ID does not exist" containerID="bfc1c1ac0c802fcf2f08fd6cd8f864f8c310284e76c046e31f2cc00889a3bd5a" Sep 30 06:38:39 crc kubenswrapper[4691]: I0930 06:38:39.857530 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfc1c1ac0c802fcf2f08fd6cd8f864f8c310284e76c046e31f2cc00889a3bd5a"} err="failed to get container status \"bfc1c1ac0c802fcf2f08fd6cd8f864f8c310284e76c046e31f2cc00889a3bd5a\": rpc error: code = NotFound desc = could not find container \"bfc1c1ac0c802fcf2f08fd6cd8f864f8c310284e76c046e31f2cc00889a3bd5a\": container with ID starting with bfc1c1ac0c802fcf2f08fd6cd8f864f8c310284e76c046e31f2cc00889a3bd5a not found: ID does not exist" Sep 30 06:38:39 crc kubenswrapper[4691]: I0930 06:38:39.857554 4691 scope.go:117] "RemoveContainer" containerID="9465f81ca628b815e22720da48adf88bc4c9e6bd874a84d19a8b5a169e4feb67" Sep 30 06:38:39 crc kubenswrapper[4691]: E0930 06:38:39.858278 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9465f81ca628b815e22720da48adf88bc4c9e6bd874a84d19a8b5a169e4feb67\": container with ID starting with 9465f81ca628b815e22720da48adf88bc4c9e6bd874a84d19a8b5a169e4feb67 not found: ID does not exist" containerID="9465f81ca628b815e22720da48adf88bc4c9e6bd874a84d19a8b5a169e4feb67" Sep 30 06:38:39 crc kubenswrapper[4691]: I0930 06:38:39.858324 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9465f81ca628b815e22720da48adf88bc4c9e6bd874a84d19a8b5a169e4feb67"} err="failed to get container status \"9465f81ca628b815e22720da48adf88bc4c9e6bd874a84d19a8b5a169e4feb67\": rpc error: code = NotFound desc = could not find container \"9465f81ca628b815e22720da48adf88bc4c9e6bd874a84d19a8b5a169e4feb67\": container with ID starting with 9465f81ca628b815e22720da48adf88bc4c9e6bd874a84d19a8b5a169e4feb67 not found: ID does not exist" Sep 30 06:38:39 crc kubenswrapper[4691]: I0930 06:38:39.858349 4691 scope.go:117] "RemoveContainer" containerID="3193c237b4fa967606e2c27c23f17f87ba693fa9642fe7bcc631599c73df621f" Sep 30 06:38:39 crc kubenswrapper[4691]: E0930 06:38:39.858694 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3193c237b4fa967606e2c27c23f17f87ba693fa9642fe7bcc631599c73df621f\": container with ID starting with 3193c237b4fa967606e2c27c23f17f87ba693fa9642fe7bcc631599c73df621f not found: ID does not exist" containerID="3193c237b4fa967606e2c27c23f17f87ba693fa9642fe7bcc631599c73df621f" Sep 30 06:38:39 crc kubenswrapper[4691]: I0930 06:38:39.858717 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3193c237b4fa967606e2c27c23f17f87ba693fa9642fe7bcc631599c73df621f"} err="failed to get container status \"3193c237b4fa967606e2c27c23f17f87ba693fa9642fe7bcc631599c73df621f\": rpc error: code = NotFound desc = could not find container \"3193c237b4fa967606e2c27c23f17f87ba693fa9642fe7bcc631599c73df621f\": container with ID starting with 3193c237b4fa967606e2c27c23f17f87ba693fa9642fe7bcc631599c73df621f not found: ID does not exist" Sep 30 06:38:39 crc kubenswrapper[4691]: I0930 06:38:39.858732 4691 scope.go:117] "RemoveContainer" containerID="dc4528c6d8605bfad4e07ba2709f92b75d027e4079ee642a2ea767e8972d1f82" Sep 30 06:38:39 crc kubenswrapper[4691]: E0930 06:38:39.859034 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc4528c6d8605bfad4e07ba2709f92b75d027e4079ee642a2ea767e8972d1f82\": container with ID starting with dc4528c6d8605bfad4e07ba2709f92b75d027e4079ee642a2ea767e8972d1f82 not found: ID does not exist" containerID="dc4528c6d8605bfad4e07ba2709f92b75d027e4079ee642a2ea767e8972d1f82" Sep 30 06:38:39 crc kubenswrapper[4691]: I0930 06:38:39.859053 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc4528c6d8605bfad4e07ba2709f92b75d027e4079ee642a2ea767e8972d1f82"} err="failed to get container status \"dc4528c6d8605bfad4e07ba2709f92b75d027e4079ee642a2ea767e8972d1f82\": rpc error: code = NotFound desc = could not find container \"dc4528c6d8605bfad4e07ba2709f92b75d027e4079ee642a2ea767e8972d1f82\": container with ID starting with dc4528c6d8605bfad4e07ba2709f92b75d027e4079ee642a2ea767e8972d1f82 not found: ID does not exist" Sep 30 06:38:39 crc kubenswrapper[4691]: I0930 06:38:39.903243 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db1e0286-c4b2-40de-aa35-50da1ced7b3d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "db1e0286-c4b2-40de-aa35-50da1ced7b3d" (UID: "db1e0286-c4b2-40de-aa35-50da1ced7b3d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:38:39 crc kubenswrapper[4691]: I0930 06:38:39.906275 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0efbbe5-44f2-4424-9a32-476f81246c28-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"b0efbbe5-44f2-4424-9a32-476f81246c28\") " pod="openstack/nova-cell1-conductor-0" Sep 30 06:38:39 crc kubenswrapper[4691]: I0930 06:38:39.906326 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zdkg\" (UniqueName: \"kubernetes.io/projected/b0efbbe5-44f2-4424-9a32-476f81246c28-kube-api-access-2zdkg\") pod \"nova-cell1-conductor-0\" (UID: \"b0efbbe5-44f2-4424-9a32-476f81246c28\") " pod="openstack/nova-cell1-conductor-0" Sep 30 06:38:39 crc kubenswrapper[4691]: I0930 06:38:39.906364 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0efbbe5-44f2-4424-9a32-476f81246c28-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"b0efbbe5-44f2-4424-9a32-476f81246c28\") " pod="openstack/nova-cell1-conductor-0" Sep 30 06:38:39 crc kubenswrapper[4691]: I0930 06:38:39.906495 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db1e0286-c4b2-40de-aa35-50da1ced7b3d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:38:39 crc kubenswrapper[4691]: I0930 06:38:39.906512 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xk7q4\" (UniqueName: \"kubernetes.io/projected/db1e0286-c4b2-40de-aa35-50da1ced7b3d-kube-api-access-xk7q4\") on node \"crc\" DevicePath \"\"" Sep 30 06:38:39 crc kubenswrapper[4691]: I0930 06:38:39.906522 4691 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/db1e0286-c4b2-40de-aa35-50da1ced7b3d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 06:38:39 crc kubenswrapper[4691]: I0930 06:38:39.906531 4691 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/db1e0286-c4b2-40de-aa35-50da1ced7b3d-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 06:38:39 crc kubenswrapper[4691]: I0930 06:38:39.906539 4691 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/db1e0286-c4b2-40de-aa35-50da1ced7b3d-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 06:38:39 crc kubenswrapper[4691]: I0930 06:38:39.906548 4691 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db1e0286-c4b2-40de-aa35-50da1ced7b3d-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 06:38:39 crc kubenswrapper[4691]: I0930 06:38:39.935472 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db1e0286-c4b2-40de-aa35-50da1ced7b3d-config-data" (OuterVolumeSpecName: "config-data") pod "db1e0286-c4b2-40de-aa35-50da1ced7b3d" (UID: "db1e0286-c4b2-40de-aa35-50da1ced7b3d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:38:40 crc kubenswrapper[4691]: I0930 06:38:40.008397 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0efbbe5-44f2-4424-9a32-476f81246c28-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"b0efbbe5-44f2-4424-9a32-476f81246c28\") " pod="openstack/nova-cell1-conductor-0" Sep 30 06:38:40 crc kubenswrapper[4691]: I0930 06:38:40.008442 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zdkg\" (UniqueName: \"kubernetes.io/projected/b0efbbe5-44f2-4424-9a32-476f81246c28-kube-api-access-2zdkg\") pod \"nova-cell1-conductor-0\" (UID: \"b0efbbe5-44f2-4424-9a32-476f81246c28\") " pod="openstack/nova-cell1-conductor-0" Sep 30 06:38:40 crc kubenswrapper[4691]: I0930 06:38:40.008472 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0efbbe5-44f2-4424-9a32-476f81246c28-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"b0efbbe5-44f2-4424-9a32-476f81246c28\") " pod="openstack/nova-cell1-conductor-0" Sep 30 06:38:40 crc kubenswrapper[4691]: I0930 06:38:40.008546 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db1e0286-c4b2-40de-aa35-50da1ced7b3d-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 06:38:40 crc kubenswrapper[4691]: I0930 06:38:40.012108 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0efbbe5-44f2-4424-9a32-476f81246c28-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"b0efbbe5-44f2-4424-9a32-476f81246c28\") " pod="openstack/nova-cell1-conductor-0" Sep 30 06:38:40 crc kubenswrapper[4691]: I0930 06:38:40.012176 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0efbbe5-44f2-4424-9a32-476f81246c28-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"b0efbbe5-44f2-4424-9a32-476f81246c28\") " pod="openstack/nova-cell1-conductor-0" Sep 30 06:38:40 crc kubenswrapper[4691]: I0930 06:38:40.028768 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zdkg\" (UniqueName: \"kubernetes.io/projected/b0efbbe5-44f2-4424-9a32-476f81246c28-kube-api-access-2zdkg\") pod \"nova-cell1-conductor-0\" (UID: \"b0efbbe5-44f2-4424-9a32-476f81246c28\") " pod="openstack/nova-cell1-conductor-0" Sep 30 06:38:40 crc kubenswrapper[4691]: I0930 06:38:40.118468 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 06:38:40 crc kubenswrapper[4691]: I0930 06:38:40.129996 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 06:38:40 crc kubenswrapper[4691]: I0930 06:38:40.143574 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Sep 30 06:38:40 crc kubenswrapper[4691]: I0930 06:38:40.144332 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 06:38:40 crc kubenswrapper[4691]: I0930 06:38:40.148543 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 06:38:40 crc kubenswrapper[4691]: I0930 06:38:40.155919 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 06:38:40 crc kubenswrapper[4691]: I0930 06:38:40.158308 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 06:38:40 crc kubenswrapper[4691]: I0930 06:38:40.158491 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 06:38:40 crc kubenswrapper[4691]: I0930 06:38:40.314003 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4528ae48-9ab1-424c-8cb0-d5cb6996562b-log-httpd\") pod \"ceilometer-0\" (UID: \"4528ae48-9ab1-424c-8cb0-d5cb6996562b\") " pod="openstack/ceilometer-0" Sep 30 06:38:40 crc kubenswrapper[4691]: I0930 06:38:40.314237 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4528ae48-9ab1-424c-8cb0-d5cb6996562b-config-data\") pod \"ceilometer-0\" (UID: \"4528ae48-9ab1-424c-8cb0-d5cb6996562b\") " pod="openstack/ceilometer-0" Sep 30 06:38:40 crc kubenswrapper[4691]: I0930 06:38:40.314268 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4528ae48-9ab1-424c-8cb0-d5cb6996562b-scripts\") pod \"ceilometer-0\" (UID: \"4528ae48-9ab1-424c-8cb0-d5cb6996562b\") " pod="openstack/ceilometer-0" Sep 30 06:38:40 crc kubenswrapper[4691]: I0930 06:38:40.314363 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4528ae48-9ab1-424c-8cb0-d5cb6996562b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4528ae48-9ab1-424c-8cb0-d5cb6996562b\") " pod="openstack/ceilometer-0" Sep 30 06:38:40 crc kubenswrapper[4691]: I0930 06:38:40.314387 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkqgh\" (UniqueName: \"kubernetes.io/projected/4528ae48-9ab1-424c-8cb0-d5cb6996562b-kube-api-access-mkqgh\") pod \"ceilometer-0\" (UID: \"4528ae48-9ab1-424c-8cb0-d5cb6996562b\") " pod="openstack/ceilometer-0" Sep 30 06:38:40 crc kubenswrapper[4691]: I0930 06:38:40.314429 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4528ae48-9ab1-424c-8cb0-d5cb6996562b-run-httpd\") pod \"ceilometer-0\" (UID: \"4528ae48-9ab1-424c-8cb0-d5cb6996562b\") " pod="openstack/ceilometer-0" Sep 30 06:38:40 crc kubenswrapper[4691]: I0930 06:38:40.314461 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4528ae48-9ab1-424c-8cb0-d5cb6996562b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4528ae48-9ab1-424c-8cb0-d5cb6996562b\") " pod="openstack/ceilometer-0" Sep 30 06:38:40 crc kubenswrapper[4691]: I0930 06:38:40.415971 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4528ae48-9ab1-424c-8cb0-d5cb6996562b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4528ae48-9ab1-424c-8cb0-d5cb6996562b\") " pod="openstack/ceilometer-0" Sep 30 06:38:40 crc kubenswrapper[4691]: I0930 06:38:40.416054 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4528ae48-9ab1-424c-8cb0-d5cb6996562b-log-httpd\") pod \"ceilometer-0\" (UID: \"4528ae48-9ab1-424c-8cb0-d5cb6996562b\") " pod="openstack/ceilometer-0" Sep 30 06:38:40 crc kubenswrapper[4691]: I0930 06:38:40.416084 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4528ae48-9ab1-424c-8cb0-d5cb6996562b-config-data\") pod \"ceilometer-0\" (UID: \"4528ae48-9ab1-424c-8cb0-d5cb6996562b\") " pod="openstack/ceilometer-0" Sep 30 06:38:40 crc kubenswrapper[4691]: I0930 06:38:40.416123 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4528ae48-9ab1-424c-8cb0-d5cb6996562b-scripts\") pod \"ceilometer-0\" (UID: \"4528ae48-9ab1-424c-8cb0-d5cb6996562b\") " pod="openstack/ceilometer-0" Sep 30 06:38:40 crc kubenswrapper[4691]: I0930 06:38:40.416225 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4528ae48-9ab1-424c-8cb0-d5cb6996562b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4528ae48-9ab1-424c-8cb0-d5cb6996562b\") " pod="openstack/ceilometer-0" Sep 30 06:38:40 crc kubenswrapper[4691]: I0930 06:38:40.416244 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkqgh\" (UniqueName: \"kubernetes.io/projected/4528ae48-9ab1-424c-8cb0-d5cb6996562b-kube-api-access-mkqgh\") pod \"ceilometer-0\" (UID: \"4528ae48-9ab1-424c-8cb0-d5cb6996562b\") " pod="openstack/ceilometer-0" Sep 30 06:38:40 crc kubenswrapper[4691]: I0930 06:38:40.416281 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4528ae48-9ab1-424c-8cb0-d5cb6996562b-run-httpd\") pod \"ceilometer-0\" (UID: \"4528ae48-9ab1-424c-8cb0-d5cb6996562b\") " pod="openstack/ceilometer-0" Sep 30 06:38:40 crc kubenswrapper[4691]: I0930 06:38:40.416641 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4528ae48-9ab1-424c-8cb0-d5cb6996562b-run-httpd\") pod \"ceilometer-0\" (UID: \"4528ae48-9ab1-424c-8cb0-d5cb6996562b\") " pod="openstack/ceilometer-0" Sep 30 06:38:40 crc kubenswrapper[4691]: I0930 06:38:40.417979 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4528ae48-9ab1-424c-8cb0-d5cb6996562b-log-httpd\") pod \"ceilometer-0\" (UID: \"4528ae48-9ab1-424c-8cb0-d5cb6996562b\") " pod="openstack/ceilometer-0" Sep 30 06:38:40 crc kubenswrapper[4691]: I0930 06:38:40.422178 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4528ae48-9ab1-424c-8cb0-d5cb6996562b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4528ae48-9ab1-424c-8cb0-d5cb6996562b\") " pod="openstack/ceilometer-0" Sep 30 06:38:40 crc kubenswrapper[4691]: I0930 06:38:40.432339 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4528ae48-9ab1-424c-8cb0-d5cb6996562b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4528ae48-9ab1-424c-8cb0-d5cb6996562b\") " pod="openstack/ceilometer-0" Sep 30 06:38:40 crc kubenswrapper[4691]: I0930 06:38:40.432653 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4528ae48-9ab1-424c-8cb0-d5cb6996562b-scripts\") pod \"ceilometer-0\" (UID: \"4528ae48-9ab1-424c-8cb0-d5cb6996562b\") " pod="openstack/ceilometer-0" Sep 30 06:38:40 crc kubenswrapper[4691]: I0930 06:38:40.433527 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4528ae48-9ab1-424c-8cb0-d5cb6996562b-config-data\") pod \"ceilometer-0\" (UID: \"4528ae48-9ab1-424c-8cb0-d5cb6996562b\") " pod="openstack/ceilometer-0" Sep 30 06:38:40 crc kubenswrapper[4691]: I0930 06:38:40.434956 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkqgh\" (UniqueName: \"kubernetes.io/projected/4528ae48-9ab1-424c-8cb0-d5cb6996562b-kube-api-access-mkqgh\") pod \"ceilometer-0\" (UID: \"4528ae48-9ab1-424c-8cb0-d5cb6996562b\") " pod="openstack/ceilometer-0" Sep 30 06:38:40 crc kubenswrapper[4691]: I0930 06:38:40.471862 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 06:38:40 crc kubenswrapper[4691]: I0930 06:38:40.584441 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Sep 30 06:38:40 crc kubenswrapper[4691]: I0930 06:38:40.735504 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"b0efbbe5-44f2-4424-9a32-476f81246c28","Type":"ContainerStarted","Data":"88245141d84cefafa353ef2d4a8f4d29d4bdd26ee81ba146a13185139266f499"} Sep 30 06:38:40 crc kubenswrapper[4691]: I0930 06:38:40.978739 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 06:38:40 crc kubenswrapper[4691]: W0930 06:38:40.986307 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4528ae48_9ab1_424c_8cb0_d5cb6996562b.slice/crio-1d99baf41abb261804ca7ccb2045cfdcf3beb389c04a1fb87cd6173f9d309bbf WatchSource:0}: Error finding container 1d99baf41abb261804ca7ccb2045cfdcf3beb389c04a1fb87cd6173f9d309bbf: Status 404 returned error can't find the container with id 1d99baf41abb261804ca7ccb2045cfdcf3beb389c04a1fb87cd6173f9d309bbf Sep 30 06:38:40 crc kubenswrapper[4691]: I0930 06:38:40.989047 4691 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 06:38:41 crc kubenswrapper[4691]: I0930 06:38:41.241345 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db1e0286-c4b2-40de-aa35-50da1ced7b3d" path="/var/lib/kubelet/pods/db1e0286-c4b2-40de-aa35-50da1ced7b3d/volumes" Sep 30 06:38:41 crc kubenswrapper[4691]: I0930 06:38:41.749235 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4528ae48-9ab1-424c-8cb0-d5cb6996562b","Type":"ContainerStarted","Data":"374c3f0e049e24622ea5631f0df57cb505aefcf66d3d95d05155ea9a94ea1434"} Sep 30 06:38:41 crc kubenswrapper[4691]: I0930 06:38:41.749293 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4528ae48-9ab1-424c-8cb0-d5cb6996562b","Type":"ContainerStarted","Data":"7eaf8010a19696f26cccb1404073cce1d7975457638251360bbba51c5ae32852"} Sep 30 06:38:41 crc kubenswrapper[4691]: I0930 06:38:41.749305 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4528ae48-9ab1-424c-8cb0-d5cb6996562b","Type":"ContainerStarted","Data":"1d99baf41abb261804ca7ccb2045cfdcf3beb389c04a1fb87cd6173f9d309bbf"} Sep 30 06:38:41 crc kubenswrapper[4691]: I0930 06:38:41.751432 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"b0efbbe5-44f2-4424-9a32-476f81246c28","Type":"ContainerStarted","Data":"928f84cb8817c6af88f73fd9cd833b0ea61df703bd12cb0e3e5c59605b876889"} Sep 30 06:38:41 crc kubenswrapper[4691]: I0930 06:38:41.751921 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Sep 30 06:38:41 crc kubenswrapper[4691]: I0930 06:38:41.782208 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.782188762 podStartE2EDuration="2.782188762s" podCreationTimestamp="2025-09-30 06:38:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:38:41.771834648 +0000 UTC m=+1165.246855688" watchObservedRunningTime="2025-09-30 06:38:41.782188762 +0000 UTC m=+1165.257209802" Sep 30 06:38:42 crc kubenswrapper[4691]: E0930 06:38:42.257030 4691 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="09b919abf4e0a07a23272d86e2c52a73a49f66b747a38fe48f877c0008b23e05" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Sep 30 06:38:42 crc kubenswrapper[4691]: E0930 06:38:42.274151 4691 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="09b919abf4e0a07a23272d86e2c52a73a49f66b747a38fe48f877c0008b23e05" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Sep 30 06:38:42 crc kubenswrapper[4691]: E0930 06:38:42.280029 4691 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="09b919abf4e0a07a23272d86e2c52a73a49f66b747a38fe48f877c0008b23e05" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Sep 30 06:38:42 crc kubenswrapper[4691]: E0930 06:38:42.280095 4691 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="fc428a66-0149-4f08-89d7-e9b5749e4bd5" containerName="nova-scheduler-scheduler" Sep 30 06:38:42 crc kubenswrapper[4691]: I0930 06:38:42.528189 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 06:38:42 crc kubenswrapper[4691]: I0930 06:38:42.660964 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-chrjx\" (UniqueName: \"kubernetes.io/projected/5292180f-82cd-4fe7-8bb6-5017fdc07cc4-kube-api-access-chrjx\") pod \"5292180f-82cd-4fe7-8bb6-5017fdc07cc4\" (UID: \"5292180f-82cd-4fe7-8bb6-5017fdc07cc4\") " Sep 30 06:38:42 crc kubenswrapper[4691]: I0930 06:38:42.661014 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5292180f-82cd-4fe7-8bb6-5017fdc07cc4-config-data\") pod \"5292180f-82cd-4fe7-8bb6-5017fdc07cc4\" (UID: \"5292180f-82cd-4fe7-8bb6-5017fdc07cc4\") " Sep 30 06:38:42 crc kubenswrapper[4691]: I0930 06:38:42.661172 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5292180f-82cd-4fe7-8bb6-5017fdc07cc4-logs\") pod \"5292180f-82cd-4fe7-8bb6-5017fdc07cc4\" (UID: \"5292180f-82cd-4fe7-8bb6-5017fdc07cc4\") " Sep 30 06:38:42 crc kubenswrapper[4691]: I0930 06:38:42.661246 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5292180f-82cd-4fe7-8bb6-5017fdc07cc4-combined-ca-bundle\") pod \"5292180f-82cd-4fe7-8bb6-5017fdc07cc4\" (UID: \"5292180f-82cd-4fe7-8bb6-5017fdc07cc4\") " Sep 30 06:38:42 crc kubenswrapper[4691]: I0930 06:38:42.661994 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5292180f-82cd-4fe7-8bb6-5017fdc07cc4-logs" (OuterVolumeSpecName: "logs") pod "5292180f-82cd-4fe7-8bb6-5017fdc07cc4" (UID: "5292180f-82cd-4fe7-8bb6-5017fdc07cc4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:38:42 crc kubenswrapper[4691]: I0930 06:38:42.669017 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5292180f-82cd-4fe7-8bb6-5017fdc07cc4-kube-api-access-chrjx" (OuterVolumeSpecName: "kube-api-access-chrjx") pod "5292180f-82cd-4fe7-8bb6-5017fdc07cc4" (UID: "5292180f-82cd-4fe7-8bb6-5017fdc07cc4"). InnerVolumeSpecName "kube-api-access-chrjx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:38:42 crc kubenswrapper[4691]: I0930 06:38:42.688543 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5292180f-82cd-4fe7-8bb6-5017fdc07cc4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5292180f-82cd-4fe7-8bb6-5017fdc07cc4" (UID: "5292180f-82cd-4fe7-8bb6-5017fdc07cc4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:38:42 crc kubenswrapper[4691]: I0930 06:38:42.692141 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5292180f-82cd-4fe7-8bb6-5017fdc07cc4-config-data" (OuterVolumeSpecName: "config-data") pod "5292180f-82cd-4fe7-8bb6-5017fdc07cc4" (UID: "5292180f-82cd-4fe7-8bb6-5017fdc07cc4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:38:42 crc kubenswrapper[4691]: I0930 06:38:42.762906 4691 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5292180f-82cd-4fe7-8bb6-5017fdc07cc4-logs\") on node \"crc\" DevicePath \"\"" Sep 30 06:38:42 crc kubenswrapper[4691]: I0930 06:38:42.763130 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5292180f-82cd-4fe7-8bb6-5017fdc07cc4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:38:42 crc kubenswrapper[4691]: I0930 06:38:42.763199 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-chrjx\" (UniqueName: \"kubernetes.io/projected/5292180f-82cd-4fe7-8bb6-5017fdc07cc4-kube-api-access-chrjx\") on node \"crc\" DevicePath \"\"" Sep 30 06:38:42 crc kubenswrapper[4691]: I0930 06:38:42.763388 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5292180f-82cd-4fe7-8bb6-5017fdc07cc4-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 06:38:42 crc kubenswrapper[4691]: I0930 06:38:42.763773 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4528ae48-9ab1-424c-8cb0-d5cb6996562b","Type":"ContainerStarted","Data":"1eae33ebf278f94d485a71973ff7e8f5034fb88c81f4e76b35433760bd2cccc5"} Sep 30 06:38:42 crc kubenswrapper[4691]: I0930 06:38:42.765139 4691 generic.go:334] "Generic (PLEG): container finished" podID="5292180f-82cd-4fe7-8bb6-5017fdc07cc4" containerID="02bf8afb49936dc693509565049ffdb8c2f018d8c5453dbf221b452f3ce3b527" exitCode=0 Sep 30 06:38:42 crc kubenswrapper[4691]: I0930 06:38:42.765760 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 06:38:42 crc kubenswrapper[4691]: I0930 06:38:42.769780 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5292180f-82cd-4fe7-8bb6-5017fdc07cc4","Type":"ContainerDied","Data":"02bf8afb49936dc693509565049ffdb8c2f018d8c5453dbf221b452f3ce3b527"} Sep 30 06:38:42 crc kubenswrapper[4691]: I0930 06:38:42.769833 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5292180f-82cd-4fe7-8bb6-5017fdc07cc4","Type":"ContainerDied","Data":"76d1f2b939ff24113c9be76c16f1c1e5a9b8a4d9ad81641b8627e47fa994164d"} Sep 30 06:38:42 crc kubenswrapper[4691]: I0930 06:38:42.769851 4691 scope.go:117] "RemoveContainer" containerID="02bf8afb49936dc693509565049ffdb8c2f018d8c5453dbf221b452f3ce3b527" Sep 30 06:38:42 crc kubenswrapper[4691]: I0930 06:38:42.804281 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 30 06:38:42 crc kubenswrapper[4691]: I0930 06:38:42.821472 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Sep 30 06:38:42 crc kubenswrapper[4691]: I0930 06:38:42.828926 4691 scope.go:117] "RemoveContainer" containerID="fe0833b835aad6001045481370c96f8840cbc23c38a5ed4a2ee7167864378366" Sep 30 06:38:42 crc kubenswrapper[4691]: I0930 06:38:42.833415 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Sep 30 06:38:42 crc kubenswrapper[4691]: E0930 06:38:42.833846 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5292180f-82cd-4fe7-8bb6-5017fdc07cc4" containerName="nova-api-api" Sep 30 06:38:42 crc kubenswrapper[4691]: I0930 06:38:42.833864 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="5292180f-82cd-4fe7-8bb6-5017fdc07cc4" containerName="nova-api-api" Sep 30 06:38:42 crc kubenswrapper[4691]: E0930 06:38:42.833926 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5292180f-82cd-4fe7-8bb6-5017fdc07cc4" containerName="nova-api-log" Sep 30 06:38:42 crc kubenswrapper[4691]: I0930 06:38:42.833933 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="5292180f-82cd-4fe7-8bb6-5017fdc07cc4" containerName="nova-api-log" Sep 30 06:38:42 crc kubenswrapper[4691]: I0930 06:38:42.834113 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="5292180f-82cd-4fe7-8bb6-5017fdc07cc4" containerName="nova-api-api" Sep 30 06:38:42 crc kubenswrapper[4691]: I0930 06:38:42.834132 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="5292180f-82cd-4fe7-8bb6-5017fdc07cc4" containerName="nova-api-log" Sep 30 06:38:42 crc kubenswrapper[4691]: I0930 06:38:42.835272 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 06:38:42 crc kubenswrapper[4691]: I0930 06:38:42.839419 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Sep 30 06:38:42 crc kubenswrapper[4691]: I0930 06:38:42.844433 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 06:38:42 crc kubenswrapper[4691]: I0930 06:38:42.876979 4691 scope.go:117] "RemoveContainer" containerID="02bf8afb49936dc693509565049ffdb8c2f018d8c5453dbf221b452f3ce3b527" Sep 30 06:38:42 crc kubenswrapper[4691]: E0930 06:38:42.877425 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02bf8afb49936dc693509565049ffdb8c2f018d8c5453dbf221b452f3ce3b527\": container with ID starting with 02bf8afb49936dc693509565049ffdb8c2f018d8c5453dbf221b452f3ce3b527 not found: ID does not exist" containerID="02bf8afb49936dc693509565049ffdb8c2f018d8c5453dbf221b452f3ce3b527" Sep 30 06:38:42 crc kubenswrapper[4691]: I0930 06:38:42.877456 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02bf8afb49936dc693509565049ffdb8c2f018d8c5453dbf221b452f3ce3b527"} err="failed to get container status \"02bf8afb49936dc693509565049ffdb8c2f018d8c5453dbf221b452f3ce3b527\": rpc error: code = NotFound desc = could not find container \"02bf8afb49936dc693509565049ffdb8c2f018d8c5453dbf221b452f3ce3b527\": container with ID starting with 02bf8afb49936dc693509565049ffdb8c2f018d8c5453dbf221b452f3ce3b527 not found: ID does not exist" Sep 30 06:38:42 crc kubenswrapper[4691]: I0930 06:38:42.877476 4691 scope.go:117] "RemoveContainer" containerID="fe0833b835aad6001045481370c96f8840cbc23c38a5ed4a2ee7167864378366" Sep 30 06:38:42 crc kubenswrapper[4691]: E0930 06:38:42.877665 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe0833b835aad6001045481370c96f8840cbc23c38a5ed4a2ee7167864378366\": container with ID starting with fe0833b835aad6001045481370c96f8840cbc23c38a5ed4a2ee7167864378366 not found: ID does not exist" containerID="fe0833b835aad6001045481370c96f8840cbc23c38a5ed4a2ee7167864378366" Sep 30 06:38:42 crc kubenswrapper[4691]: I0930 06:38:42.877685 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe0833b835aad6001045481370c96f8840cbc23c38a5ed4a2ee7167864378366"} err="failed to get container status \"fe0833b835aad6001045481370c96f8840cbc23c38a5ed4a2ee7167864378366\": rpc error: code = NotFound desc = could not find container \"fe0833b835aad6001045481370c96f8840cbc23c38a5ed4a2ee7167864378366\": container with ID starting with fe0833b835aad6001045481370c96f8840cbc23c38a5ed4a2ee7167864378366 not found: ID does not exist" Sep 30 06:38:42 crc kubenswrapper[4691]: I0930 06:38:42.967379 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f18aaea4-d2f7-47e6-bb32-00b3ba99ea72-logs\") pod \"nova-api-0\" (UID: \"f18aaea4-d2f7-47e6-bb32-00b3ba99ea72\") " pod="openstack/nova-api-0" Sep 30 06:38:42 crc kubenswrapper[4691]: I0930 06:38:42.967435 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f18aaea4-d2f7-47e6-bb32-00b3ba99ea72-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f18aaea4-d2f7-47e6-bb32-00b3ba99ea72\") " pod="openstack/nova-api-0" Sep 30 06:38:42 crc kubenswrapper[4691]: I0930 06:38:42.967635 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dczr\" (UniqueName: \"kubernetes.io/projected/f18aaea4-d2f7-47e6-bb32-00b3ba99ea72-kube-api-access-2dczr\") pod \"nova-api-0\" (UID: \"f18aaea4-d2f7-47e6-bb32-00b3ba99ea72\") " pod="openstack/nova-api-0" Sep 30 06:38:42 crc kubenswrapper[4691]: I0930 06:38:42.967704 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f18aaea4-d2f7-47e6-bb32-00b3ba99ea72-config-data\") pod \"nova-api-0\" (UID: \"f18aaea4-d2f7-47e6-bb32-00b3ba99ea72\") " pod="openstack/nova-api-0" Sep 30 06:38:43 crc kubenswrapper[4691]: I0930 06:38:43.079251 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dczr\" (UniqueName: \"kubernetes.io/projected/f18aaea4-d2f7-47e6-bb32-00b3ba99ea72-kube-api-access-2dczr\") pod \"nova-api-0\" (UID: \"f18aaea4-d2f7-47e6-bb32-00b3ba99ea72\") " pod="openstack/nova-api-0" Sep 30 06:38:43 crc kubenswrapper[4691]: I0930 06:38:43.079345 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f18aaea4-d2f7-47e6-bb32-00b3ba99ea72-config-data\") pod \"nova-api-0\" (UID: \"f18aaea4-d2f7-47e6-bb32-00b3ba99ea72\") " pod="openstack/nova-api-0" Sep 30 06:38:43 crc kubenswrapper[4691]: I0930 06:38:43.088525 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f18aaea4-d2f7-47e6-bb32-00b3ba99ea72-config-data\") pod \"nova-api-0\" (UID: \"f18aaea4-d2f7-47e6-bb32-00b3ba99ea72\") " pod="openstack/nova-api-0" Sep 30 06:38:43 crc kubenswrapper[4691]: I0930 06:38:43.088705 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f18aaea4-d2f7-47e6-bb32-00b3ba99ea72-logs\") pod \"nova-api-0\" (UID: \"f18aaea4-d2f7-47e6-bb32-00b3ba99ea72\") " pod="openstack/nova-api-0" Sep 30 06:38:43 crc kubenswrapper[4691]: I0930 06:38:43.088799 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f18aaea4-d2f7-47e6-bb32-00b3ba99ea72-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f18aaea4-d2f7-47e6-bb32-00b3ba99ea72\") " pod="openstack/nova-api-0" Sep 30 06:38:43 crc kubenswrapper[4691]: I0930 06:38:43.089389 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f18aaea4-d2f7-47e6-bb32-00b3ba99ea72-logs\") pod \"nova-api-0\" (UID: \"f18aaea4-d2f7-47e6-bb32-00b3ba99ea72\") " pod="openstack/nova-api-0" Sep 30 06:38:43 crc kubenswrapper[4691]: I0930 06:38:43.091369 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f18aaea4-d2f7-47e6-bb32-00b3ba99ea72-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f18aaea4-d2f7-47e6-bb32-00b3ba99ea72\") " pod="openstack/nova-api-0" Sep 30 06:38:43 crc kubenswrapper[4691]: I0930 06:38:43.101047 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dczr\" (UniqueName: \"kubernetes.io/projected/f18aaea4-d2f7-47e6-bb32-00b3ba99ea72-kube-api-access-2dczr\") pod \"nova-api-0\" (UID: \"f18aaea4-d2f7-47e6-bb32-00b3ba99ea72\") " pod="openstack/nova-api-0" Sep 30 06:38:43 crc kubenswrapper[4691]: I0930 06:38:43.161469 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 06:38:43 crc kubenswrapper[4691]: I0930 06:38:43.249228 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5292180f-82cd-4fe7-8bb6-5017fdc07cc4" path="/var/lib/kubelet/pods/5292180f-82cd-4fe7-8bb6-5017fdc07cc4/volumes" Sep 30 06:38:43 crc kubenswrapper[4691]: I0930 06:38:43.635910 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 06:38:43 crc kubenswrapper[4691]: I0930 06:38:43.673666 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 06:38:43 crc kubenswrapper[4691]: I0930 06:38:43.699604 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc428a66-0149-4f08-89d7-e9b5749e4bd5-config-data\") pod \"fc428a66-0149-4f08-89d7-e9b5749e4bd5\" (UID: \"fc428a66-0149-4f08-89d7-e9b5749e4bd5\") " Sep 30 06:38:43 crc kubenswrapper[4691]: I0930 06:38:43.699677 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nc9zd\" (UniqueName: \"kubernetes.io/projected/fc428a66-0149-4f08-89d7-e9b5749e4bd5-kube-api-access-nc9zd\") pod \"fc428a66-0149-4f08-89d7-e9b5749e4bd5\" (UID: \"fc428a66-0149-4f08-89d7-e9b5749e4bd5\") " Sep 30 06:38:43 crc kubenswrapper[4691]: I0930 06:38:43.699824 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc428a66-0149-4f08-89d7-e9b5749e4bd5-combined-ca-bundle\") pod \"fc428a66-0149-4f08-89d7-e9b5749e4bd5\" (UID: \"fc428a66-0149-4f08-89d7-e9b5749e4bd5\") " Sep 30 06:38:43 crc kubenswrapper[4691]: I0930 06:38:43.711780 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc428a66-0149-4f08-89d7-e9b5749e4bd5-kube-api-access-nc9zd" (OuterVolumeSpecName: "kube-api-access-nc9zd") pod "fc428a66-0149-4f08-89d7-e9b5749e4bd5" (UID: "fc428a66-0149-4f08-89d7-e9b5749e4bd5"). InnerVolumeSpecName "kube-api-access-nc9zd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:38:43 crc kubenswrapper[4691]: I0930 06:38:43.768836 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc428a66-0149-4f08-89d7-e9b5749e4bd5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fc428a66-0149-4f08-89d7-e9b5749e4bd5" (UID: "fc428a66-0149-4f08-89d7-e9b5749e4bd5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:38:43 crc kubenswrapper[4691]: I0930 06:38:43.774612 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc428a66-0149-4f08-89d7-e9b5749e4bd5-config-data" (OuterVolumeSpecName: "config-data") pod "fc428a66-0149-4f08-89d7-e9b5749e4bd5" (UID: "fc428a66-0149-4f08-89d7-e9b5749e4bd5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:38:43 crc kubenswrapper[4691]: I0930 06:38:43.782539 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f18aaea4-d2f7-47e6-bb32-00b3ba99ea72","Type":"ContainerStarted","Data":"e0289f034f94f0daf228e0ba060236603fd112385db9c97e21668703f08f8a8c"} Sep 30 06:38:43 crc kubenswrapper[4691]: I0930 06:38:43.787112 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 06:38:43 crc kubenswrapper[4691]: I0930 06:38:43.788407 4691 generic.go:334] "Generic (PLEG): container finished" podID="fc428a66-0149-4f08-89d7-e9b5749e4bd5" containerID="09b919abf4e0a07a23272d86e2c52a73a49f66b747a38fe48f877c0008b23e05" exitCode=0 Sep 30 06:38:43 crc kubenswrapper[4691]: I0930 06:38:43.788471 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fc428a66-0149-4f08-89d7-e9b5749e4bd5","Type":"ContainerDied","Data":"09b919abf4e0a07a23272d86e2c52a73a49f66b747a38fe48f877c0008b23e05"} Sep 30 06:38:43 crc kubenswrapper[4691]: I0930 06:38:43.788489 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"fc428a66-0149-4f08-89d7-e9b5749e4bd5","Type":"ContainerDied","Data":"d7ce394cd7c959c17a8afe33e6feff262c007833f9dd31ca03275813d8fcdf72"} Sep 30 06:38:43 crc kubenswrapper[4691]: I0930 06:38:43.788518 4691 scope.go:117] "RemoveContainer" containerID="09b919abf4e0a07a23272d86e2c52a73a49f66b747a38fe48f877c0008b23e05" Sep 30 06:38:43 crc kubenswrapper[4691]: I0930 06:38:43.788605 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 06:38:43 crc kubenswrapper[4691]: I0930 06:38:43.802225 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc428a66-0149-4f08-89d7-e9b5749e4bd5-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 06:38:43 crc kubenswrapper[4691]: I0930 06:38:43.802253 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nc9zd\" (UniqueName: \"kubernetes.io/projected/fc428a66-0149-4f08-89d7-e9b5749e4bd5-kube-api-access-nc9zd\") on node \"crc\" DevicePath \"\"" Sep 30 06:38:43 crc kubenswrapper[4691]: I0930 06:38:43.802263 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc428a66-0149-4f08-89d7-e9b5749e4bd5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:38:43 crc kubenswrapper[4691]: I0930 06:38:43.813795 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.294980936 podStartE2EDuration="3.813762129s" podCreationTimestamp="2025-09-30 06:38:40 +0000 UTC" firstStartedPulling="2025-09-30 06:38:40.988803372 +0000 UTC m=+1164.463824412" lastFinishedPulling="2025-09-30 06:38:43.507584565 +0000 UTC m=+1166.982605605" observedRunningTime="2025-09-30 06:38:43.804058239 +0000 UTC m=+1167.279079279" watchObservedRunningTime="2025-09-30 06:38:43.813762129 +0000 UTC m=+1167.288783169" Sep 30 06:38:43 crc kubenswrapper[4691]: I0930 06:38:43.825096 4691 scope.go:117] "RemoveContainer" containerID="09b919abf4e0a07a23272d86e2c52a73a49f66b747a38fe48f877c0008b23e05" Sep 30 06:38:43 crc kubenswrapper[4691]: E0930 06:38:43.825697 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09b919abf4e0a07a23272d86e2c52a73a49f66b747a38fe48f877c0008b23e05\": container with ID starting with 09b919abf4e0a07a23272d86e2c52a73a49f66b747a38fe48f877c0008b23e05 not found: ID does not exist" containerID="09b919abf4e0a07a23272d86e2c52a73a49f66b747a38fe48f877c0008b23e05" Sep 30 06:38:43 crc kubenswrapper[4691]: I0930 06:38:43.825723 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09b919abf4e0a07a23272d86e2c52a73a49f66b747a38fe48f877c0008b23e05"} err="failed to get container status \"09b919abf4e0a07a23272d86e2c52a73a49f66b747a38fe48f877c0008b23e05\": rpc error: code = NotFound desc = could not find container \"09b919abf4e0a07a23272d86e2c52a73a49f66b747a38fe48f877c0008b23e05\": container with ID starting with 09b919abf4e0a07a23272d86e2c52a73a49f66b747a38fe48f877c0008b23e05 not found: ID does not exist" Sep 30 06:38:43 crc kubenswrapper[4691]: I0930 06:38:43.871706 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 06:38:43 crc kubenswrapper[4691]: I0930 06:38:43.888340 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 06:38:43 crc kubenswrapper[4691]: I0930 06:38:43.901616 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 06:38:43 crc kubenswrapper[4691]: E0930 06:38:43.902488 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc428a66-0149-4f08-89d7-e9b5749e4bd5" containerName="nova-scheduler-scheduler" Sep 30 06:38:43 crc kubenswrapper[4691]: I0930 06:38:43.902506 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc428a66-0149-4f08-89d7-e9b5749e4bd5" containerName="nova-scheduler-scheduler" Sep 30 06:38:43 crc kubenswrapper[4691]: I0930 06:38:43.902686 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc428a66-0149-4f08-89d7-e9b5749e4bd5" containerName="nova-scheduler-scheduler" Sep 30 06:38:43 crc kubenswrapper[4691]: I0930 06:38:43.903423 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 06:38:43 crc kubenswrapper[4691]: I0930 06:38:43.921437 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Sep 30 06:38:43 crc kubenswrapper[4691]: I0930 06:38:43.931928 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 06:38:44 crc kubenswrapper[4691]: I0930 06:38:44.006227 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32d7b989-18a3-4e18-8eb5-e3f7856ed003-config-data\") pod \"nova-scheduler-0\" (UID: \"32d7b989-18a3-4e18-8eb5-e3f7856ed003\") " pod="openstack/nova-scheduler-0" Sep 30 06:38:44 crc kubenswrapper[4691]: I0930 06:38:44.006281 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32d7b989-18a3-4e18-8eb5-e3f7856ed003-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"32d7b989-18a3-4e18-8eb5-e3f7856ed003\") " pod="openstack/nova-scheduler-0" Sep 30 06:38:44 crc kubenswrapper[4691]: I0930 06:38:44.006407 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngttl\" (UniqueName: \"kubernetes.io/projected/32d7b989-18a3-4e18-8eb5-e3f7856ed003-kube-api-access-ngttl\") pod \"nova-scheduler-0\" (UID: \"32d7b989-18a3-4e18-8eb5-e3f7856ed003\") " pod="openstack/nova-scheduler-0" Sep 30 06:38:44 crc kubenswrapper[4691]: I0930 06:38:44.107587 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngttl\" (UniqueName: \"kubernetes.io/projected/32d7b989-18a3-4e18-8eb5-e3f7856ed003-kube-api-access-ngttl\") pod \"nova-scheduler-0\" (UID: \"32d7b989-18a3-4e18-8eb5-e3f7856ed003\") " pod="openstack/nova-scheduler-0" Sep 30 06:38:44 crc kubenswrapper[4691]: I0930 06:38:44.107676 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32d7b989-18a3-4e18-8eb5-e3f7856ed003-config-data\") pod \"nova-scheduler-0\" (UID: \"32d7b989-18a3-4e18-8eb5-e3f7856ed003\") " pod="openstack/nova-scheduler-0" Sep 30 06:38:44 crc kubenswrapper[4691]: I0930 06:38:44.107707 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32d7b989-18a3-4e18-8eb5-e3f7856ed003-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"32d7b989-18a3-4e18-8eb5-e3f7856ed003\") " pod="openstack/nova-scheduler-0" Sep 30 06:38:44 crc kubenswrapper[4691]: I0930 06:38:44.113441 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32d7b989-18a3-4e18-8eb5-e3f7856ed003-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"32d7b989-18a3-4e18-8eb5-e3f7856ed003\") " pod="openstack/nova-scheduler-0" Sep 30 06:38:44 crc kubenswrapper[4691]: I0930 06:38:44.125758 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32d7b989-18a3-4e18-8eb5-e3f7856ed003-config-data\") pod \"nova-scheduler-0\" (UID: \"32d7b989-18a3-4e18-8eb5-e3f7856ed003\") " pod="openstack/nova-scheduler-0" Sep 30 06:38:44 crc kubenswrapper[4691]: I0930 06:38:44.126635 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngttl\" (UniqueName: \"kubernetes.io/projected/32d7b989-18a3-4e18-8eb5-e3f7856ed003-kube-api-access-ngttl\") pod \"nova-scheduler-0\" (UID: \"32d7b989-18a3-4e18-8eb5-e3f7856ed003\") " pod="openstack/nova-scheduler-0" Sep 30 06:38:44 crc kubenswrapper[4691]: I0930 06:38:44.225899 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 06:38:44 crc kubenswrapper[4691]: W0930 06:38:44.684495 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod32d7b989_18a3_4e18_8eb5_e3f7856ed003.slice/crio-fff6814417c248ce18160414bf0d6d2450b1d7c41a9a51a9a949219b9793da34 WatchSource:0}: Error finding container fff6814417c248ce18160414bf0d6d2450b1d7c41a9a51a9a949219b9793da34: Status 404 returned error can't find the container with id fff6814417c248ce18160414bf0d6d2450b1d7c41a9a51a9a949219b9793da34 Sep 30 06:38:44 crc kubenswrapper[4691]: I0930 06:38:44.690395 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 06:38:44 crc kubenswrapper[4691]: I0930 06:38:44.803719 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f18aaea4-d2f7-47e6-bb32-00b3ba99ea72","Type":"ContainerStarted","Data":"f8edbd476f29040c250467d5a4c086a09ff5bb7829dd180fc19a411589723007"} Sep 30 06:38:44 crc kubenswrapper[4691]: I0930 06:38:44.803761 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f18aaea4-d2f7-47e6-bb32-00b3ba99ea72","Type":"ContainerStarted","Data":"0ad19fb7829d03a8a4136df7f8c7600016368715c9509a97e342d1da83350e0d"} Sep 30 06:38:44 crc kubenswrapper[4691]: I0930 06:38:44.807810 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4528ae48-9ab1-424c-8cb0-d5cb6996562b","Type":"ContainerStarted","Data":"a55efa7ab36921ab411af4d3ca4223770f34f484c7b1d6e55bedbe647594d16b"} Sep 30 06:38:44 crc kubenswrapper[4691]: I0930 06:38:44.811124 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"32d7b989-18a3-4e18-8eb5-e3f7856ed003","Type":"ContainerStarted","Data":"fff6814417c248ce18160414bf0d6d2450b1d7c41a9a51a9a949219b9793da34"} Sep 30 06:38:44 crc kubenswrapper[4691]: I0930 06:38:44.832490 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.83246756 podStartE2EDuration="2.83246756s" podCreationTimestamp="2025-09-30 06:38:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:38:44.821924811 +0000 UTC m=+1168.296945881" watchObservedRunningTime="2025-09-30 06:38:44.83246756 +0000 UTC m=+1168.307488620" Sep 30 06:38:45 crc kubenswrapper[4691]: I0930 06:38:45.177978 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Sep 30 06:38:45 crc kubenswrapper[4691]: I0930 06:38:45.238725 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc428a66-0149-4f08-89d7-e9b5749e4bd5" path="/var/lib/kubelet/pods/fc428a66-0149-4f08-89d7-e9b5749e4bd5/volumes" Sep 30 06:38:45 crc kubenswrapper[4691]: I0930 06:38:45.831573 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"32d7b989-18a3-4e18-8eb5-e3f7856ed003","Type":"ContainerStarted","Data":"bbdb88b55728fea6717d8ca7d961ff0ae840abbe8018b24e5b4ab2bbaa6bb23c"} Sep 30 06:38:45 crc kubenswrapper[4691]: I0930 06:38:45.867071 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.867048767 podStartE2EDuration="2.867048767s" podCreationTimestamp="2025-09-30 06:38:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:38:45.845337937 +0000 UTC m=+1169.320358997" watchObservedRunningTime="2025-09-30 06:38:45.867048767 +0000 UTC m=+1169.342069837" Sep 30 06:38:49 crc kubenswrapper[4691]: I0930 06:38:49.239765 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Sep 30 06:38:53 crc kubenswrapper[4691]: I0930 06:38:53.161862 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 06:38:53 crc kubenswrapper[4691]: I0930 06:38:53.162535 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 06:38:54 crc kubenswrapper[4691]: I0930 06:38:54.226040 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Sep 30 06:38:54 crc kubenswrapper[4691]: I0930 06:38:54.245206 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f18aaea4-d2f7-47e6-bb32-00b3ba99ea72" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.213:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 30 06:38:54 crc kubenswrapper[4691]: I0930 06:38:54.245310 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f18aaea4-d2f7-47e6-bb32-00b3ba99ea72" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.213:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 30 06:38:54 crc kubenswrapper[4691]: I0930 06:38:54.280855 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Sep 30 06:38:54 crc kubenswrapper[4691]: I0930 06:38:54.987930 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Sep 30 06:39:02 crc kubenswrapper[4691]: I0930 06:39:02.042630 4691 generic.go:334] "Generic (PLEG): container finished" podID="92ee0693-710e-43cd-a7f6-823fcd0acf42" containerID="37dd74b2067d2407970a9ec08d0544da7cedbe0561f322a4e76318d781fc5140" exitCode=137 Sep 30 06:39:02 crc kubenswrapper[4691]: I0930 06:39:02.043175 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"92ee0693-710e-43cd-a7f6-823fcd0acf42","Type":"ContainerDied","Data":"37dd74b2067d2407970a9ec08d0544da7cedbe0561f322a4e76318d781fc5140"} Sep 30 06:39:02 crc kubenswrapper[4691]: I0930 06:39:02.045256 4691 generic.go:334] "Generic (PLEG): container finished" podID="e3555510-11fa-4c8a-bf0b-9196b7a61f36" containerID="1c582eb449995b0023fe2b4ebc8e15a19718f42294e28fe52551eeaf0d5bcbb2" exitCode=137 Sep 30 06:39:02 crc kubenswrapper[4691]: I0930 06:39:02.045289 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e3555510-11fa-4c8a-bf0b-9196b7a61f36","Type":"ContainerDied","Data":"1c582eb449995b0023fe2b4ebc8e15a19718f42294e28fe52551eeaf0d5bcbb2"} Sep 30 06:39:02 crc kubenswrapper[4691]: I0930 06:39:02.131593 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 06:39:02 crc kubenswrapper[4691]: I0930 06:39:02.137664 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 30 06:39:02 crc kubenswrapper[4691]: I0930 06:39:02.236170 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3555510-11fa-4c8a-bf0b-9196b7a61f36-logs\") pod \"e3555510-11fa-4c8a-bf0b-9196b7a61f36\" (UID: \"e3555510-11fa-4c8a-bf0b-9196b7a61f36\") " Sep 30 06:39:02 crc kubenswrapper[4691]: I0930 06:39:02.236217 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mc97q\" (UniqueName: \"kubernetes.io/projected/92ee0693-710e-43cd-a7f6-823fcd0acf42-kube-api-access-mc97q\") pod \"92ee0693-710e-43cd-a7f6-823fcd0acf42\" (UID: \"92ee0693-710e-43cd-a7f6-823fcd0acf42\") " Sep 30 06:39:02 crc kubenswrapper[4691]: I0930 06:39:02.236265 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzsbv\" (UniqueName: \"kubernetes.io/projected/e3555510-11fa-4c8a-bf0b-9196b7a61f36-kube-api-access-rzsbv\") pod \"e3555510-11fa-4c8a-bf0b-9196b7a61f36\" (UID: \"e3555510-11fa-4c8a-bf0b-9196b7a61f36\") " Sep 30 06:39:02 crc kubenswrapper[4691]: I0930 06:39:02.236360 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3555510-11fa-4c8a-bf0b-9196b7a61f36-config-data\") pod \"e3555510-11fa-4c8a-bf0b-9196b7a61f36\" (UID: \"e3555510-11fa-4c8a-bf0b-9196b7a61f36\") " Sep 30 06:39:02 crc kubenswrapper[4691]: I0930 06:39:02.236405 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3555510-11fa-4c8a-bf0b-9196b7a61f36-combined-ca-bundle\") pod \"e3555510-11fa-4c8a-bf0b-9196b7a61f36\" (UID: \"e3555510-11fa-4c8a-bf0b-9196b7a61f36\") " Sep 30 06:39:02 crc kubenswrapper[4691]: I0930 06:39:02.236447 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92ee0693-710e-43cd-a7f6-823fcd0acf42-config-data\") pod \"92ee0693-710e-43cd-a7f6-823fcd0acf42\" (UID: \"92ee0693-710e-43cd-a7f6-823fcd0acf42\") " Sep 30 06:39:02 crc kubenswrapper[4691]: I0930 06:39:02.236469 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92ee0693-710e-43cd-a7f6-823fcd0acf42-combined-ca-bundle\") pod \"92ee0693-710e-43cd-a7f6-823fcd0acf42\" (UID: \"92ee0693-710e-43cd-a7f6-823fcd0acf42\") " Sep 30 06:39:02 crc kubenswrapper[4691]: I0930 06:39:02.236502 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3555510-11fa-4c8a-bf0b-9196b7a61f36-logs" (OuterVolumeSpecName: "logs") pod "e3555510-11fa-4c8a-bf0b-9196b7a61f36" (UID: "e3555510-11fa-4c8a-bf0b-9196b7a61f36"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:39:02 crc kubenswrapper[4691]: I0930 06:39:02.236792 4691 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e3555510-11fa-4c8a-bf0b-9196b7a61f36-logs\") on node \"crc\" DevicePath \"\"" Sep 30 06:39:02 crc kubenswrapper[4691]: I0930 06:39:02.241877 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92ee0693-710e-43cd-a7f6-823fcd0acf42-kube-api-access-mc97q" (OuterVolumeSpecName: "kube-api-access-mc97q") pod "92ee0693-710e-43cd-a7f6-823fcd0acf42" (UID: "92ee0693-710e-43cd-a7f6-823fcd0acf42"). InnerVolumeSpecName "kube-api-access-mc97q". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:39:02 crc kubenswrapper[4691]: I0930 06:39:02.252384 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3555510-11fa-4c8a-bf0b-9196b7a61f36-kube-api-access-rzsbv" (OuterVolumeSpecName: "kube-api-access-rzsbv") pod "e3555510-11fa-4c8a-bf0b-9196b7a61f36" (UID: "e3555510-11fa-4c8a-bf0b-9196b7a61f36"). InnerVolumeSpecName "kube-api-access-rzsbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:39:02 crc kubenswrapper[4691]: I0930 06:39:02.266430 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3555510-11fa-4c8a-bf0b-9196b7a61f36-config-data" (OuterVolumeSpecName: "config-data") pod "e3555510-11fa-4c8a-bf0b-9196b7a61f36" (UID: "e3555510-11fa-4c8a-bf0b-9196b7a61f36"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:39:02 crc kubenswrapper[4691]: I0930 06:39:02.273492 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92ee0693-710e-43cd-a7f6-823fcd0acf42-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "92ee0693-710e-43cd-a7f6-823fcd0acf42" (UID: "92ee0693-710e-43cd-a7f6-823fcd0acf42"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:39:02 crc kubenswrapper[4691]: I0930 06:39:02.274257 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e3555510-11fa-4c8a-bf0b-9196b7a61f36-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e3555510-11fa-4c8a-bf0b-9196b7a61f36" (UID: "e3555510-11fa-4c8a-bf0b-9196b7a61f36"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:39:02 crc kubenswrapper[4691]: I0930 06:39:02.277960 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92ee0693-710e-43cd-a7f6-823fcd0acf42-config-data" (OuterVolumeSpecName: "config-data") pod "92ee0693-710e-43cd-a7f6-823fcd0acf42" (UID: "92ee0693-710e-43cd-a7f6-823fcd0acf42"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:39:02 crc kubenswrapper[4691]: I0930 06:39:02.337558 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mc97q\" (UniqueName: \"kubernetes.io/projected/92ee0693-710e-43cd-a7f6-823fcd0acf42-kube-api-access-mc97q\") on node \"crc\" DevicePath \"\"" Sep 30 06:39:02 crc kubenswrapper[4691]: I0930 06:39:02.337592 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rzsbv\" (UniqueName: \"kubernetes.io/projected/e3555510-11fa-4c8a-bf0b-9196b7a61f36-kube-api-access-rzsbv\") on node \"crc\" DevicePath \"\"" Sep 30 06:39:02 crc kubenswrapper[4691]: I0930 06:39:02.337607 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3555510-11fa-4c8a-bf0b-9196b7a61f36-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 06:39:02 crc kubenswrapper[4691]: I0930 06:39:02.337622 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3555510-11fa-4c8a-bf0b-9196b7a61f36-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:39:02 crc kubenswrapper[4691]: I0930 06:39:02.337633 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/92ee0693-710e-43cd-a7f6-823fcd0acf42-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 06:39:02 crc kubenswrapper[4691]: I0930 06:39:02.337643 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92ee0693-710e-43cd-a7f6-823fcd0acf42-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:39:03 crc kubenswrapper[4691]: I0930 06:39:03.061728 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 06:39:03 crc kubenswrapper[4691]: I0930 06:39:03.061786 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e3555510-11fa-4c8a-bf0b-9196b7a61f36","Type":"ContainerDied","Data":"254e8bc3af4ee6fc1b8f9812d4ddd4418660149226b672b88afff7e8fb971012"} Sep 30 06:39:03 crc kubenswrapper[4691]: I0930 06:39:03.061984 4691 scope.go:117] "RemoveContainer" containerID="1c582eb449995b0023fe2b4ebc8e15a19718f42294e28fe52551eeaf0d5bcbb2" Sep 30 06:39:03 crc kubenswrapper[4691]: I0930 06:39:03.065031 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"92ee0693-710e-43cd-a7f6-823fcd0acf42","Type":"ContainerDied","Data":"3f0078cada3bedce31a70406666b76110370196bfc5fde48570a713fcf9120ed"} Sep 30 06:39:03 crc kubenswrapper[4691]: I0930 06:39:03.065138 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 30 06:39:03 crc kubenswrapper[4691]: I0930 06:39:03.105848 4691 scope.go:117] "RemoveContainer" containerID="2d466f630232df7ed4e680aee3df7a0daf4cfc0f6007f4c5daf91a421d70e73a" Sep 30 06:39:03 crc kubenswrapper[4691]: I0930 06:39:03.109591 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 06:39:03 crc kubenswrapper[4691]: I0930 06:39:03.129675 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 06:39:03 crc kubenswrapper[4691]: I0930 06:39:03.156988 4691 scope.go:117] "RemoveContainer" containerID="37dd74b2067d2407970a9ec08d0544da7cedbe0561f322a4e76318d781fc5140" Sep 30 06:39:03 crc kubenswrapper[4691]: I0930 06:39:03.157232 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 06:39:03 crc kubenswrapper[4691]: I0930 06:39:03.192965 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 06:39:03 crc kubenswrapper[4691]: I0930 06:39:03.198987 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Sep 30 06:39:03 crc kubenswrapper[4691]: I0930 06:39:03.200420 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Sep 30 06:39:03 crc kubenswrapper[4691]: I0930 06:39:03.200566 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Sep 30 06:39:03 crc kubenswrapper[4691]: I0930 06:39:03.205627 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 06:39:03 crc kubenswrapper[4691]: E0930 06:39:03.206210 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3555510-11fa-4c8a-bf0b-9196b7a61f36" containerName="nova-metadata-metadata" Sep 30 06:39:03 crc kubenswrapper[4691]: I0930 06:39:03.206226 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3555510-11fa-4c8a-bf0b-9196b7a61f36" containerName="nova-metadata-metadata" Sep 30 06:39:03 crc kubenswrapper[4691]: E0930 06:39:03.206297 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92ee0693-710e-43cd-a7f6-823fcd0acf42" containerName="nova-cell1-novncproxy-novncproxy" Sep 30 06:39:03 crc kubenswrapper[4691]: I0930 06:39:03.206306 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="92ee0693-710e-43cd-a7f6-823fcd0acf42" containerName="nova-cell1-novncproxy-novncproxy" Sep 30 06:39:03 crc kubenswrapper[4691]: E0930 06:39:03.206348 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3555510-11fa-4c8a-bf0b-9196b7a61f36" containerName="nova-metadata-log" Sep 30 06:39:03 crc kubenswrapper[4691]: I0930 06:39:03.206367 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3555510-11fa-4c8a-bf0b-9196b7a61f36" containerName="nova-metadata-log" Sep 30 06:39:03 crc kubenswrapper[4691]: I0930 06:39:03.206622 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="92ee0693-710e-43cd-a7f6-823fcd0acf42" containerName="nova-cell1-novncproxy-novncproxy" Sep 30 06:39:03 crc kubenswrapper[4691]: I0930 06:39:03.206643 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3555510-11fa-4c8a-bf0b-9196b7a61f36" containerName="nova-metadata-metadata" Sep 30 06:39:03 crc kubenswrapper[4691]: I0930 06:39:03.206661 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3555510-11fa-4c8a-bf0b-9196b7a61f36" containerName="nova-metadata-log" Sep 30 06:39:03 crc kubenswrapper[4691]: I0930 06:39:03.207490 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 30 06:39:03 crc kubenswrapper[4691]: I0930 06:39:03.213343 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Sep 30 06:39:03 crc kubenswrapper[4691]: I0930 06:39:03.213408 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Sep 30 06:39:03 crc kubenswrapper[4691]: I0930 06:39:03.213662 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Sep 30 06:39:03 crc kubenswrapper[4691]: I0930 06:39:03.215074 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Sep 30 06:39:03 crc kubenswrapper[4691]: I0930 06:39:03.215221 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 06:39:03 crc kubenswrapper[4691]: I0930 06:39:03.253697 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92ee0693-710e-43cd-a7f6-823fcd0acf42" path="/var/lib/kubelet/pods/92ee0693-710e-43cd-a7f6-823fcd0acf42/volumes" Sep 30 06:39:03 crc kubenswrapper[4691]: I0930 06:39:03.254792 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3555510-11fa-4c8a-bf0b-9196b7a61f36" path="/var/lib/kubelet/pods/e3555510-11fa-4c8a-bf0b-9196b7a61f36/volumes" Sep 30 06:39:03 crc kubenswrapper[4691]: I0930 06:39:03.255583 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Sep 30 06:39:03 crc kubenswrapper[4691]: I0930 06:39:03.257862 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 06:39:03 crc kubenswrapper[4691]: I0930 06:39:03.257978 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 06:39:03 crc kubenswrapper[4691]: I0930 06:39:03.259158 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7mtz\" (UniqueName: \"kubernetes.io/projected/3a61f6fc-3212-4050-92f5-363ed195680f-kube-api-access-t7mtz\") pod \"nova-cell1-novncproxy-0\" (UID: \"3a61f6fc-3212-4050-92f5-363ed195680f\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 06:39:03 crc kubenswrapper[4691]: I0930 06:39:03.259252 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a61f6fc-3212-4050-92f5-363ed195680f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"3a61f6fc-3212-4050-92f5-363ed195680f\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 06:39:03 crc kubenswrapper[4691]: I0930 06:39:03.259285 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a61f6fc-3212-4050-92f5-363ed195680f-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"3a61f6fc-3212-4050-92f5-363ed195680f\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 06:39:03 crc kubenswrapper[4691]: I0930 06:39:03.259302 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a61f6fc-3212-4050-92f5-363ed195680f-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"3a61f6fc-3212-4050-92f5-363ed195680f\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 06:39:03 crc kubenswrapper[4691]: I0930 06:39:03.259344 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a61f6fc-3212-4050-92f5-363ed195680f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"3a61f6fc-3212-4050-92f5-363ed195680f\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 06:39:03 crc kubenswrapper[4691]: I0930 06:39:03.259505 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Sep 30 06:39:03 crc kubenswrapper[4691]: I0930 06:39:03.259715 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Sep 30 06:39:03 crc kubenswrapper[4691]: I0930 06:39:03.360584 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a61f6fc-3212-4050-92f5-363ed195680f-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"3a61f6fc-3212-4050-92f5-363ed195680f\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 06:39:03 crc kubenswrapper[4691]: I0930 06:39:03.360633 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a61f6fc-3212-4050-92f5-363ed195680f-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"3a61f6fc-3212-4050-92f5-363ed195680f\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 06:39:03 crc kubenswrapper[4691]: I0930 06:39:03.360664 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/afe1b16e-3c68-4b97-ad22-7b38c84ef6bb-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"afe1b16e-3c68-4b97-ad22-7b38c84ef6bb\") " pod="openstack/nova-metadata-0" Sep 30 06:39:03 crc kubenswrapper[4691]: I0930 06:39:03.360715 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a61f6fc-3212-4050-92f5-363ed195680f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"3a61f6fc-3212-4050-92f5-363ed195680f\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 06:39:03 crc kubenswrapper[4691]: I0930 06:39:03.360837 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7mtz\" (UniqueName: \"kubernetes.io/projected/3a61f6fc-3212-4050-92f5-363ed195680f-kube-api-access-t7mtz\") pod \"nova-cell1-novncproxy-0\" (UID: \"3a61f6fc-3212-4050-92f5-363ed195680f\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 06:39:03 crc kubenswrapper[4691]: I0930 06:39:03.360865 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/afe1b16e-3c68-4b97-ad22-7b38c84ef6bb-logs\") pod \"nova-metadata-0\" (UID: \"afe1b16e-3c68-4b97-ad22-7b38c84ef6bb\") " pod="openstack/nova-metadata-0" Sep 30 06:39:03 crc kubenswrapper[4691]: I0930 06:39:03.360942 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afe1b16e-3c68-4b97-ad22-7b38c84ef6bb-config-data\") pod \"nova-metadata-0\" (UID: \"afe1b16e-3c68-4b97-ad22-7b38c84ef6bb\") " pod="openstack/nova-metadata-0" Sep 30 06:39:03 crc kubenswrapper[4691]: I0930 06:39:03.360962 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afe1b16e-3c68-4b97-ad22-7b38c84ef6bb-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"afe1b16e-3c68-4b97-ad22-7b38c84ef6bb\") " pod="openstack/nova-metadata-0" Sep 30 06:39:03 crc kubenswrapper[4691]: I0930 06:39:03.360991 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nq5nr\" (UniqueName: \"kubernetes.io/projected/afe1b16e-3c68-4b97-ad22-7b38c84ef6bb-kube-api-access-nq5nr\") pod \"nova-metadata-0\" (UID: \"afe1b16e-3c68-4b97-ad22-7b38c84ef6bb\") " pod="openstack/nova-metadata-0" Sep 30 06:39:03 crc kubenswrapper[4691]: I0930 06:39:03.361094 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a61f6fc-3212-4050-92f5-363ed195680f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"3a61f6fc-3212-4050-92f5-363ed195680f\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 06:39:03 crc kubenswrapper[4691]: I0930 06:39:03.365982 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a61f6fc-3212-4050-92f5-363ed195680f-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"3a61f6fc-3212-4050-92f5-363ed195680f\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 06:39:03 crc kubenswrapper[4691]: I0930 06:39:03.375346 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/3a61f6fc-3212-4050-92f5-363ed195680f-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"3a61f6fc-3212-4050-92f5-363ed195680f\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 06:39:03 crc kubenswrapper[4691]: I0930 06:39:03.376536 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a61f6fc-3212-4050-92f5-363ed195680f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"3a61f6fc-3212-4050-92f5-363ed195680f\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 06:39:03 crc kubenswrapper[4691]: I0930 06:39:03.378848 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a61f6fc-3212-4050-92f5-363ed195680f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"3a61f6fc-3212-4050-92f5-363ed195680f\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 06:39:03 crc kubenswrapper[4691]: I0930 06:39:03.380457 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7mtz\" (UniqueName: \"kubernetes.io/projected/3a61f6fc-3212-4050-92f5-363ed195680f-kube-api-access-t7mtz\") pod \"nova-cell1-novncproxy-0\" (UID: \"3a61f6fc-3212-4050-92f5-363ed195680f\") " pod="openstack/nova-cell1-novncproxy-0" Sep 30 06:39:03 crc kubenswrapper[4691]: I0930 06:39:03.466724 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/afe1b16e-3c68-4b97-ad22-7b38c84ef6bb-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"afe1b16e-3c68-4b97-ad22-7b38c84ef6bb\") " pod="openstack/nova-metadata-0" Sep 30 06:39:03 crc kubenswrapper[4691]: I0930 06:39:03.467656 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/afe1b16e-3c68-4b97-ad22-7b38c84ef6bb-logs\") pod \"nova-metadata-0\" (UID: \"afe1b16e-3c68-4b97-ad22-7b38c84ef6bb\") " pod="openstack/nova-metadata-0" Sep 30 06:39:03 crc kubenswrapper[4691]: I0930 06:39:03.467933 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afe1b16e-3c68-4b97-ad22-7b38c84ef6bb-config-data\") pod \"nova-metadata-0\" (UID: \"afe1b16e-3c68-4b97-ad22-7b38c84ef6bb\") " pod="openstack/nova-metadata-0" Sep 30 06:39:03 crc kubenswrapper[4691]: I0930 06:39:03.467964 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afe1b16e-3c68-4b97-ad22-7b38c84ef6bb-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"afe1b16e-3c68-4b97-ad22-7b38c84ef6bb\") " pod="openstack/nova-metadata-0" Sep 30 06:39:03 crc kubenswrapper[4691]: I0930 06:39:03.467996 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nq5nr\" (UniqueName: \"kubernetes.io/projected/afe1b16e-3c68-4b97-ad22-7b38c84ef6bb-kube-api-access-nq5nr\") pod \"nova-metadata-0\" (UID: \"afe1b16e-3c68-4b97-ad22-7b38c84ef6bb\") " pod="openstack/nova-metadata-0" Sep 30 06:39:03 crc kubenswrapper[4691]: I0930 06:39:03.469093 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/afe1b16e-3c68-4b97-ad22-7b38c84ef6bb-logs\") pod \"nova-metadata-0\" (UID: \"afe1b16e-3c68-4b97-ad22-7b38c84ef6bb\") " pod="openstack/nova-metadata-0" Sep 30 06:39:03 crc kubenswrapper[4691]: I0930 06:39:03.471723 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afe1b16e-3c68-4b97-ad22-7b38c84ef6bb-config-data\") pod \"nova-metadata-0\" (UID: \"afe1b16e-3c68-4b97-ad22-7b38c84ef6bb\") " pod="openstack/nova-metadata-0" Sep 30 06:39:03 crc kubenswrapper[4691]: I0930 06:39:03.472310 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/afe1b16e-3c68-4b97-ad22-7b38c84ef6bb-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"afe1b16e-3c68-4b97-ad22-7b38c84ef6bb\") " pod="openstack/nova-metadata-0" Sep 30 06:39:03 crc kubenswrapper[4691]: I0930 06:39:03.475627 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afe1b16e-3c68-4b97-ad22-7b38c84ef6bb-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"afe1b16e-3c68-4b97-ad22-7b38c84ef6bb\") " pod="openstack/nova-metadata-0" Sep 30 06:39:03 crc kubenswrapper[4691]: I0930 06:39:03.489552 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nq5nr\" (UniqueName: \"kubernetes.io/projected/afe1b16e-3c68-4b97-ad22-7b38c84ef6bb-kube-api-access-nq5nr\") pod \"nova-metadata-0\" (UID: \"afe1b16e-3c68-4b97-ad22-7b38c84ef6bb\") " pod="openstack/nova-metadata-0" Sep 30 06:39:03 crc kubenswrapper[4691]: I0930 06:39:03.531410 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Sep 30 06:39:03 crc kubenswrapper[4691]: I0930 06:39:03.576469 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 06:39:04 crc kubenswrapper[4691]: I0930 06:39:04.060681 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Sep 30 06:39:04 crc kubenswrapper[4691]: I0930 06:39:04.082961 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Sep 30 06:39:04 crc kubenswrapper[4691]: I0930 06:39:04.090660 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Sep 30 06:39:04 crc kubenswrapper[4691]: I0930 06:39:04.194570 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 06:39:04 crc kubenswrapper[4691]: I0930 06:39:04.331971 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6c45569577-gvk65"] Sep 30 06:39:04 crc kubenswrapper[4691]: I0930 06:39:04.357910 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c45569577-gvk65" Sep 30 06:39:04 crc kubenswrapper[4691]: I0930 06:39:04.361757 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c45569577-gvk65"] Sep 30 06:39:04 crc kubenswrapper[4691]: I0930 06:39:04.495547 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2caca8a1-81a4-40f6-9dc6-bcd84120889d-ovsdbserver-sb\") pod \"dnsmasq-dns-6c45569577-gvk65\" (UID: \"2caca8a1-81a4-40f6-9dc6-bcd84120889d\") " pod="openstack/dnsmasq-dns-6c45569577-gvk65" Sep 30 06:39:04 crc kubenswrapper[4691]: I0930 06:39:04.495662 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2caca8a1-81a4-40f6-9dc6-bcd84120889d-dns-svc\") pod \"dnsmasq-dns-6c45569577-gvk65\" (UID: \"2caca8a1-81a4-40f6-9dc6-bcd84120889d\") " pod="openstack/dnsmasq-dns-6c45569577-gvk65" Sep 30 06:39:04 crc kubenswrapper[4691]: I0930 06:39:04.495690 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2caca8a1-81a4-40f6-9dc6-bcd84120889d-config\") pod \"dnsmasq-dns-6c45569577-gvk65\" (UID: \"2caca8a1-81a4-40f6-9dc6-bcd84120889d\") " pod="openstack/dnsmasq-dns-6c45569577-gvk65" Sep 30 06:39:04 crc kubenswrapper[4691]: I0930 06:39:04.495733 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2caca8a1-81a4-40f6-9dc6-bcd84120889d-dns-swift-storage-0\") pod \"dnsmasq-dns-6c45569577-gvk65\" (UID: \"2caca8a1-81a4-40f6-9dc6-bcd84120889d\") " pod="openstack/dnsmasq-dns-6c45569577-gvk65" Sep 30 06:39:04 crc kubenswrapper[4691]: I0930 06:39:04.495759 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2caca8a1-81a4-40f6-9dc6-bcd84120889d-ovsdbserver-nb\") pod \"dnsmasq-dns-6c45569577-gvk65\" (UID: \"2caca8a1-81a4-40f6-9dc6-bcd84120889d\") " pod="openstack/dnsmasq-dns-6c45569577-gvk65" Sep 30 06:39:04 crc kubenswrapper[4691]: I0930 06:39:04.495789 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpmdf\" (UniqueName: \"kubernetes.io/projected/2caca8a1-81a4-40f6-9dc6-bcd84120889d-kube-api-access-wpmdf\") pod \"dnsmasq-dns-6c45569577-gvk65\" (UID: \"2caca8a1-81a4-40f6-9dc6-bcd84120889d\") " pod="openstack/dnsmasq-dns-6c45569577-gvk65" Sep 30 06:39:04 crc kubenswrapper[4691]: I0930 06:39:04.597297 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2caca8a1-81a4-40f6-9dc6-bcd84120889d-dns-svc\") pod \"dnsmasq-dns-6c45569577-gvk65\" (UID: \"2caca8a1-81a4-40f6-9dc6-bcd84120889d\") " pod="openstack/dnsmasq-dns-6c45569577-gvk65" Sep 30 06:39:04 crc kubenswrapper[4691]: I0930 06:39:04.597346 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2caca8a1-81a4-40f6-9dc6-bcd84120889d-config\") pod \"dnsmasq-dns-6c45569577-gvk65\" (UID: \"2caca8a1-81a4-40f6-9dc6-bcd84120889d\") " pod="openstack/dnsmasq-dns-6c45569577-gvk65" Sep 30 06:39:04 crc kubenswrapper[4691]: I0930 06:39:04.597396 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2caca8a1-81a4-40f6-9dc6-bcd84120889d-dns-swift-storage-0\") pod \"dnsmasq-dns-6c45569577-gvk65\" (UID: \"2caca8a1-81a4-40f6-9dc6-bcd84120889d\") " pod="openstack/dnsmasq-dns-6c45569577-gvk65" Sep 30 06:39:04 crc kubenswrapper[4691]: I0930 06:39:04.597427 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2caca8a1-81a4-40f6-9dc6-bcd84120889d-ovsdbserver-nb\") pod \"dnsmasq-dns-6c45569577-gvk65\" (UID: \"2caca8a1-81a4-40f6-9dc6-bcd84120889d\") " pod="openstack/dnsmasq-dns-6c45569577-gvk65" Sep 30 06:39:04 crc kubenswrapper[4691]: I0930 06:39:04.597458 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpmdf\" (UniqueName: \"kubernetes.io/projected/2caca8a1-81a4-40f6-9dc6-bcd84120889d-kube-api-access-wpmdf\") pod \"dnsmasq-dns-6c45569577-gvk65\" (UID: \"2caca8a1-81a4-40f6-9dc6-bcd84120889d\") " pod="openstack/dnsmasq-dns-6c45569577-gvk65" Sep 30 06:39:04 crc kubenswrapper[4691]: I0930 06:39:04.597504 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2caca8a1-81a4-40f6-9dc6-bcd84120889d-ovsdbserver-sb\") pod \"dnsmasq-dns-6c45569577-gvk65\" (UID: \"2caca8a1-81a4-40f6-9dc6-bcd84120889d\") " pod="openstack/dnsmasq-dns-6c45569577-gvk65" Sep 30 06:39:04 crc kubenswrapper[4691]: I0930 06:39:04.598178 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2caca8a1-81a4-40f6-9dc6-bcd84120889d-config\") pod \"dnsmasq-dns-6c45569577-gvk65\" (UID: \"2caca8a1-81a4-40f6-9dc6-bcd84120889d\") " pod="openstack/dnsmasq-dns-6c45569577-gvk65" Sep 30 06:39:04 crc kubenswrapper[4691]: I0930 06:39:04.598178 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2caca8a1-81a4-40f6-9dc6-bcd84120889d-dns-svc\") pod \"dnsmasq-dns-6c45569577-gvk65\" (UID: \"2caca8a1-81a4-40f6-9dc6-bcd84120889d\") " pod="openstack/dnsmasq-dns-6c45569577-gvk65" Sep 30 06:39:04 crc kubenswrapper[4691]: I0930 06:39:04.598307 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2caca8a1-81a4-40f6-9dc6-bcd84120889d-dns-swift-storage-0\") pod \"dnsmasq-dns-6c45569577-gvk65\" (UID: \"2caca8a1-81a4-40f6-9dc6-bcd84120889d\") " pod="openstack/dnsmasq-dns-6c45569577-gvk65" Sep 30 06:39:04 crc kubenswrapper[4691]: I0930 06:39:04.598651 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2caca8a1-81a4-40f6-9dc6-bcd84120889d-ovsdbserver-nb\") pod \"dnsmasq-dns-6c45569577-gvk65\" (UID: \"2caca8a1-81a4-40f6-9dc6-bcd84120889d\") " pod="openstack/dnsmasq-dns-6c45569577-gvk65" Sep 30 06:39:04 crc kubenswrapper[4691]: I0930 06:39:04.602913 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2caca8a1-81a4-40f6-9dc6-bcd84120889d-ovsdbserver-sb\") pod \"dnsmasq-dns-6c45569577-gvk65\" (UID: \"2caca8a1-81a4-40f6-9dc6-bcd84120889d\") " pod="openstack/dnsmasq-dns-6c45569577-gvk65" Sep 30 06:39:04 crc kubenswrapper[4691]: I0930 06:39:04.618946 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpmdf\" (UniqueName: \"kubernetes.io/projected/2caca8a1-81a4-40f6-9dc6-bcd84120889d-kube-api-access-wpmdf\") pod \"dnsmasq-dns-6c45569577-gvk65\" (UID: \"2caca8a1-81a4-40f6-9dc6-bcd84120889d\") " pod="openstack/dnsmasq-dns-6c45569577-gvk65" Sep 30 06:39:04 crc kubenswrapper[4691]: I0930 06:39:04.725593 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c45569577-gvk65" Sep 30 06:39:05 crc kubenswrapper[4691]: I0930 06:39:05.091067 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"3a61f6fc-3212-4050-92f5-363ed195680f","Type":"ContainerStarted","Data":"ab8c1ed8e93a10bd15ca15e1a989e8d43d7c63aa47bcfcca343fd72727a25122"} Sep 30 06:39:05 crc kubenswrapper[4691]: I0930 06:39:05.091108 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"3a61f6fc-3212-4050-92f5-363ed195680f","Type":"ContainerStarted","Data":"f9e5b3d9a77a529fee1e1cac2e68be0c65373f2ed632f6ae3337cef73050f12e"} Sep 30 06:39:05 crc kubenswrapper[4691]: I0930 06:39:05.094194 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"afe1b16e-3c68-4b97-ad22-7b38c84ef6bb","Type":"ContainerStarted","Data":"cd752a11f112b7e1fba4875c4d6b5ab82518ce13fcc8be6237889e56a52a800d"} Sep 30 06:39:05 crc kubenswrapper[4691]: I0930 06:39:05.094240 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"afe1b16e-3c68-4b97-ad22-7b38c84ef6bb","Type":"ContainerStarted","Data":"3f35b801a7adea0b68ae6fca821129b794ddb77d7d6a746ac413ab3a83955888"} Sep 30 06:39:05 crc kubenswrapper[4691]: I0930 06:39:05.094254 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"afe1b16e-3c68-4b97-ad22-7b38c84ef6bb","Type":"ContainerStarted","Data":"e8b8322eb40bfe7a9b99406d248176be8ee5bb7b9482b25f61cbc1fbff4d09de"} Sep 30 06:39:05 crc kubenswrapper[4691]: I0930 06:39:05.137526 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.137505541 podStartE2EDuration="2.137505541s" podCreationTimestamp="2025-09-30 06:39:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:39:05.131126819 +0000 UTC m=+1188.606147869" watchObservedRunningTime="2025-09-30 06:39:05.137505541 +0000 UTC m=+1188.612526581" Sep 30 06:39:05 crc kubenswrapper[4691]: I0930 06:39:05.141368 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.141353974 podStartE2EDuration="2.141353974s" podCreationTimestamp="2025-09-30 06:39:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:39:05.110239886 +0000 UTC m=+1188.585260936" watchObservedRunningTime="2025-09-30 06:39:05.141353974 +0000 UTC m=+1188.616375014" Sep 30 06:39:05 crc kubenswrapper[4691]: I0930 06:39:05.213544 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c45569577-gvk65"] Sep 30 06:39:05 crc kubenswrapper[4691]: W0930 06:39:05.214432 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2caca8a1_81a4_40f6_9dc6_bcd84120889d.slice/crio-e84a7d408786bf83e92079a4c7ff754c8b71e4703385776fc6c293c921c6f952 WatchSource:0}: Error finding container e84a7d408786bf83e92079a4c7ff754c8b71e4703385776fc6c293c921c6f952: Status 404 returned error can't find the container with id e84a7d408786bf83e92079a4c7ff754c8b71e4703385776fc6c293c921c6f952 Sep 30 06:39:06 crc kubenswrapper[4691]: I0930 06:39:06.102792 4691 generic.go:334] "Generic (PLEG): container finished" podID="2caca8a1-81a4-40f6-9dc6-bcd84120889d" containerID="dbdc271250f31d3eb65b8f87ab80b0fb60a40e098c4781e4cbee5ff798c18389" exitCode=0 Sep 30 06:39:06 crc kubenswrapper[4691]: I0930 06:39:06.102838 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c45569577-gvk65" event={"ID":"2caca8a1-81a4-40f6-9dc6-bcd84120889d","Type":"ContainerDied","Data":"dbdc271250f31d3eb65b8f87ab80b0fb60a40e098c4781e4cbee5ff798c18389"} Sep 30 06:39:06 crc kubenswrapper[4691]: I0930 06:39:06.103120 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c45569577-gvk65" event={"ID":"2caca8a1-81a4-40f6-9dc6-bcd84120889d","Type":"ContainerStarted","Data":"e84a7d408786bf83e92079a4c7ff754c8b71e4703385776fc6c293c921c6f952"} Sep 30 06:39:06 crc kubenswrapper[4691]: I0930 06:39:06.701828 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 06:39:06 crc kubenswrapper[4691]: I0930 06:39:06.702426 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4528ae48-9ab1-424c-8cb0-d5cb6996562b" containerName="ceilometer-central-agent" containerID="cri-o://7eaf8010a19696f26cccb1404073cce1d7975457638251360bbba51c5ae32852" gracePeriod=30 Sep 30 06:39:06 crc kubenswrapper[4691]: I0930 06:39:06.702578 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4528ae48-9ab1-424c-8cb0-d5cb6996562b" containerName="proxy-httpd" containerID="cri-o://a55efa7ab36921ab411af4d3ca4223770f34f484c7b1d6e55bedbe647594d16b" gracePeriod=30 Sep 30 06:39:06 crc kubenswrapper[4691]: I0930 06:39:06.702598 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4528ae48-9ab1-424c-8cb0-d5cb6996562b" containerName="ceilometer-notification-agent" containerID="cri-o://374c3f0e049e24622ea5631f0df57cb505aefcf66d3d95d05155ea9a94ea1434" gracePeriod=30 Sep 30 06:39:06 crc kubenswrapper[4691]: I0930 06:39:06.702660 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4528ae48-9ab1-424c-8cb0-d5cb6996562b" containerName="sg-core" containerID="cri-o://1eae33ebf278f94d485a71973ff7e8f5034fb88c81f4e76b35433760bd2cccc5" gracePeriod=30 Sep 30 06:39:06 crc kubenswrapper[4691]: I0930 06:39:06.717294 4691 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="4528ae48-9ab1-424c-8cb0-d5cb6996562b" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.212:3000/\": EOF" Sep 30 06:39:06 crc kubenswrapper[4691]: I0930 06:39:06.958029 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 30 06:39:07 crc kubenswrapper[4691]: I0930 06:39:07.115279 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c45569577-gvk65" event={"ID":"2caca8a1-81a4-40f6-9dc6-bcd84120889d","Type":"ContainerStarted","Data":"c2e619eb2798157e6dae8f9c08d4a248fd4243b3eb141b1ad3a9725206cacd87"} Sep 30 06:39:07 crc kubenswrapper[4691]: I0930 06:39:07.116124 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6c45569577-gvk65" Sep 30 06:39:07 crc kubenswrapper[4691]: I0930 06:39:07.118786 4691 generic.go:334] "Generic (PLEG): container finished" podID="4528ae48-9ab1-424c-8cb0-d5cb6996562b" containerID="a55efa7ab36921ab411af4d3ca4223770f34f484c7b1d6e55bedbe647594d16b" exitCode=0 Sep 30 06:39:07 crc kubenswrapper[4691]: I0930 06:39:07.118812 4691 generic.go:334] "Generic (PLEG): container finished" podID="4528ae48-9ab1-424c-8cb0-d5cb6996562b" containerID="1eae33ebf278f94d485a71973ff7e8f5034fb88c81f4e76b35433760bd2cccc5" exitCode=2 Sep 30 06:39:07 crc kubenswrapper[4691]: I0930 06:39:07.118854 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4528ae48-9ab1-424c-8cb0-d5cb6996562b","Type":"ContainerDied","Data":"a55efa7ab36921ab411af4d3ca4223770f34f484c7b1d6e55bedbe647594d16b"} Sep 30 06:39:07 crc kubenswrapper[4691]: I0930 06:39:07.118874 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4528ae48-9ab1-424c-8cb0-d5cb6996562b","Type":"ContainerDied","Data":"1eae33ebf278f94d485a71973ff7e8f5034fb88c81f4e76b35433760bd2cccc5"} Sep 30 06:39:07 crc kubenswrapper[4691]: I0930 06:39:07.118954 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f18aaea4-d2f7-47e6-bb32-00b3ba99ea72" containerName="nova-api-log" containerID="cri-o://0ad19fb7829d03a8a4136df7f8c7600016368715c9509a97e342d1da83350e0d" gracePeriod=30 Sep 30 06:39:07 crc kubenswrapper[4691]: I0930 06:39:07.119003 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="f18aaea4-d2f7-47e6-bb32-00b3ba99ea72" containerName="nova-api-api" containerID="cri-o://f8edbd476f29040c250467d5a4c086a09ff5bb7829dd180fc19a411589723007" gracePeriod=30 Sep 30 06:39:07 crc kubenswrapper[4691]: I0930 06:39:07.136892 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6c45569577-gvk65" podStartSLOduration=3.136866698 podStartE2EDuration="3.136866698s" podCreationTimestamp="2025-09-30 06:39:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:39:07.13095476 +0000 UTC m=+1190.605975810" watchObservedRunningTime="2025-09-30 06:39:07.136866698 +0000 UTC m=+1190.611887738" Sep 30 06:39:08 crc kubenswrapper[4691]: I0930 06:39:08.131629 4691 generic.go:334] "Generic (PLEG): container finished" podID="f18aaea4-d2f7-47e6-bb32-00b3ba99ea72" containerID="0ad19fb7829d03a8a4136df7f8c7600016368715c9509a97e342d1da83350e0d" exitCode=143 Sep 30 06:39:08 crc kubenswrapper[4691]: I0930 06:39:08.131987 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f18aaea4-d2f7-47e6-bb32-00b3ba99ea72","Type":"ContainerDied","Data":"0ad19fb7829d03a8a4136df7f8c7600016368715c9509a97e342d1da83350e0d"} Sep 30 06:39:08 crc kubenswrapper[4691]: I0930 06:39:08.135108 4691 generic.go:334] "Generic (PLEG): container finished" podID="4528ae48-9ab1-424c-8cb0-d5cb6996562b" containerID="7eaf8010a19696f26cccb1404073cce1d7975457638251360bbba51c5ae32852" exitCode=0 Sep 30 06:39:08 crc kubenswrapper[4691]: I0930 06:39:08.135163 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4528ae48-9ab1-424c-8cb0-d5cb6996562b","Type":"ContainerDied","Data":"7eaf8010a19696f26cccb1404073cce1d7975457638251360bbba51c5ae32852"} Sep 30 06:39:08 crc kubenswrapper[4691]: I0930 06:39:08.531845 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Sep 30 06:39:08 crc kubenswrapper[4691]: I0930 06:39:08.577439 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 30 06:39:08 crc kubenswrapper[4691]: I0930 06:39:08.577485 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 30 06:39:09 crc kubenswrapper[4691]: I0930 06:39:09.146477 4691 generic.go:334] "Generic (PLEG): container finished" podID="f18aaea4-d2f7-47e6-bb32-00b3ba99ea72" containerID="f8edbd476f29040c250467d5a4c086a09ff5bb7829dd180fc19a411589723007" exitCode=0 Sep 30 06:39:09 crc kubenswrapper[4691]: I0930 06:39:09.146968 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f18aaea4-d2f7-47e6-bb32-00b3ba99ea72","Type":"ContainerDied","Data":"f8edbd476f29040c250467d5a4c086a09ff5bb7829dd180fc19a411589723007"} Sep 30 06:39:09 crc kubenswrapper[4691]: I0930 06:39:09.147008 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f18aaea4-d2f7-47e6-bb32-00b3ba99ea72","Type":"ContainerDied","Data":"e0289f034f94f0daf228e0ba060236603fd112385db9c97e21668703f08f8a8c"} Sep 30 06:39:09 crc kubenswrapper[4691]: I0930 06:39:09.147022 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0289f034f94f0daf228e0ba060236603fd112385db9c97e21668703f08f8a8c" Sep 30 06:39:09 crc kubenswrapper[4691]: I0930 06:39:09.215299 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 06:39:09 crc kubenswrapper[4691]: I0930 06:39:09.300599 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f18aaea4-d2f7-47e6-bb32-00b3ba99ea72-config-data\") pod \"f18aaea4-d2f7-47e6-bb32-00b3ba99ea72\" (UID: \"f18aaea4-d2f7-47e6-bb32-00b3ba99ea72\") " Sep 30 06:39:09 crc kubenswrapper[4691]: I0930 06:39:09.300695 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f18aaea4-d2f7-47e6-bb32-00b3ba99ea72-combined-ca-bundle\") pod \"f18aaea4-d2f7-47e6-bb32-00b3ba99ea72\" (UID: \"f18aaea4-d2f7-47e6-bb32-00b3ba99ea72\") " Sep 30 06:39:09 crc kubenswrapper[4691]: I0930 06:39:09.300820 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f18aaea4-d2f7-47e6-bb32-00b3ba99ea72-logs\") pod \"f18aaea4-d2f7-47e6-bb32-00b3ba99ea72\" (UID: \"f18aaea4-d2f7-47e6-bb32-00b3ba99ea72\") " Sep 30 06:39:09 crc kubenswrapper[4691]: I0930 06:39:09.300929 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dczr\" (UniqueName: \"kubernetes.io/projected/f18aaea4-d2f7-47e6-bb32-00b3ba99ea72-kube-api-access-2dczr\") pod \"f18aaea4-d2f7-47e6-bb32-00b3ba99ea72\" (UID: \"f18aaea4-d2f7-47e6-bb32-00b3ba99ea72\") " Sep 30 06:39:09 crc kubenswrapper[4691]: I0930 06:39:09.302356 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f18aaea4-d2f7-47e6-bb32-00b3ba99ea72-logs" (OuterVolumeSpecName: "logs") pod "f18aaea4-d2f7-47e6-bb32-00b3ba99ea72" (UID: "f18aaea4-d2f7-47e6-bb32-00b3ba99ea72"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:39:09 crc kubenswrapper[4691]: I0930 06:39:09.314057 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f18aaea4-d2f7-47e6-bb32-00b3ba99ea72-kube-api-access-2dczr" (OuterVolumeSpecName: "kube-api-access-2dczr") pod "f18aaea4-d2f7-47e6-bb32-00b3ba99ea72" (UID: "f18aaea4-d2f7-47e6-bb32-00b3ba99ea72"). InnerVolumeSpecName "kube-api-access-2dczr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:39:09 crc kubenswrapper[4691]: I0930 06:39:09.336037 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f18aaea4-d2f7-47e6-bb32-00b3ba99ea72-config-data" (OuterVolumeSpecName: "config-data") pod "f18aaea4-d2f7-47e6-bb32-00b3ba99ea72" (UID: "f18aaea4-d2f7-47e6-bb32-00b3ba99ea72"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:39:09 crc kubenswrapper[4691]: I0930 06:39:09.351074 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f18aaea4-d2f7-47e6-bb32-00b3ba99ea72-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f18aaea4-d2f7-47e6-bb32-00b3ba99ea72" (UID: "f18aaea4-d2f7-47e6-bb32-00b3ba99ea72"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:39:09 crc kubenswrapper[4691]: I0930 06:39:09.403713 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f18aaea4-d2f7-47e6-bb32-00b3ba99ea72-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 06:39:09 crc kubenswrapper[4691]: I0930 06:39:09.403740 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f18aaea4-d2f7-47e6-bb32-00b3ba99ea72-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:39:09 crc kubenswrapper[4691]: I0930 06:39:09.403769 4691 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f18aaea4-d2f7-47e6-bb32-00b3ba99ea72-logs\") on node \"crc\" DevicePath \"\"" Sep 30 06:39:09 crc kubenswrapper[4691]: I0930 06:39:09.403778 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2dczr\" (UniqueName: \"kubernetes.io/projected/f18aaea4-d2f7-47e6-bb32-00b3ba99ea72-kube-api-access-2dczr\") on node \"crc\" DevicePath \"\"" Sep 30 06:39:10 crc kubenswrapper[4691]: I0930 06:39:10.168568 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 06:39:10 crc kubenswrapper[4691]: I0930 06:39:10.215089 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 30 06:39:10 crc kubenswrapper[4691]: I0930 06:39:10.271095 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Sep 30 06:39:10 crc kubenswrapper[4691]: I0930 06:39:10.285561 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Sep 30 06:39:10 crc kubenswrapper[4691]: E0930 06:39:10.286059 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f18aaea4-d2f7-47e6-bb32-00b3ba99ea72" containerName="nova-api-log" Sep 30 06:39:10 crc kubenswrapper[4691]: I0930 06:39:10.286083 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="f18aaea4-d2f7-47e6-bb32-00b3ba99ea72" containerName="nova-api-log" Sep 30 06:39:10 crc kubenswrapper[4691]: E0930 06:39:10.286117 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f18aaea4-d2f7-47e6-bb32-00b3ba99ea72" containerName="nova-api-api" Sep 30 06:39:10 crc kubenswrapper[4691]: I0930 06:39:10.286126 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="f18aaea4-d2f7-47e6-bb32-00b3ba99ea72" containerName="nova-api-api" Sep 30 06:39:10 crc kubenswrapper[4691]: I0930 06:39:10.286349 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="f18aaea4-d2f7-47e6-bb32-00b3ba99ea72" containerName="nova-api-log" Sep 30 06:39:10 crc kubenswrapper[4691]: I0930 06:39:10.286374 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="f18aaea4-d2f7-47e6-bb32-00b3ba99ea72" containerName="nova-api-api" Sep 30 06:39:10 crc kubenswrapper[4691]: I0930 06:39:10.287781 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 06:39:10 crc kubenswrapper[4691]: I0930 06:39:10.289103 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Sep 30 06:39:10 crc kubenswrapper[4691]: I0930 06:39:10.289314 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Sep 30 06:39:10 crc kubenswrapper[4691]: I0930 06:39:10.302371 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Sep 30 06:39:10 crc kubenswrapper[4691]: I0930 06:39:10.305745 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 06:39:10 crc kubenswrapper[4691]: I0930 06:39:10.323158 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d72ab7f3-2303-4207-b7e5-c7935165f9ae-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d72ab7f3-2303-4207-b7e5-c7935165f9ae\") " pod="openstack/nova-api-0" Sep 30 06:39:10 crc kubenswrapper[4691]: I0930 06:39:10.323243 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d72ab7f3-2303-4207-b7e5-c7935165f9ae-config-data\") pod \"nova-api-0\" (UID: \"d72ab7f3-2303-4207-b7e5-c7935165f9ae\") " pod="openstack/nova-api-0" Sep 30 06:39:10 crc kubenswrapper[4691]: I0930 06:39:10.323265 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5pbwx\" (UniqueName: \"kubernetes.io/projected/d72ab7f3-2303-4207-b7e5-c7935165f9ae-kube-api-access-5pbwx\") pod \"nova-api-0\" (UID: \"d72ab7f3-2303-4207-b7e5-c7935165f9ae\") " pod="openstack/nova-api-0" Sep 30 06:39:10 crc kubenswrapper[4691]: I0930 06:39:10.323313 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d72ab7f3-2303-4207-b7e5-c7935165f9ae-logs\") pod \"nova-api-0\" (UID: \"d72ab7f3-2303-4207-b7e5-c7935165f9ae\") " pod="openstack/nova-api-0" Sep 30 06:39:10 crc kubenswrapper[4691]: I0930 06:39:10.323381 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d72ab7f3-2303-4207-b7e5-c7935165f9ae-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d72ab7f3-2303-4207-b7e5-c7935165f9ae\") " pod="openstack/nova-api-0" Sep 30 06:39:10 crc kubenswrapper[4691]: I0930 06:39:10.323435 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d72ab7f3-2303-4207-b7e5-c7935165f9ae-public-tls-certs\") pod \"nova-api-0\" (UID: \"d72ab7f3-2303-4207-b7e5-c7935165f9ae\") " pod="openstack/nova-api-0" Sep 30 06:39:10 crc kubenswrapper[4691]: I0930 06:39:10.425370 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d72ab7f3-2303-4207-b7e5-c7935165f9ae-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d72ab7f3-2303-4207-b7e5-c7935165f9ae\") " pod="openstack/nova-api-0" Sep 30 06:39:10 crc kubenswrapper[4691]: I0930 06:39:10.425501 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d72ab7f3-2303-4207-b7e5-c7935165f9ae-config-data\") pod \"nova-api-0\" (UID: \"d72ab7f3-2303-4207-b7e5-c7935165f9ae\") " pod="openstack/nova-api-0" Sep 30 06:39:10 crc kubenswrapper[4691]: I0930 06:39:10.425550 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5pbwx\" (UniqueName: \"kubernetes.io/projected/d72ab7f3-2303-4207-b7e5-c7935165f9ae-kube-api-access-5pbwx\") pod \"nova-api-0\" (UID: \"d72ab7f3-2303-4207-b7e5-c7935165f9ae\") " pod="openstack/nova-api-0" Sep 30 06:39:10 crc kubenswrapper[4691]: I0930 06:39:10.425649 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d72ab7f3-2303-4207-b7e5-c7935165f9ae-logs\") pod \"nova-api-0\" (UID: \"d72ab7f3-2303-4207-b7e5-c7935165f9ae\") " pod="openstack/nova-api-0" Sep 30 06:39:10 crc kubenswrapper[4691]: I0930 06:39:10.425692 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d72ab7f3-2303-4207-b7e5-c7935165f9ae-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d72ab7f3-2303-4207-b7e5-c7935165f9ae\") " pod="openstack/nova-api-0" Sep 30 06:39:10 crc kubenswrapper[4691]: I0930 06:39:10.425775 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d72ab7f3-2303-4207-b7e5-c7935165f9ae-public-tls-certs\") pod \"nova-api-0\" (UID: \"d72ab7f3-2303-4207-b7e5-c7935165f9ae\") " pod="openstack/nova-api-0" Sep 30 06:39:10 crc kubenswrapper[4691]: I0930 06:39:10.426477 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d72ab7f3-2303-4207-b7e5-c7935165f9ae-logs\") pod \"nova-api-0\" (UID: \"d72ab7f3-2303-4207-b7e5-c7935165f9ae\") " pod="openstack/nova-api-0" Sep 30 06:39:10 crc kubenswrapper[4691]: I0930 06:39:10.431877 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d72ab7f3-2303-4207-b7e5-c7935165f9ae-internal-tls-certs\") pod \"nova-api-0\" (UID: \"d72ab7f3-2303-4207-b7e5-c7935165f9ae\") " pod="openstack/nova-api-0" Sep 30 06:39:10 crc kubenswrapper[4691]: I0930 06:39:10.432177 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d72ab7f3-2303-4207-b7e5-c7935165f9ae-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"d72ab7f3-2303-4207-b7e5-c7935165f9ae\") " pod="openstack/nova-api-0" Sep 30 06:39:10 crc kubenswrapper[4691]: I0930 06:39:10.432459 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d72ab7f3-2303-4207-b7e5-c7935165f9ae-config-data\") pod \"nova-api-0\" (UID: \"d72ab7f3-2303-4207-b7e5-c7935165f9ae\") " pod="openstack/nova-api-0" Sep 30 06:39:10 crc kubenswrapper[4691]: I0930 06:39:10.437394 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d72ab7f3-2303-4207-b7e5-c7935165f9ae-public-tls-certs\") pod \"nova-api-0\" (UID: \"d72ab7f3-2303-4207-b7e5-c7935165f9ae\") " pod="openstack/nova-api-0" Sep 30 06:39:10 crc kubenswrapper[4691]: I0930 06:39:10.443381 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5pbwx\" (UniqueName: \"kubernetes.io/projected/d72ab7f3-2303-4207-b7e5-c7935165f9ae-kube-api-access-5pbwx\") pod \"nova-api-0\" (UID: \"d72ab7f3-2303-4207-b7e5-c7935165f9ae\") " pod="openstack/nova-api-0" Sep 30 06:39:10 crc kubenswrapper[4691]: I0930 06:39:10.473611 4691 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="4528ae48-9ab1-424c-8cb0-d5cb6996562b" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.0.212:3000/\": dial tcp 10.217.0.212:3000: connect: connection refused" Sep 30 06:39:10 crc kubenswrapper[4691]: I0930 06:39:10.605709 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 06:39:11 crc kubenswrapper[4691]: I0930 06:39:11.115134 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 06:39:11 crc kubenswrapper[4691]: I0930 06:39:11.177153 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d72ab7f3-2303-4207-b7e5-c7935165f9ae","Type":"ContainerStarted","Data":"e8c35109546b1f47bdf221f40ed5c34b0b7a1c01cbe87d7c133d0c42c7616a52"} Sep 30 06:39:11 crc kubenswrapper[4691]: I0930 06:39:11.242841 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f18aaea4-d2f7-47e6-bb32-00b3ba99ea72" path="/var/lib/kubelet/pods/f18aaea4-d2f7-47e6-bb32-00b3ba99ea72/volumes" Sep 30 06:39:12 crc kubenswrapper[4691]: I0930 06:39:12.196593 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d72ab7f3-2303-4207-b7e5-c7935165f9ae","Type":"ContainerStarted","Data":"29522a106bf94e484acca61feedfcedb7809e1553ccbb2d963446ef6ca2a29fb"} Sep 30 06:39:12 crc kubenswrapper[4691]: I0930 06:39:12.197038 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d72ab7f3-2303-4207-b7e5-c7935165f9ae","Type":"ContainerStarted","Data":"3e72d4958cccdada72e2480f668d648b1001d1e49df9145d6b5c04754ca7696b"} Sep 30 06:39:12 crc kubenswrapper[4691]: I0930 06:39:12.242079 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.242053068 podStartE2EDuration="2.242053068s" podCreationTimestamp="2025-09-30 06:39:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:39:12.231759101 +0000 UTC m=+1195.706780181" watchObservedRunningTime="2025-09-30 06:39:12.242053068 +0000 UTC m=+1195.717074118" Sep 30 06:39:12 crc kubenswrapper[4691]: I0930 06:39:12.779129 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 06:39:12 crc kubenswrapper[4691]: I0930 06:39:12.875776 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4528ae48-9ab1-424c-8cb0-d5cb6996562b-combined-ca-bundle\") pod \"4528ae48-9ab1-424c-8cb0-d5cb6996562b\" (UID: \"4528ae48-9ab1-424c-8cb0-d5cb6996562b\") " Sep 30 06:39:12 crc kubenswrapper[4691]: I0930 06:39:12.875824 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mkqgh\" (UniqueName: \"kubernetes.io/projected/4528ae48-9ab1-424c-8cb0-d5cb6996562b-kube-api-access-mkqgh\") pod \"4528ae48-9ab1-424c-8cb0-d5cb6996562b\" (UID: \"4528ae48-9ab1-424c-8cb0-d5cb6996562b\") " Sep 30 06:39:12 crc kubenswrapper[4691]: I0930 06:39:12.875946 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4528ae48-9ab1-424c-8cb0-d5cb6996562b-sg-core-conf-yaml\") pod \"4528ae48-9ab1-424c-8cb0-d5cb6996562b\" (UID: \"4528ae48-9ab1-424c-8cb0-d5cb6996562b\") " Sep 30 06:39:12 crc kubenswrapper[4691]: I0930 06:39:12.875998 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4528ae48-9ab1-424c-8cb0-d5cb6996562b-scripts\") pod \"4528ae48-9ab1-424c-8cb0-d5cb6996562b\" (UID: \"4528ae48-9ab1-424c-8cb0-d5cb6996562b\") " Sep 30 06:39:12 crc kubenswrapper[4691]: I0930 06:39:12.876042 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4528ae48-9ab1-424c-8cb0-d5cb6996562b-config-data\") pod \"4528ae48-9ab1-424c-8cb0-d5cb6996562b\" (UID: \"4528ae48-9ab1-424c-8cb0-d5cb6996562b\") " Sep 30 06:39:12 crc kubenswrapper[4691]: I0930 06:39:12.876111 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4528ae48-9ab1-424c-8cb0-d5cb6996562b-log-httpd\") pod \"4528ae48-9ab1-424c-8cb0-d5cb6996562b\" (UID: \"4528ae48-9ab1-424c-8cb0-d5cb6996562b\") " Sep 30 06:39:12 crc kubenswrapper[4691]: I0930 06:39:12.876149 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4528ae48-9ab1-424c-8cb0-d5cb6996562b-run-httpd\") pod \"4528ae48-9ab1-424c-8cb0-d5cb6996562b\" (UID: \"4528ae48-9ab1-424c-8cb0-d5cb6996562b\") " Sep 30 06:39:12 crc kubenswrapper[4691]: I0930 06:39:12.876780 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4528ae48-9ab1-424c-8cb0-d5cb6996562b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4528ae48-9ab1-424c-8cb0-d5cb6996562b" (UID: "4528ae48-9ab1-424c-8cb0-d5cb6996562b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:39:12 crc kubenswrapper[4691]: I0930 06:39:12.877098 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4528ae48-9ab1-424c-8cb0-d5cb6996562b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4528ae48-9ab1-424c-8cb0-d5cb6996562b" (UID: "4528ae48-9ab1-424c-8cb0-d5cb6996562b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:39:12 crc kubenswrapper[4691]: I0930 06:39:12.881928 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4528ae48-9ab1-424c-8cb0-d5cb6996562b-kube-api-access-mkqgh" (OuterVolumeSpecName: "kube-api-access-mkqgh") pod "4528ae48-9ab1-424c-8cb0-d5cb6996562b" (UID: "4528ae48-9ab1-424c-8cb0-d5cb6996562b"). InnerVolumeSpecName "kube-api-access-mkqgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:39:12 crc kubenswrapper[4691]: I0930 06:39:12.882469 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4528ae48-9ab1-424c-8cb0-d5cb6996562b-scripts" (OuterVolumeSpecName: "scripts") pod "4528ae48-9ab1-424c-8cb0-d5cb6996562b" (UID: "4528ae48-9ab1-424c-8cb0-d5cb6996562b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:39:12 crc kubenswrapper[4691]: I0930 06:39:12.953928 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4528ae48-9ab1-424c-8cb0-d5cb6996562b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4528ae48-9ab1-424c-8cb0-d5cb6996562b" (UID: "4528ae48-9ab1-424c-8cb0-d5cb6996562b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:39:12 crc kubenswrapper[4691]: I0930 06:39:12.977599 4691 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4528ae48-9ab1-424c-8cb0-d5cb6996562b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 06:39:12 crc kubenswrapper[4691]: I0930 06:39:12.977628 4691 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4528ae48-9ab1-424c-8cb0-d5cb6996562b-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 06:39:12 crc kubenswrapper[4691]: I0930 06:39:12.977637 4691 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4528ae48-9ab1-424c-8cb0-d5cb6996562b-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 06:39:12 crc kubenswrapper[4691]: I0930 06:39:12.977645 4691 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4528ae48-9ab1-424c-8cb0-d5cb6996562b-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 06:39:12 crc kubenswrapper[4691]: I0930 06:39:12.977653 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mkqgh\" (UniqueName: \"kubernetes.io/projected/4528ae48-9ab1-424c-8cb0-d5cb6996562b-kube-api-access-mkqgh\") on node \"crc\" DevicePath \"\"" Sep 30 06:39:13 crc kubenswrapper[4691]: I0930 06:39:13.025392 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4528ae48-9ab1-424c-8cb0-d5cb6996562b-config-data" (OuterVolumeSpecName: "config-data") pod "4528ae48-9ab1-424c-8cb0-d5cb6996562b" (UID: "4528ae48-9ab1-424c-8cb0-d5cb6996562b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:39:13 crc kubenswrapper[4691]: I0930 06:39:13.031303 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4528ae48-9ab1-424c-8cb0-d5cb6996562b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4528ae48-9ab1-424c-8cb0-d5cb6996562b" (UID: "4528ae48-9ab1-424c-8cb0-d5cb6996562b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:39:13 crc kubenswrapper[4691]: I0930 06:39:13.079480 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4528ae48-9ab1-424c-8cb0-d5cb6996562b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:39:13 crc kubenswrapper[4691]: I0930 06:39:13.079517 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4528ae48-9ab1-424c-8cb0-d5cb6996562b-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 06:39:13 crc kubenswrapper[4691]: I0930 06:39:13.207879 4691 generic.go:334] "Generic (PLEG): container finished" podID="4528ae48-9ab1-424c-8cb0-d5cb6996562b" containerID="374c3f0e049e24622ea5631f0df57cb505aefcf66d3d95d05155ea9a94ea1434" exitCode=0 Sep 30 06:39:13 crc kubenswrapper[4691]: I0930 06:39:13.207949 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 06:39:13 crc kubenswrapper[4691]: I0930 06:39:13.207982 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4528ae48-9ab1-424c-8cb0-d5cb6996562b","Type":"ContainerDied","Data":"374c3f0e049e24622ea5631f0df57cb505aefcf66d3d95d05155ea9a94ea1434"} Sep 30 06:39:13 crc kubenswrapper[4691]: I0930 06:39:13.208032 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4528ae48-9ab1-424c-8cb0-d5cb6996562b","Type":"ContainerDied","Data":"1d99baf41abb261804ca7ccb2045cfdcf3beb389c04a1fb87cd6173f9d309bbf"} Sep 30 06:39:13 crc kubenswrapper[4691]: I0930 06:39:13.208050 4691 scope.go:117] "RemoveContainer" containerID="a55efa7ab36921ab411af4d3ca4223770f34f484c7b1d6e55bedbe647594d16b" Sep 30 06:39:13 crc kubenswrapper[4691]: I0930 06:39:13.239483 4691 scope.go:117] "RemoveContainer" containerID="1eae33ebf278f94d485a71973ff7e8f5034fb88c81f4e76b35433760bd2cccc5" Sep 30 06:39:13 crc kubenswrapper[4691]: I0930 06:39:13.256250 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 06:39:13 crc kubenswrapper[4691]: I0930 06:39:13.273567 4691 scope.go:117] "RemoveContainer" containerID="374c3f0e049e24622ea5631f0df57cb505aefcf66d3d95d05155ea9a94ea1434" Sep 30 06:39:13 crc kubenswrapper[4691]: I0930 06:39:13.284219 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 06:39:13 crc kubenswrapper[4691]: I0930 06:39:13.322377 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 06:39:13 crc kubenswrapper[4691]: E0930 06:39:13.323042 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4528ae48-9ab1-424c-8cb0-d5cb6996562b" containerName="ceilometer-notification-agent" Sep 30 06:39:13 crc kubenswrapper[4691]: I0930 06:39:13.323067 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="4528ae48-9ab1-424c-8cb0-d5cb6996562b" containerName="ceilometer-notification-agent" Sep 30 06:39:13 crc kubenswrapper[4691]: E0930 06:39:13.323081 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4528ae48-9ab1-424c-8cb0-d5cb6996562b" containerName="proxy-httpd" Sep 30 06:39:13 crc kubenswrapper[4691]: I0930 06:39:13.323090 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="4528ae48-9ab1-424c-8cb0-d5cb6996562b" containerName="proxy-httpd" Sep 30 06:39:13 crc kubenswrapper[4691]: E0930 06:39:13.323112 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4528ae48-9ab1-424c-8cb0-d5cb6996562b" containerName="ceilometer-central-agent" Sep 30 06:39:13 crc kubenswrapper[4691]: I0930 06:39:13.323121 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="4528ae48-9ab1-424c-8cb0-d5cb6996562b" containerName="ceilometer-central-agent" Sep 30 06:39:13 crc kubenswrapper[4691]: E0930 06:39:13.323134 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4528ae48-9ab1-424c-8cb0-d5cb6996562b" containerName="sg-core" Sep 30 06:39:13 crc kubenswrapper[4691]: I0930 06:39:13.323141 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="4528ae48-9ab1-424c-8cb0-d5cb6996562b" containerName="sg-core" Sep 30 06:39:13 crc kubenswrapper[4691]: I0930 06:39:13.323398 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="4528ae48-9ab1-424c-8cb0-d5cb6996562b" containerName="ceilometer-notification-agent" Sep 30 06:39:13 crc kubenswrapper[4691]: I0930 06:39:13.323422 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="4528ae48-9ab1-424c-8cb0-d5cb6996562b" containerName="ceilometer-central-agent" Sep 30 06:39:13 crc kubenswrapper[4691]: I0930 06:39:13.323439 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="4528ae48-9ab1-424c-8cb0-d5cb6996562b" containerName="proxy-httpd" Sep 30 06:39:13 crc kubenswrapper[4691]: I0930 06:39:13.323453 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="4528ae48-9ab1-424c-8cb0-d5cb6996562b" containerName="sg-core" Sep 30 06:39:13 crc kubenswrapper[4691]: I0930 06:39:13.325800 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 06:39:13 crc kubenswrapper[4691]: I0930 06:39:13.329596 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 06:39:13 crc kubenswrapper[4691]: I0930 06:39:13.330660 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 06:39:13 crc kubenswrapper[4691]: I0930 06:39:13.333532 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 06:39:13 crc kubenswrapper[4691]: I0930 06:39:13.333825 4691 scope.go:117] "RemoveContainer" containerID="7eaf8010a19696f26cccb1404073cce1d7975457638251360bbba51c5ae32852" Sep 30 06:39:13 crc kubenswrapper[4691]: I0930 06:39:13.380793 4691 scope.go:117] "RemoveContainer" containerID="a55efa7ab36921ab411af4d3ca4223770f34f484c7b1d6e55bedbe647594d16b" Sep 30 06:39:13 crc kubenswrapper[4691]: E0930 06:39:13.381155 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a55efa7ab36921ab411af4d3ca4223770f34f484c7b1d6e55bedbe647594d16b\": container with ID starting with a55efa7ab36921ab411af4d3ca4223770f34f484c7b1d6e55bedbe647594d16b not found: ID does not exist" containerID="a55efa7ab36921ab411af4d3ca4223770f34f484c7b1d6e55bedbe647594d16b" Sep 30 06:39:13 crc kubenswrapper[4691]: I0930 06:39:13.381189 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a55efa7ab36921ab411af4d3ca4223770f34f484c7b1d6e55bedbe647594d16b"} err="failed to get container status \"a55efa7ab36921ab411af4d3ca4223770f34f484c7b1d6e55bedbe647594d16b\": rpc error: code = NotFound desc = could not find container \"a55efa7ab36921ab411af4d3ca4223770f34f484c7b1d6e55bedbe647594d16b\": container with ID starting with a55efa7ab36921ab411af4d3ca4223770f34f484c7b1d6e55bedbe647594d16b not found: ID does not exist" Sep 30 06:39:13 crc kubenswrapper[4691]: I0930 06:39:13.381210 4691 scope.go:117] "RemoveContainer" containerID="1eae33ebf278f94d485a71973ff7e8f5034fb88c81f4e76b35433760bd2cccc5" Sep 30 06:39:13 crc kubenswrapper[4691]: E0930 06:39:13.381684 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1eae33ebf278f94d485a71973ff7e8f5034fb88c81f4e76b35433760bd2cccc5\": container with ID starting with 1eae33ebf278f94d485a71973ff7e8f5034fb88c81f4e76b35433760bd2cccc5 not found: ID does not exist" containerID="1eae33ebf278f94d485a71973ff7e8f5034fb88c81f4e76b35433760bd2cccc5" Sep 30 06:39:13 crc kubenswrapper[4691]: I0930 06:39:13.381729 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1eae33ebf278f94d485a71973ff7e8f5034fb88c81f4e76b35433760bd2cccc5"} err="failed to get container status \"1eae33ebf278f94d485a71973ff7e8f5034fb88c81f4e76b35433760bd2cccc5\": rpc error: code = NotFound desc = could not find container \"1eae33ebf278f94d485a71973ff7e8f5034fb88c81f4e76b35433760bd2cccc5\": container with ID starting with 1eae33ebf278f94d485a71973ff7e8f5034fb88c81f4e76b35433760bd2cccc5 not found: ID does not exist" Sep 30 06:39:13 crc kubenswrapper[4691]: I0930 06:39:13.381746 4691 scope.go:117] "RemoveContainer" containerID="374c3f0e049e24622ea5631f0df57cb505aefcf66d3d95d05155ea9a94ea1434" Sep 30 06:39:13 crc kubenswrapper[4691]: E0930 06:39:13.382154 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"374c3f0e049e24622ea5631f0df57cb505aefcf66d3d95d05155ea9a94ea1434\": container with ID starting with 374c3f0e049e24622ea5631f0df57cb505aefcf66d3d95d05155ea9a94ea1434 not found: ID does not exist" containerID="374c3f0e049e24622ea5631f0df57cb505aefcf66d3d95d05155ea9a94ea1434" Sep 30 06:39:13 crc kubenswrapper[4691]: I0930 06:39:13.382216 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"374c3f0e049e24622ea5631f0df57cb505aefcf66d3d95d05155ea9a94ea1434"} err="failed to get container status \"374c3f0e049e24622ea5631f0df57cb505aefcf66d3d95d05155ea9a94ea1434\": rpc error: code = NotFound desc = could not find container \"374c3f0e049e24622ea5631f0df57cb505aefcf66d3d95d05155ea9a94ea1434\": container with ID starting with 374c3f0e049e24622ea5631f0df57cb505aefcf66d3d95d05155ea9a94ea1434 not found: ID does not exist" Sep 30 06:39:13 crc kubenswrapper[4691]: I0930 06:39:13.382251 4691 scope.go:117] "RemoveContainer" containerID="7eaf8010a19696f26cccb1404073cce1d7975457638251360bbba51c5ae32852" Sep 30 06:39:13 crc kubenswrapper[4691]: E0930 06:39:13.382671 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7eaf8010a19696f26cccb1404073cce1d7975457638251360bbba51c5ae32852\": container with ID starting with 7eaf8010a19696f26cccb1404073cce1d7975457638251360bbba51c5ae32852 not found: ID does not exist" containerID="7eaf8010a19696f26cccb1404073cce1d7975457638251360bbba51c5ae32852" Sep 30 06:39:13 crc kubenswrapper[4691]: I0930 06:39:13.382717 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7eaf8010a19696f26cccb1404073cce1d7975457638251360bbba51c5ae32852"} err="failed to get container status \"7eaf8010a19696f26cccb1404073cce1d7975457638251360bbba51c5ae32852\": rpc error: code = NotFound desc = could not find container \"7eaf8010a19696f26cccb1404073cce1d7975457638251360bbba51c5ae32852\": container with ID starting with 7eaf8010a19696f26cccb1404073cce1d7975457638251360bbba51c5ae32852 not found: ID does not exist" Sep 30 06:39:13 crc kubenswrapper[4691]: I0930 06:39:13.405058 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/16b1887e-8994-401c-a175-6749c32fc173-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"16b1887e-8994-401c-a175-6749c32fc173\") " pod="openstack/ceilometer-0" Sep 30 06:39:13 crc kubenswrapper[4691]: I0930 06:39:13.405131 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16b1887e-8994-401c-a175-6749c32fc173-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"16b1887e-8994-401c-a175-6749c32fc173\") " pod="openstack/ceilometer-0" Sep 30 06:39:13 crc kubenswrapper[4691]: I0930 06:39:13.405212 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbk7c\" (UniqueName: \"kubernetes.io/projected/16b1887e-8994-401c-a175-6749c32fc173-kube-api-access-nbk7c\") pod \"ceilometer-0\" (UID: \"16b1887e-8994-401c-a175-6749c32fc173\") " pod="openstack/ceilometer-0" Sep 30 06:39:13 crc kubenswrapper[4691]: I0930 06:39:13.405258 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16b1887e-8994-401c-a175-6749c32fc173-config-data\") pod \"ceilometer-0\" (UID: \"16b1887e-8994-401c-a175-6749c32fc173\") " pod="openstack/ceilometer-0" Sep 30 06:39:13 crc kubenswrapper[4691]: I0930 06:39:13.405287 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16b1887e-8994-401c-a175-6749c32fc173-scripts\") pod \"ceilometer-0\" (UID: \"16b1887e-8994-401c-a175-6749c32fc173\") " pod="openstack/ceilometer-0" Sep 30 06:39:13 crc kubenswrapper[4691]: I0930 06:39:13.405333 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/16b1887e-8994-401c-a175-6749c32fc173-run-httpd\") pod \"ceilometer-0\" (UID: \"16b1887e-8994-401c-a175-6749c32fc173\") " pod="openstack/ceilometer-0" Sep 30 06:39:13 crc kubenswrapper[4691]: I0930 06:39:13.405366 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/16b1887e-8994-401c-a175-6749c32fc173-log-httpd\") pod \"ceilometer-0\" (UID: \"16b1887e-8994-401c-a175-6749c32fc173\") " pod="openstack/ceilometer-0" Sep 30 06:39:13 crc kubenswrapper[4691]: I0930 06:39:13.507162 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/16b1887e-8994-401c-a175-6749c32fc173-run-httpd\") pod \"ceilometer-0\" (UID: \"16b1887e-8994-401c-a175-6749c32fc173\") " pod="openstack/ceilometer-0" Sep 30 06:39:13 crc kubenswrapper[4691]: I0930 06:39:13.507555 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/16b1887e-8994-401c-a175-6749c32fc173-log-httpd\") pod \"ceilometer-0\" (UID: \"16b1887e-8994-401c-a175-6749c32fc173\") " pod="openstack/ceilometer-0" Sep 30 06:39:13 crc kubenswrapper[4691]: I0930 06:39:13.507622 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/16b1887e-8994-401c-a175-6749c32fc173-run-httpd\") pod \"ceilometer-0\" (UID: \"16b1887e-8994-401c-a175-6749c32fc173\") " pod="openstack/ceilometer-0" Sep 30 06:39:13 crc kubenswrapper[4691]: I0930 06:39:13.507664 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/16b1887e-8994-401c-a175-6749c32fc173-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"16b1887e-8994-401c-a175-6749c32fc173\") " pod="openstack/ceilometer-0" Sep 30 06:39:13 crc kubenswrapper[4691]: I0930 06:39:13.507708 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16b1887e-8994-401c-a175-6749c32fc173-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"16b1887e-8994-401c-a175-6749c32fc173\") " pod="openstack/ceilometer-0" Sep 30 06:39:13 crc kubenswrapper[4691]: I0930 06:39:13.507796 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nbk7c\" (UniqueName: \"kubernetes.io/projected/16b1887e-8994-401c-a175-6749c32fc173-kube-api-access-nbk7c\") pod \"ceilometer-0\" (UID: \"16b1887e-8994-401c-a175-6749c32fc173\") " pod="openstack/ceilometer-0" Sep 30 06:39:13 crc kubenswrapper[4691]: I0930 06:39:13.507845 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16b1887e-8994-401c-a175-6749c32fc173-config-data\") pod \"ceilometer-0\" (UID: \"16b1887e-8994-401c-a175-6749c32fc173\") " pod="openstack/ceilometer-0" Sep 30 06:39:13 crc kubenswrapper[4691]: I0930 06:39:13.507875 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16b1887e-8994-401c-a175-6749c32fc173-scripts\") pod \"ceilometer-0\" (UID: \"16b1887e-8994-401c-a175-6749c32fc173\") " pod="openstack/ceilometer-0" Sep 30 06:39:13 crc kubenswrapper[4691]: I0930 06:39:13.508015 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/16b1887e-8994-401c-a175-6749c32fc173-log-httpd\") pod \"ceilometer-0\" (UID: \"16b1887e-8994-401c-a175-6749c32fc173\") " pod="openstack/ceilometer-0" Sep 30 06:39:13 crc kubenswrapper[4691]: I0930 06:39:13.511232 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/16b1887e-8994-401c-a175-6749c32fc173-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"16b1887e-8994-401c-a175-6749c32fc173\") " pod="openstack/ceilometer-0" Sep 30 06:39:13 crc kubenswrapper[4691]: I0930 06:39:13.512028 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16b1887e-8994-401c-a175-6749c32fc173-scripts\") pod \"ceilometer-0\" (UID: \"16b1887e-8994-401c-a175-6749c32fc173\") " pod="openstack/ceilometer-0" Sep 30 06:39:13 crc kubenswrapper[4691]: I0930 06:39:13.512348 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16b1887e-8994-401c-a175-6749c32fc173-config-data\") pod \"ceilometer-0\" (UID: \"16b1887e-8994-401c-a175-6749c32fc173\") " pod="openstack/ceilometer-0" Sep 30 06:39:13 crc kubenswrapper[4691]: I0930 06:39:13.513731 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16b1887e-8994-401c-a175-6749c32fc173-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"16b1887e-8994-401c-a175-6749c32fc173\") " pod="openstack/ceilometer-0" Sep 30 06:39:13 crc kubenswrapper[4691]: I0930 06:39:13.522984 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbk7c\" (UniqueName: \"kubernetes.io/projected/16b1887e-8994-401c-a175-6749c32fc173-kube-api-access-nbk7c\") pod \"ceilometer-0\" (UID: \"16b1887e-8994-401c-a175-6749c32fc173\") " pod="openstack/ceilometer-0" Sep 30 06:39:13 crc kubenswrapper[4691]: I0930 06:39:13.531923 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Sep 30 06:39:13 crc kubenswrapper[4691]: I0930 06:39:13.548348 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Sep 30 06:39:13 crc kubenswrapper[4691]: I0930 06:39:13.576895 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Sep 30 06:39:13 crc kubenswrapper[4691]: I0930 06:39:13.577022 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Sep 30 06:39:13 crc kubenswrapper[4691]: I0930 06:39:13.682318 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 06:39:14 crc kubenswrapper[4691]: W0930 06:39:14.146206 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod16b1887e_8994_401c_a175_6749c32fc173.slice/crio-db41a7847210f858dbed08f73548bc6eecc5500c4dc752f5489659e6e580df05 WatchSource:0}: Error finding container db41a7847210f858dbed08f73548bc6eecc5500c4dc752f5489659e6e580df05: Status 404 returned error can't find the container with id db41a7847210f858dbed08f73548bc6eecc5500c4dc752f5489659e6e580df05 Sep 30 06:39:14 crc kubenswrapper[4691]: I0930 06:39:14.173367 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 06:39:14 crc kubenswrapper[4691]: I0930 06:39:14.228356 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"16b1887e-8994-401c-a175-6749c32fc173","Type":"ContainerStarted","Data":"db41a7847210f858dbed08f73548bc6eecc5500c4dc752f5489659e6e580df05"} Sep 30 06:39:14 crc kubenswrapper[4691]: I0930 06:39:14.265011 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Sep 30 06:39:14 crc kubenswrapper[4691]: I0930 06:39:14.450941 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-xqb84"] Sep 30 06:39:14 crc kubenswrapper[4691]: I0930 06:39:14.452479 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-xqb84" Sep 30 06:39:14 crc kubenswrapper[4691]: I0930 06:39:14.454568 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Sep 30 06:39:14 crc kubenswrapper[4691]: I0930 06:39:14.454916 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Sep 30 06:39:14 crc kubenswrapper[4691]: I0930 06:39:14.472334 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-xqb84"] Sep 30 06:39:14 crc kubenswrapper[4691]: I0930 06:39:14.525430 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08f0f027-7a83-431f-849a-639df1298562-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-xqb84\" (UID: \"08f0f027-7a83-431f-849a-639df1298562\") " pod="openstack/nova-cell1-cell-mapping-xqb84" Sep 30 06:39:14 crc kubenswrapper[4691]: I0930 06:39:14.525615 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwvg5\" (UniqueName: \"kubernetes.io/projected/08f0f027-7a83-431f-849a-639df1298562-kube-api-access-xwvg5\") pod \"nova-cell1-cell-mapping-xqb84\" (UID: \"08f0f027-7a83-431f-849a-639df1298562\") " pod="openstack/nova-cell1-cell-mapping-xqb84" Sep 30 06:39:14 crc kubenswrapper[4691]: I0930 06:39:14.525697 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08f0f027-7a83-431f-849a-639df1298562-config-data\") pod \"nova-cell1-cell-mapping-xqb84\" (UID: \"08f0f027-7a83-431f-849a-639df1298562\") " pod="openstack/nova-cell1-cell-mapping-xqb84" Sep 30 06:39:14 crc kubenswrapper[4691]: I0930 06:39:14.525776 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08f0f027-7a83-431f-849a-639df1298562-scripts\") pod \"nova-cell1-cell-mapping-xqb84\" (UID: \"08f0f027-7a83-431f-849a-639df1298562\") " pod="openstack/nova-cell1-cell-mapping-xqb84" Sep 30 06:39:14 crc kubenswrapper[4691]: I0930 06:39:14.585159 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="afe1b16e-3c68-4b97-ad22-7b38c84ef6bb" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.216:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 06:39:14 crc kubenswrapper[4691]: I0930 06:39:14.592069 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="afe1b16e-3c68-4b97-ad22-7b38c84ef6bb" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.216:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 06:39:14 crc kubenswrapper[4691]: I0930 06:39:14.626994 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwvg5\" (UniqueName: \"kubernetes.io/projected/08f0f027-7a83-431f-849a-639df1298562-kube-api-access-xwvg5\") pod \"nova-cell1-cell-mapping-xqb84\" (UID: \"08f0f027-7a83-431f-849a-639df1298562\") " pod="openstack/nova-cell1-cell-mapping-xqb84" Sep 30 06:39:14 crc kubenswrapper[4691]: I0930 06:39:14.627063 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08f0f027-7a83-431f-849a-639df1298562-config-data\") pod \"nova-cell1-cell-mapping-xqb84\" (UID: \"08f0f027-7a83-431f-849a-639df1298562\") " pod="openstack/nova-cell1-cell-mapping-xqb84" Sep 30 06:39:14 crc kubenswrapper[4691]: I0930 06:39:14.627109 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08f0f027-7a83-431f-849a-639df1298562-scripts\") pod \"nova-cell1-cell-mapping-xqb84\" (UID: \"08f0f027-7a83-431f-849a-639df1298562\") " pod="openstack/nova-cell1-cell-mapping-xqb84" Sep 30 06:39:14 crc kubenswrapper[4691]: I0930 06:39:14.627180 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08f0f027-7a83-431f-849a-639df1298562-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-xqb84\" (UID: \"08f0f027-7a83-431f-849a-639df1298562\") " pod="openstack/nova-cell1-cell-mapping-xqb84" Sep 30 06:39:14 crc kubenswrapper[4691]: I0930 06:39:14.630797 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08f0f027-7a83-431f-849a-639df1298562-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-xqb84\" (UID: \"08f0f027-7a83-431f-849a-639df1298562\") " pod="openstack/nova-cell1-cell-mapping-xqb84" Sep 30 06:39:14 crc kubenswrapper[4691]: I0930 06:39:14.634454 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08f0f027-7a83-431f-849a-639df1298562-config-data\") pod \"nova-cell1-cell-mapping-xqb84\" (UID: \"08f0f027-7a83-431f-849a-639df1298562\") " pod="openstack/nova-cell1-cell-mapping-xqb84" Sep 30 06:39:14 crc kubenswrapper[4691]: I0930 06:39:14.641585 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08f0f027-7a83-431f-849a-639df1298562-scripts\") pod \"nova-cell1-cell-mapping-xqb84\" (UID: \"08f0f027-7a83-431f-849a-639df1298562\") " pod="openstack/nova-cell1-cell-mapping-xqb84" Sep 30 06:39:14 crc kubenswrapper[4691]: I0930 06:39:14.645100 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwvg5\" (UniqueName: \"kubernetes.io/projected/08f0f027-7a83-431f-849a-639df1298562-kube-api-access-xwvg5\") pod \"nova-cell1-cell-mapping-xqb84\" (UID: \"08f0f027-7a83-431f-849a-639df1298562\") " pod="openstack/nova-cell1-cell-mapping-xqb84" Sep 30 06:39:14 crc kubenswrapper[4691]: I0930 06:39:14.727908 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6c45569577-gvk65" Sep 30 06:39:14 crc kubenswrapper[4691]: I0930 06:39:14.777786 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-xqb84" Sep 30 06:39:14 crc kubenswrapper[4691]: I0930 06:39:14.802974 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57676fb98c-lb5zs"] Sep 30 06:39:14 crc kubenswrapper[4691]: I0930 06:39:14.803194 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57676fb98c-lb5zs" podUID="1944fc3a-694c-4642-a242-ace9c04f708e" containerName="dnsmasq-dns" containerID="cri-o://0bd4b43e5a043453bd748d5aa177344a3b55fc5430f9653230d74ed1396dc421" gracePeriod=10 Sep 30 06:39:15 crc kubenswrapper[4691]: I0930 06:39:15.238235 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4528ae48-9ab1-424c-8cb0-d5cb6996562b" path="/var/lib/kubelet/pods/4528ae48-9ab1-424c-8cb0-d5cb6996562b/volumes" Sep 30 06:39:15 crc kubenswrapper[4691]: I0930 06:39:15.244839 4691 generic.go:334] "Generic (PLEG): container finished" podID="1944fc3a-694c-4642-a242-ace9c04f708e" containerID="0bd4b43e5a043453bd748d5aa177344a3b55fc5430f9653230d74ed1396dc421" exitCode=0 Sep 30 06:39:15 crc kubenswrapper[4691]: I0930 06:39:15.244899 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57676fb98c-lb5zs" event={"ID":"1944fc3a-694c-4642-a242-ace9c04f708e","Type":"ContainerDied","Data":"0bd4b43e5a043453bd748d5aa177344a3b55fc5430f9653230d74ed1396dc421"} Sep 30 06:39:15 crc kubenswrapper[4691]: I0930 06:39:15.247321 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"16b1887e-8994-401c-a175-6749c32fc173","Type":"ContainerStarted","Data":"408800e72e6c2829153ffa6840f0defd6780fc3f5dbdc11f11b6987fc44959ab"} Sep 30 06:39:15 crc kubenswrapper[4691]: I0930 06:39:15.247340 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"16b1887e-8994-401c-a175-6749c32fc173","Type":"ContainerStarted","Data":"229254bdac01fc3350892e0dbc1acd302316b9c3aa3748d908c9106802cb52ff"} Sep 30 06:39:16 crc kubenswrapper[4691]: W0930 06:39:16.070326 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08f0f027_7a83_431f_849a_639df1298562.slice/crio-a09f86cb613b5694133e9a23c203ed2276835e63af37f54bdfc2e9be293de4f9 WatchSource:0}: Error finding container a09f86cb613b5694133e9a23c203ed2276835e63af37f54bdfc2e9be293de4f9: Status 404 returned error can't find the container with id a09f86cb613b5694133e9a23c203ed2276835e63af37f54bdfc2e9be293de4f9 Sep 30 06:39:16 crc kubenswrapper[4691]: I0930 06:39:16.074986 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-xqb84"] Sep 30 06:39:16 crc kubenswrapper[4691]: I0930 06:39:16.158958 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57676fb98c-lb5zs" Sep 30 06:39:16 crc kubenswrapper[4691]: I0930 06:39:16.256241 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1944fc3a-694c-4642-a242-ace9c04f708e-config\") pod \"1944fc3a-694c-4642-a242-ace9c04f708e\" (UID: \"1944fc3a-694c-4642-a242-ace9c04f708e\") " Sep 30 06:39:16 crc kubenswrapper[4691]: I0930 06:39:16.256811 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1944fc3a-694c-4642-a242-ace9c04f708e-dns-svc\") pod \"1944fc3a-694c-4642-a242-ace9c04f708e\" (UID: \"1944fc3a-694c-4642-a242-ace9c04f708e\") " Sep 30 06:39:16 crc kubenswrapper[4691]: I0930 06:39:16.257034 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1944fc3a-694c-4642-a242-ace9c04f708e-ovsdbserver-nb\") pod \"1944fc3a-694c-4642-a242-ace9c04f708e\" (UID: \"1944fc3a-694c-4642-a242-ace9c04f708e\") " Sep 30 06:39:16 crc kubenswrapper[4691]: I0930 06:39:16.257163 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jlndh\" (UniqueName: \"kubernetes.io/projected/1944fc3a-694c-4642-a242-ace9c04f708e-kube-api-access-jlndh\") pod \"1944fc3a-694c-4642-a242-ace9c04f708e\" (UID: \"1944fc3a-694c-4642-a242-ace9c04f708e\") " Sep 30 06:39:16 crc kubenswrapper[4691]: I0930 06:39:16.258250 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1944fc3a-694c-4642-a242-ace9c04f708e-dns-swift-storage-0\") pod \"1944fc3a-694c-4642-a242-ace9c04f708e\" (UID: \"1944fc3a-694c-4642-a242-ace9c04f708e\") " Sep 30 06:39:16 crc kubenswrapper[4691]: I0930 06:39:16.258377 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1944fc3a-694c-4642-a242-ace9c04f708e-ovsdbserver-sb\") pod \"1944fc3a-694c-4642-a242-ace9c04f708e\" (UID: \"1944fc3a-694c-4642-a242-ace9c04f708e\") " Sep 30 06:39:16 crc kubenswrapper[4691]: I0930 06:39:16.262342 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1944fc3a-694c-4642-a242-ace9c04f708e-kube-api-access-jlndh" (OuterVolumeSpecName: "kube-api-access-jlndh") pod "1944fc3a-694c-4642-a242-ace9c04f708e" (UID: "1944fc3a-694c-4642-a242-ace9c04f708e"). InnerVolumeSpecName "kube-api-access-jlndh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:39:16 crc kubenswrapper[4691]: I0930 06:39:16.266436 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57676fb98c-lb5zs" event={"ID":"1944fc3a-694c-4642-a242-ace9c04f708e","Type":"ContainerDied","Data":"1e8067f5448b49ac398222e9f86e42b61bced0f8e7a31b7ad8f2ac4556de0d6d"} Sep 30 06:39:16 crc kubenswrapper[4691]: I0930 06:39:16.266481 4691 scope.go:117] "RemoveContainer" containerID="0bd4b43e5a043453bd748d5aa177344a3b55fc5430f9653230d74ed1396dc421" Sep 30 06:39:16 crc kubenswrapper[4691]: I0930 06:39:16.266600 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57676fb98c-lb5zs" Sep 30 06:39:16 crc kubenswrapper[4691]: I0930 06:39:16.274715 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"16b1887e-8994-401c-a175-6749c32fc173","Type":"ContainerStarted","Data":"1c74ee4af5e97f08a39ac81ed99a02915e646a2ebe59e58b517d5fd2e05b4f1f"} Sep 30 06:39:16 crc kubenswrapper[4691]: I0930 06:39:16.280770 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-xqb84" event={"ID":"08f0f027-7a83-431f-849a-639df1298562","Type":"ContainerStarted","Data":"a09f86cb613b5694133e9a23c203ed2276835e63af37f54bdfc2e9be293de4f9"} Sep 30 06:39:16 crc kubenswrapper[4691]: I0930 06:39:16.290198 4691 scope.go:117] "RemoveContainer" containerID="c74887900b88c9e327db441aae0694a79b15b2f90098071683a764dbb7efb0a7" Sep 30 06:39:16 crc kubenswrapper[4691]: I0930 06:39:16.311423 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1944fc3a-694c-4642-a242-ace9c04f708e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1944fc3a-694c-4642-a242-ace9c04f708e" (UID: "1944fc3a-694c-4642-a242-ace9c04f708e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:39:16 crc kubenswrapper[4691]: I0930 06:39:16.321298 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-xqb84" podStartSLOduration=2.321277646 podStartE2EDuration="2.321277646s" podCreationTimestamp="2025-09-30 06:39:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:39:16.297508392 +0000 UTC m=+1199.772529452" watchObservedRunningTime="2025-09-30 06:39:16.321277646 +0000 UTC m=+1199.796298686" Sep 30 06:39:16 crc kubenswrapper[4691]: I0930 06:39:16.327007 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1944fc3a-694c-4642-a242-ace9c04f708e-config" (OuterVolumeSpecName: "config") pod "1944fc3a-694c-4642-a242-ace9c04f708e" (UID: "1944fc3a-694c-4642-a242-ace9c04f708e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:39:16 crc kubenswrapper[4691]: I0930 06:39:16.327191 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1944fc3a-694c-4642-a242-ace9c04f708e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1944fc3a-694c-4642-a242-ace9c04f708e" (UID: "1944fc3a-694c-4642-a242-ace9c04f708e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:39:16 crc kubenswrapper[4691]: I0930 06:39:16.332368 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1944fc3a-694c-4642-a242-ace9c04f708e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1944fc3a-694c-4642-a242-ace9c04f708e" (UID: "1944fc3a-694c-4642-a242-ace9c04f708e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:39:16 crc kubenswrapper[4691]: I0930 06:39:16.341100 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1944fc3a-694c-4642-a242-ace9c04f708e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1944fc3a-694c-4642-a242-ace9c04f708e" (UID: "1944fc3a-694c-4642-a242-ace9c04f708e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:39:16 crc kubenswrapper[4691]: I0930 06:39:16.360427 4691 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1944fc3a-694c-4642-a242-ace9c04f708e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 06:39:16 crc kubenswrapper[4691]: I0930 06:39:16.360453 4691 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1944fc3a-694c-4642-a242-ace9c04f708e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 06:39:16 crc kubenswrapper[4691]: I0930 06:39:16.360464 4691 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1944fc3a-694c-4642-a242-ace9c04f708e-config\") on node \"crc\" DevicePath \"\"" Sep 30 06:39:16 crc kubenswrapper[4691]: I0930 06:39:16.360474 4691 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1944fc3a-694c-4642-a242-ace9c04f708e-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 06:39:16 crc kubenswrapper[4691]: I0930 06:39:16.360483 4691 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1944fc3a-694c-4642-a242-ace9c04f708e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 06:39:16 crc kubenswrapper[4691]: I0930 06:39:16.360492 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jlndh\" (UniqueName: \"kubernetes.io/projected/1944fc3a-694c-4642-a242-ace9c04f708e-kube-api-access-jlndh\") on node \"crc\" DevicePath \"\"" Sep 30 06:39:16 crc kubenswrapper[4691]: I0930 06:39:16.604613 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57676fb98c-lb5zs"] Sep 30 06:39:16 crc kubenswrapper[4691]: I0930 06:39:16.611945 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57676fb98c-lb5zs"] Sep 30 06:39:17 crc kubenswrapper[4691]: I0930 06:39:17.242413 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1944fc3a-694c-4642-a242-ace9c04f708e" path="/var/lib/kubelet/pods/1944fc3a-694c-4642-a242-ace9c04f708e/volumes" Sep 30 06:39:17 crc kubenswrapper[4691]: I0930 06:39:17.293541 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"16b1887e-8994-401c-a175-6749c32fc173","Type":"ContainerStarted","Data":"a35a38af0cc49be7ce7dae05071e378f52d1c65c73fa412af07d1f2105d80149"} Sep 30 06:39:17 crc kubenswrapper[4691]: I0930 06:39:17.293679 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 06:39:17 crc kubenswrapper[4691]: I0930 06:39:17.295458 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-xqb84" event={"ID":"08f0f027-7a83-431f-849a-639df1298562","Type":"ContainerStarted","Data":"7cbeb0e8a236f64eee17cfcfd4c7a9bdbe2a324c7d204ef5b6b43b8dea6677cb"} Sep 30 06:39:17 crc kubenswrapper[4691]: I0930 06:39:17.320961 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.478205239 podStartE2EDuration="4.320940853s" podCreationTimestamp="2025-09-30 06:39:13 +0000 UTC" firstStartedPulling="2025-09-30 06:39:14.154056082 +0000 UTC m=+1197.629077122" lastFinishedPulling="2025-09-30 06:39:16.996791656 +0000 UTC m=+1200.471812736" observedRunningTime="2025-09-30 06:39:17.311253546 +0000 UTC m=+1200.786274587" watchObservedRunningTime="2025-09-30 06:39:17.320940853 +0000 UTC m=+1200.795961893" Sep 30 06:39:20 crc kubenswrapper[4691]: I0930 06:39:20.606759 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 06:39:20 crc kubenswrapper[4691]: I0930 06:39:20.607441 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 06:39:21 crc kubenswrapper[4691]: I0930 06:39:21.625296 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d72ab7f3-2303-4207-b7e5-c7935165f9ae" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.218:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 06:39:21 crc kubenswrapper[4691]: I0930 06:39:21.625312 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="d72ab7f3-2303-4207-b7e5-c7935165f9ae" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.218:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 06:39:22 crc kubenswrapper[4691]: I0930 06:39:22.371898 4691 generic.go:334] "Generic (PLEG): container finished" podID="08f0f027-7a83-431f-849a-639df1298562" containerID="7cbeb0e8a236f64eee17cfcfd4c7a9bdbe2a324c7d204ef5b6b43b8dea6677cb" exitCode=0 Sep 30 06:39:22 crc kubenswrapper[4691]: I0930 06:39:22.371992 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-xqb84" event={"ID":"08f0f027-7a83-431f-849a-639df1298562","Type":"ContainerDied","Data":"7cbeb0e8a236f64eee17cfcfd4c7a9bdbe2a324c7d204ef5b6b43b8dea6677cb"} Sep 30 06:39:22 crc kubenswrapper[4691]: I0930 06:39:22.849975 4691 patch_prober.go:28] interesting pod/machine-config-daemon-4w4k6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 06:39:22 crc kubenswrapper[4691]: I0930 06:39:22.850049 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 06:39:23 crc kubenswrapper[4691]: I0930 06:39:23.583402 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Sep 30 06:39:23 crc kubenswrapper[4691]: I0930 06:39:23.585312 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Sep 30 06:39:23 crc kubenswrapper[4691]: I0930 06:39:23.596910 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Sep 30 06:39:23 crc kubenswrapper[4691]: I0930 06:39:23.816308 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-xqb84" Sep 30 06:39:23 crc kubenswrapper[4691]: I0930 06:39:23.835210 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwvg5\" (UniqueName: \"kubernetes.io/projected/08f0f027-7a83-431f-849a-639df1298562-kube-api-access-xwvg5\") pod \"08f0f027-7a83-431f-849a-639df1298562\" (UID: \"08f0f027-7a83-431f-849a-639df1298562\") " Sep 30 06:39:23 crc kubenswrapper[4691]: I0930 06:39:23.835279 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08f0f027-7a83-431f-849a-639df1298562-config-data\") pod \"08f0f027-7a83-431f-849a-639df1298562\" (UID: \"08f0f027-7a83-431f-849a-639df1298562\") " Sep 30 06:39:23 crc kubenswrapper[4691]: I0930 06:39:23.835385 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08f0f027-7a83-431f-849a-639df1298562-scripts\") pod \"08f0f027-7a83-431f-849a-639df1298562\" (UID: \"08f0f027-7a83-431f-849a-639df1298562\") " Sep 30 06:39:23 crc kubenswrapper[4691]: I0930 06:39:23.835556 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08f0f027-7a83-431f-849a-639df1298562-combined-ca-bundle\") pod \"08f0f027-7a83-431f-849a-639df1298562\" (UID: \"08f0f027-7a83-431f-849a-639df1298562\") " Sep 30 06:39:23 crc kubenswrapper[4691]: I0930 06:39:23.841183 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08f0f027-7a83-431f-849a-639df1298562-scripts" (OuterVolumeSpecName: "scripts") pod "08f0f027-7a83-431f-849a-639df1298562" (UID: "08f0f027-7a83-431f-849a-639df1298562"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:39:23 crc kubenswrapper[4691]: I0930 06:39:23.841579 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08f0f027-7a83-431f-849a-639df1298562-kube-api-access-xwvg5" (OuterVolumeSpecName: "kube-api-access-xwvg5") pod "08f0f027-7a83-431f-849a-639df1298562" (UID: "08f0f027-7a83-431f-849a-639df1298562"). InnerVolumeSpecName "kube-api-access-xwvg5". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:39:23 crc kubenswrapper[4691]: I0930 06:39:23.875118 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08f0f027-7a83-431f-849a-639df1298562-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "08f0f027-7a83-431f-849a-639df1298562" (UID: "08f0f027-7a83-431f-849a-639df1298562"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:39:23 crc kubenswrapper[4691]: I0930 06:39:23.890604 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08f0f027-7a83-431f-849a-639df1298562-config-data" (OuterVolumeSpecName: "config-data") pod "08f0f027-7a83-431f-849a-639df1298562" (UID: "08f0f027-7a83-431f-849a-639df1298562"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:39:23 crc kubenswrapper[4691]: I0930 06:39:23.938272 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/08f0f027-7a83-431f-849a-639df1298562-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:39:23 crc kubenswrapper[4691]: I0930 06:39:23.938324 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwvg5\" (UniqueName: \"kubernetes.io/projected/08f0f027-7a83-431f-849a-639df1298562-kube-api-access-xwvg5\") on node \"crc\" DevicePath \"\"" Sep 30 06:39:23 crc kubenswrapper[4691]: I0930 06:39:23.938341 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/08f0f027-7a83-431f-849a-639df1298562-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 06:39:23 crc kubenswrapper[4691]: I0930 06:39:23.938355 4691 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/08f0f027-7a83-431f-849a-639df1298562-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 06:39:24 crc kubenswrapper[4691]: I0930 06:39:24.440516 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-xqb84" event={"ID":"08f0f027-7a83-431f-849a-639df1298562","Type":"ContainerDied","Data":"a09f86cb613b5694133e9a23c203ed2276835e63af37f54bdfc2e9be293de4f9"} Sep 30 06:39:24 crc kubenswrapper[4691]: I0930 06:39:24.441161 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a09f86cb613b5694133e9a23c203ed2276835e63af37f54bdfc2e9be293de4f9" Sep 30 06:39:24 crc kubenswrapper[4691]: I0930 06:39:24.440543 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-xqb84" Sep 30 06:39:24 crc kubenswrapper[4691]: I0930 06:39:24.448404 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Sep 30 06:39:24 crc kubenswrapper[4691]: I0930 06:39:24.712753 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 30 06:39:24 crc kubenswrapper[4691]: I0930 06:39:24.713065 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d72ab7f3-2303-4207-b7e5-c7935165f9ae" containerName="nova-api-log" containerID="cri-o://3e72d4958cccdada72e2480f668d648b1001d1e49df9145d6b5c04754ca7696b" gracePeriod=30 Sep 30 06:39:24 crc kubenswrapper[4691]: I0930 06:39:24.713247 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="d72ab7f3-2303-4207-b7e5-c7935165f9ae" containerName="nova-api-api" containerID="cri-o://29522a106bf94e484acca61feedfcedb7809e1553ccbb2d963446ef6ca2a29fb" gracePeriod=30 Sep 30 06:39:24 crc kubenswrapper[4691]: I0930 06:39:24.733769 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 06:39:24 crc kubenswrapper[4691]: I0930 06:39:24.734062 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="32d7b989-18a3-4e18-8eb5-e3f7856ed003" containerName="nova-scheduler-scheduler" containerID="cri-o://bbdb88b55728fea6717d8ca7d961ff0ae840abbe8018b24e5b4ab2bbaa6bb23c" gracePeriod=30 Sep 30 06:39:24 crc kubenswrapper[4691]: I0930 06:39:24.777397 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 06:39:25 crc kubenswrapper[4691]: I0930 06:39:25.450775 4691 generic.go:334] "Generic (PLEG): container finished" podID="d72ab7f3-2303-4207-b7e5-c7935165f9ae" containerID="3e72d4958cccdada72e2480f668d648b1001d1e49df9145d6b5c04754ca7696b" exitCode=143 Sep 30 06:39:25 crc kubenswrapper[4691]: I0930 06:39:25.450835 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d72ab7f3-2303-4207-b7e5-c7935165f9ae","Type":"ContainerDied","Data":"3e72d4958cccdada72e2480f668d648b1001d1e49df9145d6b5c04754ca7696b"} Sep 30 06:39:26 crc kubenswrapper[4691]: I0930 06:39:26.021398 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 06:39:26 crc kubenswrapper[4691]: I0930 06:39:26.184319 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d72ab7f3-2303-4207-b7e5-c7935165f9ae-internal-tls-certs\") pod \"d72ab7f3-2303-4207-b7e5-c7935165f9ae\" (UID: \"d72ab7f3-2303-4207-b7e5-c7935165f9ae\") " Sep 30 06:39:26 crc kubenswrapper[4691]: I0930 06:39:26.184392 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d72ab7f3-2303-4207-b7e5-c7935165f9ae-logs\") pod \"d72ab7f3-2303-4207-b7e5-c7935165f9ae\" (UID: \"d72ab7f3-2303-4207-b7e5-c7935165f9ae\") " Sep 30 06:39:26 crc kubenswrapper[4691]: I0930 06:39:26.184564 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d72ab7f3-2303-4207-b7e5-c7935165f9ae-public-tls-certs\") pod \"d72ab7f3-2303-4207-b7e5-c7935165f9ae\" (UID: \"d72ab7f3-2303-4207-b7e5-c7935165f9ae\") " Sep 30 06:39:26 crc kubenswrapper[4691]: I0930 06:39:26.184642 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5pbwx\" (UniqueName: \"kubernetes.io/projected/d72ab7f3-2303-4207-b7e5-c7935165f9ae-kube-api-access-5pbwx\") pod \"d72ab7f3-2303-4207-b7e5-c7935165f9ae\" (UID: \"d72ab7f3-2303-4207-b7e5-c7935165f9ae\") " Sep 30 06:39:26 crc kubenswrapper[4691]: I0930 06:39:26.184711 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d72ab7f3-2303-4207-b7e5-c7935165f9ae-combined-ca-bundle\") pod \"d72ab7f3-2303-4207-b7e5-c7935165f9ae\" (UID: \"d72ab7f3-2303-4207-b7e5-c7935165f9ae\") " Sep 30 06:39:26 crc kubenswrapper[4691]: I0930 06:39:26.184749 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d72ab7f3-2303-4207-b7e5-c7935165f9ae-config-data\") pod \"d72ab7f3-2303-4207-b7e5-c7935165f9ae\" (UID: \"d72ab7f3-2303-4207-b7e5-c7935165f9ae\") " Sep 30 06:39:26 crc kubenswrapper[4691]: I0930 06:39:26.185458 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d72ab7f3-2303-4207-b7e5-c7935165f9ae-logs" (OuterVolumeSpecName: "logs") pod "d72ab7f3-2303-4207-b7e5-c7935165f9ae" (UID: "d72ab7f3-2303-4207-b7e5-c7935165f9ae"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:39:26 crc kubenswrapper[4691]: I0930 06:39:26.200192 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d72ab7f3-2303-4207-b7e5-c7935165f9ae-kube-api-access-5pbwx" (OuterVolumeSpecName: "kube-api-access-5pbwx") pod "d72ab7f3-2303-4207-b7e5-c7935165f9ae" (UID: "d72ab7f3-2303-4207-b7e5-c7935165f9ae"). InnerVolumeSpecName "kube-api-access-5pbwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:39:26 crc kubenswrapper[4691]: I0930 06:39:26.212580 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d72ab7f3-2303-4207-b7e5-c7935165f9ae-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d72ab7f3-2303-4207-b7e5-c7935165f9ae" (UID: "d72ab7f3-2303-4207-b7e5-c7935165f9ae"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:39:26 crc kubenswrapper[4691]: I0930 06:39:26.240239 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d72ab7f3-2303-4207-b7e5-c7935165f9ae-config-data" (OuterVolumeSpecName: "config-data") pod "d72ab7f3-2303-4207-b7e5-c7935165f9ae" (UID: "d72ab7f3-2303-4207-b7e5-c7935165f9ae"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:39:26 crc kubenswrapper[4691]: I0930 06:39:26.259370 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d72ab7f3-2303-4207-b7e5-c7935165f9ae-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "d72ab7f3-2303-4207-b7e5-c7935165f9ae" (UID: "d72ab7f3-2303-4207-b7e5-c7935165f9ae"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:39:26 crc kubenswrapper[4691]: I0930 06:39:26.269748 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d72ab7f3-2303-4207-b7e5-c7935165f9ae-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "d72ab7f3-2303-4207-b7e5-c7935165f9ae" (UID: "d72ab7f3-2303-4207-b7e5-c7935165f9ae"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:39:26 crc kubenswrapper[4691]: I0930 06:39:26.287335 4691 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d72ab7f3-2303-4207-b7e5-c7935165f9ae-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 06:39:26 crc kubenswrapper[4691]: I0930 06:39:26.287362 4691 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d72ab7f3-2303-4207-b7e5-c7935165f9ae-logs\") on node \"crc\" DevicePath \"\"" Sep 30 06:39:26 crc kubenswrapper[4691]: I0930 06:39:26.287372 4691 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d72ab7f3-2303-4207-b7e5-c7935165f9ae-public-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 06:39:26 crc kubenswrapper[4691]: I0930 06:39:26.287382 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5pbwx\" (UniqueName: \"kubernetes.io/projected/d72ab7f3-2303-4207-b7e5-c7935165f9ae-kube-api-access-5pbwx\") on node \"crc\" DevicePath \"\"" Sep 30 06:39:26 crc kubenswrapper[4691]: I0930 06:39:26.287393 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d72ab7f3-2303-4207-b7e5-c7935165f9ae-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:39:26 crc kubenswrapper[4691]: I0930 06:39:26.287400 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d72ab7f3-2303-4207-b7e5-c7935165f9ae-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 06:39:26 crc kubenswrapper[4691]: I0930 06:39:26.475021 4691 generic.go:334] "Generic (PLEG): container finished" podID="d72ab7f3-2303-4207-b7e5-c7935165f9ae" containerID="29522a106bf94e484acca61feedfcedb7809e1553ccbb2d963446ef6ca2a29fb" exitCode=0 Sep 30 06:39:26 crc kubenswrapper[4691]: I0930 06:39:26.475159 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 06:39:26 crc kubenswrapper[4691]: I0930 06:39:26.475210 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d72ab7f3-2303-4207-b7e5-c7935165f9ae","Type":"ContainerDied","Data":"29522a106bf94e484acca61feedfcedb7809e1553ccbb2d963446ef6ca2a29fb"} Sep 30 06:39:26 crc kubenswrapper[4691]: I0930 06:39:26.476133 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"d72ab7f3-2303-4207-b7e5-c7935165f9ae","Type":"ContainerDied","Data":"e8c35109546b1f47bdf221f40ed5c34b0b7a1c01cbe87d7c133d0c42c7616a52"} Sep 30 06:39:26 crc kubenswrapper[4691]: I0930 06:39:26.476241 4691 scope.go:117] "RemoveContainer" containerID="29522a106bf94e484acca61feedfcedb7809e1553ccbb2d963446ef6ca2a29fb" Sep 30 06:39:26 crc kubenswrapper[4691]: I0930 06:39:26.477040 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="afe1b16e-3c68-4b97-ad22-7b38c84ef6bb" containerName="nova-metadata-log" containerID="cri-o://3f35b801a7adea0b68ae6fca821129b794ddb77d7d6a746ac413ab3a83955888" gracePeriod=30 Sep 30 06:39:26 crc kubenswrapper[4691]: I0930 06:39:26.477330 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="afe1b16e-3c68-4b97-ad22-7b38c84ef6bb" containerName="nova-metadata-metadata" containerID="cri-o://cd752a11f112b7e1fba4875c4d6b5ab82518ce13fcc8be6237889e56a52a800d" gracePeriod=30 Sep 30 06:39:26 crc kubenswrapper[4691]: I0930 06:39:26.504080 4691 scope.go:117] "RemoveContainer" containerID="3e72d4958cccdada72e2480f668d648b1001d1e49df9145d6b5c04754ca7696b" Sep 30 06:39:26 crc kubenswrapper[4691]: I0930 06:39:26.530213 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Sep 30 06:39:26 crc kubenswrapper[4691]: I0930 06:39:26.535021 4691 scope.go:117] "RemoveContainer" containerID="29522a106bf94e484acca61feedfcedb7809e1553ccbb2d963446ef6ca2a29fb" Sep 30 06:39:26 crc kubenswrapper[4691]: E0930 06:39:26.538015 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29522a106bf94e484acca61feedfcedb7809e1553ccbb2d963446ef6ca2a29fb\": container with ID starting with 29522a106bf94e484acca61feedfcedb7809e1553ccbb2d963446ef6ca2a29fb not found: ID does not exist" containerID="29522a106bf94e484acca61feedfcedb7809e1553ccbb2d963446ef6ca2a29fb" Sep 30 06:39:26 crc kubenswrapper[4691]: I0930 06:39:26.538071 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29522a106bf94e484acca61feedfcedb7809e1553ccbb2d963446ef6ca2a29fb"} err="failed to get container status \"29522a106bf94e484acca61feedfcedb7809e1553ccbb2d963446ef6ca2a29fb\": rpc error: code = NotFound desc = could not find container \"29522a106bf94e484acca61feedfcedb7809e1553ccbb2d963446ef6ca2a29fb\": container with ID starting with 29522a106bf94e484acca61feedfcedb7809e1553ccbb2d963446ef6ca2a29fb not found: ID does not exist" Sep 30 06:39:26 crc kubenswrapper[4691]: I0930 06:39:26.538100 4691 scope.go:117] "RemoveContainer" containerID="3e72d4958cccdada72e2480f668d648b1001d1e49df9145d6b5c04754ca7696b" Sep 30 06:39:26 crc kubenswrapper[4691]: E0930 06:39:26.543342 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e72d4958cccdada72e2480f668d648b1001d1e49df9145d6b5c04754ca7696b\": container with ID starting with 3e72d4958cccdada72e2480f668d648b1001d1e49df9145d6b5c04754ca7696b not found: ID does not exist" containerID="3e72d4958cccdada72e2480f668d648b1001d1e49df9145d6b5c04754ca7696b" Sep 30 06:39:26 crc kubenswrapper[4691]: I0930 06:39:26.543395 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e72d4958cccdada72e2480f668d648b1001d1e49df9145d6b5c04754ca7696b"} err="failed to get container status \"3e72d4958cccdada72e2480f668d648b1001d1e49df9145d6b5c04754ca7696b\": rpc error: code = NotFound desc = could not find container \"3e72d4958cccdada72e2480f668d648b1001d1e49df9145d6b5c04754ca7696b\": container with ID starting with 3e72d4958cccdada72e2480f668d648b1001d1e49df9145d6b5c04754ca7696b not found: ID does not exist" Sep 30 06:39:26 crc kubenswrapper[4691]: I0930 06:39:26.552640 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Sep 30 06:39:26 crc kubenswrapper[4691]: I0930 06:39:26.569263 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Sep 30 06:39:26 crc kubenswrapper[4691]: E0930 06:39:26.569680 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1944fc3a-694c-4642-a242-ace9c04f708e" containerName="dnsmasq-dns" Sep 30 06:39:26 crc kubenswrapper[4691]: I0930 06:39:26.569700 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="1944fc3a-694c-4642-a242-ace9c04f708e" containerName="dnsmasq-dns" Sep 30 06:39:26 crc kubenswrapper[4691]: E0930 06:39:26.569713 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08f0f027-7a83-431f-849a-639df1298562" containerName="nova-manage" Sep 30 06:39:26 crc kubenswrapper[4691]: I0930 06:39:26.569721 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="08f0f027-7a83-431f-849a-639df1298562" containerName="nova-manage" Sep 30 06:39:26 crc kubenswrapper[4691]: E0930 06:39:26.569754 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d72ab7f3-2303-4207-b7e5-c7935165f9ae" containerName="nova-api-log" Sep 30 06:39:26 crc kubenswrapper[4691]: I0930 06:39:26.569760 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="d72ab7f3-2303-4207-b7e5-c7935165f9ae" containerName="nova-api-log" Sep 30 06:39:26 crc kubenswrapper[4691]: E0930 06:39:26.569770 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d72ab7f3-2303-4207-b7e5-c7935165f9ae" containerName="nova-api-api" Sep 30 06:39:26 crc kubenswrapper[4691]: I0930 06:39:26.569776 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="d72ab7f3-2303-4207-b7e5-c7935165f9ae" containerName="nova-api-api" Sep 30 06:39:26 crc kubenswrapper[4691]: E0930 06:39:26.569788 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1944fc3a-694c-4642-a242-ace9c04f708e" containerName="init" Sep 30 06:39:26 crc kubenswrapper[4691]: I0930 06:39:26.569794 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="1944fc3a-694c-4642-a242-ace9c04f708e" containerName="init" Sep 30 06:39:26 crc kubenswrapper[4691]: I0930 06:39:26.569986 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="1944fc3a-694c-4642-a242-ace9c04f708e" containerName="dnsmasq-dns" Sep 30 06:39:26 crc kubenswrapper[4691]: I0930 06:39:26.570006 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="d72ab7f3-2303-4207-b7e5-c7935165f9ae" containerName="nova-api-api" Sep 30 06:39:26 crc kubenswrapper[4691]: I0930 06:39:26.570016 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="d72ab7f3-2303-4207-b7e5-c7935165f9ae" containerName="nova-api-log" Sep 30 06:39:26 crc kubenswrapper[4691]: I0930 06:39:26.570027 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="08f0f027-7a83-431f-849a-639df1298562" containerName="nova-manage" Sep 30 06:39:26 crc kubenswrapper[4691]: I0930 06:39:26.571040 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 06:39:26 crc kubenswrapper[4691]: I0930 06:39:26.574803 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Sep 30 06:39:26 crc kubenswrapper[4691]: I0930 06:39:26.576611 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Sep 30 06:39:26 crc kubenswrapper[4691]: I0930 06:39:26.580334 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Sep 30 06:39:26 crc kubenswrapper[4691]: I0930 06:39:26.581827 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 06:39:26 crc kubenswrapper[4691]: I0930 06:39:26.701334 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8b5ef76-7c77-4698-9f7f-219791e59bd2-logs\") pod \"nova-api-0\" (UID: \"f8b5ef76-7c77-4698-9f7f-219791e59bd2\") " pod="openstack/nova-api-0" Sep 30 06:39:26 crc kubenswrapper[4691]: I0930 06:39:26.701462 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8b5ef76-7c77-4698-9f7f-219791e59bd2-config-data\") pod \"nova-api-0\" (UID: \"f8b5ef76-7c77-4698-9f7f-219791e59bd2\") " pod="openstack/nova-api-0" Sep 30 06:39:26 crc kubenswrapper[4691]: I0930 06:39:26.701482 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8b5ef76-7c77-4698-9f7f-219791e59bd2-public-tls-certs\") pod \"nova-api-0\" (UID: \"f8b5ef76-7c77-4698-9f7f-219791e59bd2\") " pod="openstack/nova-api-0" Sep 30 06:39:26 crc kubenswrapper[4691]: I0930 06:39:26.701507 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8b5ef76-7c77-4698-9f7f-219791e59bd2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f8b5ef76-7c77-4698-9f7f-219791e59bd2\") " pod="openstack/nova-api-0" Sep 30 06:39:26 crc kubenswrapper[4691]: I0930 06:39:26.701552 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9mrs\" (UniqueName: \"kubernetes.io/projected/f8b5ef76-7c77-4698-9f7f-219791e59bd2-kube-api-access-q9mrs\") pod \"nova-api-0\" (UID: \"f8b5ef76-7c77-4698-9f7f-219791e59bd2\") " pod="openstack/nova-api-0" Sep 30 06:39:26 crc kubenswrapper[4691]: I0930 06:39:26.701571 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8b5ef76-7c77-4698-9f7f-219791e59bd2-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f8b5ef76-7c77-4698-9f7f-219791e59bd2\") " pod="openstack/nova-api-0" Sep 30 06:39:26 crc kubenswrapper[4691]: I0930 06:39:26.803366 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8b5ef76-7c77-4698-9f7f-219791e59bd2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f8b5ef76-7c77-4698-9f7f-219791e59bd2\") " pod="openstack/nova-api-0" Sep 30 06:39:26 crc kubenswrapper[4691]: I0930 06:39:26.803917 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9mrs\" (UniqueName: \"kubernetes.io/projected/f8b5ef76-7c77-4698-9f7f-219791e59bd2-kube-api-access-q9mrs\") pod \"nova-api-0\" (UID: \"f8b5ef76-7c77-4698-9f7f-219791e59bd2\") " pod="openstack/nova-api-0" Sep 30 06:39:26 crc kubenswrapper[4691]: I0930 06:39:26.803967 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8b5ef76-7c77-4698-9f7f-219791e59bd2-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f8b5ef76-7c77-4698-9f7f-219791e59bd2\") " pod="openstack/nova-api-0" Sep 30 06:39:26 crc kubenswrapper[4691]: I0930 06:39:26.804040 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8b5ef76-7c77-4698-9f7f-219791e59bd2-logs\") pod \"nova-api-0\" (UID: \"f8b5ef76-7c77-4698-9f7f-219791e59bd2\") " pod="openstack/nova-api-0" Sep 30 06:39:26 crc kubenswrapper[4691]: I0930 06:39:26.804210 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8b5ef76-7c77-4698-9f7f-219791e59bd2-config-data\") pod \"nova-api-0\" (UID: \"f8b5ef76-7c77-4698-9f7f-219791e59bd2\") " pod="openstack/nova-api-0" Sep 30 06:39:26 crc kubenswrapper[4691]: I0930 06:39:26.804232 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8b5ef76-7c77-4698-9f7f-219791e59bd2-public-tls-certs\") pod \"nova-api-0\" (UID: \"f8b5ef76-7c77-4698-9f7f-219791e59bd2\") " pod="openstack/nova-api-0" Sep 30 06:39:26 crc kubenswrapper[4691]: I0930 06:39:26.804957 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f8b5ef76-7c77-4698-9f7f-219791e59bd2-logs\") pod \"nova-api-0\" (UID: \"f8b5ef76-7c77-4698-9f7f-219791e59bd2\") " pod="openstack/nova-api-0" Sep 30 06:39:26 crc kubenswrapper[4691]: I0930 06:39:26.809349 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8b5ef76-7c77-4698-9f7f-219791e59bd2-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f8b5ef76-7c77-4698-9f7f-219791e59bd2\") " pod="openstack/nova-api-0" Sep 30 06:39:26 crc kubenswrapper[4691]: I0930 06:39:26.809392 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f8b5ef76-7c77-4698-9f7f-219791e59bd2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f8b5ef76-7c77-4698-9f7f-219791e59bd2\") " pod="openstack/nova-api-0" Sep 30 06:39:26 crc kubenswrapper[4691]: I0930 06:39:26.810951 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f8b5ef76-7c77-4698-9f7f-219791e59bd2-public-tls-certs\") pod \"nova-api-0\" (UID: \"f8b5ef76-7c77-4698-9f7f-219791e59bd2\") " pod="openstack/nova-api-0" Sep 30 06:39:26 crc kubenswrapper[4691]: I0930 06:39:26.812494 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f8b5ef76-7c77-4698-9f7f-219791e59bd2-config-data\") pod \"nova-api-0\" (UID: \"f8b5ef76-7c77-4698-9f7f-219791e59bd2\") " pod="openstack/nova-api-0" Sep 30 06:39:26 crc kubenswrapper[4691]: I0930 06:39:26.835682 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9mrs\" (UniqueName: \"kubernetes.io/projected/f8b5ef76-7c77-4698-9f7f-219791e59bd2-kube-api-access-q9mrs\") pod \"nova-api-0\" (UID: \"f8b5ef76-7c77-4698-9f7f-219791e59bd2\") " pod="openstack/nova-api-0" Sep 30 06:39:26 crc kubenswrapper[4691]: I0930 06:39:26.944680 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Sep 30 06:39:27 crc kubenswrapper[4691]: I0930 06:39:27.240941 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d72ab7f3-2303-4207-b7e5-c7935165f9ae" path="/var/lib/kubelet/pods/d72ab7f3-2303-4207-b7e5-c7935165f9ae/volumes" Sep 30 06:39:27 crc kubenswrapper[4691]: I0930 06:39:27.430986 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Sep 30 06:39:27 crc kubenswrapper[4691]: I0930 06:39:27.489314 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f8b5ef76-7c77-4698-9f7f-219791e59bd2","Type":"ContainerStarted","Data":"1af2b9be2bbbe3f180d89fcbb63406fe5971134c251d32e5230655b85cbf4772"} Sep 30 06:39:27 crc kubenswrapper[4691]: I0930 06:39:27.491509 4691 generic.go:334] "Generic (PLEG): container finished" podID="afe1b16e-3c68-4b97-ad22-7b38c84ef6bb" containerID="cd752a11f112b7e1fba4875c4d6b5ab82518ce13fcc8be6237889e56a52a800d" exitCode=0 Sep 30 06:39:27 crc kubenswrapper[4691]: I0930 06:39:27.491540 4691 generic.go:334] "Generic (PLEG): container finished" podID="afe1b16e-3c68-4b97-ad22-7b38c84ef6bb" containerID="3f35b801a7adea0b68ae6fca821129b794ddb77d7d6a746ac413ab3a83955888" exitCode=143 Sep 30 06:39:27 crc kubenswrapper[4691]: I0930 06:39:27.491587 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"afe1b16e-3c68-4b97-ad22-7b38c84ef6bb","Type":"ContainerDied","Data":"cd752a11f112b7e1fba4875c4d6b5ab82518ce13fcc8be6237889e56a52a800d"} Sep 30 06:39:27 crc kubenswrapper[4691]: I0930 06:39:27.491613 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"afe1b16e-3c68-4b97-ad22-7b38c84ef6bb","Type":"ContainerDied","Data":"3f35b801a7adea0b68ae6fca821129b794ddb77d7d6a746ac413ab3a83955888"} Sep 30 06:39:27 crc kubenswrapper[4691]: I0930 06:39:27.753107 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 06:39:27 crc kubenswrapper[4691]: I0930 06:39:27.928609 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afe1b16e-3c68-4b97-ad22-7b38c84ef6bb-combined-ca-bundle\") pod \"afe1b16e-3c68-4b97-ad22-7b38c84ef6bb\" (UID: \"afe1b16e-3c68-4b97-ad22-7b38c84ef6bb\") " Sep 30 06:39:27 crc kubenswrapper[4691]: I0930 06:39:27.928741 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/afe1b16e-3c68-4b97-ad22-7b38c84ef6bb-nova-metadata-tls-certs\") pod \"afe1b16e-3c68-4b97-ad22-7b38c84ef6bb\" (UID: \"afe1b16e-3c68-4b97-ad22-7b38c84ef6bb\") " Sep 30 06:39:27 crc kubenswrapper[4691]: I0930 06:39:27.928778 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afe1b16e-3c68-4b97-ad22-7b38c84ef6bb-config-data\") pod \"afe1b16e-3c68-4b97-ad22-7b38c84ef6bb\" (UID: \"afe1b16e-3c68-4b97-ad22-7b38c84ef6bb\") " Sep 30 06:39:27 crc kubenswrapper[4691]: I0930 06:39:27.928824 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/afe1b16e-3c68-4b97-ad22-7b38c84ef6bb-logs\") pod \"afe1b16e-3c68-4b97-ad22-7b38c84ef6bb\" (UID: \"afe1b16e-3c68-4b97-ad22-7b38c84ef6bb\") " Sep 30 06:39:27 crc kubenswrapper[4691]: I0930 06:39:27.928933 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nq5nr\" (UniqueName: \"kubernetes.io/projected/afe1b16e-3c68-4b97-ad22-7b38c84ef6bb-kube-api-access-nq5nr\") pod \"afe1b16e-3c68-4b97-ad22-7b38c84ef6bb\" (UID: \"afe1b16e-3c68-4b97-ad22-7b38c84ef6bb\") " Sep 30 06:39:27 crc kubenswrapper[4691]: I0930 06:39:27.929533 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/afe1b16e-3c68-4b97-ad22-7b38c84ef6bb-logs" (OuterVolumeSpecName: "logs") pod "afe1b16e-3c68-4b97-ad22-7b38c84ef6bb" (UID: "afe1b16e-3c68-4b97-ad22-7b38c84ef6bb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:39:27 crc kubenswrapper[4691]: I0930 06:39:27.934326 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afe1b16e-3c68-4b97-ad22-7b38c84ef6bb-kube-api-access-nq5nr" (OuterVolumeSpecName: "kube-api-access-nq5nr") pod "afe1b16e-3c68-4b97-ad22-7b38c84ef6bb" (UID: "afe1b16e-3c68-4b97-ad22-7b38c84ef6bb"). InnerVolumeSpecName "kube-api-access-nq5nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:39:27 crc kubenswrapper[4691]: I0930 06:39:27.972436 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afe1b16e-3c68-4b97-ad22-7b38c84ef6bb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "afe1b16e-3c68-4b97-ad22-7b38c84ef6bb" (UID: "afe1b16e-3c68-4b97-ad22-7b38c84ef6bb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:39:27 crc kubenswrapper[4691]: I0930 06:39:27.977665 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afe1b16e-3c68-4b97-ad22-7b38c84ef6bb-config-data" (OuterVolumeSpecName: "config-data") pod "afe1b16e-3c68-4b97-ad22-7b38c84ef6bb" (UID: "afe1b16e-3c68-4b97-ad22-7b38c84ef6bb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:39:28 crc kubenswrapper[4691]: I0930 06:39:28.000195 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afe1b16e-3c68-4b97-ad22-7b38c84ef6bb-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "afe1b16e-3c68-4b97-ad22-7b38c84ef6bb" (UID: "afe1b16e-3c68-4b97-ad22-7b38c84ef6bb"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:39:28 crc kubenswrapper[4691]: I0930 06:39:28.031534 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afe1b16e-3c68-4b97-ad22-7b38c84ef6bb-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 06:39:28 crc kubenswrapper[4691]: I0930 06:39:28.031705 4691 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/afe1b16e-3c68-4b97-ad22-7b38c84ef6bb-logs\") on node \"crc\" DevicePath \"\"" Sep 30 06:39:28 crc kubenswrapper[4691]: I0930 06:39:28.031788 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nq5nr\" (UniqueName: \"kubernetes.io/projected/afe1b16e-3c68-4b97-ad22-7b38c84ef6bb-kube-api-access-nq5nr\") on node \"crc\" DevicePath \"\"" Sep 30 06:39:28 crc kubenswrapper[4691]: I0930 06:39:28.031841 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afe1b16e-3c68-4b97-ad22-7b38c84ef6bb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:39:28 crc kubenswrapper[4691]: I0930 06:39:28.031908 4691 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/afe1b16e-3c68-4b97-ad22-7b38c84ef6bb-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Sep 30 06:39:28 crc kubenswrapper[4691]: I0930 06:39:28.278570 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 06:39:28 crc kubenswrapper[4691]: I0930 06:39:28.436707 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32d7b989-18a3-4e18-8eb5-e3f7856ed003-config-data\") pod \"32d7b989-18a3-4e18-8eb5-e3f7856ed003\" (UID: \"32d7b989-18a3-4e18-8eb5-e3f7856ed003\") " Sep 30 06:39:28 crc kubenswrapper[4691]: I0930 06:39:28.436843 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngttl\" (UniqueName: \"kubernetes.io/projected/32d7b989-18a3-4e18-8eb5-e3f7856ed003-kube-api-access-ngttl\") pod \"32d7b989-18a3-4e18-8eb5-e3f7856ed003\" (UID: \"32d7b989-18a3-4e18-8eb5-e3f7856ed003\") " Sep 30 06:39:28 crc kubenswrapper[4691]: I0930 06:39:28.439140 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32d7b989-18a3-4e18-8eb5-e3f7856ed003-combined-ca-bundle\") pod \"32d7b989-18a3-4e18-8eb5-e3f7856ed003\" (UID: \"32d7b989-18a3-4e18-8eb5-e3f7856ed003\") " Sep 30 06:39:28 crc kubenswrapper[4691]: I0930 06:39:28.456161 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32d7b989-18a3-4e18-8eb5-e3f7856ed003-kube-api-access-ngttl" (OuterVolumeSpecName: "kube-api-access-ngttl") pod "32d7b989-18a3-4e18-8eb5-e3f7856ed003" (UID: "32d7b989-18a3-4e18-8eb5-e3f7856ed003"). InnerVolumeSpecName "kube-api-access-ngttl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:39:28 crc kubenswrapper[4691]: I0930 06:39:28.476604 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32d7b989-18a3-4e18-8eb5-e3f7856ed003-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "32d7b989-18a3-4e18-8eb5-e3f7856ed003" (UID: "32d7b989-18a3-4e18-8eb5-e3f7856ed003"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:39:28 crc kubenswrapper[4691]: I0930 06:39:28.478395 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32d7b989-18a3-4e18-8eb5-e3f7856ed003-config-data" (OuterVolumeSpecName: "config-data") pod "32d7b989-18a3-4e18-8eb5-e3f7856ed003" (UID: "32d7b989-18a3-4e18-8eb5-e3f7856ed003"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:39:28 crc kubenswrapper[4691]: I0930 06:39:28.503911 4691 generic.go:334] "Generic (PLEG): container finished" podID="32d7b989-18a3-4e18-8eb5-e3f7856ed003" containerID="bbdb88b55728fea6717d8ca7d961ff0ae840abbe8018b24e5b4ab2bbaa6bb23c" exitCode=0 Sep 30 06:39:28 crc kubenswrapper[4691]: I0930 06:39:28.503971 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 06:39:28 crc kubenswrapper[4691]: I0930 06:39:28.503990 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"32d7b989-18a3-4e18-8eb5-e3f7856ed003","Type":"ContainerDied","Data":"bbdb88b55728fea6717d8ca7d961ff0ae840abbe8018b24e5b4ab2bbaa6bb23c"} Sep 30 06:39:28 crc kubenswrapper[4691]: I0930 06:39:28.504261 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"32d7b989-18a3-4e18-8eb5-e3f7856ed003","Type":"ContainerDied","Data":"fff6814417c248ce18160414bf0d6d2450b1d7c41a9a51a9a949219b9793da34"} Sep 30 06:39:28 crc kubenswrapper[4691]: I0930 06:39:28.504304 4691 scope.go:117] "RemoveContainer" containerID="bbdb88b55728fea6717d8ca7d961ff0ae840abbe8018b24e5b4ab2bbaa6bb23c" Sep 30 06:39:28 crc kubenswrapper[4691]: I0930 06:39:28.506811 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f8b5ef76-7c77-4698-9f7f-219791e59bd2","Type":"ContainerStarted","Data":"76210240125e4eeb69d92986748e85a38b2e4452217910e924d2288e114b2363"} Sep 30 06:39:28 crc kubenswrapper[4691]: I0930 06:39:28.506831 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f8b5ef76-7c77-4698-9f7f-219791e59bd2","Type":"ContainerStarted","Data":"619b0df7dfb6e8551adda699423853acb7e50275a9a8dea9c159a99f80daa164"} Sep 30 06:39:28 crc kubenswrapper[4691]: I0930 06:39:28.510659 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"afe1b16e-3c68-4b97-ad22-7b38c84ef6bb","Type":"ContainerDied","Data":"e8b8322eb40bfe7a9b99406d248176be8ee5bb7b9482b25f61cbc1fbff4d09de"} Sep 30 06:39:28 crc kubenswrapper[4691]: I0930 06:39:28.510713 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 06:39:28 crc kubenswrapper[4691]: I0930 06:39:28.544996 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32d7b989-18a3-4e18-8eb5-e3f7856ed003-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:39:28 crc kubenswrapper[4691]: I0930 06:39:28.545045 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32d7b989-18a3-4e18-8eb5-e3f7856ed003-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 06:39:28 crc kubenswrapper[4691]: I0930 06:39:28.545056 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngttl\" (UniqueName: \"kubernetes.io/projected/32d7b989-18a3-4e18-8eb5-e3f7856ed003-kube-api-access-ngttl\") on node \"crc\" DevicePath \"\"" Sep 30 06:39:28 crc kubenswrapper[4691]: I0930 06:39:28.558588 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.558570048 podStartE2EDuration="2.558570048s" podCreationTimestamp="2025-09-30 06:39:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:39:28.540208486 +0000 UTC m=+1212.015229546" watchObservedRunningTime="2025-09-30 06:39:28.558570048 +0000 UTC m=+1212.033591088" Sep 30 06:39:28 crc kubenswrapper[4691]: I0930 06:39:28.562318 4691 scope.go:117] "RemoveContainer" containerID="bbdb88b55728fea6717d8ca7d961ff0ae840abbe8018b24e5b4ab2bbaa6bb23c" Sep 30 06:39:28 crc kubenswrapper[4691]: E0930 06:39:28.567905 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbdb88b55728fea6717d8ca7d961ff0ae840abbe8018b24e5b4ab2bbaa6bb23c\": container with ID starting with bbdb88b55728fea6717d8ca7d961ff0ae840abbe8018b24e5b4ab2bbaa6bb23c not found: ID does not exist" containerID="bbdb88b55728fea6717d8ca7d961ff0ae840abbe8018b24e5b4ab2bbaa6bb23c" Sep 30 06:39:28 crc kubenswrapper[4691]: I0930 06:39:28.567964 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbdb88b55728fea6717d8ca7d961ff0ae840abbe8018b24e5b4ab2bbaa6bb23c"} err="failed to get container status \"bbdb88b55728fea6717d8ca7d961ff0ae840abbe8018b24e5b4ab2bbaa6bb23c\": rpc error: code = NotFound desc = could not find container \"bbdb88b55728fea6717d8ca7d961ff0ae840abbe8018b24e5b4ab2bbaa6bb23c\": container with ID starting with bbdb88b55728fea6717d8ca7d961ff0ae840abbe8018b24e5b4ab2bbaa6bb23c not found: ID does not exist" Sep 30 06:39:28 crc kubenswrapper[4691]: I0930 06:39:28.567994 4691 scope.go:117] "RemoveContainer" containerID="cd752a11f112b7e1fba4875c4d6b5ab82518ce13fcc8be6237889e56a52a800d" Sep 30 06:39:28 crc kubenswrapper[4691]: I0930 06:39:28.591733 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 06:39:28 crc kubenswrapper[4691]: I0930 06:39:28.604950 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 06:39:28 crc kubenswrapper[4691]: I0930 06:39:28.616974 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 06:39:28 crc kubenswrapper[4691]: I0930 06:39:28.621000 4691 scope.go:117] "RemoveContainer" containerID="3f35b801a7adea0b68ae6fca821129b794ddb77d7d6a746ac413ab3a83955888" Sep 30 06:39:28 crc kubenswrapper[4691]: I0930 06:39:28.627373 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 06:39:28 crc kubenswrapper[4691]: I0930 06:39:28.690922 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Sep 30 06:39:28 crc kubenswrapper[4691]: E0930 06:39:28.691399 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32d7b989-18a3-4e18-8eb5-e3f7856ed003" containerName="nova-scheduler-scheduler" Sep 30 06:39:28 crc kubenswrapper[4691]: I0930 06:39:28.691418 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="32d7b989-18a3-4e18-8eb5-e3f7856ed003" containerName="nova-scheduler-scheduler" Sep 30 06:39:28 crc kubenswrapper[4691]: E0930 06:39:28.691435 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afe1b16e-3c68-4b97-ad22-7b38c84ef6bb" containerName="nova-metadata-metadata" Sep 30 06:39:28 crc kubenswrapper[4691]: I0930 06:39:28.691441 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="afe1b16e-3c68-4b97-ad22-7b38c84ef6bb" containerName="nova-metadata-metadata" Sep 30 06:39:28 crc kubenswrapper[4691]: E0930 06:39:28.691460 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afe1b16e-3c68-4b97-ad22-7b38c84ef6bb" containerName="nova-metadata-log" Sep 30 06:39:28 crc kubenswrapper[4691]: I0930 06:39:28.691469 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="afe1b16e-3c68-4b97-ad22-7b38c84ef6bb" containerName="nova-metadata-log" Sep 30 06:39:28 crc kubenswrapper[4691]: I0930 06:39:28.691631 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="afe1b16e-3c68-4b97-ad22-7b38c84ef6bb" containerName="nova-metadata-metadata" Sep 30 06:39:28 crc kubenswrapper[4691]: I0930 06:39:28.691658 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="afe1b16e-3c68-4b97-ad22-7b38c84ef6bb" containerName="nova-metadata-log" Sep 30 06:39:28 crc kubenswrapper[4691]: I0930 06:39:28.691674 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="32d7b989-18a3-4e18-8eb5-e3f7856ed003" containerName="nova-scheduler-scheduler" Sep 30 06:39:28 crc kubenswrapper[4691]: I0930 06:39:28.692768 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 06:39:28 crc kubenswrapper[4691]: I0930 06:39:28.695510 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Sep 30 06:39:28 crc kubenswrapper[4691]: I0930 06:39:28.695797 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Sep 30 06:39:28 crc kubenswrapper[4691]: I0930 06:39:28.717633 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 06:39:28 crc kubenswrapper[4691]: I0930 06:39:28.719228 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 06:39:28 crc kubenswrapper[4691]: I0930 06:39:28.721129 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Sep 30 06:39:28 crc kubenswrapper[4691]: I0930 06:39:28.741018 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 06:39:28 crc kubenswrapper[4691]: I0930 06:39:28.751847 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f6cce79-72b3-407a-8ac5-ca3782a878b5-logs\") pod \"nova-metadata-0\" (UID: \"4f6cce79-72b3-407a-8ac5-ca3782a878b5\") " pod="openstack/nova-metadata-0" Sep 30 06:39:28 crc kubenswrapper[4691]: I0930 06:39:28.751911 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77f46\" (UniqueName: \"kubernetes.io/projected/4f6cce79-72b3-407a-8ac5-ca3782a878b5-kube-api-access-77f46\") pod \"nova-metadata-0\" (UID: \"4f6cce79-72b3-407a-8ac5-ca3782a878b5\") " pod="openstack/nova-metadata-0" Sep 30 06:39:28 crc kubenswrapper[4691]: I0930 06:39:28.751949 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24eb00d6-56e5-477b-840b-ad3f6fd6e473-config-data\") pod \"nova-scheduler-0\" (UID: \"24eb00d6-56e5-477b-840b-ad3f6fd6e473\") " pod="openstack/nova-scheduler-0" Sep 30 06:39:28 crc kubenswrapper[4691]: I0930 06:39:28.752016 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f6cce79-72b3-407a-8ac5-ca3782a878b5-config-data\") pod \"nova-metadata-0\" (UID: \"4f6cce79-72b3-407a-8ac5-ca3782a878b5\") " pod="openstack/nova-metadata-0" Sep 30 06:39:28 crc kubenswrapper[4691]: I0930 06:39:28.752040 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6hh2\" (UniqueName: \"kubernetes.io/projected/24eb00d6-56e5-477b-840b-ad3f6fd6e473-kube-api-access-q6hh2\") pod \"nova-scheduler-0\" (UID: \"24eb00d6-56e5-477b-840b-ad3f6fd6e473\") " pod="openstack/nova-scheduler-0" Sep 30 06:39:28 crc kubenswrapper[4691]: I0930 06:39:28.752078 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f6cce79-72b3-407a-8ac5-ca3782a878b5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4f6cce79-72b3-407a-8ac5-ca3782a878b5\") " pod="openstack/nova-metadata-0" Sep 30 06:39:28 crc kubenswrapper[4691]: I0930 06:39:28.752100 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f6cce79-72b3-407a-8ac5-ca3782a878b5-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4f6cce79-72b3-407a-8ac5-ca3782a878b5\") " pod="openstack/nova-metadata-0" Sep 30 06:39:28 crc kubenswrapper[4691]: I0930 06:39:28.752137 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24eb00d6-56e5-477b-840b-ad3f6fd6e473-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"24eb00d6-56e5-477b-840b-ad3f6fd6e473\") " pod="openstack/nova-scheduler-0" Sep 30 06:39:28 crc kubenswrapper[4691]: I0930 06:39:28.812865 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 06:39:28 crc kubenswrapper[4691]: I0930 06:39:28.854744 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24eb00d6-56e5-477b-840b-ad3f6fd6e473-config-data\") pod \"nova-scheduler-0\" (UID: \"24eb00d6-56e5-477b-840b-ad3f6fd6e473\") " pod="openstack/nova-scheduler-0" Sep 30 06:39:28 crc kubenswrapper[4691]: I0930 06:39:28.854818 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f6cce79-72b3-407a-8ac5-ca3782a878b5-config-data\") pod \"nova-metadata-0\" (UID: \"4f6cce79-72b3-407a-8ac5-ca3782a878b5\") " pod="openstack/nova-metadata-0" Sep 30 06:39:28 crc kubenswrapper[4691]: I0930 06:39:28.854842 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6hh2\" (UniqueName: \"kubernetes.io/projected/24eb00d6-56e5-477b-840b-ad3f6fd6e473-kube-api-access-q6hh2\") pod \"nova-scheduler-0\" (UID: \"24eb00d6-56e5-477b-840b-ad3f6fd6e473\") " pod="openstack/nova-scheduler-0" Sep 30 06:39:28 crc kubenswrapper[4691]: I0930 06:39:28.854876 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f6cce79-72b3-407a-8ac5-ca3782a878b5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4f6cce79-72b3-407a-8ac5-ca3782a878b5\") " pod="openstack/nova-metadata-0" Sep 30 06:39:28 crc kubenswrapper[4691]: I0930 06:39:28.854903 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f6cce79-72b3-407a-8ac5-ca3782a878b5-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4f6cce79-72b3-407a-8ac5-ca3782a878b5\") " pod="openstack/nova-metadata-0" Sep 30 06:39:28 crc kubenswrapper[4691]: I0930 06:39:28.854939 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24eb00d6-56e5-477b-840b-ad3f6fd6e473-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"24eb00d6-56e5-477b-840b-ad3f6fd6e473\") " pod="openstack/nova-scheduler-0" Sep 30 06:39:28 crc kubenswrapper[4691]: I0930 06:39:28.854986 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f6cce79-72b3-407a-8ac5-ca3782a878b5-logs\") pod \"nova-metadata-0\" (UID: \"4f6cce79-72b3-407a-8ac5-ca3782a878b5\") " pod="openstack/nova-metadata-0" Sep 30 06:39:28 crc kubenswrapper[4691]: I0930 06:39:28.855017 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77f46\" (UniqueName: \"kubernetes.io/projected/4f6cce79-72b3-407a-8ac5-ca3782a878b5-kube-api-access-77f46\") pod \"nova-metadata-0\" (UID: \"4f6cce79-72b3-407a-8ac5-ca3782a878b5\") " pod="openstack/nova-metadata-0" Sep 30 06:39:28 crc kubenswrapper[4691]: I0930 06:39:28.856903 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4f6cce79-72b3-407a-8ac5-ca3782a878b5-logs\") pod \"nova-metadata-0\" (UID: \"4f6cce79-72b3-407a-8ac5-ca3782a878b5\") " pod="openstack/nova-metadata-0" Sep 30 06:39:28 crc kubenswrapper[4691]: I0930 06:39:28.859272 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24eb00d6-56e5-477b-840b-ad3f6fd6e473-config-data\") pod \"nova-scheduler-0\" (UID: \"24eb00d6-56e5-477b-840b-ad3f6fd6e473\") " pod="openstack/nova-scheduler-0" Sep 30 06:39:28 crc kubenswrapper[4691]: I0930 06:39:28.859626 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4f6cce79-72b3-407a-8ac5-ca3782a878b5-config-data\") pod \"nova-metadata-0\" (UID: \"4f6cce79-72b3-407a-8ac5-ca3782a878b5\") " pod="openstack/nova-metadata-0" Sep 30 06:39:28 crc kubenswrapper[4691]: I0930 06:39:28.859742 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24eb00d6-56e5-477b-840b-ad3f6fd6e473-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"24eb00d6-56e5-477b-840b-ad3f6fd6e473\") " pod="openstack/nova-scheduler-0" Sep 30 06:39:28 crc kubenswrapper[4691]: I0930 06:39:28.860440 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/4f6cce79-72b3-407a-8ac5-ca3782a878b5-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"4f6cce79-72b3-407a-8ac5-ca3782a878b5\") " pod="openstack/nova-metadata-0" Sep 30 06:39:28 crc kubenswrapper[4691]: I0930 06:39:28.861743 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4f6cce79-72b3-407a-8ac5-ca3782a878b5-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"4f6cce79-72b3-407a-8ac5-ca3782a878b5\") " pod="openstack/nova-metadata-0" Sep 30 06:39:28 crc kubenswrapper[4691]: I0930 06:39:28.870466 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77f46\" (UniqueName: \"kubernetes.io/projected/4f6cce79-72b3-407a-8ac5-ca3782a878b5-kube-api-access-77f46\") pod \"nova-metadata-0\" (UID: \"4f6cce79-72b3-407a-8ac5-ca3782a878b5\") " pod="openstack/nova-metadata-0" Sep 30 06:39:28 crc kubenswrapper[4691]: I0930 06:39:28.874358 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6hh2\" (UniqueName: \"kubernetes.io/projected/24eb00d6-56e5-477b-840b-ad3f6fd6e473-kube-api-access-q6hh2\") pod \"nova-scheduler-0\" (UID: \"24eb00d6-56e5-477b-840b-ad3f6fd6e473\") " pod="openstack/nova-scheduler-0" Sep 30 06:39:29 crc kubenswrapper[4691]: I0930 06:39:29.071771 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Sep 30 06:39:29 crc kubenswrapper[4691]: I0930 06:39:29.106310 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Sep 30 06:39:29 crc kubenswrapper[4691]: I0930 06:39:29.255008 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32d7b989-18a3-4e18-8eb5-e3f7856ed003" path="/var/lib/kubelet/pods/32d7b989-18a3-4e18-8eb5-e3f7856ed003/volumes" Sep 30 06:39:29 crc kubenswrapper[4691]: I0930 06:39:29.256106 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afe1b16e-3c68-4b97-ad22-7b38c84ef6bb" path="/var/lib/kubelet/pods/afe1b16e-3c68-4b97-ad22-7b38c84ef6bb/volumes" Sep 30 06:39:29 crc kubenswrapper[4691]: I0930 06:39:29.533772 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Sep 30 06:39:29 crc kubenswrapper[4691]: I0930 06:39:29.635677 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Sep 30 06:39:30 crc kubenswrapper[4691]: I0930 06:39:30.547537 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4f6cce79-72b3-407a-8ac5-ca3782a878b5","Type":"ContainerStarted","Data":"35e6ce26b8f03c2267a12dd4cb07fbdaed23b8cc38f1c342bdc8921c8436056a"} Sep 30 06:39:30 crc kubenswrapper[4691]: I0930 06:39:30.547611 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4f6cce79-72b3-407a-8ac5-ca3782a878b5","Type":"ContainerStarted","Data":"2f12167c6b850ec8fe4491cba692190e6eb44aa0427ee239f95c1cf265f5dcca"} Sep 30 06:39:30 crc kubenswrapper[4691]: I0930 06:39:30.547640 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"4f6cce79-72b3-407a-8ac5-ca3782a878b5","Type":"ContainerStarted","Data":"cabda96fbf89ed72bc8939a3dbeca48c33f63178e7e90fe39398d813c0454898"} Sep 30 06:39:30 crc kubenswrapper[4691]: I0930 06:39:30.551744 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"24eb00d6-56e5-477b-840b-ad3f6fd6e473","Type":"ContainerStarted","Data":"98c7ac31cf40d23b656a8c5c166de2f3cb5443c8f7c2ca3be908bbc5395eb2bf"} Sep 30 06:39:30 crc kubenswrapper[4691]: I0930 06:39:30.551780 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"24eb00d6-56e5-477b-840b-ad3f6fd6e473","Type":"ContainerStarted","Data":"37b1ebcfbafff369ff21b561f149a0f4df48b221b86ef10c16b1c365adb4a91d"} Sep 30 06:39:30 crc kubenswrapper[4691]: I0930 06:39:30.584448 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.584428198 podStartE2EDuration="2.584428198s" podCreationTimestamp="2025-09-30 06:39:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:39:30.57129733 +0000 UTC m=+1214.046318460" watchObservedRunningTime="2025-09-30 06:39:30.584428198 +0000 UTC m=+1214.059449238" Sep 30 06:39:30 crc kubenswrapper[4691]: I0930 06:39:30.596575 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.596555822 podStartE2EDuration="2.596555822s" podCreationTimestamp="2025-09-30 06:39:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:39:30.592408961 +0000 UTC m=+1214.067430011" watchObservedRunningTime="2025-09-30 06:39:30.596555822 +0000 UTC m=+1214.071576862" Sep 30 06:39:34 crc kubenswrapper[4691]: I0930 06:39:34.071971 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 30 06:39:34 crc kubenswrapper[4691]: I0930 06:39:34.072337 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Sep 30 06:39:34 crc kubenswrapper[4691]: I0930 06:39:34.106827 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Sep 30 06:39:36 crc kubenswrapper[4691]: I0930 06:39:36.946798 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 06:39:36 crc kubenswrapper[4691]: I0930 06:39:36.947153 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Sep 30 06:39:37 crc kubenswrapper[4691]: I0930 06:39:37.963264 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f8b5ef76-7c77-4698-9f7f-219791e59bd2" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.221:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 06:39:37 crc kubenswrapper[4691]: I0930 06:39:37.963280 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f8b5ef76-7c77-4698-9f7f-219791e59bd2" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.221:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Sep 30 06:39:39 crc kubenswrapper[4691]: I0930 06:39:39.072812 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Sep 30 06:39:39 crc kubenswrapper[4691]: I0930 06:39:39.074402 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Sep 30 06:39:39 crc kubenswrapper[4691]: I0930 06:39:39.106592 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Sep 30 06:39:39 crc kubenswrapper[4691]: I0930 06:39:39.152740 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Sep 30 06:39:39 crc kubenswrapper[4691]: I0930 06:39:39.700579 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Sep 30 06:39:40 crc kubenswrapper[4691]: I0930 06:39:40.090294 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="4f6cce79-72b3-407a-8ac5-ca3782a878b5" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.222:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 06:39:40 crc kubenswrapper[4691]: I0930 06:39:40.090294 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="4f6cce79-72b3-407a-8ac5-ca3782a878b5" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.222:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Sep 30 06:39:43 crc kubenswrapper[4691]: I0930 06:39:43.687852 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Sep 30 06:39:46 crc kubenswrapper[4691]: I0930 06:39:46.963462 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Sep 30 06:39:46 crc kubenswrapper[4691]: I0930 06:39:46.964288 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Sep 30 06:39:46 crc kubenswrapper[4691]: I0930 06:39:46.975078 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Sep 30 06:39:46 crc kubenswrapper[4691]: I0930 06:39:46.986837 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Sep 30 06:39:47 crc kubenswrapper[4691]: I0930 06:39:47.777695 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Sep 30 06:39:47 crc kubenswrapper[4691]: I0930 06:39:47.790513 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Sep 30 06:39:48 crc kubenswrapper[4691]: I0930 06:39:48.047001 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 06:39:48 crc kubenswrapper[4691]: I0930 06:39:48.047388 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="8ebf5adc-aea5-4d38-81e8-722c6f1db55c" containerName="kube-state-metrics" containerID="cri-o://0e42ca2b32b2f0526f514321e8691f5ee74dc750ee820e044519805d5ce11fb0" gracePeriod=30 Sep 30 06:39:48 crc kubenswrapper[4691]: I0930 06:39:48.569738 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 30 06:39:48 crc kubenswrapper[4691]: I0930 06:39:48.675306 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vd52s\" (UniqueName: \"kubernetes.io/projected/8ebf5adc-aea5-4d38-81e8-722c6f1db55c-kube-api-access-vd52s\") pod \"8ebf5adc-aea5-4d38-81e8-722c6f1db55c\" (UID: \"8ebf5adc-aea5-4d38-81e8-722c6f1db55c\") " Sep 30 06:39:48 crc kubenswrapper[4691]: I0930 06:39:48.683216 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ebf5adc-aea5-4d38-81e8-722c6f1db55c-kube-api-access-vd52s" (OuterVolumeSpecName: "kube-api-access-vd52s") pod "8ebf5adc-aea5-4d38-81e8-722c6f1db55c" (UID: "8ebf5adc-aea5-4d38-81e8-722c6f1db55c"). InnerVolumeSpecName "kube-api-access-vd52s". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:39:48 crc kubenswrapper[4691]: I0930 06:39:48.777254 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vd52s\" (UniqueName: \"kubernetes.io/projected/8ebf5adc-aea5-4d38-81e8-722c6f1db55c-kube-api-access-vd52s\") on node \"crc\" DevicePath \"\"" Sep 30 06:39:48 crc kubenswrapper[4691]: I0930 06:39:48.797417 4691 generic.go:334] "Generic (PLEG): container finished" podID="8ebf5adc-aea5-4d38-81e8-722c6f1db55c" containerID="0e42ca2b32b2f0526f514321e8691f5ee74dc750ee820e044519805d5ce11fb0" exitCode=2 Sep 30 06:39:48 crc kubenswrapper[4691]: I0930 06:39:48.797487 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 30 06:39:48 crc kubenswrapper[4691]: I0930 06:39:48.797519 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8ebf5adc-aea5-4d38-81e8-722c6f1db55c","Type":"ContainerDied","Data":"0e42ca2b32b2f0526f514321e8691f5ee74dc750ee820e044519805d5ce11fb0"} Sep 30 06:39:48 crc kubenswrapper[4691]: I0930 06:39:48.797576 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"8ebf5adc-aea5-4d38-81e8-722c6f1db55c","Type":"ContainerDied","Data":"debd78bd3131a8ae316d65287dd37311473cfbdf99704f67c29877839f077bd3"} Sep 30 06:39:48 crc kubenswrapper[4691]: I0930 06:39:48.797603 4691 scope.go:117] "RemoveContainer" containerID="0e42ca2b32b2f0526f514321e8691f5ee74dc750ee820e044519805d5ce11fb0" Sep 30 06:39:48 crc kubenswrapper[4691]: I0930 06:39:48.834807 4691 scope.go:117] "RemoveContainer" containerID="0e42ca2b32b2f0526f514321e8691f5ee74dc750ee820e044519805d5ce11fb0" Sep 30 06:39:48 crc kubenswrapper[4691]: E0930 06:39:48.838222 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e42ca2b32b2f0526f514321e8691f5ee74dc750ee820e044519805d5ce11fb0\": container with ID starting with 0e42ca2b32b2f0526f514321e8691f5ee74dc750ee820e044519805d5ce11fb0 not found: ID does not exist" containerID="0e42ca2b32b2f0526f514321e8691f5ee74dc750ee820e044519805d5ce11fb0" Sep 30 06:39:48 crc kubenswrapper[4691]: I0930 06:39:48.838274 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e42ca2b32b2f0526f514321e8691f5ee74dc750ee820e044519805d5ce11fb0"} err="failed to get container status \"0e42ca2b32b2f0526f514321e8691f5ee74dc750ee820e044519805d5ce11fb0\": rpc error: code = NotFound desc = could not find container \"0e42ca2b32b2f0526f514321e8691f5ee74dc750ee820e044519805d5ce11fb0\": container with ID starting with 0e42ca2b32b2f0526f514321e8691f5ee74dc750ee820e044519805d5ce11fb0 not found: ID does not exist" Sep 30 06:39:48 crc kubenswrapper[4691]: I0930 06:39:48.844942 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 06:39:48 crc kubenswrapper[4691]: I0930 06:39:48.857284 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 06:39:48 crc kubenswrapper[4691]: I0930 06:39:48.866690 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 06:39:48 crc kubenswrapper[4691]: E0930 06:39:48.867220 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ebf5adc-aea5-4d38-81e8-722c6f1db55c" containerName="kube-state-metrics" Sep 30 06:39:48 crc kubenswrapper[4691]: I0930 06:39:48.867238 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ebf5adc-aea5-4d38-81e8-722c6f1db55c" containerName="kube-state-metrics" Sep 30 06:39:48 crc kubenswrapper[4691]: I0930 06:39:48.867425 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ebf5adc-aea5-4d38-81e8-722c6f1db55c" containerName="kube-state-metrics" Sep 30 06:39:48 crc kubenswrapper[4691]: I0930 06:39:48.868260 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 30 06:39:48 crc kubenswrapper[4691]: I0930 06:39:48.871762 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Sep 30 06:39:48 crc kubenswrapper[4691]: I0930 06:39:48.872066 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Sep 30 06:39:48 crc kubenswrapper[4691]: I0930 06:39:48.882675 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 06:39:48 crc kubenswrapper[4691]: E0930 06:39:48.965131 4691 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ebf5adc_aea5_4d38_81e8_722c6f1db55c.slice/crio-debd78bd3131a8ae316d65287dd37311473cfbdf99704f67c29877839f077bd3\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8ebf5adc_aea5_4d38_81e8_722c6f1db55c.slice\": RecentStats: unable to find data in memory cache]" Sep 30 06:39:48 crc kubenswrapper[4691]: I0930 06:39:48.980191 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/36b81859-2533-442a-bf54-a2fe2a8a5baa-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"36b81859-2533-442a-bf54-a2fe2a8a5baa\") " pod="openstack/kube-state-metrics-0" Sep 30 06:39:48 crc kubenswrapper[4691]: I0930 06:39:48.980572 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36b81859-2533-442a-bf54-a2fe2a8a5baa-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"36b81859-2533-442a-bf54-a2fe2a8a5baa\") " pod="openstack/kube-state-metrics-0" Sep 30 06:39:48 crc kubenswrapper[4691]: I0930 06:39:48.980595 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/36b81859-2533-442a-bf54-a2fe2a8a5baa-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"36b81859-2533-442a-bf54-a2fe2a8a5baa\") " pod="openstack/kube-state-metrics-0" Sep 30 06:39:48 crc kubenswrapper[4691]: I0930 06:39:48.980618 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8n2dg\" (UniqueName: \"kubernetes.io/projected/36b81859-2533-442a-bf54-a2fe2a8a5baa-kube-api-access-8n2dg\") pod \"kube-state-metrics-0\" (UID: \"36b81859-2533-442a-bf54-a2fe2a8a5baa\") " pod="openstack/kube-state-metrics-0" Sep 30 06:39:49 crc kubenswrapper[4691]: I0930 06:39:49.078395 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Sep 30 06:39:49 crc kubenswrapper[4691]: I0930 06:39:49.079532 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Sep 30 06:39:49 crc kubenswrapper[4691]: I0930 06:39:49.081615 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/36b81859-2533-442a-bf54-a2fe2a8a5baa-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"36b81859-2533-442a-bf54-a2fe2a8a5baa\") " pod="openstack/kube-state-metrics-0" Sep 30 06:39:49 crc kubenswrapper[4691]: I0930 06:39:49.081728 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36b81859-2533-442a-bf54-a2fe2a8a5baa-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"36b81859-2533-442a-bf54-a2fe2a8a5baa\") " pod="openstack/kube-state-metrics-0" Sep 30 06:39:49 crc kubenswrapper[4691]: I0930 06:39:49.081756 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/36b81859-2533-442a-bf54-a2fe2a8a5baa-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"36b81859-2533-442a-bf54-a2fe2a8a5baa\") " pod="openstack/kube-state-metrics-0" Sep 30 06:39:49 crc kubenswrapper[4691]: I0930 06:39:49.081778 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8n2dg\" (UniqueName: \"kubernetes.io/projected/36b81859-2533-442a-bf54-a2fe2a8a5baa-kube-api-access-8n2dg\") pod \"kube-state-metrics-0\" (UID: \"36b81859-2533-442a-bf54-a2fe2a8a5baa\") " pod="openstack/kube-state-metrics-0" Sep 30 06:39:49 crc kubenswrapper[4691]: I0930 06:39:49.085688 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Sep 30 06:39:49 crc kubenswrapper[4691]: I0930 06:39:49.085873 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/36b81859-2533-442a-bf54-a2fe2a8a5baa-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"36b81859-2533-442a-bf54-a2fe2a8a5baa\") " pod="openstack/kube-state-metrics-0" Sep 30 06:39:49 crc kubenswrapper[4691]: I0930 06:39:49.086537 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/36b81859-2533-442a-bf54-a2fe2a8a5baa-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"36b81859-2533-442a-bf54-a2fe2a8a5baa\") " pod="openstack/kube-state-metrics-0" Sep 30 06:39:49 crc kubenswrapper[4691]: I0930 06:39:49.088454 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/36b81859-2533-442a-bf54-a2fe2a8a5baa-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"36b81859-2533-442a-bf54-a2fe2a8a5baa\") " pod="openstack/kube-state-metrics-0" Sep 30 06:39:49 crc kubenswrapper[4691]: I0930 06:39:49.096210 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Sep 30 06:39:49 crc kubenswrapper[4691]: I0930 06:39:49.102724 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8n2dg\" (UniqueName: \"kubernetes.io/projected/36b81859-2533-442a-bf54-a2fe2a8a5baa-kube-api-access-8n2dg\") pod \"kube-state-metrics-0\" (UID: \"36b81859-2533-442a-bf54-a2fe2a8a5baa\") " pod="openstack/kube-state-metrics-0" Sep 30 06:39:49 crc kubenswrapper[4691]: I0930 06:39:49.188055 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Sep 30 06:39:49 crc kubenswrapper[4691]: I0930 06:39:49.244967 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ebf5adc-aea5-4d38-81e8-722c6f1db55c" path="/var/lib/kubelet/pods/8ebf5adc-aea5-4d38-81e8-722c6f1db55c/volumes" Sep 30 06:39:49 crc kubenswrapper[4691]: I0930 06:39:49.672133 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Sep 30 06:39:49 crc kubenswrapper[4691]: W0930 06:39:49.680741 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod36b81859_2533_442a_bf54_a2fe2a8a5baa.slice/crio-4b19a3fa29c98d920acbd19623ac2e496fe735b665622e7916bc51125ccbc3f8 WatchSource:0}: Error finding container 4b19a3fa29c98d920acbd19623ac2e496fe735b665622e7916bc51125ccbc3f8: Status 404 returned error can't find the container with id 4b19a3fa29c98d920acbd19623ac2e496fe735b665622e7916bc51125ccbc3f8 Sep 30 06:39:49 crc kubenswrapper[4691]: I0930 06:39:49.811035 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"36b81859-2533-442a-bf54-a2fe2a8a5baa","Type":"ContainerStarted","Data":"4b19a3fa29c98d920acbd19623ac2e496fe735b665622e7916bc51125ccbc3f8"} Sep 30 06:39:49 crc kubenswrapper[4691]: I0930 06:39:49.990006 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 06:39:49 crc kubenswrapper[4691]: I0930 06:39:49.990356 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="16b1887e-8994-401c-a175-6749c32fc173" containerName="ceilometer-central-agent" containerID="cri-o://229254bdac01fc3350892e0dbc1acd302316b9c3aa3748d908c9106802cb52ff" gracePeriod=30 Sep 30 06:39:49 crc kubenswrapper[4691]: I0930 06:39:49.990417 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="16b1887e-8994-401c-a175-6749c32fc173" containerName="proxy-httpd" containerID="cri-o://a35a38af0cc49be7ce7dae05071e378f52d1c65c73fa412af07d1f2105d80149" gracePeriod=30 Sep 30 06:39:49 crc kubenswrapper[4691]: I0930 06:39:49.990454 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="16b1887e-8994-401c-a175-6749c32fc173" containerName="sg-core" containerID="cri-o://1c74ee4af5e97f08a39ac81ed99a02915e646a2ebe59e58b517d5fd2e05b4f1f" gracePeriod=30 Sep 30 06:39:49 crc kubenswrapper[4691]: I0930 06:39:49.990520 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="16b1887e-8994-401c-a175-6749c32fc173" containerName="ceilometer-notification-agent" containerID="cri-o://408800e72e6c2829153ffa6840f0defd6780fc3f5dbdc11f11b6987fc44959ab" gracePeriod=30 Sep 30 06:39:50 crc kubenswrapper[4691]: I0930 06:39:50.832720 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"36b81859-2533-442a-bf54-a2fe2a8a5baa","Type":"ContainerStarted","Data":"89963a4c1848371056d79711a480d9ce16b29a30044776e1697aa629c891c90c"} Sep 30 06:39:50 crc kubenswrapper[4691]: I0930 06:39:50.833868 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Sep 30 06:39:50 crc kubenswrapper[4691]: I0930 06:39:50.855208 4691 generic.go:334] "Generic (PLEG): container finished" podID="16b1887e-8994-401c-a175-6749c32fc173" containerID="a35a38af0cc49be7ce7dae05071e378f52d1c65c73fa412af07d1f2105d80149" exitCode=0 Sep 30 06:39:50 crc kubenswrapper[4691]: I0930 06:39:50.855250 4691 generic.go:334] "Generic (PLEG): container finished" podID="16b1887e-8994-401c-a175-6749c32fc173" containerID="1c74ee4af5e97f08a39ac81ed99a02915e646a2ebe59e58b517d5fd2e05b4f1f" exitCode=2 Sep 30 06:39:50 crc kubenswrapper[4691]: I0930 06:39:50.855262 4691 generic.go:334] "Generic (PLEG): container finished" podID="16b1887e-8994-401c-a175-6749c32fc173" containerID="229254bdac01fc3350892e0dbc1acd302316b9c3aa3748d908c9106802cb52ff" exitCode=0 Sep 30 06:39:50 crc kubenswrapper[4691]: I0930 06:39:50.855358 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"16b1887e-8994-401c-a175-6749c32fc173","Type":"ContainerDied","Data":"a35a38af0cc49be7ce7dae05071e378f52d1c65c73fa412af07d1f2105d80149"} Sep 30 06:39:50 crc kubenswrapper[4691]: I0930 06:39:50.855425 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"16b1887e-8994-401c-a175-6749c32fc173","Type":"ContainerDied","Data":"1c74ee4af5e97f08a39ac81ed99a02915e646a2ebe59e58b517d5fd2e05b4f1f"} Sep 30 06:39:50 crc kubenswrapper[4691]: I0930 06:39:50.855439 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"16b1887e-8994-401c-a175-6749c32fc173","Type":"ContainerDied","Data":"229254bdac01fc3350892e0dbc1acd302316b9c3aa3748d908c9106802cb52ff"} Sep 30 06:39:50 crc kubenswrapper[4691]: I0930 06:39:50.863988 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.406695094 podStartE2EDuration="2.863965351s" podCreationTimestamp="2025-09-30 06:39:48 +0000 UTC" firstStartedPulling="2025-09-30 06:39:49.684820266 +0000 UTC m=+1233.159841306" lastFinishedPulling="2025-09-30 06:39:50.142090523 +0000 UTC m=+1233.617111563" observedRunningTime="2025-09-30 06:39:50.851851884 +0000 UTC m=+1234.326872954" watchObservedRunningTime="2025-09-30 06:39:50.863965351 +0000 UTC m=+1234.338986421" Sep 30 06:39:52 crc kubenswrapper[4691]: I0930 06:39:52.849613 4691 patch_prober.go:28] interesting pod/machine-config-daemon-4w4k6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 06:39:52 crc kubenswrapper[4691]: I0930 06:39:52.850020 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 06:39:56 crc kubenswrapper[4691]: I0930 06:39:56.451693 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 06:39:56 crc kubenswrapper[4691]: I0930 06:39:56.572508 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nbk7c\" (UniqueName: \"kubernetes.io/projected/16b1887e-8994-401c-a175-6749c32fc173-kube-api-access-nbk7c\") pod \"16b1887e-8994-401c-a175-6749c32fc173\" (UID: \"16b1887e-8994-401c-a175-6749c32fc173\") " Sep 30 06:39:56 crc kubenswrapper[4691]: I0930 06:39:56.572691 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16b1887e-8994-401c-a175-6749c32fc173-config-data\") pod \"16b1887e-8994-401c-a175-6749c32fc173\" (UID: \"16b1887e-8994-401c-a175-6749c32fc173\") " Sep 30 06:39:56 crc kubenswrapper[4691]: I0930 06:39:56.572877 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16b1887e-8994-401c-a175-6749c32fc173-combined-ca-bundle\") pod \"16b1887e-8994-401c-a175-6749c32fc173\" (UID: \"16b1887e-8994-401c-a175-6749c32fc173\") " Sep 30 06:39:56 crc kubenswrapper[4691]: I0930 06:39:56.572973 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/16b1887e-8994-401c-a175-6749c32fc173-run-httpd\") pod \"16b1887e-8994-401c-a175-6749c32fc173\" (UID: \"16b1887e-8994-401c-a175-6749c32fc173\") " Sep 30 06:39:56 crc kubenswrapper[4691]: I0930 06:39:56.573081 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/16b1887e-8994-401c-a175-6749c32fc173-sg-core-conf-yaml\") pod \"16b1887e-8994-401c-a175-6749c32fc173\" (UID: \"16b1887e-8994-401c-a175-6749c32fc173\") " Sep 30 06:39:56 crc kubenswrapper[4691]: I0930 06:39:56.573216 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/16b1887e-8994-401c-a175-6749c32fc173-log-httpd\") pod \"16b1887e-8994-401c-a175-6749c32fc173\" (UID: \"16b1887e-8994-401c-a175-6749c32fc173\") " Sep 30 06:39:56 crc kubenswrapper[4691]: I0930 06:39:56.573347 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16b1887e-8994-401c-a175-6749c32fc173-scripts\") pod \"16b1887e-8994-401c-a175-6749c32fc173\" (UID: \"16b1887e-8994-401c-a175-6749c32fc173\") " Sep 30 06:39:56 crc kubenswrapper[4691]: I0930 06:39:56.575036 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16b1887e-8994-401c-a175-6749c32fc173-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "16b1887e-8994-401c-a175-6749c32fc173" (UID: "16b1887e-8994-401c-a175-6749c32fc173"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:39:56 crc kubenswrapper[4691]: I0930 06:39:56.575471 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/16b1887e-8994-401c-a175-6749c32fc173-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "16b1887e-8994-401c-a175-6749c32fc173" (UID: "16b1887e-8994-401c-a175-6749c32fc173"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:39:56 crc kubenswrapper[4691]: I0930 06:39:56.579023 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16b1887e-8994-401c-a175-6749c32fc173-scripts" (OuterVolumeSpecName: "scripts") pod "16b1887e-8994-401c-a175-6749c32fc173" (UID: "16b1887e-8994-401c-a175-6749c32fc173"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:39:56 crc kubenswrapper[4691]: I0930 06:39:56.595719 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16b1887e-8994-401c-a175-6749c32fc173-kube-api-access-nbk7c" (OuterVolumeSpecName: "kube-api-access-nbk7c") pod "16b1887e-8994-401c-a175-6749c32fc173" (UID: "16b1887e-8994-401c-a175-6749c32fc173"). InnerVolumeSpecName "kube-api-access-nbk7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:39:56 crc kubenswrapper[4691]: I0930 06:39:56.632242 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16b1887e-8994-401c-a175-6749c32fc173-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "16b1887e-8994-401c-a175-6749c32fc173" (UID: "16b1887e-8994-401c-a175-6749c32fc173"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:39:56 crc kubenswrapper[4691]: I0930 06:39:56.676139 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nbk7c\" (UniqueName: \"kubernetes.io/projected/16b1887e-8994-401c-a175-6749c32fc173-kube-api-access-nbk7c\") on node \"crc\" DevicePath \"\"" Sep 30 06:39:56 crc kubenswrapper[4691]: I0930 06:39:56.676185 4691 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/16b1887e-8994-401c-a175-6749c32fc173-run-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 06:39:56 crc kubenswrapper[4691]: I0930 06:39:56.676203 4691 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/16b1887e-8994-401c-a175-6749c32fc173-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Sep 30 06:39:56 crc kubenswrapper[4691]: I0930 06:39:56.676220 4691 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/16b1887e-8994-401c-a175-6749c32fc173-log-httpd\") on node \"crc\" DevicePath \"\"" Sep 30 06:39:56 crc kubenswrapper[4691]: I0930 06:39:56.676238 4691 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16b1887e-8994-401c-a175-6749c32fc173-scripts\") on node \"crc\" DevicePath \"\"" Sep 30 06:39:56 crc kubenswrapper[4691]: I0930 06:39:56.689042 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16b1887e-8994-401c-a175-6749c32fc173-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "16b1887e-8994-401c-a175-6749c32fc173" (UID: "16b1887e-8994-401c-a175-6749c32fc173"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:39:56 crc kubenswrapper[4691]: I0930 06:39:56.730370 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16b1887e-8994-401c-a175-6749c32fc173-config-data" (OuterVolumeSpecName: "config-data") pod "16b1887e-8994-401c-a175-6749c32fc173" (UID: "16b1887e-8994-401c-a175-6749c32fc173"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:39:56 crc kubenswrapper[4691]: I0930 06:39:56.778413 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16b1887e-8994-401c-a175-6749c32fc173-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 06:39:56 crc kubenswrapper[4691]: I0930 06:39:56.778466 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16b1887e-8994-401c-a175-6749c32fc173-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:39:56 crc kubenswrapper[4691]: I0930 06:39:56.929966 4691 generic.go:334] "Generic (PLEG): container finished" podID="16b1887e-8994-401c-a175-6749c32fc173" containerID="408800e72e6c2829153ffa6840f0defd6780fc3f5dbdc11f11b6987fc44959ab" exitCode=0 Sep 30 06:39:56 crc kubenswrapper[4691]: I0930 06:39:56.930034 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"16b1887e-8994-401c-a175-6749c32fc173","Type":"ContainerDied","Data":"408800e72e6c2829153ffa6840f0defd6780fc3f5dbdc11f11b6987fc44959ab"} Sep 30 06:39:56 crc kubenswrapper[4691]: I0930 06:39:56.930323 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"16b1887e-8994-401c-a175-6749c32fc173","Type":"ContainerDied","Data":"db41a7847210f858dbed08f73548bc6eecc5500c4dc752f5489659e6e580df05"} Sep 30 06:39:56 crc kubenswrapper[4691]: I0930 06:39:56.930348 4691 scope.go:117] "RemoveContainer" containerID="a35a38af0cc49be7ce7dae05071e378f52d1c65c73fa412af07d1f2105d80149" Sep 30 06:39:56 crc kubenswrapper[4691]: I0930 06:39:56.930176 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 06:39:56 crc kubenswrapper[4691]: I0930 06:39:56.966459 4691 scope.go:117] "RemoveContainer" containerID="1c74ee4af5e97f08a39ac81ed99a02915e646a2ebe59e58b517d5fd2e05b4f1f" Sep 30 06:39:56 crc kubenswrapper[4691]: I0930 06:39:56.996852 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Sep 30 06:39:57 crc kubenswrapper[4691]: I0930 06:39:57.012131 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Sep 30 06:39:57 crc kubenswrapper[4691]: I0930 06:39:57.013399 4691 scope.go:117] "RemoveContainer" containerID="408800e72e6c2829153ffa6840f0defd6780fc3f5dbdc11f11b6987fc44959ab" Sep 30 06:39:57 crc kubenswrapper[4691]: I0930 06:39:57.028102 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Sep 30 06:39:57 crc kubenswrapper[4691]: E0930 06:39:57.028630 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16b1887e-8994-401c-a175-6749c32fc173" containerName="ceilometer-notification-agent" Sep 30 06:39:57 crc kubenswrapper[4691]: I0930 06:39:57.028709 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="16b1887e-8994-401c-a175-6749c32fc173" containerName="ceilometer-notification-agent" Sep 30 06:39:57 crc kubenswrapper[4691]: E0930 06:39:57.028760 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16b1887e-8994-401c-a175-6749c32fc173" containerName="ceilometer-central-agent" Sep 30 06:39:57 crc kubenswrapper[4691]: I0930 06:39:57.028809 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="16b1887e-8994-401c-a175-6749c32fc173" containerName="ceilometer-central-agent" Sep 30 06:39:57 crc kubenswrapper[4691]: E0930 06:39:57.028865 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16b1887e-8994-401c-a175-6749c32fc173" containerName="sg-core" Sep 30 06:39:57 crc kubenswrapper[4691]: I0930 06:39:57.028927 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="16b1887e-8994-401c-a175-6749c32fc173" containerName="sg-core" Sep 30 06:39:57 crc kubenswrapper[4691]: E0930 06:39:57.029002 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16b1887e-8994-401c-a175-6749c32fc173" containerName="proxy-httpd" Sep 30 06:39:57 crc kubenswrapper[4691]: I0930 06:39:57.029052 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="16b1887e-8994-401c-a175-6749c32fc173" containerName="proxy-httpd" Sep 30 06:39:57 crc kubenswrapper[4691]: I0930 06:39:57.029270 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="16b1887e-8994-401c-a175-6749c32fc173" containerName="ceilometer-notification-agent" Sep 30 06:39:57 crc kubenswrapper[4691]: I0930 06:39:57.029326 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="16b1887e-8994-401c-a175-6749c32fc173" containerName="proxy-httpd" Sep 30 06:39:57 crc kubenswrapper[4691]: I0930 06:39:57.029383 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="16b1887e-8994-401c-a175-6749c32fc173" containerName="ceilometer-central-agent" Sep 30 06:39:57 crc kubenswrapper[4691]: I0930 06:39:57.029432 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="16b1887e-8994-401c-a175-6749c32fc173" containerName="sg-core" Sep 30 06:39:57 crc kubenswrapper[4691]: I0930 06:39:57.031401 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 06:39:57 crc kubenswrapper[4691]: I0930 06:39:57.039659 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Sep 30 06:39:57 crc kubenswrapper[4691]: I0930 06:39:57.040000 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Sep 30 06:39:57 crc kubenswrapper[4691]: I0930 06:39:57.040189 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Sep 30 06:39:57 crc kubenswrapper[4691]: I0930 06:39:57.048743 4691 scope.go:117] "RemoveContainer" containerID="229254bdac01fc3350892e0dbc1acd302316b9c3aa3748d908c9106802cb52ff" Sep 30 06:39:57 crc kubenswrapper[4691]: I0930 06:39:57.050652 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 06:39:57 crc kubenswrapper[4691]: I0930 06:39:57.079179 4691 scope.go:117] "RemoveContainer" containerID="a35a38af0cc49be7ce7dae05071e378f52d1c65c73fa412af07d1f2105d80149" Sep 30 06:39:57 crc kubenswrapper[4691]: E0930 06:39:57.079725 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a35a38af0cc49be7ce7dae05071e378f52d1c65c73fa412af07d1f2105d80149\": container with ID starting with a35a38af0cc49be7ce7dae05071e378f52d1c65c73fa412af07d1f2105d80149 not found: ID does not exist" containerID="a35a38af0cc49be7ce7dae05071e378f52d1c65c73fa412af07d1f2105d80149" Sep 30 06:39:57 crc kubenswrapper[4691]: I0930 06:39:57.079771 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a35a38af0cc49be7ce7dae05071e378f52d1c65c73fa412af07d1f2105d80149"} err="failed to get container status \"a35a38af0cc49be7ce7dae05071e378f52d1c65c73fa412af07d1f2105d80149\": rpc error: code = NotFound desc = could not find container \"a35a38af0cc49be7ce7dae05071e378f52d1c65c73fa412af07d1f2105d80149\": container with ID starting with a35a38af0cc49be7ce7dae05071e378f52d1c65c73fa412af07d1f2105d80149 not found: ID does not exist" Sep 30 06:39:57 crc kubenswrapper[4691]: I0930 06:39:57.079803 4691 scope.go:117] "RemoveContainer" containerID="1c74ee4af5e97f08a39ac81ed99a02915e646a2ebe59e58b517d5fd2e05b4f1f" Sep 30 06:39:57 crc kubenswrapper[4691]: E0930 06:39:57.080306 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c74ee4af5e97f08a39ac81ed99a02915e646a2ebe59e58b517d5fd2e05b4f1f\": container with ID starting with 1c74ee4af5e97f08a39ac81ed99a02915e646a2ebe59e58b517d5fd2e05b4f1f not found: ID does not exist" containerID="1c74ee4af5e97f08a39ac81ed99a02915e646a2ebe59e58b517d5fd2e05b4f1f" Sep 30 06:39:57 crc kubenswrapper[4691]: I0930 06:39:57.080336 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c74ee4af5e97f08a39ac81ed99a02915e646a2ebe59e58b517d5fd2e05b4f1f"} err="failed to get container status \"1c74ee4af5e97f08a39ac81ed99a02915e646a2ebe59e58b517d5fd2e05b4f1f\": rpc error: code = NotFound desc = could not find container \"1c74ee4af5e97f08a39ac81ed99a02915e646a2ebe59e58b517d5fd2e05b4f1f\": container with ID starting with 1c74ee4af5e97f08a39ac81ed99a02915e646a2ebe59e58b517d5fd2e05b4f1f not found: ID does not exist" Sep 30 06:39:57 crc kubenswrapper[4691]: I0930 06:39:57.080357 4691 scope.go:117] "RemoveContainer" containerID="408800e72e6c2829153ffa6840f0defd6780fc3f5dbdc11f11b6987fc44959ab" Sep 30 06:39:57 crc kubenswrapper[4691]: E0930 06:39:57.080657 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"408800e72e6c2829153ffa6840f0defd6780fc3f5dbdc11f11b6987fc44959ab\": container with ID starting with 408800e72e6c2829153ffa6840f0defd6780fc3f5dbdc11f11b6987fc44959ab not found: ID does not exist" containerID="408800e72e6c2829153ffa6840f0defd6780fc3f5dbdc11f11b6987fc44959ab" Sep 30 06:39:57 crc kubenswrapper[4691]: I0930 06:39:57.080676 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"408800e72e6c2829153ffa6840f0defd6780fc3f5dbdc11f11b6987fc44959ab"} err="failed to get container status \"408800e72e6c2829153ffa6840f0defd6780fc3f5dbdc11f11b6987fc44959ab\": rpc error: code = NotFound desc = could not find container \"408800e72e6c2829153ffa6840f0defd6780fc3f5dbdc11f11b6987fc44959ab\": container with ID starting with 408800e72e6c2829153ffa6840f0defd6780fc3f5dbdc11f11b6987fc44959ab not found: ID does not exist" Sep 30 06:39:57 crc kubenswrapper[4691]: I0930 06:39:57.080689 4691 scope.go:117] "RemoveContainer" containerID="229254bdac01fc3350892e0dbc1acd302316b9c3aa3748d908c9106802cb52ff" Sep 30 06:39:57 crc kubenswrapper[4691]: E0930 06:39:57.081161 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"229254bdac01fc3350892e0dbc1acd302316b9c3aa3748d908c9106802cb52ff\": container with ID starting with 229254bdac01fc3350892e0dbc1acd302316b9c3aa3748d908c9106802cb52ff not found: ID does not exist" containerID="229254bdac01fc3350892e0dbc1acd302316b9c3aa3748d908c9106802cb52ff" Sep 30 06:39:57 crc kubenswrapper[4691]: I0930 06:39:57.081191 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"229254bdac01fc3350892e0dbc1acd302316b9c3aa3748d908c9106802cb52ff"} err="failed to get container status \"229254bdac01fc3350892e0dbc1acd302316b9c3aa3748d908c9106802cb52ff\": rpc error: code = NotFound desc = could not find container \"229254bdac01fc3350892e0dbc1acd302316b9c3aa3748d908c9106802cb52ff\": container with ID starting with 229254bdac01fc3350892e0dbc1acd302316b9c3aa3748d908c9106802cb52ff not found: ID does not exist" Sep 30 06:39:57 crc kubenswrapper[4691]: I0930 06:39:57.185257 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ac8d6a42-d8ce-419f-ae31-d9746dcedea9-log-httpd\") pod \"ceilometer-0\" (UID: \"ac8d6a42-d8ce-419f-ae31-d9746dcedea9\") " pod="openstack/ceilometer-0" Sep 30 06:39:57 crc kubenswrapper[4691]: I0930 06:39:57.185307 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ac8d6a42-d8ce-419f-ae31-d9746dcedea9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ac8d6a42-d8ce-419f-ae31-d9746dcedea9\") " pod="openstack/ceilometer-0" Sep 30 06:39:57 crc kubenswrapper[4691]: I0930 06:39:57.185384 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac8d6a42-d8ce-419f-ae31-d9746dcedea9-scripts\") pod \"ceilometer-0\" (UID: \"ac8d6a42-d8ce-419f-ae31-d9746dcedea9\") " pod="openstack/ceilometer-0" Sep 30 06:39:57 crc kubenswrapper[4691]: I0930 06:39:57.185403 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac8d6a42-d8ce-419f-ae31-d9746dcedea9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ac8d6a42-d8ce-419f-ae31-d9746dcedea9\") " pod="openstack/ceilometer-0" Sep 30 06:39:57 crc kubenswrapper[4691]: I0930 06:39:57.185426 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac8d6a42-d8ce-419f-ae31-d9746dcedea9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ac8d6a42-d8ce-419f-ae31-d9746dcedea9\") " pod="openstack/ceilometer-0" Sep 30 06:39:57 crc kubenswrapper[4691]: I0930 06:39:57.185650 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac8d6a42-d8ce-419f-ae31-d9746dcedea9-config-data\") pod \"ceilometer-0\" (UID: \"ac8d6a42-d8ce-419f-ae31-d9746dcedea9\") " pod="openstack/ceilometer-0" Sep 30 06:39:57 crc kubenswrapper[4691]: I0930 06:39:57.185706 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ac8d6a42-d8ce-419f-ae31-d9746dcedea9-run-httpd\") pod \"ceilometer-0\" (UID: \"ac8d6a42-d8ce-419f-ae31-d9746dcedea9\") " pod="openstack/ceilometer-0" Sep 30 06:39:57 crc kubenswrapper[4691]: I0930 06:39:57.185862 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vj9ht\" (UniqueName: \"kubernetes.io/projected/ac8d6a42-d8ce-419f-ae31-d9746dcedea9-kube-api-access-vj9ht\") pod \"ceilometer-0\" (UID: \"ac8d6a42-d8ce-419f-ae31-d9746dcedea9\") " pod="openstack/ceilometer-0" Sep 30 06:39:57 crc kubenswrapper[4691]: I0930 06:39:57.245773 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16b1887e-8994-401c-a175-6749c32fc173" path="/var/lib/kubelet/pods/16b1887e-8994-401c-a175-6749c32fc173/volumes" Sep 30 06:39:57 crc kubenswrapper[4691]: I0930 06:39:57.288406 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vj9ht\" (UniqueName: \"kubernetes.io/projected/ac8d6a42-d8ce-419f-ae31-d9746dcedea9-kube-api-access-vj9ht\") pod \"ceilometer-0\" (UID: \"ac8d6a42-d8ce-419f-ae31-d9746dcedea9\") " pod="openstack/ceilometer-0" Sep 30 06:39:57 crc kubenswrapper[4691]: I0930 06:39:57.288509 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ac8d6a42-d8ce-419f-ae31-d9746dcedea9-log-httpd\") pod \"ceilometer-0\" (UID: \"ac8d6a42-d8ce-419f-ae31-d9746dcedea9\") " pod="openstack/ceilometer-0" Sep 30 06:39:57 crc kubenswrapper[4691]: I0930 06:39:57.288558 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ac8d6a42-d8ce-419f-ae31-d9746dcedea9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ac8d6a42-d8ce-419f-ae31-d9746dcedea9\") " pod="openstack/ceilometer-0" Sep 30 06:39:57 crc kubenswrapper[4691]: I0930 06:39:57.288649 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac8d6a42-d8ce-419f-ae31-d9746dcedea9-scripts\") pod \"ceilometer-0\" (UID: \"ac8d6a42-d8ce-419f-ae31-d9746dcedea9\") " pod="openstack/ceilometer-0" Sep 30 06:39:57 crc kubenswrapper[4691]: I0930 06:39:57.288687 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac8d6a42-d8ce-419f-ae31-d9746dcedea9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ac8d6a42-d8ce-419f-ae31-d9746dcedea9\") " pod="openstack/ceilometer-0" Sep 30 06:39:57 crc kubenswrapper[4691]: I0930 06:39:57.288732 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac8d6a42-d8ce-419f-ae31-d9746dcedea9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ac8d6a42-d8ce-419f-ae31-d9746dcedea9\") " pod="openstack/ceilometer-0" Sep 30 06:39:57 crc kubenswrapper[4691]: I0930 06:39:57.288851 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac8d6a42-d8ce-419f-ae31-d9746dcedea9-config-data\") pod \"ceilometer-0\" (UID: \"ac8d6a42-d8ce-419f-ae31-d9746dcedea9\") " pod="openstack/ceilometer-0" Sep 30 06:39:57 crc kubenswrapper[4691]: I0930 06:39:57.288921 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ac8d6a42-d8ce-419f-ae31-d9746dcedea9-run-httpd\") pod \"ceilometer-0\" (UID: \"ac8d6a42-d8ce-419f-ae31-d9746dcedea9\") " pod="openstack/ceilometer-0" Sep 30 06:39:57 crc kubenswrapper[4691]: I0930 06:39:57.288957 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ac8d6a42-d8ce-419f-ae31-d9746dcedea9-log-httpd\") pod \"ceilometer-0\" (UID: \"ac8d6a42-d8ce-419f-ae31-d9746dcedea9\") " pod="openstack/ceilometer-0" Sep 30 06:39:57 crc kubenswrapper[4691]: I0930 06:39:57.289544 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/ac8d6a42-d8ce-419f-ae31-d9746dcedea9-run-httpd\") pod \"ceilometer-0\" (UID: \"ac8d6a42-d8ce-419f-ae31-d9746dcedea9\") " pod="openstack/ceilometer-0" Sep 30 06:39:57 crc kubenswrapper[4691]: I0930 06:39:57.294394 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/ac8d6a42-d8ce-419f-ae31-d9746dcedea9-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"ac8d6a42-d8ce-419f-ae31-d9746dcedea9\") " pod="openstack/ceilometer-0" Sep 30 06:39:57 crc kubenswrapper[4691]: I0930 06:39:57.294568 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ac8d6a42-d8ce-419f-ae31-d9746dcedea9-scripts\") pod \"ceilometer-0\" (UID: \"ac8d6a42-d8ce-419f-ae31-d9746dcedea9\") " pod="openstack/ceilometer-0" Sep 30 06:39:57 crc kubenswrapper[4691]: I0930 06:39:57.295228 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/ac8d6a42-d8ce-419f-ae31-d9746dcedea9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"ac8d6a42-d8ce-419f-ae31-d9746dcedea9\") " pod="openstack/ceilometer-0" Sep 30 06:39:57 crc kubenswrapper[4691]: I0930 06:39:57.298736 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac8d6a42-d8ce-419f-ae31-d9746dcedea9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"ac8d6a42-d8ce-419f-ae31-d9746dcedea9\") " pod="openstack/ceilometer-0" Sep 30 06:39:57 crc kubenswrapper[4691]: I0930 06:39:57.302777 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ac8d6a42-d8ce-419f-ae31-d9746dcedea9-config-data\") pod \"ceilometer-0\" (UID: \"ac8d6a42-d8ce-419f-ae31-d9746dcedea9\") " pod="openstack/ceilometer-0" Sep 30 06:39:57 crc kubenswrapper[4691]: I0930 06:39:57.313332 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vj9ht\" (UniqueName: \"kubernetes.io/projected/ac8d6a42-d8ce-419f-ae31-d9746dcedea9-kube-api-access-vj9ht\") pod \"ceilometer-0\" (UID: \"ac8d6a42-d8ce-419f-ae31-d9746dcedea9\") " pod="openstack/ceilometer-0" Sep 30 06:39:57 crc kubenswrapper[4691]: I0930 06:39:57.352421 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Sep 30 06:39:57 crc kubenswrapper[4691]: I0930 06:39:57.849160 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Sep 30 06:39:57 crc kubenswrapper[4691]: W0930 06:39:57.859575 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac8d6a42_d8ce_419f_ae31_d9746dcedea9.slice/crio-d59f218af1d4dc8c44c4c962ed5e562fae2d3b9ad829351609a16796333a387c WatchSource:0}: Error finding container d59f218af1d4dc8c44c4c962ed5e562fae2d3b9ad829351609a16796333a387c: Status 404 returned error can't find the container with id d59f218af1d4dc8c44c4c962ed5e562fae2d3b9ad829351609a16796333a387c Sep 30 06:39:57 crc kubenswrapper[4691]: I0930 06:39:57.947196 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ac8d6a42-d8ce-419f-ae31-d9746dcedea9","Type":"ContainerStarted","Data":"d59f218af1d4dc8c44c4c962ed5e562fae2d3b9ad829351609a16796333a387c"} Sep 30 06:39:58 crc kubenswrapper[4691]: I0930 06:39:58.962068 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ac8d6a42-d8ce-419f-ae31-d9746dcedea9","Type":"ContainerStarted","Data":"8336583340b96001dcba7a7e3d0d68a671416bec84dac884918a56e06fd082c8"} Sep 30 06:39:58 crc kubenswrapper[4691]: I0930 06:39:58.962467 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ac8d6a42-d8ce-419f-ae31-d9746dcedea9","Type":"ContainerStarted","Data":"aacea82a2fce5f13429c0a202471a392a34627f7ac863cee57ebfc39d061fd18"} Sep 30 06:39:59 crc kubenswrapper[4691]: I0930 06:39:59.201912 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Sep 30 06:39:59 crc kubenswrapper[4691]: I0930 06:39:59.973961 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ac8d6a42-d8ce-419f-ae31-d9746dcedea9","Type":"ContainerStarted","Data":"67150a3108d49c25189a1e40601c2d763fe095fe8f6cafa008ce03cb9c165730"} Sep 30 06:40:00 crc kubenswrapper[4691]: I0930 06:40:00.987467 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"ac8d6a42-d8ce-419f-ae31-d9746dcedea9","Type":"ContainerStarted","Data":"045723773edee5757a5c1d2481ee2731e910ebc885a3eec4a43f412c43412a7f"} Sep 30 06:40:00 crc kubenswrapper[4691]: I0930 06:40:00.987929 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Sep 30 06:40:01 crc kubenswrapper[4691]: I0930 06:40:01.021087 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.501424173 podStartE2EDuration="5.021068497s" podCreationTimestamp="2025-09-30 06:39:56 +0000 UTC" firstStartedPulling="2025-09-30 06:39:57.86459035 +0000 UTC m=+1241.339611400" lastFinishedPulling="2025-09-30 06:40:00.384234644 +0000 UTC m=+1243.859255724" observedRunningTime="2025-09-30 06:40:01.016840561 +0000 UTC m=+1244.491861611" watchObservedRunningTime="2025-09-30 06:40:01.021068497 +0000 UTC m=+1244.496089547" Sep 30 06:40:22 crc kubenswrapper[4691]: I0930 06:40:22.849830 4691 patch_prober.go:28] interesting pod/machine-config-daemon-4w4k6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 06:40:22 crc kubenswrapper[4691]: I0930 06:40:22.850372 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 06:40:22 crc kubenswrapper[4691]: I0930 06:40:22.850421 4691 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" Sep 30 06:40:22 crc kubenswrapper[4691]: I0930 06:40:22.851140 4691 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b284d70235ce92b5dbfc6f06471e0d2494b74dc71ad661702951112856d0f82c"} pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 06:40:22 crc kubenswrapper[4691]: I0930 06:40:22.851205 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" containerName="machine-config-daemon" containerID="cri-o://b284d70235ce92b5dbfc6f06471e0d2494b74dc71ad661702951112856d0f82c" gracePeriod=600 Sep 30 06:40:23 crc kubenswrapper[4691]: I0930 06:40:23.271687 4691 generic.go:334] "Generic (PLEG): container finished" podID="69b46ade-8260-448f-84b7-506632d23ff9" containerID="b284d70235ce92b5dbfc6f06471e0d2494b74dc71ad661702951112856d0f82c" exitCode=0 Sep 30 06:40:23 crc kubenswrapper[4691]: I0930 06:40:23.271769 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" event={"ID":"69b46ade-8260-448f-84b7-506632d23ff9","Type":"ContainerDied","Data":"b284d70235ce92b5dbfc6f06471e0d2494b74dc71ad661702951112856d0f82c"} Sep 30 06:40:23 crc kubenswrapper[4691]: I0930 06:40:23.272079 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" event={"ID":"69b46ade-8260-448f-84b7-506632d23ff9","Type":"ContainerStarted","Data":"cdbae690f51f4bea9a63d0f6c926710bf8cae323365923c61958365eb48e16db"} Sep 30 06:40:23 crc kubenswrapper[4691]: I0930 06:40:23.272115 4691 scope.go:117] "RemoveContainer" containerID="38f0c707492af70fdfb0f260acc0b7e0af55b1c1967ae7e929f5286c470b2dd6" Sep 30 06:40:27 crc kubenswrapper[4691]: I0930 06:40:27.368380 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Sep 30 06:40:37 crc kubenswrapper[4691]: I0930 06:40:37.252974 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 06:40:38 crc kubenswrapper[4691]: I0930 06:40:38.141976 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 06:40:40 crc kubenswrapper[4691]: I0930 06:40:40.675789 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="6b6ff7c5-6146-432e-a89c-fe95ac728e5c" containerName="rabbitmq" containerID="cri-o://3a338c90117dd264286426ab2d4ddfa45266ae38adc725c1467b4129a7c3187a" gracePeriod=604797 Sep 30 06:40:41 crc kubenswrapper[4691]: I0930 06:40:41.259392 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="fd5df9d9-7a0a-441c-b21d-92dff2af7376" containerName="rabbitmq" containerID="cri-o://25e026665d18a5a549dc7bd0f85f93fb72649cd66d04b7d5ccd6ddae22cf463e" gracePeriod=604797 Sep 30 06:40:42 crc kubenswrapper[4691]: I0930 06:40:42.513876 4691 generic.go:334] "Generic (PLEG): container finished" podID="6b6ff7c5-6146-432e-a89c-fe95ac728e5c" containerID="3a338c90117dd264286426ab2d4ddfa45266ae38adc725c1467b4129a7c3187a" exitCode=0 Sep 30 06:40:42 crc kubenswrapper[4691]: I0930 06:40:42.513950 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6b6ff7c5-6146-432e-a89c-fe95ac728e5c","Type":"ContainerDied","Data":"3a338c90117dd264286426ab2d4ddfa45266ae38adc725c1467b4129a7c3187a"} Sep 30 06:40:42 crc kubenswrapper[4691]: I0930 06:40:42.514501 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"6b6ff7c5-6146-432e-a89c-fe95ac728e5c","Type":"ContainerDied","Data":"20832adcf2ec865fc3b6cdc4c73cacb061fc89ca927a4726e31b491eef44ecf2"} Sep 30 06:40:42 crc kubenswrapper[4691]: I0930 06:40:42.514530 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20832adcf2ec865fc3b6cdc4c73cacb061fc89ca927a4726e31b491eef44ecf2" Sep 30 06:40:42 crc kubenswrapper[4691]: I0930 06:40:42.518520 4691 generic.go:334] "Generic (PLEG): container finished" podID="fd5df9d9-7a0a-441c-b21d-92dff2af7376" containerID="25e026665d18a5a549dc7bd0f85f93fb72649cd66d04b7d5ccd6ddae22cf463e" exitCode=0 Sep 30 06:40:42 crc kubenswrapper[4691]: I0930 06:40:42.518572 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"fd5df9d9-7a0a-441c-b21d-92dff2af7376","Type":"ContainerDied","Data":"25e026665d18a5a549dc7bd0f85f93fb72649cd66d04b7d5ccd6ddae22cf463e"} Sep 30 06:40:42 crc kubenswrapper[4691]: I0930 06:40:42.563277 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 30 06:40:42 crc kubenswrapper[4691]: I0930 06:40:42.726252 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6b6ff7c5-6146-432e-a89c-fe95ac728e5c-plugins-conf\") pod \"6b6ff7c5-6146-432e-a89c-fe95ac728e5c\" (UID: \"6b6ff7c5-6146-432e-a89c-fe95ac728e5c\") " Sep 30 06:40:42 crc kubenswrapper[4691]: I0930 06:40:42.726657 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vssb\" (UniqueName: \"kubernetes.io/projected/6b6ff7c5-6146-432e-a89c-fe95ac728e5c-kube-api-access-8vssb\") pod \"6b6ff7c5-6146-432e-a89c-fe95ac728e5c\" (UID: \"6b6ff7c5-6146-432e-a89c-fe95ac728e5c\") " Sep 30 06:40:42 crc kubenswrapper[4691]: I0930 06:40:42.726733 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6b6ff7c5-6146-432e-a89c-fe95ac728e5c-pod-info\") pod \"6b6ff7c5-6146-432e-a89c-fe95ac728e5c\" (UID: \"6b6ff7c5-6146-432e-a89c-fe95ac728e5c\") " Sep 30 06:40:42 crc kubenswrapper[4691]: I0930 06:40:42.726751 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6b6ff7c5-6146-432e-a89c-fe95ac728e5c-rabbitmq-tls\") pod \"6b6ff7c5-6146-432e-a89c-fe95ac728e5c\" (UID: \"6b6ff7c5-6146-432e-a89c-fe95ac728e5c\") " Sep 30 06:40:42 crc kubenswrapper[4691]: I0930 06:40:42.726773 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"6b6ff7c5-6146-432e-a89c-fe95ac728e5c\" (UID: \"6b6ff7c5-6146-432e-a89c-fe95ac728e5c\") " Sep 30 06:40:42 crc kubenswrapper[4691]: I0930 06:40:42.726822 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6b6ff7c5-6146-432e-a89c-fe95ac728e5c-rabbitmq-confd\") pod \"6b6ff7c5-6146-432e-a89c-fe95ac728e5c\" (UID: \"6b6ff7c5-6146-432e-a89c-fe95ac728e5c\") " Sep 30 06:40:42 crc kubenswrapper[4691]: I0930 06:40:42.726879 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6b6ff7c5-6146-432e-a89c-fe95ac728e5c-rabbitmq-erlang-cookie\") pod \"6b6ff7c5-6146-432e-a89c-fe95ac728e5c\" (UID: \"6b6ff7c5-6146-432e-a89c-fe95ac728e5c\") " Sep 30 06:40:42 crc kubenswrapper[4691]: I0930 06:40:42.726952 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6b6ff7c5-6146-432e-a89c-fe95ac728e5c-server-conf\") pod \"6b6ff7c5-6146-432e-a89c-fe95ac728e5c\" (UID: \"6b6ff7c5-6146-432e-a89c-fe95ac728e5c\") " Sep 30 06:40:42 crc kubenswrapper[4691]: I0930 06:40:42.726980 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6b6ff7c5-6146-432e-a89c-fe95ac728e5c-rabbitmq-plugins\") pod \"6b6ff7c5-6146-432e-a89c-fe95ac728e5c\" (UID: \"6b6ff7c5-6146-432e-a89c-fe95ac728e5c\") " Sep 30 06:40:42 crc kubenswrapper[4691]: I0930 06:40:42.727007 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6b6ff7c5-6146-432e-a89c-fe95ac728e5c-config-data\") pod \"6b6ff7c5-6146-432e-a89c-fe95ac728e5c\" (UID: \"6b6ff7c5-6146-432e-a89c-fe95ac728e5c\") " Sep 30 06:40:42 crc kubenswrapper[4691]: I0930 06:40:42.727029 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6b6ff7c5-6146-432e-a89c-fe95ac728e5c-erlang-cookie-secret\") pod \"6b6ff7c5-6146-432e-a89c-fe95ac728e5c\" (UID: \"6b6ff7c5-6146-432e-a89c-fe95ac728e5c\") " Sep 30 06:40:42 crc kubenswrapper[4691]: I0930 06:40:42.730164 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b6ff7c5-6146-432e-a89c-fe95ac728e5c-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "6b6ff7c5-6146-432e-a89c-fe95ac728e5c" (UID: "6b6ff7c5-6146-432e-a89c-fe95ac728e5c"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:40:42 crc kubenswrapper[4691]: I0930 06:40:42.730173 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b6ff7c5-6146-432e-a89c-fe95ac728e5c-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "6b6ff7c5-6146-432e-a89c-fe95ac728e5c" (UID: "6b6ff7c5-6146-432e-a89c-fe95ac728e5c"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:40:42 crc kubenswrapper[4691]: I0930 06:40:42.733303 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b6ff7c5-6146-432e-a89c-fe95ac728e5c-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "6b6ff7c5-6146-432e-a89c-fe95ac728e5c" (UID: "6b6ff7c5-6146-432e-a89c-fe95ac728e5c"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:40:42 crc kubenswrapper[4691]: I0930 06:40:42.737275 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b6ff7c5-6146-432e-a89c-fe95ac728e5c-kube-api-access-8vssb" (OuterVolumeSpecName: "kube-api-access-8vssb") pod "6b6ff7c5-6146-432e-a89c-fe95ac728e5c" (UID: "6b6ff7c5-6146-432e-a89c-fe95ac728e5c"). InnerVolumeSpecName "kube-api-access-8vssb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:40:42 crc kubenswrapper[4691]: I0930 06:40:42.740034 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "persistence") pod "6b6ff7c5-6146-432e-a89c-fe95ac728e5c" (UID: "6b6ff7c5-6146-432e-a89c-fe95ac728e5c"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 06:40:42 crc kubenswrapper[4691]: I0930 06:40:42.740802 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b6ff7c5-6146-432e-a89c-fe95ac728e5c-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "6b6ff7c5-6146-432e-a89c-fe95ac728e5c" (UID: "6b6ff7c5-6146-432e-a89c-fe95ac728e5c"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:40:42 crc kubenswrapper[4691]: I0930 06:40:42.741357 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/6b6ff7c5-6146-432e-a89c-fe95ac728e5c-pod-info" (OuterVolumeSpecName: "pod-info") pod "6b6ff7c5-6146-432e-a89c-fe95ac728e5c" (UID: "6b6ff7c5-6146-432e-a89c-fe95ac728e5c"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Sep 30 06:40:42 crc kubenswrapper[4691]: I0930 06:40:42.784166 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b6ff7c5-6146-432e-a89c-fe95ac728e5c-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "6b6ff7c5-6146-432e-a89c-fe95ac728e5c" (UID: "6b6ff7c5-6146-432e-a89c-fe95ac728e5c"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:40:42 crc kubenswrapper[4691]: I0930 06:40:42.789069 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b6ff7c5-6146-432e-a89c-fe95ac728e5c-config-data" (OuterVolumeSpecName: "config-data") pod "6b6ff7c5-6146-432e-a89c-fe95ac728e5c" (UID: "6b6ff7c5-6146-432e-a89c-fe95ac728e5c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:40:42 crc kubenswrapper[4691]: I0930 06:40:42.795074 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b6ff7c5-6146-432e-a89c-fe95ac728e5c-server-conf" (OuterVolumeSpecName: "server-conf") pod "6b6ff7c5-6146-432e-a89c-fe95ac728e5c" (UID: "6b6ff7c5-6146-432e-a89c-fe95ac728e5c"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:40:42 crc kubenswrapper[4691]: I0930 06:40:42.829469 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vssb\" (UniqueName: \"kubernetes.io/projected/6b6ff7c5-6146-432e-a89c-fe95ac728e5c-kube-api-access-8vssb\") on node \"crc\" DevicePath \"\"" Sep 30 06:40:42 crc kubenswrapper[4691]: I0930 06:40:42.829494 4691 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6b6ff7c5-6146-432e-a89c-fe95ac728e5c-pod-info\") on node \"crc\" DevicePath \"\"" Sep 30 06:40:42 crc kubenswrapper[4691]: I0930 06:40:42.829504 4691 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6b6ff7c5-6146-432e-a89c-fe95ac728e5c-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Sep 30 06:40:42 crc kubenswrapper[4691]: I0930 06:40:42.829526 4691 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Sep 30 06:40:42 crc kubenswrapper[4691]: I0930 06:40:42.829534 4691 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6b6ff7c5-6146-432e-a89c-fe95ac728e5c-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Sep 30 06:40:42 crc kubenswrapper[4691]: I0930 06:40:42.829543 4691 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6b6ff7c5-6146-432e-a89c-fe95ac728e5c-server-conf\") on node \"crc\" DevicePath \"\"" Sep 30 06:40:42 crc kubenswrapper[4691]: I0930 06:40:42.829550 4691 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6b6ff7c5-6146-432e-a89c-fe95ac728e5c-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Sep 30 06:40:42 crc kubenswrapper[4691]: I0930 06:40:42.829558 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6b6ff7c5-6146-432e-a89c-fe95ac728e5c-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 06:40:42 crc kubenswrapper[4691]: I0930 06:40:42.829566 4691 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6b6ff7c5-6146-432e-a89c-fe95ac728e5c-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Sep 30 06:40:42 crc kubenswrapper[4691]: I0930 06:40:42.829573 4691 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6b6ff7c5-6146-432e-a89c-fe95ac728e5c-plugins-conf\") on node \"crc\" DevicePath \"\"" Sep 30 06:40:42 crc kubenswrapper[4691]: I0930 06:40:42.847362 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 30 06:40:42 crc kubenswrapper[4691]: I0930 06:40:42.861907 4691 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Sep 30 06:40:42 crc kubenswrapper[4691]: I0930 06:40:42.917351 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b6ff7c5-6146-432e-a89c-fe95ac728e5c-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "6b6ff7c5-6146-432e-a89c-fe95ac728e5c" (UID: "6b6ff7c5-6146-432e-a89c-fe95ac728e5c"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:40:42 crc kubenswrapper[4691]: I0930 06:40:42.931453 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mrqcp\" (UniqueName: \"kubernetes.io/projected/fd5df9d9-7a0a-441c-b21d-92dff2af7376-kube-api-access-mrqcp\") pod \"fd5df9d9-7a0a-441c-b21d-92dff2af7376\" (UID: \"fd5df9d9-7a0a-441c-b21d-92dff2af7376\") " Sep 30 06:40:42 crc kubenswrapper[4691]: I0930 06:40:42.931517 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/fd5df9d9-7a0a-441c-b21d-92dff2af7376-server-conf\") pod \"fd5df9d9-7a0a-441c-b21d-92dff2af7376\" (UID: \"fd5df9d9-7a0a-441c-b21d-92dff2af7376\") " Sep 30 06:40:42 crc kubenswrapper[4691]: I0930 06:40:42.931547 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/fd5df9d9-7a0a-441c-b21d-92dff2af7376-erlang-cookie-secret\") pod \"fd5df9d9-7a0a-441c-b21d-92dff2af7376\" (UID: \"fd5df9d9-7a0a-441c-b21d-92dff2af7376\") " Sep 30 06:40:42 crc kubenswrapper[4691]: I0930 06:40:42.931705 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/fd5df9d9-7a0a-441c-b21d-92dff2af7376-rabbitmq-confd\") pod \"fd5df9d9-7a0a-441c-b21d-92dff2af7376\" (UID: \"fd5df9d9-7a0a-441c-b21d-92dff2af7376\") " Sep 30 06:40:42 crc kubenswrapper[4691]: I0930 06:40:42.931806 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/fd5df9d9-7a0a-441c-b21d-92dff2af7376-rabbitmq-plugins\") pod \"fd5df9d9-7a0a-441c-b21d-92dff2af7376\" (UID: \"fd5df9d9-7a0a-441c-b21d-92dff2af7376\") " Sep 30 06:40:42 crc kubenswrapper[4691]: I0930 06:40:42.931915 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/fd5df9d9-7a0a-441c-b21d-92dff2af7376-pod-info\") pod \"fd5df9d9-7a0a-441c-b21d-92dff2af7376\" (UID: \"fd5df9d9-7a0a-441c-b21d-92dff2af7376\") " Sep 30 06:40:42 crc kubenswrapper[4691]: I0930 06:40:42.931985 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/fd5df9d9-7a0a-441c-b21d-92dff2af7376-rabbitmq-erlang-cookie\") pod \"fd5df9d9-7a0a-441c-b21d-92dff2af7376\" (UID: \"fd5df9d9-7a0a-441c-b21d-92dff2af7376\") " Sep 30 06:40:42 crc kubenswrapper[4691]: I0930 06:40:42.932006 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/fd5df9d9-7a0a-441c-b21d-92dff2af7376-plugins-conf\") pod \"fd5df9d9-7a0a-441c-b21d-92dff2af7376\" (UID: \"fd5df9d9-7a0a-441c-b21d-92dff2af7376\") " Sep 30 06:40:42 crc kubenswrapper[4691]: I0930 06:40:42.932050 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fd5df9d9-7a0a-441c-b21d-92dff2af7376-config-data\") pod \"fd5df9d9-7a0a-441c-b21d-92dff2af7376\" (UID: \"fd5df9d9-7a0a-441c-b21d-92dff2af7376\") " Sep 30 06:40:42 crc kubenswrapper[4691]: I0930 06:40:42.932151 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/fd5df9d9-7a0a-441c-b21d-92dff2af7376-rabbitmq-tls\") pod \"fd5df9d9-7a0a-441c-b21d-92dff2af7376\" (UID: \"fd5df9d9-7a0a-441c-b21d-92dff2af7376\") " Sep 30 06:40:42 crc kubenswrapper[4691]: I0930 06:40:42.932199 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"fd5df9d9-7a0a-441c-b21d-92dff2af7376\" (UID: \"fd5df9d9-7a0a-441c-b21d-92dff2af7376\") " Sep 30 06:40:42 crc kubenswrapper[4691]: I0930 06:40:42.932395 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd5df9d9-7a0a-441c-b21d-92dff2af7376-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "fd5df9d9-7a0a-441c-b21d-92dff2af7376" (UID: "fd5df9d9-7a0a-441c-b21d-92dff2af7376"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:40:42 crc kubenswrapper[4691]: I0930 06:40:42.932695 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd5df9d9-7a0a-441c-b21d-92dff2af7376-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "fd5df9d9-7a0a-441c-b21d-92dff2af7376" (UID: "fd5df9d9-7a0a-441c-b21d-92dff2af7376"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:40:42 crc kubenswrapper[4691]: I0930 06:40:42.933393 4691 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/fd5df9d9-7a0a-441c-b21d-92dff2af7376-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Sep 30 06:40:42 crc kubenswrapper[4691]: I0930 06:40:42.933435 4691 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Sep 30 06:40:42 crc kubenswrapper[4691]: I0930 06:40:42.933447 4691 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6b6ff7c5-6146-432e-a89c-fe95ac728e5c-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Sep 30 06:40:42 crc kubenswrapper[4691]: I0930 06:40:42.933455 4691 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/fd5df9d9-7a0a-441c-b21d-92dff2af7376-plugins-conf\") on node \"crc\" DevicePath \"\"" Sep 30 06:40:42 crc kubenswrapper[4691]: I0930 06:40:42.935180 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd5df9d9-7a0a-441c-b21d-92dff2af7376-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "fd5df9d9-7a0a-441c-b21d-92dff2af7376" (UID: "fd5df9d9-7a0a-441c-b21d-92dff2af7376"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:40:42 crc kubenswrapper[4691]: I0930 06:40:42.937220 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/fd5df9d9-7a0a-441c-b21d-92dff2af7376-pod-info" (OuterVolumeSpecName: "pod-info") pod "fd5df9d9-7a0a-441c-b21d-92dff2af7376" (UID: "fd5df9d9-7a0a-441c-b21d-92dff2af7376"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Sep 30 06:40:42 crc kubenswrapper[4691]: I0930 06:40:42.940015 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "persistence") pod "fd5df9d9-7a0a-441c-b21d-92dff2af7376" (UID: "fd5df9d9-7a0a-441c-b21d-92dff2af7376"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 06:40:42 crc kubenswrapper[4691]: I0930 06:40:42.940035 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd5df9d9-7a0a-441c-b21d-92dff2af7376-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "fd5df9d9-7a0a-441c-b21d-92dff2af7376" (UID: "fd5df9d9-7a0a-441c-b21d-92dff2af7376"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:40:42 crc kubenswrapper[4691]: I0930 06:40:42.940185 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd5df9d9-7a0a-441c-b21d-92dff2af7376-kube-api-access-mrqcp" (OuterVolumeSpecName: "kube-api-access-mrqcp") pod "fd5df9d9-7a0a-441c-b21d-92dff2af7376" (UID: "fd5df9d9-7a0a-441c-b21d-92dff2af7376"). InnerVolumeSpecName "kube-api-access-mrqcp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:40:42 crc kubenswrapper[4691]: I0930 06:40:42.957155 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd5df9d9-7a0a-441c-b21d-92dff2af7376-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "fd5df9d9-7a0a-441c-b21d-92dff2af7376" (UID: "fd5df9d9-7a0a-441c-b21d-92dff2af7376"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:40:42 crc kubenswrapper[4691]: I0930 06:40:42.997734 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd5df9d9-7a0a-441c-b21d-92dff2af7376-config-data" (OuterVolumeSpecName: "config-data") pod "fd5df9d9-7a0a-441c-b21d-92dff2af7376" (UID: "fd5df9d9-7a0a-441c-b21d-92dff2af7376"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.011130 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd5df9d9-7a0a-441c-b21d-92dff2af7376-server-conf" (OuterVolumeSpecName: "server-conf") pod "fd5df9d9-7a0a-441c-b21d-92dff2af7376" (UID: "fd5df9d9-7a0a-441c-b21d-92dff2af7376"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.035331 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mrqcp\" (UniqueName: \"kubernetes.io/projected/fd5df9d9-7a0a-441c-b21d-92dff2af7376-kube-api-access-mrqcp\") on node \"crc\" DevicePath \"\"" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.035360 4691 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/fd5df9d9-7a0a-441c-b21d-92dff2af7376-server-conf\") on node \"crc\" DevicePath \"\"" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.035371 4691 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/fd5df9d9-7a0a-441c-b21d-92dff2af7376-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.035379 4691 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/fd5df9d9-7a0a-441c-b21d-92dff2af7376-pod-info\") on node \"crc\" DevicePath \"\"" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.035387 4691 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/fd5df9d9-7a0a-441c-b21d-92dff2af7376-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.035395 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fd5df9d9-7a0a-441c-b21d-92dff2af7376-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.035403 4691 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/fd5df9d9-7a0a-441c-b21d-92dff2af7376-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.035424 4691 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.056961 4691 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.068311 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd5df9d9-7a0a-441c-b21d-92dff2af7376-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "fd5df9d9-7a0a-441c-b21d-92dff2af7376" (UID: "fd5df9d9-7a0a-441c-b21d-92dff2af7376"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.137207 4691 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.137243 4691 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/fd5df9d9-7a0a-441c-b21d-92dff2af7376-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.528844 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.529112 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.529137 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"fd5df9d9-7a0a-441c-b21d-92dff2af7376","Type":"ContainerDied","Data":"cb51188eddc163015e354c12b278405a46bb52ee286d1a252cb509b6f9199636"} Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.530336 4691 scope.go:117] "RemoveContainer" containerID="25e026665d18a5a549dc7bd0f85f93fb72649cd66d04b7d5ccd6ddae22cf463e" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.554398 4691 scope.go:117] "RemoveContainer" containerID="23bbfb3a62c8abdf4f6d70e59cf97b8f36b8de2806338d340cd77a5da638b089" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.572462 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.595110 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.607303 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.616615 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.626091 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 06:40:43 crc kubenswrapper[4691]: E0930 06:40:43.626557 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd5df9d9-7a0a-441c-b21d-92dff2af7376" containerName="rabbitmq" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.626569 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd5df9d9-7a0a-441c-b21d-92dff2af7376" containerName="rabbitmq" Sep 30 06:40:43 crc kubenswrapper[4691]: E0930 06:40:43.626586 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b6ff7c5-6146-432e-a89c-fe95ac728e5c" containerName="rabbitmq" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.626592 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b6ff7c5-6146-432e-a89c-fe95ac728e5c" containerName="rabbitmq" Sep 30 06:40:43 crc kubenswrapper[4691]: E0930 06:40:43.626602 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd5df9d9-7a0a-441c-b21d-92dff2af7376" containerName="setup-container" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.626608 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd5df9d9-7a0a-441c-b21d-92dff2af7376" containerName="setup-container" Sep 30 06:40:43 crc kubenswrapper[4691]: E0930 06:40:43.626621 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b6ff7c5-6146-432e-a89c-fe95ac728e5c" containerName="setup-container" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.626627 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b6ff7c5-6146-432e-a89c-fe95ac728e5c" containerName="setup-container" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.626813 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd5df9d9-7a0a-441c-b21d-92dff2af7376" containerName="rabbitmq" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.626833 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b6ff7c5-6146-432e-a89c-fe95ac728e5c" containerName="rabbitmq" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.628026 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.634462 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.635026 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.635263 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.635364 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.635380 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.635499 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.635610 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.635844 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-lwgnz" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.637214 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.641389 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.642031 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.642158 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.642295 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.642345 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.642408 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-cvhfh" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.642555 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.642700 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.652062 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.749854 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/136adcf8-2194-4dd2-9b57-6bf571f9e295-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"136adcf8-2194-4dd2-9b57-6bf571f9e295\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.749920 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/136adcf8-2194-4dd2-9b57-6bf571f9e295-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"136adcf8-2194-4dd2-9b57-6bf571f9e295\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.749975 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/136adcf8-2194-4dd2-9b57-6bf571f9e295-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"136adcf8-2194-4dd2-9b57-6bf571f9e295\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.750000 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a7ac4031-4e4c-4dd1-bbe8-16846c31d7b5-server-conf\") pod \"rabbitmq-server-0\" (UID: \"a7ac4031-4e4c-4dd1-bbe8-16846c31d7b5\") " pod="openstack/rabbitmq-server-0" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.750096 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/136adcf8-2194-4dd2-9b57-6bf571f9e295-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"136adcf8-2194-4dd2-9b57-6bf571f9e295\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.750179 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a7ac4031-4e4c-4dd1-bbe8-16846c31d7b5-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"a7ac4031-4e4c-4dd1-bbe8-16846c31d7b5\") " pod="openstack/rabbitmq-server-0" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.750248 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a7ac4031-4e4c-4dd1-bbe8-16846c31d7b5-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"a7ac4031-4e4c-4dd1-bbe8-16846c31d7b5\") " pod="openstack/rabbitmq-server-0" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.750279 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/136adcf8-2194-4dd2-9b57-6bf571f9e295-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"136adcf8-2194-4dd2-9b57-6bf571f9e295\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.750401 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/136adcf8-2194-4dd2-9b57-6bf571f9e295-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"136adcf8-2194-4dd2-9b57-6bf571f9e295\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.750416 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcx7h\" (UniqueName: \"kubernetes.io/projected/136adcf8-2194-4dd2-9b57-6bf571f9e295-kube-api-access-vcx7h\") pod \"rabbitmq-cell1-server-0\" (UID: \"136adcf8-2194-4dd2-9b57-6bf571f9e295\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.750435 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9jnx\" (UniqueName: \"kubernetes.io/projected/a7ac4031-4e4c-4dd1-bbe8-16846c31d7b5-kube-api-access-g9jnx\") pod \"rabbitmq-server-0\" (UID: \"a7ac4031-4e4c-4dd1-bbe8-16846c31d7b5\") " pod="openstack/rabbitmq-server-0" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.750473 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a7ac4031-4e4c-4dd1-bbe8-16846c31d7b5-config-data\") pod \"rabbitmq-server-0\" (UID: \"a7ac4031-4e4c-4dd1-bbe8-16846c31d7b5\") " pod="openstack/rabbitmq-server-0" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.750512 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/136adcf8-2194-4dd2-9b57-6bf571f9e295-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"136adcf8-2194-4dd2-9b57-6bf571f9e295\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.750544 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"a7ac4031-4e4c-4dd1-bbe8-16846c31d7b5\") " pod="openstack/rabbitmq-server-0" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.750618 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a7ac4031-4e4c-4dd1-bbe8-16846c31d7b5-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"a7ac4031-4e4c-4dd1-bbe8-16846c31d7b5\") " pod="openstack/rabbitmq-server-0" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.750672 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a7ac4031-4e4c-4dd1-bbe8-16846c31d7b5-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"a7ac4031-4e4c-4dd1-bbe8-16846c31d7b5\") " pod="openstack/rabbitmq-server-0" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.750716 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/136adcf8-2194-4dd2-9b57-6bf571f9e295-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"136adcf8-2194-4dd2-9b57-6bf571f9e295\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.750746 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"136adcf8-2194-4dd2-9b57-6bf571f9e295\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.750780 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a7ac4031-4e4c-4dd1-bbe8-16846c31d7b5-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"a7ac4031-4e4c-4dd1-bbe8-16846c31d7b5\") " pod="openstack/rabbitmq-server-0" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.750796 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/136adcf8-2194-4dd2-9b57-6bf571f9e295-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"136adcf8-2194-4dd2-9b57-6bf571f9e295\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.750816 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a7ac4031-4e4c-4dd1-bbe8-16846c31d7b5-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"a7ac4031-4e4c-4dd1-bbe8-16846c31d7b5\") " pod="openstack/rabbitmq-server-0" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.750839 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a7ac4031-4e4c-4dd1-bbe8-16846c31d7b5-pod-info\") pod \"rabbitmq-server-0\" (UID: \"a7ac4031-4e4c-4dd1-bbe8-16846c31d7b5\") " pod="openstack/rabbitmq-server-0" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.852983 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/136adcf8-2194-4dd2-9b57-6bf571f9e295-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"136adcf8-2194-4dd2-9b57-6bf571f9e295\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.853026 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcx7h\" (UniqueName: \"kubernetes.io/projected/136adcf8-2194-4dd2-9b57-6bf571f9e295-kube-api-access-vcx7h\") pod \"rabbitmq-cell1-server-0\" (UID: \"136adcf8-2194-4dd2-9b57-6bf571f9e295\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.853048 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9jnx\" (UniqueName: \"kubernetes.io/projected/a7ac4031-4e4c-4dd1-bbe8-16846c31d7b5-kube-api-access-g9jnx\") pod \"rabbitmq-server-0\" (UID: \"a7ac4031-4e4c-4dd1-bbe8-16846c31d7b5\") " pod="openstack/rabbitmq-server-0" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.853074 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a7ac4031-4e4c-4dd1-bbe8-16846c31d7b5-config-data\") pod \"rabbitmq-server-0\" (UID: \"a7ac4031-4e4c-4dd1-bbe8-16846c31d7b5\") " pod="openstack/rabbitmq-server-0" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.853099 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/136adcf8-2194-4dd2-9b57-6bf571f9e295-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"136adcf8-2194-4dd2-9b57-6bf571f9e295\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.853123 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"a7ac4031-4e4c-4dd1-bbe8-16846c31d7b5\") " pod="openstack/rabbitmq-server-0" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.853156 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a7ac4031-4e4c-4dd1-bbe8-16846c31d7b5-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"a7ac4031-4e4c-4dd1-bbe8-16846c31d7b5\") " pod="openstack/rabbitmq-server-0" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.853187 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a7ac4031-4e4c-4dd1-bbe8-16846c31d7b5-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"a7ac4031-4e4c-4dd1-bbe8-16846c31d7b5\") " pod="openstack/rabbitmq-server-0" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.853208 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/136adcf8-2194-4dd2-9b57-6bf571f9e295-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"136adcf8-2194-4dd2-9b57-6bf571f9e295\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.853228 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"136adcf8-2194-4dd2-9b57-6bf571f9e295\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.853248 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a7ac4031-4e4c-4dd1-bbe8-16846c31d7b5-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"a7ac4031-4e4c-4dd1-bbe8-16846c31d7b5\") " pod="openstack/rabbitmq-server-0" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.853264 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/136adcf8-2194-4dd2-9b57-6bf571f9e295-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"136adcf8-2194-4dd2-9b57-6bf571f9e295\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.853285 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a7ac4031-4e4c-4dd1-bbe8-16846c31d7b5-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"a7ac4031-4e4c-4dd1-bbe8-16846c31d7b5\") " pod="openstack/rabbitmq-server-0" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.853311 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a7ac4031-4e4c-4dd1-bbe8-16846c31d7b5-pod-info\") pod \"rabbitmq-server-0\" (UID: \"a7ac4031-4e4c-4dd1-bbe8-16846c31d7b5\") " pod="openstack/rabbitmq-server-0" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.853335 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/136adcf8-2194-4dd2-9b57-6bf571f9e295-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"136adcf8-2194-4dd2-9b57-6bf571f9e295\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.853358 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/136adcf8-2194-4dd2-9b57-6bf571f9e295-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"136adcf8-2194-4dd2-9b57-6bf571f9e295\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.853381 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/136adcf8-2194-4dd2-9b57-6bf571f9e295-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"136adcf8-2194-4dd2-9b57-6bf571f9e295\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.853405 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a7ac4031-4e4c-4dd1-bbe8-16846c31d7b5-server-conf\") pod \"rabbitmq-server-0\" (UID: \"a7ac4031-4e4c-4dd1-bbe8-16846c31d7b5\") " pod="openstack/rabbitmq-server-0" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.853422 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/136adcf8-2194-4dd2-9b57-6bf571f9e295-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"136adcf8-2194-4dd2-9b57-6bf571f9e295\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.853444 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a7ac4031-4e4c-4dd1-bbe8-16846c31d7b5-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"a7ac4031-4e4c-4dd1-bbe8-16846c31d7b5\") " pod="openstack/rabbitmq-server-0" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.853470 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a7ac4031-4e4c-4dd1-bbe8-16846c31d7b5-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"a7ac4031-4e4c-4dd1-bbe8-16846c31d7b5\") " pod="openstack/rabbitmq-server-0" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.853489 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/136adcf8-2194-4dd2-9b57-6bf571f9e295-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"136adcf8-2194-4dd2-9b57-6bf571f9e295\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.853493 4691 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"a7ac4031-4e4c-4dd1-bbe8-16846c31d7b5\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-server-0" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.853718 4691 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"136adcf8-2194-4dd2-9b57-6bf571f9e295\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/rabbitmq-cell1-server-0" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.853998 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/136adcf8-2194-4dd2-9b57-6bf571f9e295-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"136adcf8-2194-4dd2-9b57-6bf571f9e295\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.854592 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/a7ac4031-4e4c-4dd1-bbe8-16846c31d7b5-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"a7ac4031-4e4c-4dd1-bbe8-16846c31d7b5\") " pod="openstack/rabbitmq-server-0" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.855164 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/136adcf8-2194-4dd2-9b57-6bf571f9e295-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"136adcf8-2194-4dd2-9b57-6bf571f9e295\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.856046 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/136adcf8-2194-4dd2-9b57-6bf571f9e295-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"136adcf8-2194-4dd2-9b57-6bf571f9e295\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.856494 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/a7ac4031-4e4c-4dd1-bbe8-16846c31d7b5-config-data\") pod \"rabbitmq-server-0\" (UID: \"a7ac4031-4e4c-4dd1-bbe8-16846c31d7b5\") " pod="openstack/rabbitmq-server-0" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.856691 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/a7ac4031-4e4c-4dd1-bbe8-16846c31d7b5-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"a7ac4031-4e4c-4dd1-bbe8-16846c31d7b5\") " pod="openstack/rabbitmq-server-0" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.856952 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/136adcf8-2194-4dd2-9b57-6bf571f9e295-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"136adcf8-2194-4dd2-9b57-6bf571f9e295\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.857548 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/a7ac4031-4e4c-4dd1-bbe8-16846c31d7b5-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"a7ac4031-4e4c-4dd1-bbe8-16846c31d7b5\") " pod="openstack/rabbitmq-server-0" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.860385 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/a7ac4031-4e4c-4dd1-bbe8-16846c31d7b5-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"a7ac4031-4e4c-4dd1-bbe8-16846c31d7b5\") " pod="openstack/rabbitmq-server-0" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.861504 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/136adcf8-2194-4dd2-9b57-6bf571f9e295-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"136adcf8-2194-4dd2-9b57-6bf571f9e295\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.862176 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/136adcf8-2194-4dd2-9b57-6bf571f9e295-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"136adcf8-2194-4dd2-9b57-6bf571f9e295\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.863979 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/a7ac4031-4e4c-4dd1-bbe8-16846c31d7b5-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"a7ac4031-4e4c-4dd1-bbe8-16846c31d7b5\") " pod="openstack/rabbitmq-server-0" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.864094 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/a7ac4031-4e4c-4dd1-bbe8-16846c31d7b5-pod-info\") pod \"rabbitmq-server-0\" (UID: \"a7ac4031-4e4c-4dd1-bbe8-16846c31d7b5\") " pod="openstack/rabbitmq-server-0" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.864865 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/136adcf8-2194-4dd2-9b57-6bf571f9e295-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"136adcf8-2194-4dd2-9b57-6bf571f9e295\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.867575 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/136adcf8-2194-4dd2-9b57-6bf571f9e295-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"136adcf8-2194-4dd2-9b57-6bf571f9e295\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.869977 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/a7ac4031-4e4c-4dd1-bbe8-16846c31d7b5-server-conf\") pod \"rabbitmq-server-0\" (UID: \"a7ac4031-4e4c-4dd1-bbe8-16846c31d7b5\") " pod="openstack/rabbitmq-server-0" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.870072 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/136adcf8-2194-4dd2-9b57-6bf571f9e295-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"136adcf8-2194-4dd2-9b57-6bf571f9e295\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.873806 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcx7h\" (UniqueName: \"kubernetes.io/projected/136adcf8-2194-4dd2-9b57-6bf571f9e295-kube-api-access-vcx7h\") pod \"rabbitmq-cell1-server-0\" (UID: \"136adcf8-2194-4dd2-9b57-6bf571f9e295\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.874049 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/a7ac4031-4e4c-4dd1-bbe8-16846c31d7b5-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"a7ac4031-4e4c-4dd1-bbe8-16846c31d7b5\") " pod="openstack/rabbitmq-server-0" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.874284 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9jnx\" (UniqueName: \"kubernetes.io/projected/a7ac4031-4e4c-4dd1-bbe8-16846c31d7b5-kube-api-access-g9jnx\") pod \"rabbitmq-server-0\" (UID: \"a7ac4031-4e4c-4dd1-bbe8-16846c31d7b5\") " pod="openstack/rabbitmq-server-0" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.951993 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"136adcf8-2194-4dd2-9b57-6bf571f9e295\") " pod="openstack/rabbitmq-cell1-server-0" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.959844 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"a7ac4031-4e4c-4dd1-bbe8-16846c31d7b5\") " pod="openstack/rabbitmq-server-0" Sep 30 06:40:43 crc kubenswrapper[4691]: I0930 06:40:43.980269 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Sep 30 06:40:44 crc kubenswrapper[4691]: I0930 06:40:44.252347 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Sep 30 06:40:44 crc kubenswrapper[4691]: I0930 06:40:44.516749 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Sep 30 06:40:44 crc kubenswrapper[4691]: I0930 06:40:44.537256 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a7ac4031-4e4c-4dd1-bbe8-16846c31d7b5","Type":"ContainerStarted","Data":"2c732d9439910aa14c111c7ec7282ddbe8b1c07678b3d775eafed05ebb6c890b"} Sep 30 06:40:44 crc kubenswrapper[4691]: I0930 06:40:44.729112 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Sep 30 06:40:44 crc kubenswrapper[4691]: W0930 06:40:44.729203 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod136adcf8_2194_4dd2_9b57_6bf571f9e295.slice/crio-9347e94485e4a54499496e7b537ee7b4fe65c0b953a95a7d7e4d53129caaaa6a WatchSource:0}: Error finding container 9347e94485e4a54499496e7b537ee7b4fe65c0b953a95a7d7e4d53129caaaa6a: Status 404 returned error can't find the container with id 9347e94485e4a54499496e7b537ee7b4fe65c0b953a95a7d7e4d53129caaaa6a Sep 30 06:40:45 crc kubenswrapper[4691]: I0930 06:40:45.240720 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b6ff7c5-6146-432e-a89c-fe95ac728e5c" path="/var/lib/kubelet/pods/6b6ff7c5-6146-432e-a89c-fe95ac728e5c/volumes" Sep 30 06:40:45 crc kubenswrapper[4691]: I0930 06:40:45.241961 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd5df9d9-7a0a-441c-b21d-92dff2af7376" path="/var/lib/kubelet/pods/fd5df9d9-7a0a-441c-b21d-92dff2af7376/volumes" Sep 30 06:40:45 crc kubenswrapper[4691]: I0930 06:40:45.557494 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"136adcf8-2194-4dd2-9b57-6bf571f9e295","Type":"ContainerStarted","Data":"a14b0748725812c88dba86501f5113886341896e789b5d2f7678f54c0f1124b0"} Sep 30 06:40:45 crc kubenswrapper[4691]: I0930 06:40:45.557585 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"136adcf8-2194-4dd2-9b57-6bf571f9e295","Type":"ContainerStarted","Data":"9347e94485e4a54499496e7b537ee7b4fe65c0b953a95a7d7e4d53129caaaa6a"} Sep 30 06:40:45 crc kubenswrapper[4691]: I0930 06:40:45.561069 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a7ac4031-4e4c-4dd1-bbe8-16846c31d7b5","Type":"ContainerStarted","Data":"9a664b6220fd9de51776ee09736771060b0af934bda098006893447ede6eb9da"} Sep 30 06:40:51 crc kubenswrapper[4691]: I0930 06:40:51.572236 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6c74869f9f-l9thw"] Sep 30 06:40:51 crc kubenswrapper[4691]: I0930 06:40:51.574523 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c74869f9f-l9thw" Sep 30 06:40:51 crc kubenswrapper[4691]: I0930 06:40:51.576819 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Sep 30 06:40:51 crc kubenswrapper[4691]: I0930 06:40:51.643793 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c74869f9f-l9thw"] Sep 30 06:40:51 crc kubenswrapper[4691]: I0930 06:40:51.725468 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3943c28-1c84-4fcb-8d95-dc6fb861fe66-config\") pod \"dnsmasq-dns-6c74869f9f-l9thw\" (UID: \"d3943c28-1c84-4fcb-8d95-dc6fb861fe66\") " pod="openstack/dnsmasq-dns-6c74869f9f-l9thw" Sep 30 06:40:51 crc kubenswrapper[4691]: I0930 06:40:51.725512 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84ppr\" (UniqueName: \"kubernetes.io/projected/d3943c28-1c84-4fcb-8d95-dc6fb861fe66-kube-api-access-84ppr\") pod \"dnsmasq-dns-6c74869f9f-l9thw\" (UID: \"d3943c28-1c84-4fcb-8d95-dc6fb861fe66\") " pod="openstack/dnsmasq-dns-6c74869f9f-l9thw" Sep 30 06:40:51 crc kubenswrapper[4691]: I0930 06:40:51.725604 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/d3943c28-1c84-4fcb-8d95-dc6fb861fe66-openstack-edpm-ipam\") pod \"dnsmasq-dns-6c74869f9f-l9thw\" (UID: \"d3943c28-1c84-4fcb-8d95-dc6fb861fe66\") " pod="openstack/dnsmasq-dns-6c74869f9f-l9thw" Sep 30 06:40:51 crc kubenswrapper[4691]: I0930 06:40:51.725635 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d3943c28-1c84-4fcb-8d95-dc6fb861fe66-dns-svc\") pod \"dnsmasq-dns-6c74869f9f-l9thw\" (UID: \"d3943c28-1c84-4fcb-8d95-dc6fb861fe66\") " pod="openstack/dnsmasq-dns-6c74869f9f-l9thw" Sep 30 06:40:51 crc kubenswrapper[4691]: I0930 06:40:51.725691 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d3943c28-1c84-4fcb-8d95-dc6fb861fe66-ovsdbserver-sb\") pod \"dnsmasq-dns-6c74869f9f-l9thw\" (UID: \"d3943c28-1c84-4fcb-8d95-dc6fb861fe66\") " pod="openstack/dnsmasq-dns-6c74869f9f-l9thw" Sep 30 06:40:51 crc kubenswrapper[4691]: I0930 06:40:51.725713 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d3943c28-1c84-4fcb-8d95-dc6fb861fe66-ovsdbserver-nb\") pod \"dnsmasq-dns-6c74869f9f-l9thw\" (UID: \"d3943c28-1c84-4fcb-8d95-dc6fb861fe66\") " pod="openstack/dnsmasq-dns-6c74869f9f-l9thw" Sep 30 06:40:51 crc kubenswrapper[4691]: I0930 06:40:51.725736 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d3943c28-1c84-4fcb-8d95-dc6fb861fe66-dns-swift-storage-0\") pod \"dnsmasq-dns-6c74869f9f-l9thw\" (UID: \"d3943c28-1c84-4fcb-8d95-dc6fb861fe66\") " pod="openstack/dnsmasq-dns-6c74869f9f-l9thw" Sep 30 06:40:51 crc kubenswrapper[4691]: I0930 06:40:51.827230 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d3943c28-1c84-4fcb-8d95-dc6fb861fe66-ovsdbserver-sb\") pod \"dnsmasq-dns-6c74869f9f-l9thw\" (UID: \"d3943c28-1c84-4fcb-8d95-dc6fb861fe66\") " pod="openstack/dnsmasq-dns-6c74869f9f-l9thw" Sep 30 06:40:51 crc kubenswrapper[4691]: I0930 06:40:51.827287 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d3943c28-1c84-4fcb-8d95-dc6fb861fe66-ovsdbserver-nb\") pod \"dnsmasq-dns-6c74869f9f-l9thw\" (UID: \"d3943c28-1c84-4fcb-8d95-dc6fb861fe66\") " pod="openstack/dnsmasq-dns-6c74869f9f-l9thw" Sep 30 06:40:51 crc kubenswrapper[4691]: I0930 06:40:51.827325 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d3943c28-1c84-4fcb-8d95-dc6fb861fe66-dns-swift-storage-0\") pod \"dnsmasq-dns-6c74869f9f-l9thw\" (UID: \"d3943c28-1c84-4fcb-8d95-dc6fb861fe66\") " pod="openstack/dnsmasq-dns-6c74869f9f-l9thw" Sep 30 06:40:51 crc kubenswrapper[4691]: I0930 06:40:51.827380 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84ppr\" (UniqueName: \"kubernetes.io/projected/d3943c28-1c84-4fcb-8d95-dc6fb861fe66-kube-api-access-84ppr\") pod \"dnsmasq-dns-6c74869f9f-l9thw\" (UID: \"d3943c28-1c84-4fcb-8d95-dc6fb861fe66\") " pod="openstack/dnsmasq-dns-6c74869f9f-l9thw" Sep 30 06:40:51 crc kubenswrapper[4691]: I0930 06:40:51.827408 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3943c28-1c84-4fcb-8d95-dc6fb861fe66-config\") pod \"dnsmasq-dns-6c74869f9f-l9thw\" (UID: \"d3943c28-1c84-4fcb-8d95-dc6fb861fe66\") " pod="openstack/dnsmasq-dns-6c74869f9f-l9thw" Sep 30 06:40:51 crc kubenswrapper[4691]: I0930 06:40:51.827518 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/d3943c28-1c84-4fcb-8d95-dc6fb861fe66-openstack-edpm-ipam\") pod \"dnsmasq-dns-6c74869f9f-l9thw\" (UID: \"d3943c28-1c84-4fcb-8d95-dc6fb861fe66\") " pod="openstack/dnsmasq-dns-6c74869f9f-l9thw" Sep 30 06:40:51 crc kubenswrapper[4691]: I0930 06:40:51.827554 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d3943c28-1c84-4fcb-8d95-dc6fb861fe66-dns-svc\") pod \"dnsmasq-dns-6c74869f9f-l9thw\" (UID: \"d3943c28-1c84-4fcb-8d95-dc6fb861fe66\") " pod="openstack/dnsmasq-dns-6c74869f9f-l9thw" Sep 30 06:40:51 crc kubenswrapper[4691]: I0930 06:40:51.828818 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3943c28-1c84-4fcb-8d95-dc6fb861fe66-config\") pod \"dnsmasq-dns-6c74869f9f-l9thw\" (UID: \"d3943c28-1c84-4fcb-8d95-dc6fb861fe66\") " pod="openstack/dnsmasq-dns-6c74869f9f-l9thw" Sep 30 06:40:51 crc kubenswrapper[4691]: I0930 06:40:51.828854 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d3943c28-1c84-4fcb-8d95-dc6fb861fe66-dns-swift-storage-0\") pod \"dnsmasq-dns-6c74869f9f-l9thw\" (UID: \"d3943c28-1c84-4fcb-8d95-dc6fb861fe66\") " pod="openstack/dnsmasq-dns-6c74869f9f-l9thw" Sep 30 06:40:51 crc kubenswrapper[4691]: I0930 06:40:51.828869 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d3943c28-1c84-4fcb-8d95-dc6fb861fe66-ovsdbserver-sb\") pod \"dnsmasq-dns-6c74869f9f-l9thw\" (UID: \"d3943c28-1c84-4fcb-8d95-dc6fb861fe66\") " pod="openstack/dnsmasq-dns-6c74869f9f-l9thw" Sep 30 06:40:51 crc kubenswrapper[4691]: I0930 06:40:51.830449 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/d3943c28-1c84-4fcb-8d95-dc6fb861fe66-openstack-edpm-ipam\") pod \"dnsmasq-dns-6c74869f9f-l9thw\" (UID: \"d3943c28-1c84-4fcb-8d95-dc6fb861fe66\") " pod="openstack/dnsmasq-dns-6c74869f9f-l9thw" Sep 30 06:40:51 crc kubenswrapper[4691]: I0930 06:40:51.830487 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d3943c28-1c84-4fcb-8d95-dc6fb861fe66-ovsdbserver-nb\") pod \"dnsmasq-dns-6c74869f9f-l9thw\" (UID: \"d3943c28-1c84-4fcb-8d95-dc6fb861fe66\") " pod="openstack/dnsmasq-dns-6c74869f9f-l9thw" Sep 30 06:40:51 crc kubenswrapper[4691]: I0930 06:40:51.830570 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d3943c28-1c84-4fcb-8d95-dc6fb861fe66-dns-svc\") pod \"dnsmasq-dns-6c74869f9f-l9thw\" (UID: \"d3943c28-1c84-4fcb-8d95-dc6fb861fe66\") " pod="openstack/dnsmasq-dns-6c74869f9f-l9thw" Sep 30 06:40:51 crc kubenswrapper[4691]: I0930 06:40:51.849041 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84ppr\" (UniqueName: \"kubernetes.io/projected/d3943c28-1c84-4fcb-8d95-dc6fb861fe66-kube-api-access-84ppr\") pod \"dnsmasq-dns-6c74869f9f-l9thw\" (UID: \"d3943c28-1c84-4fcb-8d95-dc6fb861fe66\") " pod="openstack/dnsmasq-dns-6c74869f9f-l9thw" Sep 30 06:40:51 crc kubenswrapper[4691]: I0930 06:40:51.908811 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c74869f9f-l9thw" Sep 30 06:40:52 crc kubenswrapper[4691]: I0930 06:40:52.516667 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6c74869f9f-l9thw"] Sep 30 06:40:52 crc kubenswrapper[4691]: I0930 06:40:52.656851 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c74869f9f-l9thw" event={"ID":"d3943c28-1c84-4fcb-8d95-dc6fb861fe66","Type":"ContainerStarted","Data":"bdeb3ce4dc28ba286811d959eef053e9fa0a2637cbe95c3c1ffa7daa63e44f7f"} Sep 30 06:40:53 crc kubenswrapper[4691]: I0930 06:40:53.666825 4691 generic.go:334] "Generic (PLEG): container finished" podID="d3943c28-1c84-4fcb-8d95-dc6fb861fe66" containerID="ce7c9212196a7e60635085a2fcf052bece0ef476c486b84122467a4e2976ef14" exitCode=0 Sep 30 06:40:53 crc kubenswrapper[4691]: I0930 06:40:53.666925 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c74869f9f-l9thw" event={"ID":"d3943c28-1c84-4fcb-8d95-dc6fb861fe66","Type":"ContainerDied","Data":"ce7c9212196a7e60635085a2fcf052bece0ef476c486b84122467a4e2976ef14"} Sep 30 06:40:54 crc kubenswrapper[4691]: I0930 06:40:54.686539 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c74869f9f-l9thw" event={"ID":"d3943c28-1c84-4fcb-8d95-dc6fb861fe66","Type":"ContainerStarted","Data":"c4110c7199953abc392001901b5aa1a76bfaaaf34aa66c684c7fecf663607f39"} Sep 30 06:40:54 crc kubenswrapper[4691]: I0930 06:40:54.687098 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6c74869f9f-l9thw" Sep 30 06:40:54 crc kubenswrapper[4691]: I0930 06:40:54.725396 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6c74869f9f-l9thw" podStartSLOduration=3.725370329 podStartE2EDuration="3.725370329s" podCreationTimestamp="2025-09-30 06:40:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:40:54.713835201 +0000 UTC m=+1298.188856261" watchObservedRunningTime="2025-09-30 06:40:54.725370329 +0000 UTC m=+1298.200391409" Sep 30 06:41:01 crc kubenswrapper[4691]: I0930 06:41:01.911162 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6c74869f9f-l9thw" Sep 30 06:41:01 crc kubenswrapper[4691]: I0930 06:41:01.990631 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c45569577-gvk65"] Sep 30 06:41:01 crc kubenswrapper[4691]: I0930 06:41:01.991151 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6c45569577-gvk65" podUID="2caca8a1-81a4-40f6-9dc6-bcd84120889d" containerName="dnsmasq-dns" containerID="cri-o://c2e619eb2798157e6dae8f9c08d4a248fd4243b3eb141b1ad3a9725206cacd87" gracePeriod=10 Sep 30 06:41:02 crc kubenswrapper[4691]: I0930 06:41:02.183928 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-c6b9844bc-q6q6n"] Sep 30 06:41:02 crc kubenswrapper[4691]: I0930 06:41:02.186144 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c6b9844bc-q6q6n" Sep 30 06:41:02 crc kubenswrapper[4691]: I0930 06:41:02.197463 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-c6b9844bc-q6q6n"] Sep 30 06:41:02 crc kubenswrapper[4691]: I0930 06:41:02.378069 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c18992fb-4c6e-4a18-a9b9-f00db9817b1b-openstack-edpm-ipam\") pod \"dnsmasq-dns-c6b9844bc-q6q6n\" (UID: \"c18992fb-4c6e-4a18-a9b9-f00db9817b1b\") " pod="openstack/dnsmasq-dns-c6b9844bc-q6q6n" Sep 30 06:41:02 crc kubenswrapper[4691]: I0930 06:41:02.378146 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c18992fb-4c6e-4a18-a9b9-f00db9817b1b-dns-svc\") pod \"dnsmasq-dns-c6b9844bc-q6q6n\" (UID: \"c18992fb-4c6e-4a18-a9b9-f00db9817b1b\") " pod="openstack/dnsmasq-dns-c6b9844bc-q6q6n" Sep 30 06:41:02 crc kubenswrapper[4691]: I0930 06:41:02.378195 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c18992fb-4c6e-4a18-a9b9-f00db9817b1b-dns-swift-storage-0\") pod \"dnsmasq-dns-c6b9844bc-q6q6n\" (UID: \"c18992fb-4c6e-4a18-a9b9-f00db9817b1b\") " pod="openstack/dnsmasq-dns-c6b9844bc-q6q6n" Sep 30 06:41:02 crc kubenswrapper[4691]: I0930 06:41:02.378218 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpmm6\" (UniqueName: \"kubernetes.io/projected/c18992fb-4c6e-4a18-a9b9-f00db9817b1b-kube-api-access-tpmm6\") pod \"dnsmasq-dns-c6b9844bc-q6q6n\" (UID: \"c18992fb-4c6e-4a18-a9b9-f00db9817b1b\") " pod="openstack/dnsmasq-dns-c6b9844bc-q6q6n" Sep 30 06:41:02 crc kubenswrapper[4691]: I0930 06:41:02.378421 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c18992fb-4c6e-4a18-a9b9-f00db9817b1b-ovsdbserver-nb\") pod \"dnsmasq-dns-c6b9844bc-q6q6n\" (UID: \"c18992fb-4c6e-4a18-a9b9-f00db9817b1b\") " pod="openstack/dnsmasq-dns-c6b9844bc-q6q6n" Sep 30 06:41:02 crc kubenswrapper[4691]: I0930 06:41:02.378520 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c18992fb-4c6e-4a18-a9b9-f00db9817b1b-ovsdbserver-sb\") pod \"dnsmasq-dns-c6b9844bc-q6q6n\" (UID: \"c18992fb-4c6e-4a18-a9b9-f00db9817b1b\") " pod="openstack/dnsmasq-dns-c6b9844bc-q6q6n" Sep 30 06:41:02 crc kubenswrapper[4691]: I0930 06:41:02.378556 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c18992fb-4c6e-4a18-a9b9-f00db9817b1b-config\") pod \"dnsmasq-dns-c6b9844bc-q6q6n\" (UID: \"c18992fb-4c6e-4a18-a9b9-f00db9817b1b\") " pod="openstack/dnsmasq-dns-c6b9844bc-q6q6n" Sep 30 06:41:02 crc kubenswrapper[4691]: I0930 06:41:02.480796 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c18992fb-4c6e-4a18-a9b9-f00db9817b1b-dns-svc\") pod \"dnsmasq-dns-c6b9844bc-q6q6n\" (UID: \"c18992fb-4c6e-4a18-a9b9-f00db9817b1b\") " pod="openstack/dnsmasq-dns-c6b9844bc-q6q6n" Sep 30 06:41:02 crc kubenswrapper[4691]: I0930 06:41:02.480876 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c18992fb-4c6e-4a18-a9b9-f00db9817b1b-dns-swift-storage-0\") pod \"dnsmasq-dns-c6b9844bc-q6q6n\" (UID: \"c18992fb-4c6e-4a18-a9b9-f00db9817b1b\") " pod="openstack/dnsmasq-dns-c6b9844bc-q6q6n" Sep 30 06:41:02 crc kubenswrapper[4691]: I0930 06:41:02.480920 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpmm6\" (UniqueName: \"kubernetes.io/projected/c18992fb-4c6e-4a18-a9b9-f00db9817b1b-kube-api-access-tpmm6\") pod \"dnsmasq-dns-c6b9844bc-q6q6n\" (UID: \"c18992fb-4c6e-4a18-a9b9-f00db9817b1b\") " pod="openstack/dnsmasq-dns-c6b9844bc-q6q6n" Sep 30 06:41:02 crc kubenswrapper[4691]: I0930 06:41:02.480987 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c18992fb-4c6e-4a18-a9b9-f00db9817b1b-ovsdbserver-nb\") pod \"dnsmasq-dns-c6b9844bc-q6q6n\" (UID: \"c18992fb-4c6e-4a18-a9b9-f00db9817b1b\") " pod="openstack/dnsmasq-dns-c6b9844bc-q6q6n" Sep 30 06:41:02 crc kubenswrapper[4691]: I0930 06:41:02.481018 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c18992fb-4c6e-4a18-a9b9-f00db9817b1b-ovsdbserver-sb\") pod \"dnsmasq-dns-c6b9844bc-q6q6n\" (UID: \"c18992fb-4c6e-4a18-a9b9-f00db9817b1b\") " pod="openstack/dnsmasq-dns-c6b9844bc-q6q6n" Sep 30 06:41:02 crc kubenswrapper[4691]: I0930 06:41:02.481036 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c18992fb-4c6e-4a18-a9b9-f00db9817b1b-config\") pod \"dnsmasq-dns-c6b9844bc-q6q6n\" (UID: \"c18992fb-4c6e-4a18-a9b9-f00db9817b1b\") " pod="openstack/dnsmasq-dns-c6b9844bc-q6q6n" Sep 30 06:41:02 crc kubenswrapper[4691]: I0930 06:41:02.481085 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c18992fb-4c6e-4a18-a9b9-f00db9817b1b-openstack-edpm-ipam\") pod \"dnsmasq-dns-c6b9844bc-q6q6n\" (UID: \"c18992fb-4c6e-4a18-a9b9-f00db9817b1b\") " pod="openstack/dnsmasq-dns-c6b9844bc-q6q6n" Sep 30 06:41:02 crc kubenswrapper[4691]: I0930 06:41:02.481877 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/c18992fb-4c6e-4a18-a9b9-f00db9817b1b-openstack-edpm-ipam\") pod \"dnsmasq-dns-c6b9844bc-q6q6n\" (UID: \"c18992fb-4c6e-4a18-a9b9-f00db9817b1b\") " pod="openstack/dnsmasq-dns-c6b9844bc-q6q6n" Sep 30 06:41:02 crc kubenswrapper[4691]: I0930 06:41:02.482397 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c18992fb-4c6e-4a18-a9b9-f00db9817b1b-dns-svc\") pod \"dnsmasq-dns-c6b9844bc-q6q6n\" (UID: \"c18992fb-4c6e-4a18-a9b9-f00db9817b1b\") " pod="openstack/dnsmasq-dns-c6b9844bc-q6q6n" Sep 30 06:41:02 crc kubenswrapper[4691]: I0930 06:41:02.482876 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c18992fb-4c6e-4a18-a9b9-f00db9817b1b-dns-swift-storage-0\") pod \"dnsmasq-dns-c6b9844bc-q6q6n\" (UID: \"c18992fb-4c6e-4a18-a9b9-f00db9817b1b\") " pod="openstack/dnsmasq-dns-c6b9844bc-q6q6n" Sep 30 06:41:02 crc kubenswrapper[4691]: I0930 06:41:02.483751 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c18992fb-4c6e-4a18-a9b9-f00db9817b1b-ovsdbserver-nb\") pod \"dnsmasq-dns-c6b9844bc-q6q6n\" (UID: \"c18992fb-4c6e-4a18-a9b9-f00db9817b1b\") " pod="openstack/dnsmasq-dns-c6b9844bc-q6q6n" Sep 30 06:41:02 crc kubenswrapper[4691]: I0930 06:41:02.484243 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c18992fb-4c6e-4a18-a9b9-f00db9817b1b-ovsdbserver-sb\") pod \"dnsmasq-dns-c6b9844bc-q6q6n\" (UID: \"c18992fb-4c6e-4a18-a9b9-f00db9817b1b\") " pod="openstack/dnsmasq-dns-c6b9844bc-q6q6n" Sep 30 06:41:02 crc kubenswrapper[4691]: I0930 06:41:02.484710 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c18992fb-4c6e-4a18-a9b9-f00db9817b1b-config\") pod \"dnsmasq-dns-c6b9844bc-q6q6n\" (UID: \"c18992fb-4c6e-4a18-a9b9-f00db9817b1b\") " pod="openstack/dnsmasq-dns-c6b9844bc-q6q6n" Sep 30 06:41:02 crc kubenswrapper[4691]: I0930 06:41:02.502215 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpmm6\" (UniqueName: \"kubernetes.io/projected/c18992fb-4c6e-4a18-a9b9-f00db9817b1b-kube-api-access-tpmm6\") pod \"dnsmasq-dns-c6b9844bc-q6q6n\" (UID: \"c18992fb-4c6e-4a18-a9b9-f00db9817b1b\") " pod="openstack/dnsmasq-dns-c6b9844bc-q6q6n" Sep 30 06:41:02 crc kubenswrapper[4691]: I0930 06:41:02.508090 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-c6b9844bc-q6q6n" Sep 30 06:41:02 crc kubenswrapper[4691]: I0930 06:41:02.600692 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c45569577-gvk65" Sep 30 06:41:02 crc kubenswrapper[4691]: I0930 06:41:02.784740 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2caca8a1-81a4-40f6-9dc6-bcd84120889d-ovsdbserver-nb\") pod \"2caca8a1-81a4-40f6-9dc6-bcd84120889d\" (UID: \"2caca8a1-81a4-40f6-9dc6-bcd84120889d\") " Sep 30 06:41:02 crc kubenswrapper[4691]: I0930 06:41:02.784825 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2caca8a1-81a4-40f6-9dc6-bcd84120889d-ovsdbserver-sb\") pod \"2caca8a1-81a4-40f6-9dc6-bcd84120889d\" (UID: \"2caca8a1-81a4-40f6-9dc6-bcd84120889d\") " Sep 30 06:41:02 crc kubenswrapper[4691]: I0930 06:41:02.784915 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2caca8a1-81a4-40f6-9dc6-bcd84120889d-dns-svc\") pod \"2caca8a1-81a4-40f6-9dc6-bcd84120889d\" (UID: \"2caca8a1-81a4-40f6-9dc6-bcd84120889d\") " Sep 30 06:41:02 crc kubenswrapper[4691]: I0930 06:41:02.784963 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wpmdf\" (UniqueName: \"kubernetes.io/projected/2caca8a1-81a4-40f6-9dc6-bcd84120889d-kube-api-access-wpmdf\") pod \"2caca8a1-81a4-40f6-9dc6-bcd84120889d\" (UID: \"2caca8a1-81a4-40f6-9dc6-bcd84120889d\") " Sep 30 06:41:02 crc kubenswrapper[4691]: I0930 06:41:02.784997 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2caca8a1-81a4-40f6-9dc6-bcd84120889d-config\") pod \"2caca8a1-81a4-40f6-9dc6-bcd84120889d\" (UID: \"2caca8a1-81a4-40f6-9dc6-bcd84120889d\") " Sep 30 06:41:02 crc kubenswrapper[4691]: I0930 06:41:02.785143 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2caca8a1-81a4-40f6-9dc6-bcd84120889d-dns-swift-storage-0\") pod \"2caca8a1-81a4-40f6-9dc6-bcd84120889d\" (UID: \"2caca8a1-81a4-40f6-9dc6-bcd84120889d\") " Sep 30 06:41:02 crc kubenswrapper[4691]: I0930 06:41:02.808708 4691 generic.go:334] "Generic (PLEG): container finished" podID="2caca8a1-81a4-40f6-9dc6-bcd84120889d" containerID="c2e619eb2798157e6dae8f9c08d4a248fd4243b3eb141b1ad3a9725206cacd87" exitCode=0 Sep 30 06:41:02 crc kubenswrapper[4691]: I0930 06:41:02.808750 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c45569577-gvk65" event={"ID":"2caca8a1-81a4-40f6-9dc6-bcd84120889d","Type":"ContainerDied","Data":"c2e619eb2798157e6dae8f9c08d4a248fd4243b3eb141b1ad3a9725206cacd87"} Sep 30 06:41:02 crc kubenswrapper[4691]: I0930 06:41:02.808776 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c45569577-gvk65" event={"ID":"2caca8a1-81a4-40f6-9dc6-bcd84120889d","Type":"ContainerDied","Data":"e84a7d408786bf83e92079a4c7ff754c8b71e4703385776fc6c293c921c6f952"} Sep 30 06:41:02 crc kubenswrapper[4691]: I0930 06:41:02.808792 4691 scope.go:117] "RemoveContainer" containerID="c2e619eb2798157e6dae8f9c08d4a248fd4243b3eb141b1ad3a9725206cacd87" Sep 30 06:41:02 crc kubenswrapper[4691]: I0930 06:41:02.808928 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c45569577-gvk65" Sep 30 06:41:02 crc kubenswrapper[4691]: I0930 06:41:02.822252 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2caca8a1-81a4-40f6-9dc6-bcd84120889d-kube-api-access-wpmdf" (OuterVolumeSpecName: "kube-api-access-wpmdf") pod "2caca8a1-81a4-40f6-9dc6-bcd84120889d" (UID: "2caca8a1-81a4-40f6-9dc6-bcd84120889d"). InnerVolumeSpecName "kube-api-access-wpmdf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:41:02 crc kubenswrapper[4691]: I0930 06:41:02.858869 4691 scope.go:117] "RemoveContainer" containerID="dbdc271250f31d3eb65b8f87ab80b0fb60a40e098c4781e4cbee5ff798c18389" Sep 30 06:41:02 crc kubenswrapper[4691]: I0930 06:41:02.865700 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2caca8a1-81a4-40f6-9dc6-bcd84120889d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2caca8a1-81a4-40f6-9dc6-bcd84120889d" (UID: "2caca8a1-81a4-40f6-9dc6-bcd84120889d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:41:02 crc kubenswrapper[4691]: I0930 06:41:02.872704 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2caca8a1-81a4-40f6-9dc6-bcd84120889d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "2caca8a1-81a4-40f6-9dc6-bcd84120889d" (UID: "2caca8a1-81a4-40f6-9dc6-bcd84120889d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:41:02 crc kubenswrapper[4691]: I0930 06:41:02.884768 4691 scope.go:117] "RemoveContainer" containerID="c2e619eb2798157e6dae8f9c08d4a248fd4243b3eb141b1ad3a9725206cacd87" Sep 30 06:41:02 crc kubenswrapper[4691]: E0930 06:41:02.885203 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c2e619eb2798157e6dae8f9c08d4a248fd4243b3eb141b1ad3a9725206cacd87\": container with ID starting with c2e619eb2798157e6dae8f9c08d4a248fd4243b3eb141b1ad3a9725206cacd87 not found: ID does not exist" containerID="c2e619eb2798157e6dae8f9c08d4a248fd4243b3eb141b1ad3a9725206cacd87" Sep 30 06:41:02 crc kubenswrapper[4691]: I0930 06:41:02.885235 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c2e619eb2798157e6dae8f9c08d4a248fd4243b3eb141b1ad3a9725206cacd87"} err="failed to get container status \"c2e619eb2798157e6dae8f9c08d4a248fd4243b3eb141b1ad3a9725206cacd87\": rpc error: code = NotFound desc = could not find container \"c2e619eb2798157e6dae8f9c08d4a248fd4243b3eb141b1ad3a9725206cacd87\": container with ID starting with c2e619eb2798157e6dae8f9c08d4a248fd4243b3eb141b1ad3a9725206cacd87 not found: ID does not exist" Sep 30 06:41:02 crc kubenswrapper[4691]: I0930 06:41:02.885255 4691 scope.go:117] "RemoveContainer" containerID="dbdc271250f31d3eb65b8f87ab80b0fb60a40e098c4781e4cbee5ff798c18389" Sep 30 06:41:02 crc kubenswrapper[4691]: E0930 06:41:02.885751 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbdc271250f31d3eb65b8f87ab80b0fb60a40e098c4781e4cbee5ff798c18389\": container with ID starting with dbdc271250f31d3eb65b8f87ab80b0fb60a40e098c4781e4cbee5ff798c18389 not found: ID does not exist" containerID="dbdc271250f31d3eb65b8f87ab80b0fb60a40e098c4781e4cbee5ff798c18389" Sep 30 06:41:02 crc kubenswrapper[4691]: I0930 06:41:02.885795 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbdc271250f31d3eb65b8f87ab80b0fb60a40e098c4781e4cbee5ff798c18389"} err="failed to get container status \"dbdc271250f31d3eb65b8f87ab80b0fb60a40e098c4781e4cbee5ff798c18389\": rpc error: code = NotFound desc = could not find container \"dbdc271250f31d3eb65b8f87ab80b0fb60a40e098c4781e4cbee5ff798c18389\": container with ID starting with dbdc271250f31d3eb65b8f87ab80b0fb60a40e098c4781e4cbee5ff798c18389 not found: ID does not exist" Sep 30 06:41:02 crc kubenswrapper[4691]: I0930 06:41:02.891628 4691 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/2caca8a1-81a4-40f6-9dc6-bcd84120889d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 06:41:02 crc kubenswrapper[4691]: I0930 06:41:02.891653 4691 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2caca8a1-81a4-40f6-9dc6-bcd84120889d-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 06:41:02 crc kubenswrapper[4691]: I0930 06:41:02.891662 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wpmdf\" (UniqueName: \"kubernetes.io/projected/2caca8a1-81a4-40f6-9dc6-bcd84120889d-kube-api-access-wpmdf\") on node \"crc\" DevicePath \"\"" Sep 30 06:41:02 crc kubenswrapper[4691]: I0930 06:41:02.893938 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2caca8a1-81a4-40f6-9dc6-bcd84120889d-config" (OuterVolumeSpecName: "config") pod "2caca8a1-81a4-40f6-9dc6-bcd84120889d" (UID: "2caca8a1-81a4-40f6-9dc6-bcd84120889d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:41:02 crc kubenswrapper[4691]: I0930 06:41:02.894109 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2caca8a1-81a4-40f6-9dc6-bcd84120889d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "2caca8a1-81a4-40f6-9dc6-bcd84120889d" (UID: "2caca8a1-81a4-40f6-9dc6-bcd84120889d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:41:02 crc kubenswrapper[4691]: I0930 06:41:02.906366 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2caca8a1-81a4-40f6-9dc6-bcd84120889d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "2caca8a1-81a4-40f6-9dc6-bcd84120889d" (UID: "2caca8a1-81a4-40f6-9dc6-bcd84120889d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:41:02 crc kubenswrapper[4691]: I0930 06:41:02.986936 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-c6b9844bc-q6q6n"] Sep 30 06:41:02 crc kubenswrapper[4691]: I0930 06:41:02.993323 4691 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/2caca8a1-81a4-40f6-9dc6-bcd84120889d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 06:41:02 crc kubenswrapper[4691]: I0930 06:41:02.993347 4691 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2caca8a1-81a4-40f6-9dc6-bcd84120889d-config\") on node \"crc\" DevicePath \"\"" Sep 30 06:41:02 crc kubenswrapper[4691]: I0930 06:41:02.993357 4691 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/2caca8a1-81a4-40f6-9dc6-bcd84120889d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 06:41:03 crc kubenswrapper[4691]: I0930 06:41:03.217496 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c45569577-gvk65"] Sep 30 06:41:03 crc kubenswrapper[4691]: I0930 06:41:03.246926 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6c45569577-gvk65"] Sep 30 06:41:03 crc kubenswrapper[4691]: I0930 06:41:03.820091 4691 generic.go:334] "Generic (PLEG): container finished" podID="c18992fb-4c6e-4a18-a9b9-f00db9817b1b" containerID="ac4e615b9cd4db6c771919e7a1c4ff059c6c5fbd4583001d18cb7a486fb4fab0" exitCode=0 Sep 30 06:41:03 crc kubenswrapper[4691]: I0930 06:41:03.820986 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c6b9844bc-q6q6n" event={"ID":"c18992fb-4c6e-4a18-a9b9-f00db9817b1b","Type":"ContainerDied","Data":"ac4e615b9cd4db6c771919e7a1c4ff059c6c5fbd4583001d18cb7a486fb4fab0"} Sep 30 06:41:03 crc kubenswrapper[4691]: I0930 06:41:03.821015 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c6b9844bc-q6q6n" event={"ID":"c18992fb-4c6e-4a18-a9b9-f00db9817b1b","Type":"ContainerStarted","Data":"0ca5bbdce397cd4c6bd96084d5936a1ed993e7a87d9bca630095690aa2879989"} Sep 30 06:41:04 crc kubenswrapper[4691]: I0930 06:41:04.839084 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-c6b9844bc-q6q6n" event={"ID":"c18992fb-4c6e-4a18-a9b9-f00db9817b1b","Type":"ContainerStarted","Data":"5a1ccca0785a96387ea341a450e0e4cd60769930892ff2f609abdee7432351de"} Sep 30 06:41:04 crc kubenswrapper[4691]: I0930 06:41:04.839725 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-c6b9844bc-q6q6n" Sep 30 06:41:04 crc kubenswrapper[4691]: I0930 06:41:04.857739 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-c6b9844bc-q6q6n" podStartSLOduration=2.857715813 podStartE2EDuration="2.857715813s" podCreationTimestamp="2025-09-30 06:41:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:41:04.855178082 +0000 UTC m=+1308.330199142" watchObservedRunningTime="2025-09-30 06:41:04.857715813 +0000 UTC m=+1308.332736863" Sep 30 06:41:05 crc kubenswrapper[4691]: I0930 06:41:05.235980 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2caca8a1-81a4-40f6-9dc6-bcd84120889d" path="/var/lib/kubelet/pods/2caca8a1-81a4-40f6-9dc6-bcd84120889d/volumes" Sep 30 06:41:12 crc kubenswrapper[4691]: I0930 06:41:12.510419 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-c6b9844bc-q6q6n" Sep 30 06:41:12 crc kubenswrapper[4691]: I0930 06:41:12.598052 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c74869f9f-l9thw"] Sep 30 06:41:12 crc kubenswrapper[4691]: I0930 06:41:12.599385 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6c74869f9f-l9thw" podUID="d3943c28-1c84-4fcb-8d95-dc6fb861fe66" containerName="dnsmasq-dns" containerID="cri-o://c4110c7199953abc392001901b5aa1a76bfaaaf34aa66c684c7fecf663607f39" gracePeriod=10 Sep 30 06:41:12 crc kubenswrapper[4691]: I0930 06:41:12.936494 4691 generic.go:334] "Generic (PLEG): container finished" podID="d3943c28-1c84-4fcb-8d95-dc6fb861fe66" containerID="c4110c7199953abc392001901b5aa1a76bfaaaf34aa66c684c7fecf663607f39" exitCode=0 Sep 30 06:41:12 crc kubenswrapper[4691]: I0930 06:41:12.936853 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c74869f9f-l9thw" event={"ID":"d3943c28-1c84-4fcb-8d95-dc6fb861fe66","Type":"ContainerDied","Data":"c4110c7199953abc392001901b5aa1a76bfaaaf34aa66c684c7fecf663607f39"} Sep 30 06:41:13 crc kubenswrapper[4691]: I0930 06:41:13.204192 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c74869f9f-l9thw" Sep 30 06:41:13 crc kubenswrapper[4691]: I0930 06:41:13.308709 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d3943c28-1c84-4fcb-8d95-dc6fb861fe66-dns-svc\") pod \"d3943c28-1c84-4fcb-8d95-dc6fb861fe66\" (UID: \"d3943c28-1c84-4fcb-8d95-dc6fb861fe66\") " Sep 30 06:41:13 crc kubenswrapper[4691]: I0930 06:41:13.308801 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/d3943c28-1c84-4fcb-8d95-dc6fb861fe66-openstack-edpm-ipam\") pod \"d3943c28-1c84-4fcb-8d95-dc6fb861fe66\" (UID: \"d3943c28-1c84-4fcb-8d95-dc6fb861fe66\") " Sep 30 06:41:13 crc kubenswrapper[4691]: I0930 06:41:13.308860 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d3943c28-1c84-4fcb-8d95-dc6fb861fe66-ovsdbserver-sb\") pod \"d3943c28-1c84-4fcb-8d95-dc6fb861fe66\" (UID: \"d3943c28-1c84-4fcb-8d95-dc6fb861fe66\") " Sep 30 06:41:13 crc kubenswrapper[4691]: I0930 06:41:13.308913 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d3943c28-1c84-4fcb-8d95-dc6fb861fe66-dns-swift-storage-0\") pod \"d3943c28-1c84-4fcb-8d95-dc6fb861fe66\" (UID: \"d3943c28-1c84-4fcb-8d95-dc6fb861fe66\") " Sep 30 06:41:13 crc kubenswrapper[4691]: I0930 06:41:13.308947 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d3943c28-1c84-4fcb-8d95-dc6fb861fe66-ovsdbserver-nb\") pod \"d3943c28-1c84-4fcb-8d95-dc6fb861fe66\" (UID: \"d3943c28-1c84-4fcb-8d95-dc6fb861fe66\") " Sep 30 06:41:13 crc kubenswrapper[4691]: I0930 06:41:13.309032 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3943c28-1c84-4fcb-8d95-dc6fb861fe66-config\") pod \"d3943c28-1c84-4fcb-8d95-dc6fb861fe66\" (UID: \"d3943c28-1c84-4fcb-8d95-dc6fb861fe66\") " Sep 30 06:41:13 crc kubenswrapper[4691]: I0930 06:41:13.309068 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84ppr\" (UniqueName: \"kubernetes.io/projected/d3943c28-1c84-4fcb-8d95-dc6fb861fe66-kube-api-access-84ppr\") pod \"d3943c28-1c84-4fcb-8d95-dc6fb861fe66\" (UID: \"d3943c28-1c84-4fcb-8d95-dc6fb861fe66\") " Sep 30 06:41:13 crc kubenswrapper[4691]: I0930 06:41:13.331053 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3943c28-1c84-4fcb-8d95-dc6fb861fe66-kube-api-access-84ppr" (OuterVolumeSpecName: "kube-api-access-84ppr") pod "d3943c28-1c84-4fcb-8d95-dc6fb861fe66" (UID: "d3943c28-1c84-4fcb-8d95-dc6fb861fe66"). InnerVolumeSpecName "kube-api-access-84ppr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:41:13 crc kubenswrapper[4691]: I0930 06:41:13.391105 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3943c28-1c84-4fcb-8d95-dc6fb861fe66-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d3943c28-1c84-4fcb-8d95-dc6fb861fe66" (UID: "d3943c28-1c84-4fcb-8d95-dc6fb861fe66"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:41:13 crc kubenswrapper[4691]: I0930 06:41:13.399637 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3943c28-1c84-4fcb-8d95-dc6fb861fe66-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d3943c28-1c84-4fcb-8d95-dc6fb861fe66" (UID: "d3943c28-1c84-4fcb-8d95-dc6fb861fe66"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:41:13 crc kubenswrapper[4691]: I0930 06:41:13.399978 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3943c28-1c84-4fcb-8d95-dc6fb861fe66-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "d3943c28-1c84-4fcb-8d95-dc6fb861fe66" (UID: "d3943c28-1c84-4fcb-8d95-dc6fb861fe66"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:41:13 crc kubenswrapper[4691]: I0930 06:41:13.400795 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3943c28-1c84-4fcb-8d95-dc6fb861fe66-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d3943c28-1c84-4fcb-8d95-dc6fb861fe66" (UID: "d3943c28-1c84-4fcb-8d95-dc6fb861fe66"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:41:13 crc kubenswrapper[4691]: I0930 06:41:13.402535 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3943c28-1c84-4fcb-8d95-dc6fb861fe66-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d3943c28-1c84-4fcb-8d95-dc6fb861fe66" (UID: "d3943c28-1c84-4fcb-8d95-dc6fb861fe66"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:41:13 crc kubenswrapper[4691]: I0930 06:41:13.405949 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3943c28-1c84-4fcb-8d95-dc6fb861fe66-config" (OuterVolumeSpecName: "config") pod "d3943c28-1c84-4fcb-8d95-dc6fb861fe66" (UID: "d3943c28-1c84-4fcb-8d95-dc6fb861fe66"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:41:13 crc kubenswrapper[4691]: I0930 06:41:13.411636 4691 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d3943c28-1c84-4fcb-8d95-dc6fb861fe66-config\") on node \"crc\" DevicePath \"\"" Sep 30 06:41:13 crc kubenswrapper[4691]: I0930 06:41:13.411656 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84ppr\" (UniqueName: \"kubernetes.io/projected/d3943c28-1c84-4fcb-8d95-dc6fb861fe66-kube-api-access-84ppr\") on node \"crc\" DevicePath \"\"" Sep 30 06:41:13 crc kubenswrapper[4691]: I0930 06:41:13.411668 4691 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d3943c28-1c84-4fcb-8d95-dc6fb861fe66-dns-svc\") on node \"crc\" DevicePath \"\"" Sep 30 06:41:13 crc kubenswrapper[4691]: I0930 06:41:13.411677 4691 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/d3943c28-1c84-4fcb-8d95-dc6fb861fe66-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Sep 30 06:41:13 crc kubenswrapper[4691]: I0930 06:41:13.411686 4691 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d3943c28-1c84-4fcb-8d95-dc6fb861fe66-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Sep 30 06:41:13 crc kubenswrapper[4691]: I0930 06:41:13.411695 4691 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d3943c28-1c84-4fcb-8d95-dc6fb861fe66-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Sep 30 06:41:13 crc kubenswrapper[4691]: I0930 06:41:13.411703 4691 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d3943c28-1c84-4fcb-8d95-dc6fb861fe66-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Sep 30 06:41:13 crc kubenswrapper[4691]: I0930 06:41:13.955050 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6c74869f9f-l9thw" event={"ID":"d3943c28-1c84-4fcb-8d95-dc6fb861fe66","Type":"ContainerDied","Data":"bdeb3ce4dc28ba286811d959eef053e9fa0a2637cbe95c3c1ffa7daa63e44f7f"} Sep 30 06:41:13 crc kubenswrapper[4691]: I0930 06:41:13.955132 4691 scope.go:117] "RemoveContainer" containerID="c4110c7199953abc392001901b5aa1a76bfaaaf34aa66c684c7fecf663607f39" Sep 30 06:41:13 crc kubenswrapper[4691]: I0930 06:41:13.955163 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6c74869f9f-l9thw" Sep 30 06:41:14 crc kubenswrapper[4691]: I0930 06:41:14.002841 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6c74869f9f-l9thw"] Sep 30 06:41:14 crc kubenswrapper[4691]: I0930 06:41:14.005942 4691 scope.go:117] "RemoveContainer" containerID="ce7c9212196a7e60635085a2fcf052bece0ef476c486b84122467a4e2976ef14" Sep 30 06:41:14 crc kubenswrapper[4691]: I0930 06:41:14.010826 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6c74869f9f-l9thw"] Sep 30 06:41:15 crc kubenswrapper[4691]: I0930 06:41:15.249846 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3943c28-1c84-4fcb-8d95-dc6fb861fe66" path="/var/lib/kubelet/pods/d3943c28-1c84-4fcb-8d95-dc6fb861fe66/volumes" Sep 30 06:41:15 crc kubenswrapper[4691]: I0930 06:41:15.984755 4691 generic.go:334] "Generic (PLEG): container finished" podID="a7ac4031-4e4c-4dd1-bbe8-16846c31d7b5" containerID="9a664b6220fd9de51776ee09736771060b0af934bda098006893447ede6eb9da" exitCode=0 Sep 30 06:41:15 crc kubenswrapper[4691]: I0930 06:41:15.984984 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a7ac4031-4e4c-4dd1-bbe8-16846c31d7b5","Type":"ContainerDied","Data":"9a664b6220fd9de51776ee09736771060b0af934bda098006893447ede6eb9da"} Sep 30 06:41:15 crc kubenswrapper[4691]: I0930 06:41:15.990194 4691 generic.go:334] "Generic (PLEG): container finished" podID="136adcf8-2194-4dd2-9b57-6bf571f9e295" containerID="a14b0748725812c88dba86501f5113886341896e789b5d2f7678f54c0f1124b0" exitCode=0 Sep 30 06:41:15 crc kubenswrapper[4691]: I0930 06:41:15.990315 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"136adcf8-2194-4dd2-9b57-6bf571f9e295","Type":"ContainerDied","Data":"a14b0748725812c88dba86501f5113886341896e789b5d2f7678f54c0f1124b0"} Sep 30 06:41:17 crc kubenswrapper[4691]: I0930 06:41:17.007329 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"136adcf8-2194-4dd2-9b57-6bf571f9e295","Type":"ContainerStarted","Data":"a9c62db48d4651da130b836fd28f5943a3ee2a03faa89620e2edc675b6d386c3"} Sep 30 06:41:17 crc kubenswrapper[4691]: I0930 06:41:17.007778 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Sep 30 06:41:17 crc kubenswrapper[4691]: I0930 06:41:17.015600 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"a7ac4031-4e4c-4dd1-bbe8-16846c31d7b5","Type":"ContainerStarted","Data":"1fbe8a4073f15f7fdfd4ba55fe9488373bef1d19bd8d886f24d8c4fe7feb2588"} Sep 30 06:41:17 crc kubenswrapper[4691]: I0930 06:41:17.015853 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Sep 30 06:41:17 crc kubenswrapper[4691]: I0930 06:41:17.052955 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=34.052936632 podStartE2EDuration="34.052936632s" podCreationTimestamp="2025-09-30 06:40:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:41:17.043186551 +0000 UTC m=+1320.518207591" watchObservedRunningTime="2025-09-30 06:41:17.052936632 +0000 UTC m=+1320.527957672" Sep 30 06:41:17 crc kubenswrapper[4691]: I0930 06:41:17.080737 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=34.080717949 podStartE2EDuration="34.080717949s" podCreationTimestamp="2025-09-30 06:40:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 06:41:17.072768566 +0000 UTC m=+1320.547789606" watchObservedRunningTime="2025-09-30 06:41:17.080717949 +0000 UTC m=+1320.555738979" Sep 30 06:41:22 crc kubenswrapper[4691]: I0930 06:41:22.553078 4691 scope.go:117] "RemoveContainer" containerID="3a338c90117dd264286426ab2d4ddfa45266ae38adc725c1467b4129a7c3187a" Sep 30 06:41:22 crc kubenswrapper[4691]: I0930 06:41:22.588015 4691 scope.go:117] "RemoveContainer" containerID="e863e1d193f7d52a37582c68c9ce346b46e2528677a41ef234001593a370589f" Sep 30 06:41:22 crc kubenswrapper[4691]: I0930 06:41:22.618758 4691 scope.go:117] "RemoveContainer" containerID="b57a0525ddc4af294fd81dcc684a4dec26e9b838458599245b9086835932afd7" Sep 30 06:41:30 crc kubenswrapper[4691]: I0930 06:41:30.945554 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jkj8w"] Sep 30 06:41:30 crc kubenswrapper[4691]: E0930 06:41:30.946545 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2caca8a1-81a4-40f6-9dc6-bcd84120889d" containerName="init" Sep 30 06:41:30 crc kubenswrapper[4691]: I0930 06:41:30.946561 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="2caca8a1-81a4-40f6-9dc6-bcd84120889d" containerName="init" Sep 30 06:41:30 crc kubenswrapper[4691]: E0930 06:41:30.946592 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3943c28-1c84-4fcb-8d95-dc6fb861fe66" containerName="dnsmasq-dns" Sep 30 06:41:30 crc kubenswrapper[4691]: I0930 06:41:30.946600 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3943c28-1c84-4fcb-8d95-dc6fb861fe66" containerName="dnsmasq-dns" Sep 30 06:41:30 crc kubenswrapper[4691]: E0930 06:41:30.946622 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3943c28-1c84-4fcb-8d95-dc6fb861fe66" containerName="init" Sep 30 06:41:30 crc kubenswrapper[4691]: I0930 06:41:30.946631 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3943c28-1c84-4fcb-8d95-dc6fb861fe66" containerName="init" Sep 30 06:41:30 crc kubenswrapper[4691]: E0930 06:41:30.946655 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2caca8a1-81a4-40f6-9dc6-bcd84120889d" containerName="dnsmasq-dns" Sep 30 06:41:30 crc kubenswrapper[4691]: I0930 06:41:30.946663 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="2caca8a1-81a4-40f6-9dc6-bcd84120889d" containerName="dnsmasq-dns" Sep 30 06:41:30 crc kubenswrapper[4691]: I0930 06:41:30.946919 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3943c28-1c84-4fcb-8d95-dc6fb861fe66" containerName="dnsmasq-dns" Sep 30 06:41:30 crc kubenswrapper[4691]: I0930 06:41:30.946936 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="2caca8a1-81a4-40f6-9dc6-bcd84120889d" containerName="dnsmasq-dns" Sep 30 06:41:30 crc kubenswrapper[4691]: I0930 06:41:30.947774 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jkj8w" Sep 30 06:41:30 crc kubenswrapper[4691]: I0930 06:41:30.957463 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 06:41:30 crc kubenswrapper[4691]: I0930 06:41:30.957645 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 06:41:30 crc kubenswrapper[4691]: I0930 06:41:30.957903 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vff8k" Sep 30 06:41:30 crc kubenswrapper[4691]: I0930 06:41:30.958733 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 06:41:30 crc kubenswrapper[4691]: I0930 06:41:30.961278 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jkj8w"] Sep 30 06:41:30 crc kubenswrapper[4691]: I0930 06:41:30.994991 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1ae5c682-dd33-42b2-8b7c-564876eef00a-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jkj8w\" (UID: \"1ae5c682-dd33-42b2-8b7c-564876eef00a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jkj8w" Sep 30 06:41:30 crc kubenswrapper[4691]: I0930 06:41:30.995048 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k44vq\" (UniqueName: \"kubernetes.io/projected/1ae5c682-dd33-42b2-8b7c-564876eef00a-kube-api-access-k44vq\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jkj8w\" (UID: \"1ae5c682-dd33-42b2-8b7c-564876eef00a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jkj8w" Sep 30 06:41:30 crc kubenswrapper[4691]: I0930 06:41:30.995314 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1ae5c682-dd33-42b2-8b7c-564876eef00a-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jkj8w\" (UID: \"1ae5c682-dd33-42b2-8b7c-564876eef00a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jkj8w" Sep 30 06:41:30 crc kubenswrapper[4691]: I0930 06:41:30.995401 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ae5c682-dd33-42b2-8b7c-564876eef00a-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jkj8w\" (UID: \"1ae5c682-dd33-42b2-8b7c-564876eef00a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jkj8w" Sep 30 06:41:31 crc kubenswrapper[4691]: I0930 06:41:31.096618 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1ae5c682-dd33-42b2-8b7c-564876eef00a-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jkj8w\" (UID: \"1ae5c682-dd33-42b2-8b7c-564876eef00a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jkj8w" Sep 30 06:41:31 crc kubenswrapper[4691]: I0930 06:41:31.096675 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ae5c682-dd33-42b2-8b7c-564876eef00a-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jkj8w\" (UID: \"1ae5c682-dd33-42b2-8b7c-564876eef00a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jkj8w" Sep 30 06:41:31 crc kubenswrapper[4691]: I0930 06:41:31.096841 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1ae5c682-dd33-42b2-8b7c-564876eef00a-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jkj8w\" (UID: \"1ae5c682-dd33-42b2-8b7c-564876eef00a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jkj8w" Sep 30 06:41:31 crc kubenswrapper[4691]: I0930 06:41:31.096901 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k44vq\" (UniqueName: \"kubernetes.io/projected/1ae5c682-dd33-42b2-8b7c-564876eef00a-kube-api-access-k44vq\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jkj8w\" (UID: \"1ae5c682-dd33-42b2-8b7c-564876eef00a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jkj8w" Sep 30 06:41:31 crc kubenswrapper[4691]: I0930 06:41:31.105837 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1ae5c682-dd33-42b2-8b7c-564876eef00a-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jkj8w\" (UID: \"1ae5c682-dd33-42b2-8b7c-564876eef00a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jkj8w" Sep 30 06:41:31 crc kubenswrapper[4691]: I0930 06:41:31.110425 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1ae5c682-dd33-42b2-8b7c-564876eef00a-ssh-key\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jkj8w\" (UID: \"1ae5c682-dd33-42b2-8b7c-564876eef00a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jkj8w" Sep 30 06:41:31 crc kubenswrapper[4691]: I0930 06:41:31.113020 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ae5c682-dd33-42b2-8b7c-564876eef00a-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jkj8w\" (UID: \"1ae5c682-dd33-42b2-8b7c-564876eef00a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jkj8w" Sep 30 06:41:31 crc kubenswrapper[4691]: I0930 06:41:31.120562 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k44vq\" (UniqueName: \"kubernetes.io/projected/1ae5c682-dd33-42b2-8b7c-564876eef00a-kube-api-access-k44vq\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-jkj8w\" (UID: \"1ae5c682-dd33-42b2-8b7c-564876eef00a\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jkj8w" Sep 30 06:41:31 crc kubenswrapper[4691]: I0930 06:41:31.276017 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jkj8w" Sep 30 06:41:31 crc kubenswrapper[4691]: I0930 06:41:31.904772 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jkj8w"] Sep 30 06:41:31 crc kubenswrapper[4691]: W0930 06:41:31.921727 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1ae5c682_dd33_42b2_8b7c_564876eef00a.slice/crio-98456baba0a8965e6b242802edbfc234cb21892b34356c75ef841577fd514fa4 WatchSource:0}: Error finding container 98456baba0a8965e6b242802edbfc234cb21892b34356c75ef841577fd514fa4: Status 404 returned error can't find the container with id 98456baba0a8965e6b242802edbfc234cb21892b34356c75ef841577fd514fa4 Sep 30 06:41:32 crc kubenswrapper[4691]: I0930 06:41:32.190968 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jkj8w" event={"ID":"1ae5c682-dd33-42b2-8b7c-564876eef00a","Type":"ContainerStarted","Data":"98456baba0a8965e6b242802edbfc234cb21892b34356c75ef841577fd514fa4"} Sep 30 06:41:33 crc kubenswrapper[4691]: I0930 06:41:33.984308 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Sep 30 06:41:34 crc kubenswrapper[4691]: I0930 06:41:34.261064 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Sep 30 06:41:41 crc kubenswrapper[4691]: I0930 06:41:41.305032 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jkj8w" event={"ID":"1ae5c682-dd33-42b2-8b7c-564876eef00a","Type":"ContainerStarted","Data":"4eb9c1214b25685a643c4a608c69ee874cd2b5445b2c241cf86d84764d71ab36"} Sep 30 06:41:41 crc kubenswrapper[4691]: I0930 06:41:41.332683 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jkj8w" podStartSLOduration=2.327229575 podStartE2EDuration="11.332654684s" podCreationTimestamp="2025-09-30 06:41:30 +0000 UTC" firstStartedPulling="2025-09-30 06:41:31.92384978 +0000 UTC m=+1335.398870820" lastFinishedPulling="2025-09-30 06:41:40.929274859 +0000 UTC m=+1344.404295929" observedRunningTime="2025-09-30 06:41:41.321944251 +0000 UTC m=+1344.796965341" watchObservedRunningTime="2025-09-30 06:41:41.332654684 +0000 UTC m=+1344.807675754" Sep 30 06:41:53 crc kubenswrapper[4691]: I0930 06:41:53.479575 4691 generic.go:334] "Generic (PLEG): container finished" podID="1ae5c682-dd33-42b2-8b7c-564876eef00a" containerID="4eb9c1214b25685a643c4a608c69ee874cd2b5445b2c241cf86d84764d71ab36" exitCode=0 Sep 30 06:41:53 crc kubenswrapper[4691]: I0930 06:41:53.480209 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jkj8w" event={"ID":"1ae5c682-dd33-42b2-8b7c-564876eef00a","Type":"ContainerDied","Data":"4eb9c1214b25685a643c4a608c69ee874cd2b5445b2c241cf86d84764d71ab36"} Sep 30 06:41:55 crc kubenswrapper[4691]: I0930 06:41:55.169827 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jkj8w" Sep 30 06:41:55 crc kubenswrapper[4691]: I0930 06:41:55.342607 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1ae5c682-dd33-42b2-8b7c-564876eef00a-inventory\") pod \"1ae5c682-dd33-42b2-8b7c-564876eef00a\" (UID: \"1ae5c682-dd33-42b2-8b7c-564876eef00a\") " Sep 30 06:41:55 crc kubenswrapper[4691]: I0930 06:41:55.343093 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k44vq\" (UniqueName: \"kubernetes.io/projected/1ae5c682-dd33-42b2-8b7c-564876eef00a-kube-api-access-k44vq\") pod \"1ae5c682-dd33-42b2-8b7c-564876eef00a\" (UID: \"1ae5c682-dd33-42b2-8b7c-564876eef00a\") " Sep 30 06:41:55 crc kubenswrapper[4691]: I0930 06:41:55.343228 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1ae5c682-dd33-42b2-8b7c-564876eef00a-ssh-key\") pod \"1ae5c682-dd33-42b2-8b7c-564876eef00a\" (UID: \"1ae5c682-dd33-42b2-8b7c-564876eef00a\") " Sep 30 06:41:55 crc kubenswrapper[4691]: I0930 06:41:55.343399 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ae5c682-dd33-42b2-8b7c-564876eef00a-repo-setup-combined-ca-bundle\") pod \"1ae5c682-dd33-42b2-8b7c-564876eef00a\" (UID: \"1ae5c682-dd33-42b2-8b7c-564876eef00a\") " Sep 30 06:41:55 crc kubenswrapper[4691]: I0930 06:41:55.349790 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ae5c682-dd33-42b2-8b7c-564876eef00a-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "1ae5c682-dd33-42b2-8b7c-564876eef00a" (UID: "1ae5c682-dd33-42b2-8b7c-564876eef00a"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:41:55 crc kubenswrapper[4691]: I0930 06:41:55.352829 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ae5c682-dd33-42b2-8b7c-564876eef00a-kube-api-access-k44vq" (OuterVolumeSpecName: "kube-api-access-k44vq") pod "1ae5c682-dd33-42b2-8b7c-564876eef00a" (UID: "1ae5c682-dd33-42b2-8b7c-564876eef00a"). InnerVolumeSpecName "kube-api-access-k44vq". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:41:55 crc kubenswrapper[4691]: I0930 06:41:55.380291 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ae5c682-dd33-42b2-8b7c-564876eef00a-inventory" (OuterVolumeSpecName: "inventory") pod "1ae5c682-dd33-42b2-8b7c-564876eef00a" (UID: "1ae5c682-dd33-42b2-8b7c-564876eef00a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:41:55 crc kubenswrapper[4691]: I0930 06:41:55.384043 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ae5c682-dd33-42b2-8b7c-564876eef00a-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1ae5c682-dd33-42b2-8b7c-564876eef00a" (UID: "1ae5c682-dd33-42b2-8b7c-564876eef00a"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:41:55 crc kubenswrapper[4691]: I0930 06:41:55.446807 4691 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ae5c682-dd33-42b2-8b7c-564876eef00a-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:41:55 crc kubenswrapper[4691]: I0930 06:41:55.446855 4691 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1ae5c682-dd33-42b2-8b7c-564876eef00a-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 06:41:55 crc kubenswrapper[4691]: I0930 06:41:55.446871 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k44vq\" (UniqueName: \"kubernetes.io/projected/1ae5c682-dd33-42b2-8b7c-564876eef00a-kube-api-access-k44vq\") on node \"crc\" DevicePath \"\"" Sep 30 06:41:55 crc kubenswrapper[4691]: I0930 06:41:55.446986 4691 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1ae5c682-dd33-42b2-8b7c-564876eef00a-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 06:41:55 crc kubenswrapper[4691]: I0930 06:41:55.511668 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jkj8w" event={"ID":"1ae5c682-dd33-42b2-8b7c-564876eef00a","Type":"ContainerDied","Data":"98456baba0a8965e6b242802edbfc234cb21892b34356c75ef841577fd514fa4"} Sep 30 06:41:55 crc kubenswrapper[4691]: I0930 06:41:55.511718 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98456baba0a8965e6b242802edbfc234cb21892b34356c75ef841577fd514fa4" Sep 30 06:41:55 crc kubenswrapper[4691]: I0930 06:41:55.511746 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-jkj8w" Sep 30 06:41:55 crc kubenswrapper[4691]: I0930 06:41:55.609555 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-ljjt4"] Sep 30 06:41:55 crc kubenswrapper[4691]: E0930 06:41:55.610085 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ae5c682-dd33-42b2-8b7c-564876eef00a" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Sep 30 06:41:55 crc kubenswrapper[4691]: I0930 06:41:55.610105 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ae5c682-dd33-42b2-8b7c-564876eef00a" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Sep 30 06:41:55 crc kubenswrapper[4691]: I0930 06:41:55.614421 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ae5c682-dd33-42b2-8b7c-564876eef00a" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Sep 30 06:41:55 crc kubenswrapper[4691]: I0930 06:41:55.615287 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ljjt4" Sep 30 06:41:55 crc kubenswrapper[4691]: I0930 06:41:55.619149 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 06:41:55 crc kubenswrapper[4691]: I0930 06:41:55.619264 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 06:41:55 crc kubenswrapper[4691]: I0930 06:41:55.626320 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 06:41:55 crc kubenswrapper[4691]: I0930 06:41:55.626670 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vff8k" Sep 30 06:41:55 crc kubenswrapper[4691]: I0930 06:41:55.633560 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-ljjt4"] Sep 30 06:41:55 crc kubenswrapper[4691]: I0930 06:41:55.651367 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d39e1c92-309d-4295-8f78-e9d01ffdb114-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-ljjt4\" (UID: \"d39e1c92-309d-4295-8f78-e9d01ffdb114\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ljjt4" Sep 30 06:41:55 crc kubenswrapper[4691]: I0930 06:41:55.651462 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d39e1c92-309d-4295-8f78-e9d01ffdb114-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-ljjt4\" (UID: \"d39e1c92-309d-4295-8f78-e9d01ffdb114\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ljjt4" Sep 30 06:41:55 crc kubenswrapper[4691]: I0930 06:41:55.651506 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7c4t\" (UniqueName: \"kubernetes.io/projected/d39e1c92-309d-4295-8f78-e9d01ffdb114-kube-api-access-s7c4t\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-ljjt4\" (UID: \"d39e1c92-309d-4295-8f78-e9d01ffdb114\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ljjt4" Sep 30 06:41:55 crc kubenswrapper[4691]: I0930 06:41:55.753641 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d39e1c92-309d-4295-8f78-e9d01ffdb114-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-ljjt4\" (UID: \"d39e1c92-309d-4295-8f78-e9d01ffdb114\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ljjt4" Sep 30 06:41:55 crc kubenswrapper[4691]: I0930 06:41:55.753696 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d39e1c92-309d-4295-8f78-e9d01ffdb114-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-ljjt4\" (UID: \"d39e1c92-309d-4295-8f78-e9d01ffdb114\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ljjt4" Sep 30 06:41:55 crc kubenswrapper[4691]: I0930 06:41:55.753730 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7c4t\" (UniqueName: \"kubernetes.io/projected/d39e1c92-309d-4295-8f78-e9d01ffdb114-kube-api-access-s7c4t\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-ljjt4\" (UID: \"d39e1c92-309d-4295-8f78-e9d01ffdb114\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ljjt4" Sep 30 06:41:55 crc kubenswrapper[4691]: I0930 06:41:55.760768 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d39e1c92-309d-4295-8f78-e9d01ffdb114-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-ljjt4\" (UID: \"d39e1c92-309d-4295-8f78-e9d01ffdb114\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ljjt4" Sep 30 06:41:55 crc kubenswrapper[4691]: I0930 06:41:55.767922 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d39e1c92-309d-4295-8f78-e9d01ffdb114-ssh-key\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-ljjt4\" (UID: \"d39e1c92-309d-4295-8f78-e9d01ffdb114\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ljjt4" Sep 30 06:41:55 crc kubenswrapper[4691]: I0930 06:41:55.782761 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7c4t\" (UniqueName: \"kubernetes.io/projected/d39e1c92-309d-4295-8f78-e9d01ffdb114-kube-api-access-s7c4t\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-ljjt4\" (UID: \"d39e1c92-309d-4295-8f78-e9d01ffdb114\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ljjt4" Sep 30 06:41:55 crc kubenswrapper[4691]: I0930 06:41:55.946412 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ljjt4" Sep 30 06:41:56 crc kubenswrapper[4691]: I0930 06:41:56.567525 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-ljjt4"] Sep 30 06:41:57 crc kubenswrapper[4691]: I0930 06:41:57.539991 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ljjt4" event={"ID":"d39e1c92-309d-4295-8f78-e9d01ffdb114","Type":"ContainerStarted","Data":"0b9f2591f88b0584f0d49e0bb9feddce1b3996ab718b4e47da4f5be2e8c9da5e"} Sep 30 06:41:57 crc kubenswrapper[4691]: I0930 06:41:57.540602 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ljjt4" event={"ID":"d39e1c92-309d-4295-8f78-e9d01ffdb114","Type":"ContainerStarted","Data":"fe3bd80e4d68b5d16b5175edbb9ae1d9a43142c1493efec408a92d8a9da2354f"} Sep 30 06:41:57 crc kubenswrapper[4691]: I0930 06:41:57.566265 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ljjt4" podStartSLOduration=2.121444201 podStartE2EDuration="2.566247849s" podCreationTimestamp="2025-09-30 06:41:55 +0000 UTC" firstStartedPulling="2025-09-30 06:41:56.577991241 +0000 UTC m=+1360.053012301" lastFinishedPulling="2025-09-30 06:41:57.022794909 +0000 UTC m=+1360.497815949" observedRunningTime="2025-09-30 06:41:57.561441285 +0000 UTC m=+1361.036462365" watchObservedRunningTime="2025-09-30 06:41:57.566247849 +0000 UTC m=+1361.041268889" Sep 30 06:42:00 crc kubenswrapper[4691]: I0930 06:42:00.583727 4691 generic.go:334] "Generic (PLEG): container finished" podID="d39e1c92-309d-4295-8f78-e9d01ffdb114" containerID="0b9f2591f88b0584f0d49e0bb9feddce1b3996ab718b4e47da4f5be2e8c9da5e" exitCode=0 Sep 30 06:42:00 crc kubenswrapper[4691]: I0930 06:42:00.583810 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ljjt4" event={"ID":"d39e1c92-309d-4295-8f78-e9d01ffdb114","Type":"ContainerDied","Data":"0b9f2591f88b0584f0d49e0bb9feddce1b3996ab718b4e47da4f5be2e8c9da5e"} Sep 30 06:42:02 crc kubenswrapper[4691]: I0930 06:42:02.160869 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ljjt4" Sep 30 06:42:02 crc kubenswrapper[4691]: I0930 06:42:02.226488 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d39e1c92-309d-4295-8f78-e9d01ffdb114-inventory\") pod \"d39e1c92-309d-4295-8f78-e9d01ffdb114\" (UID: \"d39e1c92-309d-4295-8f78-e9d01ffdb114\") " Sep 30 06:42:02 crc kubenswrapper[4691]: I0930 06:42:02.226573 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s7c4t\" (UniqueName: \"kubernetes.io/projected/d39e1c92-309d-4295-8f78-e9d01ffdb114-kube-api-access-s7c4t\") pod \"d39e1c92-309d-4295-8f78-e9d01ffdb114\" (UID: \"d39e1c92-309d-4295-8f78-e9d01ffdb114\") " Sep 30 06:42:02 crc kubenswrapper[4691]: I0930 06:42:02.226688 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d39e1c92-309d-4295-8f78-e9d01ffdb114-ssh-key\") pod \"d39e1c92-309d-4295-8f78-e9d01ffdb114\" (UID: \"d39e1c92-309d-4295-8f78-e9d01ffdb114\") " Sep 30 06:42:02 crc kubenswrapper[4691]: I0930 06:42:02.237196 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d39e1c92-309d-4295-8f78-e9d01ffdb114-kube-api-access-s7c4t" (OuterVolumeSpecName: "kube-api-access-s7c4t") pod "d39e1c92-309d-4295-8f78-e9d01ffdb114" (UID: "d39e1c92-309d-4295-8f78-e9d01ffdb114"). InnerVolumeSpecName "kube-api-access-s7c4t". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:42:02 crc kubenswrapper[4691]: I0930 06:42:02.272133 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d39e1c92-309d-4295-8f78-e9d01ffdb114-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d39e1c92-309d-4295-8f78-e9d01ffdb114" (UID: "d39e1c92-309d-4295-8f78-e9d01ffdb114"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:42:02 crc kubenswrapper[4691]: I0930 06:42:02.277253 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d39e1c92-309d-4295-8f78-e9d01ffdb114-inventory" (OuterVolumeSpecName: "inventory") pod "d39e1c92-309d-4295-8f78-e9d01ffdb114" (UID: "d39e1c92-309d-4295-8f78-e9d01ffdb114"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:42:02 crc kubenswrapper[4691]: I0930 06:42:02.329272 4691 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d39e1c92-309d-4295-8f78-e9d01ffdb114-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 06:42:02 crc kubenswrapper[4691]: I0930 06:42:02.329334 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s7c4t\" (UniqueName: \"kubernetes.io/projected/d39e1c92-309d-4295-8f78-e9d01ffdb114-kube-api-access-s7c4t\") on node \"crc\" DevicePath \"\"" Sep 30 06:42:02 crc kubenswrapper[4691]: I0930 06:42:02.329355 4691 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d39e1c92-309d-4295-8f78-e9d01ffdb114-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 06:42:02 crc kubenswrapper[4691]: I0930 06:42:02.614918 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ljjt4" event={"ID":"d39e1c92-309d-4295-8f78-e9d01ffdb114","Type":"ContainerDied","Data":"fe3bd80e4d68b5d16b5175edbb9ae1d9a43142c1493efec408a92d8a9da2354f"} Sep 30 06:42:02 crc kubenswrapper[4691]: I0930 06:42:02.614965 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fe3bd80e4d68b5d16b5175edbb9ae1d9a43142c1493efec408a92d8a9da2354f" Sep 30 06:42:02 crc kubenswrapper[4691]: I0930 06:42:02.615029 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-ljjt4" Sep 30 06:42:02 crc kubenswrapper[4691]: I0930 06:42:02.720299 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wdn8x"] Sep 30 06:42:02 crc kubenswrapper[4691]: E0930 06:42:02.721138 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d39e1c92-309d-4295-8f78-e9d01ffdb114" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Sep 30 06:42:02 crc kubenswrapper[4691]: I0930 06:42:02.721164 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="d39e1c92-309d-4295-8f78-e9d01ffdb114" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Sep 30 06:42:02 crc kubenswrapper[4691]: I0930 06:42:02.721431 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="d39e1c92-309d-4295-8f78-e9d01ffdb114" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Sep 30 06:42:02 crc kubenswrapper[4691]: I0930 06:42:02.722256 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wdn8x" Sep 30 06:42:02 crc kubenswrapper[4691]: I0930 06:42:02.724518 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 06:42:02 crc kubenswrapper[4691]: I0930 06:42:02.724669 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 06:42:02 crc kubenswrapper[4691]: I0930 06:42:02.724832 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 06:42:02 crc kubenswrapper[4691]: I0930 06:42:02.727553 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vff8k" Sep 30 06:42:02 crc kubenswrapper[4691]: I0930 06:42:02.752472 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wdn8x"] Sep 30 06:42:02 crc kubenswrapper[4691]: I0930 06:42:02.842409 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c6027156-9dfc-40c5-b265-96d0231b32d6-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-wdn8x\" (UID: \"c6027156-9dfc-40c5-b265-96d0231b32d6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wdn8x" Sep 30 06:42:02 crc kubenswrapper[4691]: I0930 06:42:02.842506 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zlrf\" (UniqueName: \"kubernetes.io/projected/c6027156-9dfc-40c5-b265-96d0231b32d6-kube-api-access-2zlrf\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-wdn8x\" (UID: \"c6027156-9dfc-40c5-b265-96d0231b32d6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wdn8x" Sep 30 06:42:02 crc kubenswrapper[4691]: I0930 06:42:02.842604 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c6027156-9dfc-40c5-b265-96d0231b32d6-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-wdn8x\" (UID: \"c6027156-9dfc-40c5-b265-96d0231b32d6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wdn8x" Sep 30 06:42:02 crc kubenswrapper[4691]: I0930 06:42:02.842816 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6027156-9dfc-40c5-b265-96d0231b32d6-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-wdn8x\" (UID: \"c6027156-9dfc-40c5-b265-96d0231b32d6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wdn8x" Sep 30 06:42:02 crc kubenswrapper[4691]: I0930 06:42:02.945370 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c6027156-9dfc-40c5-b265-96d0231b32d6-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-wdn8x\" (UID: \"c6027156-9dfc-40c5-b265-96d0231b32d6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wdn8x" Sep 30 06:42:02 crc kubenswrapper[4691]: I0930 06:42:02.946080 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zlrf\" (UniqueName: \"kubernetes.io/projected/c6027156-9dfc-40c5-b265-96d0231b32d6-kube-api-access-2zlrf\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-wdn8x\" (UID: \"c6027156-9dfc-40c5-b265-96d0231b32d6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wdn8x" Sep 30 06:42:02 crc kubenswrapper[4691]: I0930 06:42:02.946416 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c6027156-9dfc-40c5-b265-96d0231b32d6-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-wdn8x\" (UID: \"c6027156-9dfc-40c5-b265-96d0231b32d6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wdn8x" Sep 30 06:42:02 crc kubenswrapper[4691]: I0930 06:42:02.946633 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6027156-9dfc-40c5-b265-96d0231b32d6-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-wdn8x\" (UID: \"c6027156-9dfc-40c5-b265-96d0231b32d6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wdn8x" Sep 30 06:42:02 crc kubenswrapper[4691]: I0930 06:42:02.951071 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c6027156-9dfc-40c5-b265-96d0231b32d6-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-wdn8x\" (UID: \"c6027156-9dfc-40c5-b265-96d0231b32d6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wdn8x" Sep 30 06:42:02 crc kubenswrapper[4691]: I0930 06:42:02.951819 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6027156-9dfc-40c5-b265-96d0231b32d6-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-wdn8x\" (UID: \"c6027156-9dfc-40c5-b265-96d0231b32d6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wdn8x" Sep 30 06:42:02 crc kubenswrapper[4691]: I0930 06:42:02.952043 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c6027156-9dfc-40c5-b265-96d0231b32d6-ssh-key\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-wdn8x\" (UID: \"c6027156-9dfc-40c5-b265-96d0231b32d6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wdn8x" Sep 30 06:42:02 crc kubenswrapper[4691]: I0930 06:42:02.967578 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zlrf\" (UniqueName: \"kubernetes.io/projected/c6027156-9dfc-40c5-b265-96d0231b32d6-kube-api-access-2zlrf\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-wdn8x\" (UID: \"c6027156-9dfc-40c5-b265-96d0231b32d6\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wdn8x" Sep 30 06:42:03 crc kubenswrapper[4691]: I0930 06:42:03.043795 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wdn8x" Sep 30 06:42:03 crc kubenswrapper[4691]: I0930 06:42:03.590380 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wdn8x"] Sep 30 06:42:03 crc kubenswrapper[4691]: W0930 06:42:03.593475 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc6027156_9dfc_40c5_b265_96d0231b32d6.slice/crio-4cd58e146bbfec38532311f3c5cf9aa27592e950551d21774ccea08b6e38119b WatchSource:0}: Error finding container 4cd58e146bbfec38532311f3c5cf9aa27592e950551d21774ccea08b6e38119b: Status 404 returned error can't find the container with id 4cd58e146bbfec38532311f3c5cf9aa27592e950551d21774ccea08b6e38119b Sep 30 06:42:03 crc kubenswrapper[4691]: I0930 06:42:03.641609 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wdn8x" event={"ID":"c6027156-9dfc-40c5-b265-96d0231b32d6","Type":"ContainerStarted","Data":"4cd58e146bbfec38532311f3c5cf9aa27592e950551d21774ccea08b6e38119b"} Sep 30 06:42:04 crc kubenswrapper[4691]: I0930 06:42:04.656551 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wdn8x" event={"ID":"c6027156-9dfc-40c5-b265-96d0231b32d6","Type":"ContainerStarted","Data":"4f10c432397d0d98981e0e953ecc513ad8213e3c072d1ba55085f48e5f221a18"} Sep 30 06:42:22 crc kubenswrapper[4691]: I0930 06:42:22.811456 4691 scope.go:117] "RemoveContainer" containerID="0becdf2d4e2225f3ba85bfa6fa6aabc1805b8268d9fe2c4cf31a3f417ac427ba" Sep 30 06:42:52 crc kubenswrapper[4691]: I0930 06:42:52.850346 4691 patch_prober.go:28] interesting pod/machine-config-daemon-4w4k6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 06:42:52 crc kubenswrapper[4691]: I0930 06:42:52.851154 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 06:43:18 crc kubenswrapper[4691]: I0930 06:43:18.553307 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wdn8x" podStartSLOduration=76.065938499 podStartE2EDuration="1m16.553289664s" podCreationTimestamp="2025-09-30 06:42:02 +0000 UTC" firstStartedPulling="2025-09-30 06:42:03.595842918 +0000 UTC m=+1367.070863958" lastFinishedPulling="2025-09-30 06:42:04.083194083 +0000 UTC m=+1367.558215123" observedRunningTime="2025-09-30 06:42:04.695679171 +0000 UTC m=+1368.170700231" watchObservedRunningTime="2025-09-30 06:43:18.553289664 +0000 UTC m=+1442.028310714" Sep 30 06:43:18 crc kubenswrapper[4691]: I0930 06:43:18.567278 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lt2bd"] Sep 30 06:43:18 crc kubenswrapper[4691]: I0930 06:43:18.571044 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lt2bd" Sep 30 06:43:18 crc kubenswrapper[4691]: I0930 06:43:18.579559 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lt2bd"] Sep 30 06:43:18 crc kubenswrapper[4691]: I0930 06:43:18.655206 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a728553-134e-487a-ba95-9d3564560bfb-catalog-content\") pod \"community-operators-lt2bd\" (UID: \"7a728553-134e-487a-ba95-9d3564560bfb\") " pod="openshift-marketplace/community-operators-lt2bd" Sep 30 06:43:18 crc kubenswrapper[4691]: I0930 06:43:18.656018 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a728553-134e-487a-ba95-9d3564560bfb-utilities\") pod \"community-operators-lt2bd\" (UID: \"7a728553-134e-487a-ba95-9d3564560bfb\") " pod="openshift-marketplace/community-operators-lt2bd" Sep 30 06:43:18 crc kubenswrapper[4691]: I0930 06:43:18.656493 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4xm9\" (UniqueName: \"kubernetes.io/projected/7a728553-134e-487a-ba95-9d3564560bfb-kube-api-access-l4xm9\") pod \"community-operators-lt2bd\" (UID: \"7a728553-134e-487a-ba95-9d3564560bfb\") " pod="openshift-marketplace/community-operators-lt2bd" Sep 30 06:43:18 crc kubenswrapper[4691]: I0930 06:43:18.759556 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a728553-134e-487a-ba95-9d3564560bfb-catalog-content\") pod \"community-operators-lt2bd\" (UID: \"7a728553-134e-487a-ba95-9d3564560bfb\") " pod="openshift-marketplace/community-operators-lt2bd" Sep 30 06:43:18 crc kubenswrapper[4691]: I0930 06:43:18.759629 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a728553-134e-487a-ba95-9d3564560bfb-utilities\") pod \"community-operators-lt2bd\" (UID: \"7a728553-134e-487a-ba95-9d3564560bfb\") " pod="openshift-marketplace/community-operators-lt2bd" Sep 30 06:43:18 crc kubenswrapper[4691]: I0930 06:43:18.759746 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4xm9\" (UniqueName: \"kubernetes.io/projected/7a728553-134e-487a-ba95-9d3564560bfb-kube-api-access-l4xm9\") pod \"community-operators-lt2bd\" (UID: \"7a728553-134e-487a-ba95-9d3564560bfb\") " pod="openshift-marketplace/community-operators-lt2bd" Sep 30 06:43:18 crc kubenswrapper[4691]: I0930 06:43:18.761425 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a728553-134e-487a-ba95-9d3564560bfb-utilities\") pod \"community-operators-lt2bd\" (UID: \"7a728553-134e-487a-ba95-9d3564560bfb\") " pod="openshift-marketplace/community-operators-lt2bd" Sep 30 06:43:18 crc kubenswrapper[4691]: I0930 06:43:18.761695 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a728553-134e-487a-ba95-9d3564560bfb-catalog-content\") pod \"community-operators-lt2bd\" (UID: \"7a728553-134e-487a-ba95-9d3564560bfb\") " pod="openshift-marketplace/community-operators-lt2bd" Sep 30 06:43:18 crc kubenswrapper[4691]: I0930 06:43:18.802004 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4xm9\" (UniqueName: \"kubernetes.io/projected/7a728553-134e-487a-ba95-9d3564560bfb-kube-api-access-l4xm9\") pod \"community-operators-lt2bd\" (UID: \"7a728553-134e-487a-ba95-9d3564560bfb\") " pod="openshift-marketplace/community-operators-lt2bd" Sep 30 06:43:18 crc kubenswrapper[4691]: I0930 06:43:18.905304 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lt2bd" Sep 30 06:43:19 crc kubenswrapper[4691]: I0930 06:43:19.483024 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lt2bd"] Sep 30 06:43:19 crc kubenswrapper[4691]: W0930 06:43:19.492635 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a728553_134e_487a_ba95_9d3564560bfb.slice/crio-29572f4ab63414806a596787f626359c62ba664b38f5d2899df2292c2a63a9b1 WatchSource:0}: Error finding container 29572f4ab63414806a596787f626359c62ba664b38f5d2899df2292c2a63a9b1: Status 404 returned error can't find the container with id 29572f4ab63414806a596787f626359c62ba664b38f5d2899df2292c2a63a9b1 Sep 30 06:43:19 crc kubenswrapper[4691]: I0930 06:43:19.650631 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lt2bd" event={"ID":"7a728553-134e-487a-ba95-9d3564560bfb","Type":"ContainerStarted","Data":"29572f4ab63414806a596787f626359c62ba664b38f5d2899df2292c2a63a9b1"} Sep 30 06:43:20 crc kubenswrapper[4691]: I0930 06:43:20.662753 4691 generic.go:334] "Generic (PLEG): container finished" podID="7a728553-134e-487a-ba95-9d3564560bfb" containerID="f1f2e6da78179e3d0013f7cf637d0c0127f2ecc16ecb3516a77d5f1a247a13b9" exitCode=0 Sep 30 06:43:20 crc kubenswrapper[4691]: I0930 06:43:20.662900 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lt2bd" event={"ID":"7a728553-134e-487a-ba95-9d3564560bfb","Type":"ContainerDied","Data":"f1f2e6da78179e3d0013f7cf637d0c0127f2ecc16ecb3516a77d5f1a247a13b9"} Sep 30 06:43:21 crc kubenswrapper[4691]: I0930 06:43:21.674795 4691 generic.go:334] "Generic (PLEG): container finished" podID="7a728553-134e-487a-ba95-9d3564560bfb" containerID="054b68328db823a5c1f88e00aed4f7332b1fddddffd10c57499496a7e36377c9" exitCode=0 Sep 30 06:43:21 crc kubenswrapper[4691]: I0930 06:43:21.674863 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lt2bd" event={"ID":"7a728553-134e-487a-ba95-9d3564560bfb","Type":"ContainerDied","Data":"054b68328db823a5c1f88e00aed4f7332b1fddddffd10c57499496a7e36377c9"} Sep 30 06:43:22 crc kubenswrapper[4691]: I0930 06:43:22.687115 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lt2bd" event={"ID":"7a728553-134e-487a-ba95-9d3564560bfb","Type":"ContainerStarted","Data":"90cb93889cc5b780df7f85aa1f3375dc72199bc8d0b2be9fab63569aa60561e0"} Sep 30 06:43:22 crc kubenswrapper[4691]: I0930 06:43:22.715346 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lt2bd" podStartSLOduration=3.351044135 podStartE2EDuration="4.715322722s" podCreationTimestamp="2025-09-30 06:43:18 +0000 UTC" firstStartedPulling="2025-09-30 06:43:20.665123031 +0000 UTC m=+1444.140144081" lastFinishedPulling="2025-09-30 06:43:22.029401608 +0000 UTC m=+1445.504422668" observedRunningTime="2025-09-30 06:43:22.705388264 +0000 UTC m=+1446.180409314" watchObservedRunningTime="2025-09-30 06:43:22.715322722 +0000 UTC m=+1446.190343772" Sep 30 06:43:22 crc kubenswrapper[4691]: I0930 06:43:22.850237 4691 patch_prober.go:28] interesting pod/machine-config-daemon-4w4k6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 06:43:22 crc kubenswrapper[4691]: I0930 06:43:22.850307 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 06:43:22 crc kubenswrapper[4691]: I0930 06:43:22.923780 4691 scope.go:117] "RemoveContainer" containerID="50d0d88a5f1231386758cbaf3ab91925e14ccbf64d6051e87f834402f40a6ee1" Sep 30 06:43:22 crc kubenswrapper[4691]: I0930 06:43:22.951709 4691 scope.go:117] "RemoveContainer" containerID="1f25bc6dda3eba3f29ffd650446bc5056db4492240f2fd9795efaa7075f86d61" Sep 30 06:43:22 crc kubenswrapper[4691]: I0930 06:43:22.976416 4691 scope.go:117] "RemoveContainer" containerID="3b94374bd067443ed31a28e4b285e89c05324fc370fedd8b4d95d64d45cb5855" Sep 30 06:43:28 crc kubenswrapper[4691]: I0930 06:43:28.906371 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lt2bd" Sep 30 06:43:28 crc kubenswrapper[4691]: I0930 06:43:28.906760 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lt2bd" Sep 30 06:43:28 crc kubenswrapper[4691]: I0930 06:43:28.977401 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lt2bd" Sep 30 06:43:29 crc kubenswrapper[4691]: I0930 06:43:29.849240 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lt2bd" Sep 30 06:43:29 crc kubenswrapper[4691]: I0930 06:43:29.928306 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lt2bd"] Sep 30 06:43:31 crc kubenswrapper[4691]: I0930 06:43:31.789587 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lt2bd" podUID="7a728553-134e-487a-ba95-9d3564560bfb" containerName="registry-server" containerID="cri-o://90cb93889cc5b780df7f85aa1f3375dc72199bc8d0b2be9fab63569aa60561e0" gracePeriod=2 Sep 30 06:43:32 crc kubenswrapper[4691]: I0930 06:43:32.292646 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lt2bd" Sep 30 06:43:32 crc kubenswrapper[4691]: I0930 06:43:32.369287 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4xm9\" (UniqueName: \"kubernetes.io/projected/7a728553-134e-487a-ba95-9d3564560bfb-kube-api-access-l4xm9\") pod \"7a728553-134e-487a-ba95-9d3564560bfb\" (UID: \"7a728553-134e-487a-ba95-9d3564560bfb\") " Sep 30 06:43:32 crc kubenswrapper[4691]: I0930 06:43:32.369409 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a728553-134e-487a-ba95-9d3564560bfb-utilities\") pod \"7a728553-134e-487a-ba95-9d3564560bfb\" (UID: \"7a728553-134e-487a-ba95-9d3564560bfb\") " Sep 30 06:43:32 crc kubenswrapper[4691]: I0930 06:43:32.369480 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a728553-134e-487a-ba95-9d3564560bfb-catalog-content\") pod \"7a728553-134e-487a-ba95-9d3564560bfb\" (UID: \"7a728553-134e-487a-ba95-9d3564560bfb\") " Sep 30 06:43:32 crc kubenswrapper[4691]: I0930 06:43:32.370290 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a728553-134e-487a-ba95-9d3564560bfb-utilities" (OuterVolumeSpecName: "utilities") pod "7a728553-134e-487a-ba95-9d3564560bfb" (UID: "7a728553-134e-487a-ba95-9d3564560bfb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:43:32 crc kubenswrapper[4691]: I0930 06:43:32.381311 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a728553-134e-487a-ba95-9d3564560bfb-kube-api-access-l4xm9" (OuterVolumeSpecName: "kube-api-access-l4xm9") pod "7a728553-134e-487a-ba95-9d3564560bfb" (UID: "7a728553-134e-487a-ba95-9d3564560bfb"). InnerVolumeSpecName "kube-api-access-l4xm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:43:32 crc kubenswrapper[4691]: I0930 06:43:32.442583 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a728553-134e-487a-ba95-9d3564560bfb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7a728553-134e-487a-ba95-9d3564560bfb" (UID: "7a728553-134e-487a-ba95-9d3564560bfb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:43:32 crc kubenswrapper[4691]: I0930 06:43:32.472286 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7a728553-134e-487a-ba95-9d3564560bfb-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 06:43:32 crc kubenswrapper[4691]: I0930 06:43:32.472555 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7a728553-134e-487a-ba95-9d3564560bfb-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 06:43:32 crc kubenswrapper[4691]: I0930 06:43:32.472573 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4xm9\" (UniqueName: \"kubernetes.io/projected/7a728553-134e-487a-ba95-9d3564560bfb-kube-api-access-l4xm9\") on node \"crc\" DevicePath \"\"" Sep 30 06:43:32 crc kubenswrapper[4691]: I0930 06:43:32.805017 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lt2bd" Sep 30 06:43:32 crc kubenswrapper[4691]: I0930 06:43:32.805021 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lt2bd" event={"ID":"7a728553-134e-487a-ba95-9d3564560bfb","Type":"ContainerDied","Data":"90cb93889cc5b780df7f85aa1f3375dc72199bc8d0b2be9fab63569aa60561e0"} Sep 30 06:43:32 crc kubenswrapper[4691]: I0930 06:43:32.805220 4691 scope.go:117] "RemoveContainer" containerID="90cb93889cc5b780df7f85aa1f3375dc72199bc8d0b2be9fab63569aa60561e0" Sep 30 06:43:32 crc kubenswrapper[4691]: I0930 06:43:32.805023 4691 generic.go:334] "Generic (PLEG): container finished" podID="7a728553-134e-487a-ba95-9d3564560bfb" containerID="90cb93889cc5b780df7f85aa1f3375dc72199bc8d0b2be9fab63569aa60561e0" exitCode=0 Sep 30 06:43:32 crc kubenswrapper[4691]: I0930 06:43:32.806046 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lt2bd" event={"ID":"7a728553-134e-487a-ba95-9d3564560bfb","Type":"ContainerDied","Data":"29572f4ab63414806a596787f626359c62ba664b38f5d2899df2292c2a63a9b1"} Sep 30 06:43:32 crc kubenswrapper[4691]: I0930 06:43:32.829789 4691 scope.go:117] "RemoveContainer" containerID="054b68328db823a5c1f88e00aed4f7332b1fddddffd10c57499496a7e36377c9" Sep 30 06:43:32 crc kubenswrapper[4691]: I0930 06:43:32.868794 4691 scope.go:117] "RemoveContainer" containerID="f1f2e6da78179e3d0013f7cf637d0c0127f2ecc16ecb3516a77d5f1a247a13b9" Sep 30 06:43:32 crc kubenswrapper[4691]: I0930 06:43:32.876336 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lt2bd"] Sep 30 06:43:32 crc kubenswrapper[4691]: I0930 06:43:32.892732 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lt2bd"] Sep 30 06:43:32 crc kubenswrapper[4691]: I0930 06:43:32.953571 4691 scope.go:117] "RemoveContainer" containerID="90cb93889cc5b780df7f85aa1f3375dc72199bc8d0b2be9fab63569aa60561e0" Sep 30 06:43:32 crc kubenswrapper[4691]: E0930 06:43:32.954372 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"90cb93889cc5b780df7f85aa1f3375dc72199bc8d0b2be9fab63569aa60561e0\": container with ID starting with 90cb93889cc5b780df7f85aa1f3375dc72199bc8d0b2be9fab63569aa60561e0 not found: ID does not exist" containerID="90cb93889cc5b780df7f85aa1f3375dc72199bc8d0b2be9fab63569aa60561e0" Sep 30 06:43:32 crc kubenswrapper[4691]: I0930 06:43:32.954413 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"90cb93889cc5b780df7f85aa1f3375dc72199bc8d0b2be9fab63569aa60561e0"} err="failed to get container status \"90cb93889cc5b780df7f85aa1f3375dc72199bc8d0b2be9fab63569aa60561e0\": rpc error: code = NotFound desc = could not find container \"90cb93889cc5b780df7f85aa1f3375dc72199bc8d0b2be9fab63569aa60561e0\": container with ID starting with 90cb93889cc5b780df7f85aa1f3375dc72199bc8d0b2be9fab63569aa60561e0 not found: ID does not exist" Sep 30 06:43:32 crc kubenswrapper[4691]: I0930 06:43:32.954438 4691 scope.go:117] "RemoveContainer" containerID="054b68328db823a5c1f88e00aed4f7332b1fddddffd10c57499496a7e36377c9" Sep 30 06:43:32 crc kubenswrapper[4691]: E0930 06:43:32.956101 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"054b68328db823a5c1f88e00aed4f7332b1fddddffd10c57499496a7e36377c9\": container with ID starting with 054b68328db823a5c1f88e00aed4f7332b1fddddffd10c57499496a7e36377c9 not found: ID does not exist" containerID="054b68328db823a5c1f88e00aed4f7332b1fddddffd10c57499496a7e36377c9" Sep 30 06:43:32 crc kubenswrapper[4691]: I0930 06:43:32.956137 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"054b68328db823a5c1f88e00aed4f7332b1fddddffd10c57499496a7e36377c9"} err="failed to get container status \"054b68328db823a5c1f88e00aed4f7332b1fddddffd10c57499496a7e36377c9\": rpc error: code = NotFound desc = could not find container \"054b68328db823a5c1f88e00aed4f7332b1fddddffd10c57499496a7e36377c9\": container with ID starting with 054b68328db823a5c1f88e00aed4f7332b1fddddffd10c57499496a7e36377c9 not found: ID does not exist" Sep 30 06:43:32 crc kubenswrapper[4691]: I0930 06:43:32.956187 4691 scope.go:117] "RemoveContainer" containerID="f1f2e6da78179e3d0013f7cf637d0c0127f2ecc16ecb3516a77d5f1a247a13b9" Sep 30 06:43:32 crc kubenswrapper[4691]: E0930 06:43:32.957752 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1f2e6da78179e3d0013f7cf637d0c0127f2ecc16ecb3516a77d5f1a247a13b9\": container with ID starting with f1f2e6da78179e3d0013f7cf637d0c0127f2ecc16ecb3516a77d5f1a247a13b9 not found: ID does not exist" containerID="f1f2e6da78179e3d0013f7cf637d0c0127f2ecc16ecb3516a77d5f1a247a13b9" Sep 30 06:43:32 crc kubenswrapper[4691]: I0930 06:43:32.957784 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1f2e6da78179e3d0013f7cf637d0c0127f2ecc16ecb3516a77d5f1a247a13b9"} err="failed to get container status \"f1f2e6da78179e3d0013f7cf637d0c0127f2ecc16ecb3516a77d5f1a247a13b9\": rpc error: code = NotFound desc = could not find container \"f1f2e6da78179e3d0013f7cf637d0c0127f2ecc16ecb3516a77d5f1a247a13b9\": container with ID starting with f1f2e6da78179e3d0013f7cf637d0c0127f2ecc16ecb3516a77d5f1a247a13b9 not found: ID does not exist" Sep 30 06:43:33 crc kubenswrapper[4691]: I0930 06:43:33.250216 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a728553-134e-487a-ba95-9d3564560bfb" path="/var/lib/kubelet/pods/7a728553-134e-487a-ba95-9d3564560bfb/volumes" Sep 30 06:43:40 crc kubenswrapper[4691]: I0930 06:43:40.058565 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-js7vc"] Sep 30 06:43:40 crc kubenswrapper[4691]: E0930 06:43:40.059544 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a728553-134e-487a-ba95-9d3564560bfb" containerName="extract-content" Sep 30 06:43:40 crc kubenswrapper[4691]: I0930 06:43:40.059559 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a728553-134e-487a-ba95-9d3564560bfb" containerName="extract-content" Sep 30 06:43:40 crc kubenswrapper[4691]: E0930 06:43:40.059573 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a728553-134e-487a-ba95-9d3564560bfb" containerName="registry-server" Sep 30 06:43:40 crc kubenswrapper[4691]: I0930 06:43:40.059579 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a728553-134e-487a-ba95-9d3564560bfb" containerName="registry-server" Sep 30 06:43:40 crc kubenswrapper[4691]: E0930 06:43:40.059607 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a728553-134e-487a-ba95-9d3564560bfb" containerName="extract-utilities" Sep 30 06:43:40 crc kubenswrapper[4691]: I0930 06:43:40.059614 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a728553-134e-487a-ba95-9d3564560bfb" containerName="extract-utilities" Sep 30 06:43:40 crc kubenswrapper[4691]: I0930 06:43:40.059861 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a728553-134e-487a-ba95-9d3564560bfb" containerName="registry-server" Sep 30 06:43:40 crc kubenswrapper[4691]: I0930 06:43:40.066623 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-js7vc" Sep 30 06:43:40 crc kubenswrapper[4691]: I0930 06:43:40.077819 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-js7vc"] Sep 30 06:43:40 crc kubenswrapper[4691]: I0930 06:43:40.151599 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79e2df2d-2e2d-45f2-90a6-719887ee5c83-catalog-content\") pod \"certified-operators-js7vc\" (UID: \"79e2df2d-2e2d-45f2-90a6-719887ee5c83\") " pod="openshift-marketplace/certified-operators-js7vc" Sep 30 06:43:40 crc kubenswrapper[4691]: I0930 06:43:40.151687 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcf8h\" (UniqueName: \"kubernetes.io/projected/79e2df2d-2e2d-45f2-90a6-719887ee5c83-kube-api-access-jcf8h\") pod \"certified-operators-js7vc\" (UID: \"79e2df2d-2e2d-45f2-90a6-719887ee5c83\") " pod="openshift-marketplace/certified-operators-js7vc" Sep 30 06:43:40 crc kubenswrapper[4691]: I0930 06:43:40.151768 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79e2df2d-2e2d-45f2-90a6-719887ee5c83-utilities\") pod \"certified-operators-js7vc\" (UID: \"79e2df2d-2e2d-45f2-90a6-719887ee5c83\") " pod="openshift-marketplace/certified-operators-js7vc" Sep 30 06:43:40 crc kubenswrapper[4691]: I0930 06:43:40.253574 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcf8h\" (UniqueName: \"kubernetes.io/projected/79e2df2d-2e2d-45f2-90a6-719887ee5c83-kube-api-access-jcf8h\") pod \"certified-operators-js7vc\" (UID: \"79e2df2d-2e2d-45f2-90a6-719887ee5c83\") " pod="openshift-marketplace/certified-operators-js7vc" Sep 30 06:43:40 crc kubenswrapper[4691]: I0930 06:43:40.253691 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79e2df2d-2e2d-45f2-90a6-719887ee5c83-utilities\") pod \"certified-operators-js7vc\" (UID: \"79e2df2d-2e2d-45f2-90a6-719887ee5c83\") " pod="openshift-marketplace/certified-operators-js7vc" Sep 30 06:43:40 crc kubenswrapper[4691]: I0930 06:43:40.253779 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79e2df2d-2e2d-45f2-90a6-719887ee5c83-catalog-content\") pod \"certified-operators-js7vc\" (UID: \"79e2df2d-2e2d-45f2-90a6-719887ee5c83\") " pod="openshift-marketplace/certified-operators-js7vc" Sep 30 06:43:40 crc kubenswrapper[4691]: I0930 06:43:40.254255 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79e2df2d-2e2d-45f2-90a6-719887ee5c83-catalog-content\") pod \"certified-operators-js7vc\" (UID: \"79e2df2d-2e2d-45f2-90a6-719887ee5c83\") " pod="openshift-marketplace/certified-operators-js7vc" Sep 30 06:43:40 crc kubenswrapper[4691]: I0930 06:43:40.254329 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79e2df2d-2e2d-45f2-90a6-719887ee5c83-utilities\") pod \"certified-operators-js7vc\" (UID: \"79e2df2d-2e2d-45f2-90a6-719887ee5c83\") " pod="openshift-marketplace/certified-operators-js7vc" Sep 30 06:43:40 crc kubenswrapper[4691]: I0930 06:43:40.270848 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcf8h\" (UniqueName: \"kubernetes.io/projected/79e2df2d-2e2d-45f2-90a6-719887ee5c83-kube-api-access-jcf8h\") pod \"certified-operators-js7vc\" (UID: \"79e2df2d-2e2d-45f2-90a6-719887ee5c83\") " pod="openshift-marketplace/certified-operators-js7vc" Sep 30 06:43:40 crc kubenswrapper[4691]: I0930 06:43:40.385786 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-js7vc" Sep 30 06:43:40 crc kubenswrapper[4691]: I0930 06:43:40.897722 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-js7vc"] Sep 30 06:43:41 crc kubenswrapper[4691]: I0930 06:43:41.912229 4691 generic.go:334] "Generic (PLEG): container finished" podID="79e2df2d-2e2d-45f2-90a6-719887ee5c83" containerID="d9e5e5db064abc0fb26fbb02ee8e4a26117c6d86e2179aadd6db503ae790cc55" exitCode=0 Sep 30 06:43:41 crc kubenswrapper[4691]: I0930 06:43:41.912283 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-js7vc" event={"ID":"79e2df2d-2e2d-45f2-90a6-719887ee5c83","Type":"ContainerDied","Data":"d9e5e5db064abc0fb26fbb02ee8e4a26117c6d86e2179aadd6db503ae790cc55"} Sep 30 06:43:41 crc kubenswrapper[4691]: I0930 06:43:41.912484 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-js7vc" event={"ID":"79e2df2d-2e2d-45f2-90a6-719887ee5c83","Type":"ContainerStarted","Data":"e1a3a3febe122e6a95449b1ae93bec8f4bd0fb18f01df9628303dff431ed11bd"} Sep 30 06:43:41 crc kubenswrapper[4691]: I0930 06:43:41.916289 4691 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 06:43:43 crc kubenswrapper[4691]: I0930 06:43:43.935755 4691 generic.go:334] "Generic (PLEG): container finished" podID="79e2df2d-2e2d-45f2-90a6-719887ee5c83" containerID="251b3c037bba0e3a4c77cc31aaf781228c9b85b040b1b7cd135590f2c78c3150" exitCode=0 Sep 30 06:43:43 crc kubenswrapper[4691]: I0930 06:43:43.935866 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-js7vc" event={"ID":"79e2df2d-2e2d-45f2-90a6-719887ee5c83","Type":"ContainerDied","Data":"251b3c037bba0e3a4c77cc31aaf781228c9b85b040b1b7cd135590f2c78c3150"} Sep 30 06:43:44 crc kubenswrapper[4691]: I0930 06:43:44.949732 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-js7vc" event={"ID":"79e2df2d-2e2d-45f2-90a6-719887ee5c83","Type":"ContainerStarted","Data":"88b7e1de0622123904ac98bfadb72d14bf9d0608796dc0a08c05622879c27f88"} Sep 30 06:43:44 crc kubenswrapper[4691]: I0930 06:43:44.973566 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-js7vc" podStartSLOduration=2.289412121 podStartE2EDuration="4.973540058s" podCreationTimestamp="2025-09-30 06:43:40 +0000 UTC" firstStartedPulling="2025-09-30 06:43:41.915447685 +0000 UTC m=+1465.390468765" lastFinishedPulling="2025-09-30 06:43:44.599575652 +0000 UTC m=+1468.074596702" observedRunningTime="2025-09-30 06:43:44.969551931 +0000 UTC m=+1468.444573011" watchObservedRunningTime="2025-09-30 06:43:44.973540058 +0000 UTC m=+1468.448561098" Sep 30 06:43:50 crc kubenswrapper[4691]: I0930 06:43:50.385923 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-js7vc" Sep 30 06:43:50 crc kubenswrapper[4691]: I0930 06:43:50.386467 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-js7vc" Sep 30 06:43:50 crc kubenswrapper[4691]: I0930 06:43:50.459202 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-js7vc" Sep 30 06:43:51 crc kubenswrapper[4691]: I0930 06:43:51.105597 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-js7vc" Sep 30 06:43:51 crc kubenswrapper[4691]: I0930 06:43:51.240573 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-js7vc"] Sep 30 06:43:52 crc kubenswrapper[4691]: I0930 06:43:52.849944 4691 patch_prober.go:28] interesting pod/machine-config-daemon-4w4k6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 06:43:52 crc kubenswrapper[4691]: I0930 06:43:52.850466 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 06:43:52 crc kubenswrapper[4691]: I0930 06:43:52.850543 4691 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" Sep 30 06:43:52 crc kubenswrapper[4691]: I0930 06:43:52.851835 4691 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cdbae690f51f4bea9a63d0f6c926710bf8cae323365923c61958365eb48e16db"} pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 06:43:52 crc kubenswrapper[4691]: I0930 06:43:52.852331 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" containerName="machine-config-daemon" containerID="cri-o://cdbae690f51f4bea9a63d0f6c926710bf8cae323365923c61958365eb48e16db" gracePeriod=600 Sep 30 06:43:53 crc kubenswrapper[4691]: I0930 06:43:53.055680 4691 generic.go:334] "Generic (PLEG): container finished" podID="69b46ade-8260-448f-84b7-506632d23ff9" containerID="cdbae690f51f4bea9a63d0f6c926710bf8cae323365923c61958365eb48e16db" exitCode=0 Sep 30 06:43:53 crc kubenswrapper[4691]: I0930 06:43:53.055791 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" event={"ID":"69b46ade-8260-448f-84b7-506632d23ff9","Type":"ContainerDied","Data":"cdbae690f51f4bea9a63d0f6c926710bf8cae323365923c61958365eb48e16db"} Sep 30 06:43:53 crc kubenswrapper[4691]: I0930 06:43:53.055908 4691 scope.go:117] "RemoveContainer" containerID="b284d70235ce92b5dbfc6f06471e0d2494b74dc71ad661702951112856d0f82c" Sep 30 06:43:53 crc kubenswrapper[4691]: I0930 06:43:53.055987 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-js7vc" podUID="79e2df2d-2e2d-45f2-90a6-719887ee5c83" containerName="registry-server" containerID="cri-o://88b7e1de0622123904ac98bfadb72d14bf9d0608796dc0a08c05622879c27f88" gracePeriod=2 Sep 30 06:43:53 crc kubenswrapper[4691]: I0930 06:43:53.575819 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-js7vc" Sep 30 06:43:53 crc kubenswrapper[4691]: I0930 06:43:53.744247 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79e2df2d-2e2d-45f2-90a6-719887ee5c83-utilities\") pod \"79e2df2d-2e2d-45f2-90a6-719887ee5c83\" (UID: \"79e2df2d-2e2d-45f2-90a6-719887ee5c83\") " Sep 30 06:43:53 crc kubenswrapper[4691]: I0930 06:43:53.744386 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79e2df2d-2e2d-45f2-90a6-719887ee5c83-catalog-content\") pod \"79e2df2d-2e2d-45f2-90a6-719887ee5c83\" (UID: \"79e2df2d-2e2d-45f2-90a6-719887ee5c83\") " Sep 30 06:43:53 crc kubenswrapper[4691]: I0930 06:43:53.744612 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jcf8h\" (UniqueName: \"kubernetes.io/projected/79e2df2d-2e2d-45f2-90a6-719887ee5c83-kube-api-access-jcf8h\") pod \"79e2df2d-2e2d-45f2-90a6-719887ee5c83\" (UID: \"79e2df2d-2e2d-45f2-90a6-719887ee5c83\") " Sep 30 06:43:53 crc kubenswrapper[4691]: I0930 06:43:53.746217 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79e2df2d-2e2d-45f2-90a6-719887ee5c83-utilities" (OuterVolumeSpecName: "utilities") pod "79e2df2d-2e2d-45f2-90a6-719887ee5c83" (UID: "79e2df2d-2e2d-45f2-90a6-719887ee5c83"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:43:53 crc kubenswrapper[4691]: I0930 06:43:53.754676 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79e2df2d-2e2d-45f2-90a6-719887ee5c83-kube-api-access-jcf8h" (OuterVolumeSpecName: "kube-api-access-jcf8h") pod "79e2df2d-2e2d-45f2-90a6-719887ee5c83" (UID: "79e2df2d-2e2d-45f2-90a6-719887ee5c83"). InnerVolumeSpecName "kube-api-access-jcf8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:43:53 crc kubenswrapper[4691]: I0930 06:43:53.792592 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79e2df2d-2e2d-45f2-90a6-719887ee5c83-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "79e2df2d-2e2d-45f2-90a6-719887ee5c83" (UID: "79e2df2d-2e2d-45f2-90a6-719887ee5c83"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:43:53 crc kubenswrapper[4691]: I0930 06:43:53.847603 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/79e2df2d-2e2d-45f2-90a6-719887ee5c83-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 06:43:53 crc kubenswrapper[4691]: I0930 06:43:53.847652 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jcf8h\" (UniqueName: \"kubernetes.io/projected/79e2df2d-2e2d-45f2-90a6-719887ee5c83-kube-api-access-jcf8h\") on node \"crc\" DevicePath \"\"" Sep 30 06:43:53 crc kubenswrapper[4691]: I0930 06:43:53.847671 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/79e2df2d-2e2d-45f2-90a6-719887ee5c83-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 06:43:54 crc kubenswrapper[4691]: I0930 06:43:54.072259 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" event={"ID":"69b46ade-8260-448f-84b7-506632d23ff9","Type":"ContainerStarted","Data":"584128b586b25bf701ec8def4f23ef28936697a1f4acd379f53fcb8a26a1716a"} Sep 30 06:43:54 crc kubenswrapper[4691]: I0930 06:43:54.077745 4691 generic.go:334] "Generic (PLEG): container finished" podID="79e2df2d-2e2d-45f2-90a6-719887ee5c83" containerID="88b7e1de0622123904ac98bfadb72d14bf9d0608796dc0a08c05622879c27f88" exitCode=0 Sep 30 06:43:54 crc kubenswrapper[4691]: I0930 06:43:54.077818 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-js7vc" event={"ID":"79e2df2d-2e2d-45f2-90a6-719887ee5c83","Type":"ContainerDied","Data":"88b7e1de0622123904ac98bfadb72d14bf9d0608796dc0a08c05622879c27f88"} Sep 30 06:43:54 crc kubenswrapper[4691]: I0930 06:43:54.077869 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-js7vc" event={"ID":"79e2df2d-2e2d-45f2-90a6-719887ee5c83","Type":"ContainerDied","Data":"e1a3a3febe122e6a95449b1ae93bec8f4bd0fb18f01df9628303dff431ed11bd"} Sep 30 06:43:54 crc kubenswrapper[4691]: I0930 06:43:54.077863 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-js7vc" Sep 30 06:43:54 crc kubenswrapper[4691]: I0930 06:43:54.077922 4691 scope.go:117] "RemoveContainer" containerID="88b7e1de0622123904ac98bfadb72d14bf9d0608796dc0a08c05622879c27f88" Sep 30 06:43:54 crc kubenswrapper[4691]: I0930 06:43:54.111799 4691 scope.go:117] "RemoveContainer" containerID="251b3c037bba0e3a4c77cc31aaf781228c9b85b040b1b7cd135590f2c78c3150" Sep 30 06:43:54 crc kubenswrapper[4691]: I0930 06:43:54.141847 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-js7vc"] Sep 30 06:43:54 crc kubenswrapper[4691]: I0930 06:43:54.157635 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-js7vc"] Sep 30 06:43:54 crc kubenswrapper[4691]: I0930 06:43:54.173826 4691 scope.go:117] "RemoveContainer" containerID="d9e5e5db064abc0fb26fbb02ee8e4a26117c6d86e2179aadd6db503ae790cc55" Sep 30 06:43:54 crc kubenswrapper[4691]: I0930 06:43:54.213615 4691 scope.go:117] "RemoveContainer" containerID="88b7e1de0622123904ac98bfadb72d14bf9d0608796dc0a08c05622879c27f88" Sep 30 06:43:54 crc kubenswrapper[4691]: E0930 06:43:54.214089 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88b7e1de0622123904ac98bfadb72d14bf9d0608796dc0a08c05622879c27f88\": container with ID starting with 88b7e1de0622123904ac98bfadb72d14bf9d0608796dc0a08c05622879c27f88 not found: ID does not exist" containerID="88b7e1de0622123904ac98bfadb72d14bf9d0608796dc0a08c05622879c27f88" Sep 30 06:43:54 crc kubenswrapper[4691]: I0930 06:43:54.214123 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88b7e1de0622123904ac98bfadb72d14bf9d0608796dc0a08c05622879c27f88"} err="failed to get container status \"88b7e1de0622123904ac98bfadb72d14bf9d0608796dc0a08c05622879c27f88\": rpc error: code = NotFound desc = could not find container \"88b7e1de0622123904ac98bfadb72d14bf9d0608796dc0a08c05622879c27f88\": container with ID starting with 88b7e1de0622123904ac98bfadb72d14bf9d0608796dc0a08c05622879c27f88 not found: ID does not exist" Sep 30 06:43:54 crc kubenswrapper[4691]: I0930 06:43:54.214144 4691 scope.go:117] "RemoveContainer" containerID="251b3c037bba0e3a4c77cc31aaf781228c9b85b040b1b7cd135590f2c78c3150" Sep 30 06:43:54 crc kubenswrapper[4691]: E0930 06:43:54.214593 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"251b3c037bba0e3a4c77cc31aaf781228c9b85b040b1b7cd135590f2c78c3150\": container with ID starting with 251b3c037bba0e3a4c77cc31aaf781228c9b85b040b1b7cd135590f2c78c3150 not found: ID does not exist" containerID="251b3c037bba0e3a4c77cc31aaf781228c9b85b040b1b7cd135590f2c78c3150" Sep 30 06:43:54 crc kubenswrapper[4691]: I0930 06:43:54.214619 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"251b3c037bba0e3a4c77cc31aaf781228c9b85b040b1b7cd135590f2c78c3150"} err="failed to get container status \"251b3c037bba0e3a4c77cc31aaf781228c9b85b040b1b7cd135590f2c78c3150\": rpc error: code = NotFound desc = could not find container \"251b3c037bba0e3a4c77cc31aaf781228c9b85b040b1b7cd135590f2c78c3150\": container with ID starting with 251b3c037bba0e3a4c77cc31aaf781228c9b85b040b1b7cd135590f2c78c3150 not found: ID does not exist" Sep 30 06:43:54 crc kubenswrapper[4691]: I0930 06:43:54.214634 4691 scope.go:117] "RemoveContainer" containerID="d9e5e5db064abc0fb26fbb02ee8e4a26117c6d86e2179aadd6db503ae790cc55" Sep 30 06:43:54 crc kubenswrapper[4691]: E0930 06:43:54.215107 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9e5e5db064abc0fb26fbb02ee8e4a26117c6d86e2179aadd6db503ae790cc55\": container with ID starting with d9e5e5db064abc0fb26fbb02ee8e4a26117c6d86e2179aadd6db503ae790cc55 not found: ID does not exist" containerID="d9e5e5db064abc0fb26fbb02ee8e4a26117c6d86e2179aadd6db503ae790cc55" Sep 30 06:43:54 crc kubenswrapper[4691]: I0930 06:43:54.215145 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9e5e5db064abc0fb26fbb02ee8e4a26117c6d86e2179aadd6db503ae790cc55"} err="failed to get container status \"d9e5e5db064abc0fb26fbb02ee8e4a26117c6d86e2179aadd6db503ae790cc55\": rpc error: code = NotFound desc = could not find container \"d9e5e5db064abc0fb26fbb02ee8e4a26117c6d86e2179aadd6db503ae790cc55\": container with ID starting with d9e5e5db064abc0fb26fbb02ee8e4a26117c6d86e2179aadd6db503ae790cc55 not found: ID does not exist" Sep 30 06:43:55 crc kubenswrapper[4691]: I0930 06:43:55.236739 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79e2df2d-2e2d-45f2-90a6-719887ee5c83" path="/var/lib/kubelet/pods/79e2df2d-2e2d-45f2-90a6-719887ee5c83/volumes" Sep 30 06:43:56 crc kubenswrapper[4691]: I0930 06:43:56.113765 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6988j"] Sep 30 06:43:56 crc kubenswrapper[4691]: E0930 06:43:56.114399 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79e2df2d-2e2d-45f2-90a6-719887ee5c83" containerName="extract-content" Sep 30 06:43:56 crc kubenswrapper[4691]: I0930 06:43:56.114417 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="79e2df2d-2e2d-45f2-90a6-719887ee5c83" containerName="extract-content" Sep 30 06:43:56 crc kubenswrapper[4691]: E0930 06:43:56.114456 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79e2df2d-2e2d-45f2-90a6-719887ee5c83" containerName="registry-server" Sep 30 06:43:56 crc kubenswrapper[4691]: I0930 06:43:56.114463 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="79e2df2d-2e2d-45f2-90a6-719887ee5c83" containerName="registry-server" Sep 30 06:43:56 crc kubenswrapper[4691]: E0930 06:43:56.114473 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79e2df2d-2e2d-45f2-90a6-719887ee5c83" containerName="extract-utilities" Sep 30 06:43:56 crc kubenswrapper[4691]: I0930 06:43:56.114479 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="79e2df2d-2e2d-45f2-90a6-719887ee5c83" containerName="extract-utilities" Sep 30 06:43:56 crc kubenswrapper[4691]: I0930 06:43:56.114668 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="79e2df2d-2e2d-45f2-90a6-719887ee5c83" containerName="registry-server" Sep 30 06:43:56 crc kubenswrapper[4691]: I0930 06:43:56.117671 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6988j" Sep 30 06:43:56 crc kubenswrapper[4691]: I0930 06:43:56.131417 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6988j"] Sep 30 06:43:56 crc kubenswrapper[4691]: I0930 06:43:56.203055 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a86411aa-c859-4de2-9640-8b93428d9638-utilities\") pod \"redhat-operators-6988j\" (UID: \"a86411aa-c859-4de2-9640-8b93428d9638\") " pod="openshift-marketplace/redhat-operators-6988j" Sep 30 06:43:56 crc kubenswrapper[4691]: I0930 06:43:56.203165 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a86411aa-c859-4de2-9640-8b93428d9638-catalog-content\") pod \"redhat-operators-6988j\" (UID: \"a86411aa-c859-4de2-9640-8b93428d9638\") " pod="openshift-marketplace/redhat-operators-6988j" Sep 30 06:43:56 crc kubenswrapper[4691]: I0930 06:43:56.203314 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7blpd\" (UniqueName: \"kubernetes.io/projected/a86411aa-c859-4de2-9640-8b93428d9638-kube-api-access-7blpd\") pod \"redhat-operators-6988j\" (UID: \"a86411aa-c859-4de2-9640-8b93428d9638\") " pod="openshift-marketplace/redhat-operators-6988j" Sep 30 06:43:56 crc kubenswrapper[4691]: I0930 06:43:56.304833 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7blpd\" (UniqueName: \"kubernetes.io/projected/a86411aa-c859-4de2-9640-8b93428d9638-kube-api-access-7blpd\") pod \"redhat-operators-6988j\" (UID: \"a86411aa-c859-4de2-9640-8b93428d9638\") " pod="openshift-marketplace/redhat-operators-6988j" Sep 30 06:43:56 crc kubenswrapper[4691]: I0930 06:43:56.306164 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a86411aa-c859-4de2-9640-8b93428d9638-utilities\") pod \"redhat-operators-6988j\" (UID: \"a86411aa-c859-4de2-9640-8b93428d9638\") " pod="openshift-marketplace/redhat-operators-6988j" Sep 30 06:43:56 crc kubenswrapper[4691]: I0930 06:43:56.306397 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a86411aa-c859-4de2-9640-8b93428d9638-catalog-content\") pod \"redhat-operators-6988j\" (UID: \"a86411aa-c859-4de2-9640-8b93428d9638\") " pod="openshift-marketplace/redhat-operators-6988j" Sep 30 06:43:56 crc kubenswrapper[4691]: I0930 06:43:56.307889 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a86411aa-c859-4de2-9640-8b93428d9638-utilities\") pod \"redhat-operators-6988j\" (UID: \"a86411aa-c859-4de2-9640-8b93428d9638\") " pod="openshift-marketplace/redhat-operators-6988j" Sep 30 06:43:56 crc kubenswrapper[4691]: I0930 06:43:56.308124 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a86411aa-c859-4de2-9640-8b93428d9638-catalog-content\") pod \"redhat-operators-6988j\" (UID: \"a86411aa-c859-4de2-9640-8b93428d9638\") " pod="openshift-marketplace/redhat-operators-6988j" Sep 30 06:43:56 crc kubenswrapper[4691]: I0930 06:43:56.334284 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7blpd\" (UniqueName: \"kubernetes.io/projected/a86411aa-c859-4de2-9640-8b93428d9638-kube-api-access-7blpd\") pod \"redhat-operators-6988j\" (UID: \"a86411aa-c859-4de2-9640-8b93428d9638\") " pod="openshift-marketplace/redhat-operators-6988j" Sep 30 06:43:56 crc kubenswrapper[4691]: I0930 06:43:56.449584 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6988j" Sep 30 06:43:56 crc kubenswrapper[4691]: I0930 06:43:56.972213 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6988j"] Sep 30 06:43:56 crc kubenswrapper[4691]: W0930 06:43:56.980074 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda86411aa_c859_4de2_9640_8b93428d9638.slice/crio-f173559c9d04fffba89c4d14816ac19ce40ea656cad8d2c4d9328e6fd3eba6a9 WatchSource:0}: Error finding container f173559c9d04fffba89c4d14816ac19ce40ea656cad8d2c4d9328e6fd3eba6a9: Status 404 returned error can't find the container with id f173559c9d04fffba89c4d14816ac19ce40ea656cad8d2c4d9328e6fd3eba6a9 Sep 30 06:43:57 crc kubenswrapper[4691]: I0930 06:43:57.114161 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6988j" event={"ID":"a86411aa-c859-4de2-9640-8b93428d9638","Type":"ContainerStarted","Data":"f173559c9d04fffba89c4d14816ac19ce40ea656cad8d2c4d9328e6fd3eba6a9"} Sep 30 06:43:58 crc kubenswrapper[4691]: I0930 06:43:58.129002 4691 generic.go:334] "Generic (PLEG): container finished" podID="a86411aa-c859-4de2-9640-8b93428d9638" containerID="439e4acb692af7358e5dfd7c0b1166ec80efe7bc976d2fd706bee6e38da266ee" exitCode=0 Sep 30 06:43:58 crc kubenswrapper[4691]: I0930 06:43:58.129080 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6988j" event={"ID":"a86411aa-c859-4de2-9640-8b93428d9638","Type":"ContainerDied","Data":"439e4acb692af7358e5dfd7c0b1166ec80efe7bc976d2fd706bee6e38da266ee"} Sep 30 06:44:00 crc kubenswrapper[4691]: I0930 06:44:00.158726 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6988j" event={"ID":"a86411aa-c859-4de2-9640-8b93428d9638","Type":"ContainerStarted","Data":"6d6f37fa878b6743a93c00e73f0b02e4b81c1104433c315f68ffac03582e5644"} Sep 30 06:44:01 crc kubenswrapper[4691]: I0930 06:44:01.173629 4691 generic.go:334] "Generic (PLEG): container finished" podID="a86411aa-c859-4de2-9640-8b93428d9638" containerID="6d6f37fa878b6743a93c00e73f0b02e4b81c1104433c315f68ffac03582e5644" exitCode=0 Sep 30 06:44:01 crc kubenswrapper[4691]: I0930 06:44:01.173675 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6988j" event={"ID":"a86411aa-c859-4de2-9640-8b93428d9638","Type":"ContainerDied","Data":"6d6f37fa878b6743a93c00e73f0b02e4b81c1104433c315f68ffac03582e5644"} Sep 30 06:44:02 crc kubenswrapper[4691]: I0930 06:44:02.205204 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6988j" event={"ID":"a86411aa-c859-4de2-9640-8b93428d9638","Type":"ContainerStarted","Data":"cf6a443d729362036962508e680481340f54e75b8a53ac73dce93c077dc1b675"} Sep 30 06:44:02 crc kubenswrapper[4691]: I0930 06:44:02.234455 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6988j" podStartSLOduration=2.689942543 podStartE2EDuration="6.23443001s" podCreationTimestamp="2025-09-30 06:43:56 +0000 UTC" firstStartedPulling="2025-09-30 06:43:58.13159422 +0000 UTC m=+1481.606615300" lastFinishedPulling="2025-09-30 06:44:01.676081727 +0000 UTC m=+1485.151102767" observedRunningTime="2025-09-30 06:44:02.223032034 +0000 UTC m=+1485.698053164" watchObservedRunningTime="2025-09-30 06:44:02.23443001 +0000 UTC m=+1485.709451090" Sep 30 06:44:06 crc kubenswrapper[4691]: I0930 06:44:06.449829 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6988j" Sep 30 06:44:06 crc kubenswrapper[4691]: I0930 06:44:06.450534 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6988j" Sep 30 06:44:07 crc kubenswrapper[4691]: I0930 06:44:07.531297 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6988j" podUID="a86411aa-c859-4de2-9640-8b93428d9638" containerName="registry-server" probeResult="failure" output=< Sep 30 06:44:07 crc kubenswrapper[4691]: timeout: failed to connect service ":50051" within 1s Sep 30 06:44:07 crc kubenswrapper[4691]: > Sep 30 06:44:16 crc kubenswrapper[4691]: I0930 06:44:16.530479 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6988j" Sep 30 06:44:16 crc kubenswrapper[4691]: I0930 06:44:16.600074 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6988j" Sep 30 06:44:16 crc kubenswrapper[4691]: I0930 06:44:16.785974 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6988j"] Sep 30 06:44:18 crc kubenswrapper[4691]: I0930 06:44:18.407314 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6988j" podUID="a86411aa-c859-4de2-9640-8b93428d9638" containerName="registry-server" containerID="cri-o://cf6a443d729362036962508e680481340f54e75b8a53ac73dce93c077dc1b675" gracePeriod=2 Sep 30 06:44:18 crc kubenswrapper[4691]: I0930 06:44:18.927137 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6988j" Sep 30 06:44:19 crc kubenswrapper[4691]: I0930 06:44:19.096411 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a86411aa-c859-4de2-9640-8b93428d9638-catalog-content\") pod \"a86411aa-c859-4de2-9640-8b93428d9638\" (UID: \"a86411aa-c859-4de2-9640-8b93428d9638\") " Sep 30 06:44:19 crc kubenswrapper[4691]: I0930 06:44:19.096923 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7blpd\" (UniqueName: \"kubernetes.io/projected/a86411aa-c859-4de2-9640-8b93428d9638-kube-api-access-7blpd\") pod \"a86411aa-c859-4de2-9640-8b93428d9638\" (UID: \"a86411aa-c859-4de2-9640-8b93428d9638\") " Sep 30 06:44:19 crc kubenswrapper[4691]: I0930 06:44:19.096952 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a86411aa-c859-4de2-9640-8b93428d9638-utilities\") pod \"a86411aa-c859-4de2-9640-8b93428d9638\" (UID: \"a86411aa-c859-4de2-9640-8b93428d9638\") " Sep 30 06:44:19 crc kubenswrapper[4691]: I0930 06:44:19.097863 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a86411aa-c859-4de2-9640-8b93428d9638-utilities" (OuterVolumeSpecName: "utilities") pod "a86411aa-c859-4de2-9640-8b93428d9638" (UID: "a86411aa-c859-4de2-9640-8b93428d9638"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:44:19 crc kubenswrapper[4691]: I0930 06:44:19.107863 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a86411aa-c859-4de2-9640-8b93428d9638-kube-api-access-7blpd" (OuterVolumeSpecName: "kube-api-access-7blpd") pod "a86411aa-c859-4de2-9640-8b93428d9638" (UID: "a86411aa-c859-4de2-9640-8b93428d9638"). InnerVolumeSpecName "kube-api-access-7blpd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:44:19 crc kubenswrapper[4691]: I0930 06:44:19.199272 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7blpd\" (UniqueName: \"kubernetes.io/projected/a86411aa-c859-4de2-9640-8b93428d9638-kube-api-access-7blpd\") on node \"crc\" DevicePath \"\"" Sep 30 06:44:19 crc kubenswrapper[4691]: I0930 06:44:19.199302 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a86411aa-c859-4de2-9640-8b93428d9638-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 06:44:19 crc kubenswrapper[4691]: I0930 06:44:19.199593 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a86411aa-c859-4de2-9640-8b93428d9638-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a86411aa-c859-4de2-9640-8b93428d9638" (UID: "a86411aa-c859-4de2-9640-8b93428d9638"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:44:19 crc kubenswrapper[4691]: I0930 06:44:19.303224 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a86411aa-c859-4de2-9640-8b93428d9638-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 06:44:19 crc kubenswrapper[4691]: I0930 06:44:19.398749 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2blsf"] Sep 30 06:44:19 crc kubenswrapper[4691]: E0930 06:44:19.399380 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a86411aa-c859-4de2-9640-8b93428d9638" containerName="extract-utilities" Sep 30 06:44:19 crc kubenswrapper[4691]: I0930 06:44:19.399407 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="a86411aa-c859-4de2-9640-8b93428d9638" containerName="extract-utilities" Sep 30 06:44:19 crc kubenswrapper[4691]: E0930 06:44:19.399443 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a86411aa-c859-4de2-9640-8b93428d9638" containerName="registry-server" Sep 30 06:44:19 crc kubenswrapper[4691]: I0930 06:44:19.399455 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="a86411aa-c859-4de2-9640-8b93428d9638" containerName="registry-server" Sep 30 06:44:19 crc kubenswrapper[4691]: E0930 06:44:19.399482 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a86411aa-c859-4de2-9640-8b93428d9638" containerName="extract-content" Sep 30 06:44:19 crc kubenswrapper[4691]: I0930 06:44:19.399493 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="a86411aa-c859-4de2-9640-8b93428d9638" containerName="extract-content" Sep 30 06:44:19 crc kubenswrapper[4691]: I0930 06:44:19.399813 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="a86411aa-c859-4de2-9640-8b93428d9638" containerName="registry-server" Sep 30 06:44:19 crc kubenswrapper[4691]: I0930 06:44:19.402173 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2blsf" Sep 30 06:44:19 crc kubenswrapper[4691]: I0930 06:44:19.419606 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2blsf"] Sep 30 06:44:19 crc kubenswrapper[4691]: I0930 06:44:19.443562 4691 generic.go:334] "Generic (PLEG): container finished" podID="a86411aa-c859-4de2-9640-8b93428d9638" containerID="cf6a443d729362036962508e680481340f54e75b8a53ac73dce93c077dc1b675" exitCode=0 Sep 30 06:44:19 crc kubenswrapper[4691]: I0930 06:44:19.443621 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6988j" event={"ID":"a86411aa-c859-4de2-9640-8b93428d9638","Type":"ContainerDied","Data":"cf6a443d729362036962508e680481340f54e75b8a53ac73dce93c077dc1b675"} Sep 30 06:44:19 crc kubenswrapper[4691]: I0930 06:44:19.443662 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6988j" event={"ID":"a86411aa-c859-4de2-9640-8b93428d9638","Type":"ContainerDied","Data":"f173559c9d04fffba89c4d14816ac19ce40ea656cad8d2c4d9328e6fd3eba6a9"} Sep 30 06:44:19 crc kubenswrapper[4691]: I0930 06:44:19.443662 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6988j" Sep 30 06:44:19 crc kubenswrapper[4691]: I0930 06:44:19.443684 4691 scope.go:117] "RemoveContainer" containerID="cf6a443d729362036962508e680481340f54e75b8a53ac73dce93c077dc1b675" Sep 30 06:44:19 crc kubenswrapper[4691]: I0930 06:44:19.480503 4691 scope.go:117] "RemoveContainer" containerID="6d6f37fa878b6743a93c00e73f0b02e4b81c1104433c315f68ffac03582e5644" Sep 30 06:44:19 crc kubenswrapper[4691]: I0930 06:44:19.513040 4691 scope.go:117] "RemoveContainer" containerID="439e4acb692af7358e5dfd7c0b1166ec80efe7bc976d2fd706bee6e38da266ee" Sep 30 06:44:19 crc kubenswrapper[4691]: I0930 06:44:19.516818 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qc646\" (UniqueName: \"kubernetes.io/projected/a62421d7-6b37-479e-96f1-fdb54fedd5ab-kube-api-access-qc646\") pod \"redhat-marketplace-2blsf\" (UID: \"a62421d7-6b37-479e-96f1-fdb54fedd5ab\") " pod="openshift-marketplace/redhat-marketplace-2blsf" Sep 30 06:44:19 crc kubenswrapper[4691]: I0930 06:44:19.517190 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a62421d7-6b37-479e-96f1-fdb54fedd5ab-utilities\") pod \"redhat-marketplace-2blsf\" (UID: \"a62421d7-6b37-479e-96f1-fdb54fedd5ab\") " pod="openshift-marketplace/redhat-marketplace-2blsf" Sep 30 06:44:19 crc kubenswrapper[4691]: I0930 06:44:19.517643 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a62421d7-6b37-479e-96f1-fdb54fedd5ab-catalog-content\") pod \"redhat-marketplace-2blsf\" (UID: \"a62421d7-6b37-479e-96f1-fdb54fedd5ab\") " pod="openshift-marketplace/redhat-marketplace-2blsf" Sep 30 06:44:19 crc kubenswrapper[4691]: I0930 06:44:19.539211 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6988j"] Sep 30 06:44:19 crc kubenswrapper[4691]: I0930 06:44:19.550321 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6988j"] Sep 30 06:44:19 crc kubenswrapper[4691]: I0930 06:44:19.590793 4691 scope.go:117] "RemoveContainer" containerID="cf6a443d729362036962508e680481340f54e75b8a53ac73dce93c077dc1b675" Sep 30 06:44:19 crc kubenswrapper[4691]: E0930 06:44:19.591656 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf6a443d729362036962508e680481340f54e75b8a53ac73dce93c077dc1b675\": container with ID starting with cf6a443d729362036962508e680481340f54e75b8a53ac73dce93c077dc1b675 not found: ID does not exist" containerID="cf6a443d729362036962508e680481340f54e75b8a53ac73dce93c077dc1b675" Sep 30 06:44:19 crc kubenswrapper[4691]: I0930 06:44:19.591695 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf6a443d729362036962508e680481340f54e75b8a53ac73dce93c077dc1b675"} err="failed to get container status \"cf6a443d729362036962508e680481340f54e75b8a53ac73dce93c077dc1b675\": rpc error: code = NotFound desc = could not find container \"cf6a443d729362036962508e680481340f54e75b8a53ac73dce93c077dc1b675\": container with ID starting with cf6a443d729362036962508e680481340f54e75b8a53ac73dce93c077dc1b675 not found: ID does not exist" Sep 30 06:44:19 crc kubenswrapper[4691]: I0930 06:44:19.591723 4691 scope.go:117] "RemoveContainer" containerID="6d6f37fa878b6743a93c00e73f0b02e4b81c1104433c315f68ffac03582e5644" Sep 30 06:44:19 crc kubenswrapper[4691]: E0930 06:44:19.592066 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d6f37fa878b6743a93c00e73f0b02e4b81c1104433c315f68ffac03582e5644\": container with ID starting with 6d6f37fa878b6743a93c00e73f0b02e4b81c1104433c315f68ffac03582e5644 not found: ID does not exist" containerID="6d6f37fa878b6743a93c00e73f0b02e4b81c1104433c315f68ffac03582e5644" Sep 30 06:44:19 crc kubenswrapper[4691]: I0930 06:44:19.592092 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d6f37fa878b6743a93c00e73f0b02e4b81c1104433c315f68ffac03582e5644"} err="failed to get container status \"6d6f37fa878b6743a93c00e73f0b02e4b81c1104433c315f68ffac03582e5644\": rpc error: code = NotFound desc = could not find container \"6d6f37fa878b6743a93c00e73f0b02e4b81c1104433c315f68ffac03582e5644\": container with ID starting with 6d6f37fa878b6743a93c00e73f0b02e4b81c1104433c315f68ffac03582e5644 not found: ID does not exist" Sep 30 06:44:19 crc kubenswrapper[4691]: I0930 06:44:19.592109 4691 scope.go:117] "RemoveContainer" containerID="439e4acb692af7358e5dfd7c0b1166ec80efe7bc976d2fd706bee6e38da266ee" Sep 30 06:44:19 crc kubenswrapper[4691]: E0930 06:44:19.592352 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"439e4acb692af7358e5dfd7c0b1166ec80efe7bc976d2fd706bee6e38da266ee\": container with ID starting with 439e4acb692af7358e5dfd7c0b1166ec80efe7bc976d2fd706bee6e38da266ee not found: ID does not exist" containerID="439e4acb692af7358e5dfd7c0b1166ec80efe7bc976d2fd706bee6e38da266ee" Sep 30 06:44:19 crc kubenswrapper[4691]: I0930 06:44:19.592390 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"439e4acb692af7358e5dfd7c0b1166ec80efe7bc976d2fd706bee6e38da266ee"} err="failed to get container status \"439e4acb692af7358e5dfd7c0b1166ec80efe7bc976d2fd706bee6e38da266ee\": rpc error: code = NotFound desc = could not find container \"439e4acb692af7358e5dfd7c0b1166ec80efe7bc976d2fd706bee6e38da266ee\": container with ID starting with 439e4acb692af7358e5dfd7c0b1166ec80efe7bc976d2fd706bee6e38da266ee not found: ID does not exist" Sep 30 06:44:19 crc kubenswrapper[4691]: I0930 06:44:19.622231 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qc646\" (UniqueName: \"kubernetes.io/projected/a62421d7-6b37-479e-96f1-fdb54fedd5ab-kube-api-access-qc646\") pod \"redhat-marketplace-2blsf\" (UID: \"a62421d7-6b37-479e-96f1-fdb54fedd5ab\") " pod="openshift-marketplace/redhat-marketplace-2blsf" Sep 30 06:44:19 crc kubenswrapper[4691]: I0930 06:44:19.622297 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a62421d7-6b37-479e-96f1-fdb54fedd5ab-utilities\") pod \"redhat-marketplace-2blsf\" (UID: \"a62421d7-6b37-479e-96f1-fdb54fedd5ab\") " pod="openshift-marketplace/redhat-marketplace-2blsf" Sep 30 06:44:19 crc kubenswrapper[4691]: I0930 06:44:19.622407 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a62421d7-6b37-479e-96f1-fdb54fedd5ab-catalog-content\") pod \"redhat-marketplace-2blsf\" (UID: \"a62421d7-6b37-479e-96f1-fdb54fedd5ab\") " pod="openshift-marketplace/redhat-marketplace-2blsf" Sep 30 06:44:19 crc kubenswrapper[4691]: I0930 06:44:19.623242 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a62421d7-6b37-479e-96f1-fdb54fedd5ab-catalog-content\") pod \"redhat-marketplace-2blsf\" (UID: \"a62421d7-6b37-479e-96f1-fdb54fedd5ab\") " pod="openshift-marketplace/redhat-marketplace-2blsf" Sep 30 06:44:19 crc kubenswrapper[4691]: I0930 06:44:19.623515 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a62421d7-6b37-479e-96f1-fdb54fedd5ab-utilities\") pod \"redhat-marketplace-2blsf\" (UID: \"a62421d7-6b37-479e-96f1-fdb54fedd5ab\") " pod="openshift-marketplace/redhat-marketplace-2blsf" Sep 30 06:44:19 crc kubenswrapper[4691]: I0930 06:44:19.644178 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qc646\" (UniqueName: \"kubernetes.io/projected/a62421d7-6b37-479e-96f1-fdb54fedd5ab-kube-api-access-qc646\") pod \"redhat-marketplace-2blsf\" (UID: \"a62421d7-6b37-479e-96f1-fdb54fedd5ab\") " pod="openshift-marketplace/redhat-marketplace-2blsf" Sep 30 06:44:19 crc kubenswrapper[4691]: I0930 06:44:19.744796 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2blsf" Sep 30 06:44:20 crc kubenswrapper[4691]: W0930 06:44:20.224213 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda62421d7_6b37_479e_96f1_fdb54fedd5ab.slice/crio-d77b79bedac34b4a9d5f7727e957982683d746e8c2399295a333d3d49502e500 WatchSource:0}: Error finding container d77b79bedac34b4a9d5f7727e957982683d746e8c2399295a333d3d49502e500: Status 404 returned error can't find the container with id d77b79bedac34b4a9d5f7727e957982683d746e8c2399295a333d3d49502e500 Sep 30 06:44:20 crc kubenswrapper[4691]: I0930 06:44:20.248861 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2blsf"] Sep 30 06:44:20 crc kubenswrapper[4691]: I0930 06:44:20.456630 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2blsf" event={"ID":"a62421d7-6b37-479e-96f1-fdb54fedd5ab","Type":"ContainerStarted","Data":"9f1bcbfbc6ed392c81f641cea7972d176a739f1a5ce6e729b9c1f09c1778c239"} Sep 30 06:44:20 crc kubenswrapper[4691]: I0930 06:44:20.456687 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2blsf" event={"ID":"a62421d7-6b37-479e-96f1-fdb54fedd5ab","Type":"ContainerStarted","Data":"d77b79bedac34b4a9d5f7727e957982683d746e8c2399295a333d3d49502e500"} Sep 30 06:44:21 crc kubenswrapper[4691]: I0930 06:44:21.240808 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a86411aa-c859-4de2-9640-8b93428d9638" path="/var/lib/kubelet/pods/a86411aa-c859-4de2-9640-8b93428d9638/volumes" Sep 30 06:44:21 crc kubenswrapper[4691]: I0930 06:44:21.478707 4691 generic.go:334] "Generic (PLEG): container finished" podID="a62421d7-6b37-479e-96f1-fdb54fedd5ab" containerID="9f1bcbfbc6ed392c81f641cea7972d176a739f1a5ce6e729b9c1f09c1778c239" exitCode=0 Sep 30 06:44:21 crc kubenswrapper[4691]: I0930 06:44:21.478849 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2blsf" event={"ID":"a62421d7-6b37-479e-96f1-fdb54fedd5ab","Type":"ContainerDied","Data":"9f1bcbfbc6ed392c81f641cea7972d176a739f1a5ce6e729b9c1f09c1778c239"} Sep 30 06:44:23 crc kubenswrapper[4691]: I0930 06:44:23.504605 4691 generic.go:334] "Generic (PLEG): container finished" podID="a62421d7-6b37-479e-96f1-fdb54fedd5ab" containerID="2a243c1fb99a214ae52a4746e6073bdf1c627d181f1740f50b6a80cbef2882bf" exitCode=0 Sep 30 06:44:23 crc kubenswrapper[4691]: I0930 06:44:23.504671 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2blsf" event={"ID":"a62421d7-6b37-479e-96f1-fdb54fedd5ab","Type":"ContainerDied","Data":"2a243c1fb99a214ae52a4746e6073bdf1c627d181f1740f50b6a80cbef2882bf"} Sep 30 06:44:24 crc kubenswrapper[4691]: I0930 06:44:24.517691 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2blsf" event={"ID":"a62421d7-6b37-479e-96f1-fdb54fedd5ab","Type":"ContainerStarted","Data":"80dc922c1d0484067d38893d949bc1f40d8aa1b6071c54e5672a5e11f4acdad4"} Sep 30 06:44:24 crc kubenswrapper[4691]: I0930 06:44:24.547485 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2blsf" podStartSLOduration=3.099599776 podStartE2EDuration="5.54745882s" podCreationTimestamp="2025-09-30 06:44:19 +0000 UTC" firstStartedPulling="2025-09-30 06:44:21.481849087 +0000 UTC m=+1504.956870157" lastFinishedPulling="2025-09-30 06:44:23.929708131 +0000 UTC m=+1507.404729201" observedRunningTime="2025-09-30 06:44:24.536509031 +0000 UTC m=+1508.011530111" watchObservedRunningTime="2025-09-30 06:44:24.54745882 +0000 UTC m=+1508.022479890" Sep 30 06:44:29 crc kubenswrapper[4691]: I0930 06:44:29.745481 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2blsf" Sep 30 06:44:29 crc kubenswrapper[4691]: I0930 06:44:29.746313 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2blsf" Sep 30 06:44:29 crc kubenswrapper[4691]: I0930 06:44:29.829669 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2blsf" Sep 30 06:44:30 crc kubenswrapper[4691]: I0930 06:44:30.658126 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2blsf" Sep 30 06:44:30 crc kubenswrapper[4691]: I0930 06:44:30.724725 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2blsf"] Sep 30 06:44:32 crc kubenswrapper[4691]: I0930 06:44:32.609485 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2blsf" podUID="a62421d7-6b37-479e-96f1-fdb54fedd5ab" containerName="registry-server" containerID="cri-o://80dc922c1d0484067d38893d949bc1f40d8aa1b6071c54e5672a5e11f4acdad4" gracePeriod=2 Sep 30 06:44:33 crc kubenswrapper[4691]: I0930 06:44:33.247185 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2blsf" Sep 30 06:44:33 crc kubenswrapper[4691]: I0930 06:44:33.355544 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a62421d7-6b37-479e-96f1-fdb54fedd5ab-catalog-content\") pod \"a62421d7-6b37-479e-96f1-fdb54fedd5ab\" (UID: \"a62421d7-6b37-479e-96f1-fdb54fedd5ab\") " Sep 30 06:44:33 crc kubenswrapper[4691]: I0930 06:44:33.355899 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qc646\" (UniqueName: \"kubernetes.io/projected/a62421d7-6b37-479e-96f1-fdb54fedd5ab-kube-api-access-qc646\") pod \"a62421d7-6b37-479e-96f1-fdb54fedd5ab\" (UID: \"a62421d7-6b37-479e-96f1-fdb54fedd5ab\") " Sep 30 06:44:33 crc kubenswrapper[4691]: I0930 06:44:33.355940 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a62421d7-6b37-479e-96f1-fdb54fedd5ab-utilities\") pod \"a62421d7-6b37-479e-96f1-fdb54fedd5ab\" (UID: \"a62421d7-6b37-479e-96f1-fdb54fedd5ab\") " Sep 30 06:44:33 crc kubenswrapper[4691]: I0930 06:44:33.357179 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a62421d7-6b37-479e-96f1-fdb54fedd5ab-utilities" (OuterVolumeSpecName: "utilities") pod "a62421d7-6b37-479e-96f1-fdb54fedd5ab" (UID: "a62421d7-6b37-479e-96f1-fdb54fedd5ab"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:44:33 crc kubenswrapper[4691]: I0930 06:44:33.361617 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a62421d7-6b37-479e-96f1-fdb54fedd5ab-kube-api-access-qc646" (OuterVolumeSpecName: "kube-api-access-qc646") pod "a62421d7-6b37-479e-96f1-fdb54fedd5ab" (UID: "a62421d7-6b37-479e-96f1-fdb54fedd5ab"). InnerVolumeSpecName "kube-api-access-qc646". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:44:33 crc kubenswrapper[4691]: I0930 06:44:33.369547 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a62421d7-6b37-479e-96f1-fdb54fedd5ab-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a62421d7-6b37-479e-96f1-fdb54fedd5ab" (UID: "a62421d7-6b37-479e-96f1-fdb54fedd5ab"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:44:33 crc kubenswrapper[4691]: I0930 06:44:33.461897 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a62421d7-6b37-479e-96f1-fdb54fedd5ab-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 06:44:33 crc kubenswrapper[4691]: I0930 06:44:33.461997 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qc646\" (UniqueName: \"kubernetes.io/projected/a62421d7-6b37-479e-96f1-fdb54fedd5ab-kube-api-access-qc646\") on node \"crc\" DevicePath \"\"" Sep 30 06:44:33 crc kubenswrapper[4691]: I0930 06:44:33.462012 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a62421d7-6b37-479e-96f1-fdb54fedd5ab-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 06:44:33 crc kubenswrapper[4691]: I0930 06:44:33.623396 4691 generic.go:334] "Generic (PLEG): container finished" podID="a62421d7-6b37-479e-96f1-fdb54fedd5ab" containerID="80dc922c1d0484067d38893d949bc1f40d8aa1b6071c54e5672a5e11f4acdad4" exitCode=0 Sep 30 06:44:33 crc kubenswrapper[4691]: I0930 06:44:33.623450 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2blsf" event={"ID":"a62421d7-6b37-479e-96f1-fdb54fedd5ab","Type":"ContainerDied","Data":"80dc922c1d0484067d38893d949bc1f40d8aa1b6071c54e5672a5e11f4acdad4"} Sep 30 06:44:33 crc kubenswrapper[4691]: I0930 06:44:33.623473 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2blsf" event={"ID":"a62421d7-6b37-479e-96f1-fdb54fedd5ab","Type":"ContainerDied","Data":"d77b79bedac34b4a9d5f7727e957982683d746e8c2399295a333d3d49502e500"} Sep 30 06:44:33 crc kubenswrapper[4691]: I0930 06:44:33.623491 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2blsf" Sep 30 06:44:33 crc kubenswrapper[4691]: I0930 06:44:33.623513 4691 scope.go:117] "RemoveContainer" containerID="80dc922c1d0484067d38893d949bc1f40d8aa1b6071c54e5672a5e11f4acdad4" Sep 30 06:44:33 crc kubenswrapper[4691]: I0930 06:44:33.652084 4691 scope.go:117] "RemoveContainer" containerID="2a243c1fb99a214ae52a4746e6073bdf1c627d181f1740f50b6a80cbef2882bf" Sep 30 06:44:33 crc kubenswrapper[4691]: I0930 06:44:33.683323 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2blsf"] Sep 30 06:44:33 crc kubenswrapper[4691]: I0930 06:44:33.685560 4691 scope.go:117] "RemoveContainer" containerID="9f1bcbfbc6ed392c81f641cea7972d176a739f1a5ce6e729b9c1f09c1778c239" Sep 30 06:44:33 crc kubenswrapper[4691]: I0930 06:44:33.703239 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2blsf"] Sep 30 06:44:33 crc kubenswrapper[4691]: I0930 06:44:33.791798 4691 scope.go:117] "RemoveContainer" containerID="80dc922c1d0484067d38893d949bc1f40d8aa1b6071c54e5672a5e11f4acdad4" Sep 30 06:44:33 crc kubenswrapper[4691]: E0930 06:44:33.792607 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80dc922c1d0484067d38893d949bc1f40d8aa1b6071c54e5672a5e11f4acdad4\": container with ID starting with 80dc922c1d0484067d38893d949bc1f40d8aa1b6071c54e5672a5e11f4acdad4 not found: ID does not exist" containerID="80dc922c1d0484067d38893d949bc1f40d8aa1b6071c54e5672a5e11f4acdad4" Sep 30 06:44:33 crc kubenswrapper[4691]: I0930 06:44:33.792666 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80dc922c1d0484067d38893d949bc1f40d8aa1b6071c54e5672a5e11f4acdad4"} err="failed to get container status \"80dc922c1d0484067d38893d949bc1f40d8aa1b6071c54e5672a5e11f4acdad4\": rpc error: code = NotFound desc = could not find container \"80dc922c1d0484067d38893d949bc1f40d8aa1b6071c54e5672a5e11f4acdad4\": container with ID starting with 80dc922c1d0484067d38893d949bc1f40d8aa1b6071c54e5672a5e11f4acdad4 not found: ID does not exist" Sep 30 06:44:33 crc kubenswrapper[4691]: I0930 06:44:33.792699 4691 scope.go:117] "RemoveContainer" containerID="2a243c1fb99a214ae52a4746e6073bdf1c627d181f1740f50b6a80cbef2882bf" Sep 30 06:44:33 crc kubenswrapper[4691]: E0930 06:44:33.793177 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a243c1fb99a214ae52a4746e6073bdf1c627d181f1740f50b6a80cbef2882bf\": container with ID starting with 2a243c1fb99a214ae52a4746e6073bdf1c627d181f1740f50b6a80cbef2882bf not found: ID does not exist" containerID="2a243c1fb99a214ae52a4746e6073bdf1c627d181f1740f50b6a80cbef2882bf" Sep 30 06:44:33 crc kubenswrapper[4691]: I0930 06:44:33.793214 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a243c1fb99a214ae52a4746e6073bdf1c627d181f1740f50b6a80cbef2882bf"} err="failed to get container status \"2a243c1fb99a214ae52a4746e6073bdf1c627d181f1740f50b6a80cbef2882bf\": rpc error: code = NotFound desc = could not find container \"2a243c1fb99a214ae52a4746e6073bdf1c627d181f1740f50b6a80cbef2882bf\": container with ID starting with 2a243c1fb99a214ae52a4746e6073bdf1c627d181f1740f50b6a80cbef2882bf not found: ID does not exist" Sep 30 06:44:33 crc kubenswrapper[4691]: I0930 06:44:33.793243 4691 scope.go:117] "RemoveContainer" containerID="9f1bcbfbc6ed392c81f641cea7972d176a739f1a5ce6e729b9c1f09c1778c239" Sep 30 06:44:33 crc kubenswrapper[4691]: E0930 06:44:33.793543 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f1bcbfbc6ed392c81f641cea7972d176a739f1a5ce6e729b9c1f09c1778c239\": container with ID starting with 9f1bcbfbc6ed392c81f641cea7972d176a739f1a5ce6e729b9c1f09c1778c239 not found: ID does not exist" containerID="9f1bcbfbc6ed392c81f641cea7972d176a739f1a5ce6e729b9c1f09c1778c239" Sep 30 06:44:33 crc kubenswrapper[4691]: I0930 06:44:33.793575 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f1bcbfbc6ed392c81f641cea7972d176a739f1a5ce6e729b9c1f09c1778c239"} err="failed to get container status \"9f1bcbfbc6ed392c81f641cea7972d176a739f1a5ce6e729b9c1f09c1778c239\": rpc error: code = NotFound desc = could not find container \"9f1bcbfbc6ed392c81f641cea7972d176a739f1a5ce6e729b9c1f09c1778c239\": container with ID starting with 9f1bcbfbc6ed392c81f641cea7972d176a739f1a5ce6e729b9c1f09c1778c239 not found: ID does not exist" Sep 30 06:44:35 crc kubenswrapper[4691]: I0930 06:44:35.246124 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a62421d7-6b37-479e-96f1-fdb54fedd5ab" path="/var/lib/kubelet/pods/a62421d7-6b37-479e-96f1-fdb54fedd5ab/volumes" Sep 30 06:45:00 crc kubenswrapper[4691]: I0930 06:45:00.182133 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320245-lbznn"] Sep 30 06:45:00 crc kubenswrapper[4691]: E0930 06:45:00.183925 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a62421d7-6b37-479e-96f1-fdb54fedd5ab" containerName="extract-utilities" Sep 30 06:45:00 crc kubenswrapper[4691]: I0930 06:45:00.184074 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="a62421d7-6b37-479e-96f1-fdb54fedd5ab" containerName="extract-utilities" Sep 30 06:45:00 crc kubenswrapper[4691]: E0930 06:45:00.184143 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a62421d7-6b37-479e-96f1-fdb54fedd5ab" containerName="extract-content" Sep 30 06:45:00 crc kubenswrapper[4691]: I0930 06:45:00.184156 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="a62421d7-6b37-479e-96f1-fdb54fedd5ab" containerName="extract-content" Sep 30 06:45:00 crc kubenswrapper[4691]: E0930 06:45:00.184222 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a62421d7-6b37-479e-96f1-fdb54fedd5ab" containerName="registry-server" Sep 30 06:45:00 crc kubenswrapper[4691]: I0930 06:45:00.184233 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="a62421d7-6b37-479e-96f1-fdb54fedd5ab" containerName="registry-server" Sep 30 06:45:00 crc kubenswrapper[4691]: I0930 06:45:00.184962 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="a62421d7-6b37-479e-96f1-fdb54fedd5ab" containerName="registry-server" Sep 30 06:45:00 crc kubenswrapper[4691]: I0930 06:45:00.186404 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320245-lbznn" Sep 30 06:45:00 crc kubenswrapper[4691]: I0930 06:45:00.190453 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 06:45:00 crc kubenswrapper[4691]: I0930 06:45:00.190800 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 06:45:00 crc kubenswrapper[4691]: I0930 06:45:00.221277 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320245-lbznn"] Sep 30 06:45:00 crc kubenswrapper[4691]: I0930 06:45:00.312984 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2f23214a-9e66-4001-833c-671c08f7a95d-config-volume\") pod \"collect-profiles-29320245-lbznn\" (UID: \"2f23214a-9e66-4001-833c-671c08f7a95d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320245-lbznn" Sep 30 06:45:00 crc kubenswrapper[4691]: I0930 06:45:00.313054 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7k8lr\" (UniqueName: \"kubernetes.io/projected/2f23214a-9e66-4001-833c-671c08f7a95d-kube-api-access-7k8lr\") pod \"collect-profiles-29320245-lbznn\" (UID: \"2f23214a-9e66-4001-833c-671c08f7a95d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320245-lbznn" Sep 30 06:45:00 crc kubenswrapper[4691]: I0930 06:45:00.313211 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2f23214a-9e66-4001-833c-671c08f7a95d-secret-volume\") pod \"collect-profiles-29320245-lbznn\" (UID: \"2f23214a-9e66-4001-833c-671c08f7a95d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320245-lbznn" Sep 30 06:45:00 crc kubenswrapper[4691]: I0930 06:45:00.415443 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2f23214a-9e66-4001-833c-671c08f7a95d-config-volume\") pod \"collect-profiles-29320245-lbznn\" (UID: \"2f23214a-9e66-4001-833c-671c08f7a95d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320245-lbznn" Sep 30 06:45:00 crc kubenswrapper[4691]: I0930 06:45:00.415556 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7k8lr\" (UniqueName: \"kubernetes.io/projected/2f23214a-9e66-4001-833c-671c08f7a95d-kube-api-access-7k8lr\") pod \"collect-profiles-29320245-lbznn\" (UID: \"2f23214a-9e66-4001-833c-671c08f7a95d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320245-lbznn" Sep 30 06:45:00 crc kubenswrapper[4691]: I0930 06:45:00.415736 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2f23214a-9e66-4001-833c-671c08f7a95d-secret-volume\") pod \"collect-profiles-29320245-lbznn\" (UID: \"2f23214a-9e66-4001-833c-671c08f7a95d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320245-lbznn" Sep 30 06:45:00 crc kubenswrapper[4691]: I0930 06:45:00.416756 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2f23214a-9e66-4001-833c-671c08f7a95d-config-volume\") pod \"collect-profiles-29320245-lbznn\" (UID: \"2f23214a-9e66-4001-833c-671c08f7a95d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320245-lbznn" Sep 30 06:45:00 crc kubenswrapper[4691]: I0930 06:45:00.428568 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2f23214a-9e66-4001-833c-671c08f7a95d-secret-volume\") pod \"collect-profiles-29320245-lbznn\" (UID: \"2f23214a-9e66-4001-833c-671c08f7a95d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320245-lbznn" Sep 30 06:45:00 crc kubenswrapper[4691]: I0930 06:45:00.439956 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7k8lr\" (UniqueName: \"kubernetes.io/projected/2f23214a-9e66-4001-833c-671c08f7a95d-kube-api-access-7k8lr\") pod \"collect-profiles-29320245-lbznn\" (UID: \"2f23214a-9e66-4001-833c-671c08f7a95d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320245-lbznn" Sep 30 06:45:00 crc kubenswrapper[4691]: I0930 06:45:00.509483 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320245-lbznn" Sep 30 06:45:00 crc kubenswrapper[4691]: I0930 06:45:00.963358 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320245-lbznn"] Sep 30 06:45:00 crc kubenswrapper[4691]: W0930 06:45:00.968119 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f23214a_9e66_4001_833c_671c08f7a95d.slice/crio-778ec06b3480ba0a9aacf8e804f521b206d15d9a1daab4049969ab7eef5adad8 WatchSource:0}: Error finding container 778ec06b3480ba0a9aacf8e804f521b206d15d9a1daab4049969ab7eef5adad8: Status 404 returned error can't find the container with id 778ec06b3480ba0a9aacf8e804f521b206d15d9a1daab4049969ab7eef5adad8 Sep 30 06:45:01 crc kubenswrapper[4691]: I0930 06:45:01.955261 4691 generic.go:334] "Generic (PLEG): container finished" podID="2f23214a-9e66-4001-833c-671c08f7a95d" containerID="4d7b49074587bba4e2b185e585678f74b2e7eb7fc35ba041b8f19e870bc21446" exitCode=0 Sep 30 06:45:01 crc kubenswrapper[4691]: I0930 06:45:01.955302 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320245-lbznn" event={"ID":"2f23214a-9e66-4001-833c-671c08f7a95d","Type":"ContainerDied","Data":"4d7b49074587bba4e2b185e585678f74b2e7eb7fc35ba041b8f19e870bc21446"} Sep 30 06:45:01 crc kubenswrapper[4691]: I0930 06:45:01.955568 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320245-lbznn" event={"ID":"2f23214a-9e66-4001-833c-671c08f7a95d","Type":"ContainerStarted","Data":"778ec06b3480ba0a9aacf8e804f521b206d15d9a1daab4049969ab7eef5adad8"} Sep 30 06:45:03 crc kubenswrapper[4691]: I0930 06:45:03.068124 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-db-create-mx4sz"] Sep 30 06:45:03 crc kubenswrapper[4691]: I0930 06:45:03.079439 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-db-create-mx4sz"] Sep 30 06:45:03 crc kubenswrapper[4691]: I0930 06:45:03.239261 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="26e581b3-ef82-4712-827a-48a328785696" path="/var/lib/kubelet/pods/26e581b3-ef82-4712-827a-48a328785696/volumes" Sep 30 06:45:03 crc kubenswrapper[4691]: I0930 06:45:03.371062 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320245-lbznn" Sep 30 06:45:03 crc kubenswrapper[4691]: I0930 06:45:03.479060 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7k8lr\" (UniqueName: \"kubernetes.io/projected/2f23214a-9e66-4001-833c-671c08f7a95d-kube-api-access-7k8lr\") pod \"2f23214a-9e66-4001-833c-671c08f7a95d\" (UID: \"2f23214a-9e66-4001-833c-671c08f7a95d\") " Sep 30 06:45:03 crc kubenswrapper[4691]: I0930 06:45:03.479138 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2f23214a-9e66-4001-833c-671c08f7a95d-config-volume\") pod \"2f23214a-9e66-4001-833c-671c08f7a95d\" (UID: \"2f23214a-9e66-4001-833c-671c08f7a95d\") " Sep 30 06:45:03 crc kubenswrapper[4691]: I0930 06:45:03.479297 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2f23214a-9e66-4001-833c-671c08f7a95d-secret-volume\") pod \"2f23214a-9e66-4001-833c-671c08f7a95d\" (UID: \"2f23214a-9e66-4001-833c-671c08f7a95d\") " Sep 30 06:45:03 crc kubenswrapper[4691]: I0930 06:45:03.479969 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f23214a-9e66-4001-833c-671c08f7a95d-config-volume" (OuterVolumeSpecName: "config-volume") pod "2f23214a-9e66-4001-833c-671c08f7a95d" (UID: "2f23214a-9e66-4001-833c-671c08f7a95d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:45:03 crc kubenswrapper[4691]: I0930 06:45:03.484804 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f23214a-9e66-4001-833c-671c08f7a95d-kube-api-access-7k8lr" (OuterVolumeSpecName: "kube-api-access-7k8lr") pod "2f23214a-9e66-4001-833c-671c08f7a95d" (UID: "2f23214a-9e66-4001-833c-671c08f7a95d"). InnerVolumeSpecName "kube-api-access-7k8lr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:45:03 crc kubenswrapper[4691]: I0930 06:45:03.485895 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f23214a-9e66-4001-833c-671c08f7a95d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2f23214a-9e66-4001-833c-671c08f7a95d" (UID: "2f23214a-9e66-4001-833c-671c08f7a95d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:45:03 crc kubenswrapper[4691]: I0930 06:45:03.582116 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7k8lr\" (UniqueName: \"kubernetes.io/projected/2f23214a-9e66-4001-833c-671c08f7a95d-kube-api-access-7k8lr\") on node \"crc\" DevicePath \"\"" Sep 30 06:45:03 crc kubenswrapper[4691]: I0930 06:45:03.582171 4691 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2f23214a-9e66-4001-833c-671c08f7a95d-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 06:45:03 crc kubenswrapper[4691]: I0930 06:45:03.582188 4691 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2f23214a-9e66-4001-833c-671c08f7a95d-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 06:45:03 crc kubenswrapper[4691]: I0930 06:45:03.982395 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320245-lbznn" event={"ID":"2f23214a-9e66-4001-833c-671c08f7a95d","Type":"ContainerDied","Data":"778ec06b3480ba0a9aacf8e804f521b206d15d9a1daab4049969ab7eef5adad8"} Sep 30 06:45:03 crc kubenswrapper[4691]: I0930 06:45:03.982439 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="778ec06b3480ba0a9aacf8e804f521b206d15d9a1daab4049969ab7eef5adad8" Sep 30 06:45:03 crc kubenswrapper[4691]: I0930 06:45:03.982474 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320245-lbznn" Sep 30 06:45:09 crc kubenswrapper[4691]: I0930 06:45:09.066819 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-hskhk"] Sep 30 06:45:09 crc kubenswrapper[4691]: I0930 06:45:09.084564 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-hskhk"] Sep 30 06:45:09 crc kubenswrapper[4691]: I0930 06:45:09.239037 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="988ba2e2-8687-4a2d-91b4-f158c4725b65" path="/var/lib/kubelet/pods/988ba2e2-8687-4a2d-91b4-f158c4725b65/volumes" Sep 30 06:45:12 crc kubenswrapper[4691]: I0930 06:45:12.031749 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-cfvmh"] Sep 30 06:45:12 crc kubenswrapper[4691]: I0930 06:45:12.044227 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-jzcb7"] Sep 30 06:45:12 crc kubenswrapper[4691]: I0930 06:45:12.056415 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-cfvmh"] Sep 30 06:45:12 crc kubenswrapper[4691]: I0930 06:45:12.064567 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-jzcb7"] Sep 30 06:45:13 crc kubenswrapper[4691]: I0930 06:45:13.238192 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4d262b4-6cf7-4fcc-bc2c-01fee2dbf62e" path="/var/lib/kubelet/pods/c4d262b4-6cf7-4fcc-bc2c-01fee2dbf62e/volumes" Sep 30 06:45:13 crc kubenswrapper[4691]: I0930 06:45:13.241844 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f822b186-ff4b-4190-b86f-e20bcc5ae236" path="/var/lib/kubelet/pods/f822b186-ff4b-4190-b86f-e20bcc5ae236/volumes" Sep 30 06:45:14 crc kubenswrapper[4691]: I0930 06:45:14.044310 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-a15e-account-create-nt67d"] Sep 30 06:45:14 crc kubenswrapper[4691]: I0930 06:45:14.061972 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-a15e-account-create-nt67d"] Sep 30 06:45:15 crc kubenswrapper[4691]: I0930 06:45:15.248015 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf1e1d74-2b4c-41a7-92fb-95337bebfd86" path="/var/lib/kubelet/pods/cf1e1d74-2b4c-41a7-92fb-95337bebfd86/volumes" Sep 30 06:45:22 crc kubenswrapper[4691]: I0930 06:45:22.036594 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-a0bc-account-create-89btn"] Sep 30 06:45:22 crc kubenswrapper[4691]: I0930 06:45:22.048697 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-3c60-account-create-l5zxb"] Sep 30 06:45:22 crc kubenswrapper[4691]: I0930 06:45:22.059969 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-3c60-account-create-l5zxb"] Sep 30 06:45:22 crc kubenswrapper[4691]: I0930 06:45:22.069422 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-a0bc-account-create-89btn"] Sep 30 06:45:22 crc kubenswrapper[4691]: I0930 06:45:22.183930 4691 generic.go:334] "Generic (PLEG): container finished" podID="c6027156-9dfc-40c5-b265-96d0231b32d6" containerID="4f10c432397d0d98981e0e953ecc513ad8213e3c072d1ba55085f48e5f221a18" exitCode=0 Sep 30 06:45:22 crc kubenswrapper[4691]: I0930 06:45:22.183976 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wdn8x" event={"ID":"c6027156-9dfc-40c5-b265-96d0231b32d6","Type":"ContainerDied","Data":"4f10c432397d0d98981e0e953ecc513ad8213e3c072d1ba55085f48e5f221a18"} Sep 30 06:45:23 crc kubenswrapper[4691]: I0930 06:45:23.157562 4691 scope.go:117] "RemoveContainer" containerID="7e9471ae1a26bebf564997628cdb5c4645d5b0994bb9cfa27422177a4572f239" Sep 30 06:45:23 crc kubenswrapper[4691]: I0930 06:45:23.193197 4691 scope.go:117] "RemoveContainer" containerID="f8edbd476f29040c250467d5a4c086a09ff5bb7829dd180fc19a411589723007" Sep 30 06:45:23 crc kubenswrapper[4691]: I0930 06:45:23.245561 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a0abd23-3753-4910-9b91-539dde605d21" path="/var/lib/kubelet/pods/1a0abd23-3753-4910-9b91-539dde605d21/volumes" Sep 30 06:45:23 crc kubenswrapper[4691]: I0930 06:45:23.248091 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5e86cb2-d4ee-4828-887f-4166440d359b" path="/var/lib/kubelet/pods/b5e86cb2-d4ee-4828-887f-4166440d359b/volumes" Sep 30 06:45:23 crc kubenswrapper[4691]: I0930 06:45:23.267999 4691 scope.go:117] "RemoveContainer" containerID="0ad19fb7829d03a8a4136df7f8c7600016368715c9509a97e342d1da83350e0d" Sep 30 06:45:23 crc kubenswrapper[4691]: I0930 06:45:23.306636 4691 scope.go:117] "RemoveContainer" containerID="47f28f4f9a1d60ef2129ffbc784bffc15130cea11546ff2647590cce6a56a3d6" Sep 30 06:45:23 crc kubenswrapper[4691]: I0930 06:45:23.384802 4691 scope.go:117] "RemoveContainer" containerID="acd8eda80e88fd8232ebc4d373f7aa89e33d5783392512a53a5cf79bf487c07c" Sep 30 06:45:23 crc kubenswrapper[4691]: I0930 06:45:23.426057 4691 scope.go:117] "RemoveContainer" containerID="ba1e3bb947de74b652862715d50eb68d4a0699ecb0378f425923c6a725a18008" Sep 30 06:45:23 crc kubenswrapper[4691]: I0930 06:45:23.480541 4691 scope.go:117] "RemoveContainer" containerID="e62a84f0d0da996c09af481de07b33b239fa29f5417410b7a62fbaf736a983cc" Sep 30 06:45:23 crc kubenswrapper[4691]: I0930 06:45:23.517225 4691 scope.go:117] "RemoveContainer" containerID="6f46d0d929e186ca33eb3a982e34e83930476b7afccb251fe18c624e6b12dd8d" Sep 30 06:45:23 crc kubenswrapper[4691]: I0930 06:45:23.561488 4691 scope.go:117] "RemoveContainer" containerID="b88dbada347fb995043d43e78fbf0983e8e441303f74888d3c86bd5c9ff7933b" Sep 30 06:45:23 crc kubenswrapper[4691]: I0930 06:45:23.578777 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wdn8x" Sep 30 06:45:23 crc kubenswrapper[4691]: I0930 06:45:23.702875 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2zlrf\" (UniqueName: \"kubernetes.io/projected/c6027156-9dfc-40c5-b265-96d0231b32d6-kube-api-access-2zlrf\") pod \"c6027156-9dfc-40c5-b265-96d0231b32d6\" (UID: \"c6027156-9dfc-40c5-b265-96d0231b32d6\") " Sep 30 06:45:23 crc kubenswrapper[4691]: I0930 06:45:23.703024 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6027156-9dfc-40c5-b265-96d0231b32d6-bootstrap-combined-ca-bundle\") pod \"c6027156-9dfc-40c5-b265-96d0231b32d6\" (UID: \"c6027156-9dfc-40c5-b265-96d0231b32d6\") " Sep 30 06:45:23 crc kubenswrapper[4691]: I0930 06:45:23.703079 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c6027156-9dfc-40c5-b265-96d0231b32d6-ssh-key\") pod \"c6027156-9dfc-40c5-b265-96d0231b32d6\" (UID: \"c6027156-9dfc-40c5-b265-96d0231b32d6\") " Sep 30 06:45:23 crc kubenswrapper[4691]: I0930 06:45:23.703198 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c6027156-9dfc-40c5-b265-96d0231b32d6-inventory\") pod \"c6027156-9dfc-40c5-b265-96d0231b32d6\" (UID: \"c6027156-9dfc-40c5-b265-96d0231b32d6\") " Sep 30 06:45:23 crc kubenswrapper[4691]: I0930 06:45:23.708967 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6027156-9dfc-40c5-b265-96d0231b32d6-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "c6027156-9dfc-40c5-b265-96d0231b32d6" (UID: "c6027156-9dfc-40c5-b265-96d0231b32d6"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:45:23 crc kubenswrapper[4691]: I0930 06:45:23.710149 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6027156-9dfc-40c5-b265-96d0231b32d6-kube-api-access-2zlrf" (OuterVolumeSpecName: "kube-api-access-2zlrf") pod "c6027156-9dfc-40c5-b265-96d0231b32d6" (UID: "c6027156-9dfc-40c5-b265-96d0231b32d6"). InnerVolumeSpecName "kube-api-access-2zlrf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:45:23 crc kubenswrapper[4691]: I0930 06:45:23.739522 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6027156-9dfc-40c5-b265-96d0231b32d6-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c6027156-9dfc-40c5-b265-96d0231b32d6" (UID: "c6027156-9dfc-40c5-b265-96d0231b32d6"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:45:23 crc kubenswrapper[4691]: I0930 06:45:23.750123 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6027156-9dfc-40c5-b265-96d0231b32d6-inventory" (OuterVolumeSpecName: "inventory") pod "c6027156-9dfc-40c5-b265-96d0231b32d6" (UID: "c6027156-9dfc-40c5-b265-96d0231b32d6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:45:23 crc kubenswrapper[4691]: I0930 06:45:23.806052 4691 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c6027156-9dfc-40c5-b265-96d0231b32d6-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 06:45:23 crc kubenswrapper[4691]: I0930 06:45:23.806085 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2zlrf\" (UniqueName: \"kubernetes.io/projected/c6027156-9dfc-40c5-b265-96d0231b32d6-kube-api-access-2zlrf\") on node \"crc\" DevicePath \"\"" Sep 30 06:45:23 crc kubenswrapper[4691]: I0930 06:45:23.806099 4691 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6027156-9dfc-40c5-b265-96d0231b32d6-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:45:23 crc kubenswrapper[4691]: I0930 06:45:23.806109 4691 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c6027156-9dfc-40c5-b265-96d0231b32d6-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 06:45:24 crc kubenswrapper[4691]: I0930 06:45:24.214625 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wdn8x" event={"ID":"c6027156-9dfc-40c5-b265-96d0231b32d6","Type":"ContainerDied","Data":"4cd58e146bbfec38532311f3c5cf9aa27592e950551d21774ccea08b6e38119b"} Sep 30 06:45:24 crc kubenswrapper[4691]: I0930 06:45:24.214681 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4cd58e146bbfec38532311f3c5cf9aa27592e950551d21774ccea08b6e38119b" Sep 30 06:45:24 crc kubenswrapper[4691]: I0930 06:45:24.214683 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-wdn8x" Sep 30 06:45:24 crc kubenswrapper[4691]: I0930 06:45:24.332005 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mwcfh"] Sep 30 06:45:24 crc kubenswrapper[4691]: E0930 06:45:24.333755 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6027156-9dfc-40c5-b265-96d0231b32d6" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Sep 30 06:45:24 crc kubenswrapper[4691]: I0930 06:45:24.334289 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6027156-9dfc-40c5-b265-96d0231b32d6" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Sep 30 06:45:24 crc kubenswrapper[4691]: E0930 06:45:24.334554 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f23214a-9e66-4001-833c-671c08f7a95d" containerName="collect-profiles" Sep 30 06:45:24 crc kubenswrapper[4691]: I0930 06:45:24.334675 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f23214a-9e66-4001-833c-671c08f7a95d" containerName="collect-profiles" Sep 30 06:45:24 crc kubenswrapper[4691]: I0930 06:45:24.335732 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6027156-9dfc-40c5-b265-96d0231b32d6" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Sep 30 06:45:24 crc kubenswrapper[4691]: I0930 06:45:24.335761 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f23214a-9e66-4001-833c-671c08f7a95d" containerName="collect-profiles" Sep 30 06:45:24 crc kubenswrapper[4691]: I0930 06:45:24.342478 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mwcfh" Sep 30 06:45:24 crc kubenswrapper[4691]: I0930 06:45:24.347131 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 06:45:24 crc kubenswrapper[4691]: I0930 06:45:24.347441 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vff8k" Sep 30 06:45:24 crc kubenswrapper[4691]: I0930 06:45:24.347541 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 06:45:24 crc kubenswrapper[4691]: I0930 06:45:24.347818 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 06:45:24 crc kubenswrapper[4691]: I0930 06:45:24.356410 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mwcfh"] Sep 30 06:45:24 crc kubenswrapper[4691]: I0930 06:45:24.436482 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4shp\" (UniqueName: \"kubernetes.io/projected/f8536c3f-e28e-49a1-9b22-bb6ab2652c5b-kube-api-access-q4shp\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mwcfh\" (UID: \"f8536c3f-e28e-49a1-9b22-bb6ab2652c5b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mwcfh" Sep 30 06:45:24 crc kubenswrapper[4691]: I0930 06:45:24.436622 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f8536c3f-e28e-49a1-9b22-bb6ab2652c5b-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mwcfh\" (UID: \"f8536c3f-e28e-49a1-9b22-bb6ab2652c5b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mwcfh" Sep 30 06:45:24 crc kubenswrapper[4691]: I0930 06:45:24.437242 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f8536c3f-e28e-49a1-9b22-bb6ab2652c5b-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mwcfh\" (UID: \"f8536c3f-e28e-49a1-9b22-bb6ab2652c5b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mwcfh" Sep 30 06:45:24 crc kubenswrapper[4691]: I0930 06:45:24.539309 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f8536c3f-e28e-49a1-9b22-bb6ab2652c5b-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mwcfh\" (UID: \"f8536c3f-e28e-49a1-9b22-bb6ab2652c5b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mwcfh" Sep 30 06:45:24 crc kubenswrapper[4691]: I0930 06:45:24.539407 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4shp\" (UniqueName: \"kubernetes.io/projected/f8536c3f-e28e-49a1-9b22-bb6ab2652c5b-kube-api-access-q4shp\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mwcfh\" (UID: \"f8536c3f-e28e-49a1-9b22-bb6ab2652c5b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mwcfh" Sep 30 06:45:24 crc kubenswrapper[4691]: I0930 06:45:24.539491 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f8536c3f-e28e-49a1-9b22-bb6ab2652c5b-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mwcfh\" (UID: \"f8536c3f-e28e-49a1-9b22-bb6ab2652c5b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mwcfh" Sep 30 06:45:24 crc kubenswrapper[4691]: I0930 06:45:24.545403 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f8536c3f-e28e-49a1-9b22-bb6ab2652c5b-ssh-key\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mwcfh\" (UID: \"f8536c3f-e28e-49a1-9b22-bb6ab2652c5b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mwcfh" Sep 30 06:45:24 crc kubenswrapper[4691]: I0930 06:45:24.546974 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f8536c3f-e28e-49a1-9b22-bb6ab2652c5b-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mwcfh\" (UID: \"f8536c3f-e28e-49a1-9b22-bb6ab2652c5b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mwcfh" Sep 30 06:45:24 crc kubenswrapper[4691]: I0930 06:45:24.580281 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4shp\" (UniqueName: \"kubernetes.io/projected/f8536c3f-e28e-49a1-9b22-bb6ab2652c5b-kube-api-access-q4shp\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-mwcfh\" (UID: \"f8536c3f-e28e-49a1-9b22-bb6ab2652c5b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mwcfh" Sep 30 06:45:24 crc kubenswrapper[4691]: I0930 06:45:24.665177 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mwcfh" Sep 30 06:45:25 crc kubenswrapper[4691]: I0930 06:45:25.243575 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mwcfh"] Sep 30 06:45:26 crc kubenswrapper[4691]: I0930 06:45:26.254021 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mwcfh" event={"ID":"f8536c3f-e28e-49a1-9b22-bb6ab2652c5b","Type":"ContainerStarted","Data":"b74467ff3c7928d57f881fba19a137c5514e630c12fde0ce305f9aa30be75e3c"} Sep 30 06:45:26 crc kubenswrapper[4691]: I0930 06:45:26.254302 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mwcfh" event={"ID":"f8536c3f-e28e-49a1-9b22-bb6ab2652c5b","Type":"ContainerStarted","Data":"a0a1900f5bc8526c084971ddc5ae0c42187903b14d5394eb9419cb1c83c7912f"} Sep 30 06:45:26 crc kubenswrapper[4691]: I0930 06:45:26.281257 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mwcfh" podStartSLOduration=1.761202022 podStartE2EDuration="2.281226944s" podCreationTimestamp="2025-09-30 06:45:24 +0000 UTC" firstStartedPulling="2025-09-30 06:45:25.252258202 +0000 UTC m=+1568.727279232" lastFinishedPulling="2025-09-30 06:45:25.772283074 +0000 UTC m=+1569.247304154" observedRunningTime="2025-09-30 06:45:26.279368265 +0000 UTC m=+1569.754389335" watchObservedRunningTime="2025-09-30 06:45:26.281226944 +0000 UTC m=+1569.756248034" Sep 30 06:45:28 crc kubenswrapper[4691]: I0930 06:45:28.036609 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-9400-account-create-2jdrq"] Sep 30 06:45:28 crc kubenswrapper[4691]: I0930 06:45:28.055763 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-9400-account-create-2jdrq"] Sep 30 06:45:29 crc kubenswrapper[4691]: I0930 06:45:29.245624 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da5644f7-944d-4df5-ae95-535bbf9399a1" path="/var/lib/kubelet/pods/da5644f7-944d-4df5-ae95-535bbf9399a1/volumes" Sep 30 06:45:53 crc kubenswrapper[4691]: I0930 06:45:53.060715 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-4dcl7"] Sep 30 06:45:53 crc kubenswrapper[4691]: I0930 06:45:53.072570 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-x5kh8"] Sep 30 06:45:53 crc kubenswrapper[4691]: I0930 06:45:53.089931 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-z5k76"] Sep 30 06:45:53 crc kubenswrapper[4691]: I0930 06:45:53.105425 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-4dcl7"] Sep 30 06:45:53 crc kubenswrapper[4691]: I0930 06:45:53.118367 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-x5kh8"] Sep 30 06:45:53 crc kubenswrapper[4691]: I0930 06:45:53.128674 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-z5k76"] Sep 30 06:45:53 crc kubenswrapper[4691]: I0930 06:45:53.240473 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16b4442d-198c-4824-a8f3-3fbfd345e87f" path="/var/lib/kubelet/pods/16b4442d-198c-4824-a8f3-3fbfd345e87f/volumes" Sep 30 06:45:53 crc kubenswrapper[4691]: I0930 06:45:53.243110 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c83300e-e91d-43bf-a9b7-ee763cea39b2" path="/var/lib/kubelet/pods/5c83300e-e91d-43bf-a9b7-ee763cea39b2/volumes" Sep 30 06:45:53 crc kubenswrapper[4691]: I0930 06:45:53.244453 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5ed4c1d-acca-4979-876b-1b0fbb34443c" path="/var/lib/kubelet/pods/b5ed4c1d-acca-4979-876b-1b0fbb34443c/volumes" Sep 30 06:45:59 crc kubenswrapper[4691]: I0930 06:45:59.037698 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-gvbbh"] Sep 30 06:45:59 crc kubenswrapper[4691]: I0930 06:45:59.048271 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-gvbbh"] Sep 30 06:45:59 crc kubenswrapper[4691]: I0930 06:45:59.239627 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30d87d59-039d-4ccf-a112-2beb7059e140" path="/var/lib/kubelet/pods/30d87d59-039d-4ccf-a112-2beb7059e140/volumes" Sep 30 06:46:03 crc kubenswrapper[4691]: I0930 06:46:03.041427 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/watcher-db-sync-k2cvv"] Sep 30 06:46:03 crc kubenswrapper[4691]: I0930 06:46:03.070747 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/watcher-db-sync-k2cvv"] Sep 30 06:46:03 crc kubenswrapper[4691]: I0930 06:46:03.240705 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="097bc1bb-8fd4-4ccd-ad03-246aa9ecdd22" path="/var/lib/kubelet/pods/097bc1bb-8fd4-4ccd-ad03-246aa9ecdd22/volumes" Sep 30 06:46:04 crc kubenswrapper[4691]: I0930 06:46:04.048786 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-6grqn"] Sep 30 06:46:04 crc kubenswrapper[4691]: I0930 06:46:04.060208 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-6grqn"] Sep 30 06:46:05 crc kubenswrapper[4691]: I0930 06:46:05.248502 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29abdc62-6e41-49f9-8426-f8b4c1f25014" path="/var/lib/kubelet/pods/29abdc62-6e41-49f9-8426-f8b4c1f25014/volumes" Sep 30 06:46:15 crc kubenswrapper[4691]: I0930 06:46:15.066359 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-d99e-account-create-zrrb2"] Sep 30 06:46:15 crc kubenswrapper[4691]: I0930 06:46:15.090195 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-7341-account-create-ln8wm"] Sep 30 06:46:15 crc kubenswrapper[4691]: I0930 06:46:15.103597 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-c5ed-account-create-d7rxh"] Sep 30 06:46:15 crc kubenswrapper[4691]: I0930 06:46:15.114395 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-7341-account-create-ln8wm"] Sep 30 06:46:15 crc kubenswrapper[4691]: I0930 06:46:15.125803 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-d99e-account-create-zrrb2"] Sep 30 06:46:15 crc kubenswrapper[4691]: I0930 06:46:15.135410 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-c5ed-account-create-d7rxh"] Sep 30 06:46:15 crc kubenswrapper[4691]: I0930 06:46:15.250085 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ce9ee68-c2b5-456f-9a12-5493b94729ea" path="/var/lib/kubelet/pods/0ce9ee68-c2b5-456f-9a12-5493b94729ea/volumes" Sep 30 06:46:15 crc kubenswrapper[4691]: I0930 06:46:15.251502 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f76bf7e-6020-4d8a-a15c-f2d497629fd9" path="/var/lib/kubelet/pods/4f76bf7e-6020-4d8a-a15c-f2d497629fd9/volumes" Sep 30 06:46:15 crc kubenswrapper[4691]: I0930 06:46:15.254341 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cae1274f-384f-452d-b80f-1c2a3712bb49" path="/var/lib/kubelet/pods/cae1274f-384f-452d-b80f-1c2a3712bb49/volumes" Sep 30 06:46:22 crc kubenswrapper[4691]: I0930 06:46:22.850038 4691 patch_prober.go:28] interesting pod/machine-config-daemon-4w4k6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 06:46:22 crc kubenswrapper[4691]: I0930 06:46:22.851230 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 06:46:23 crc kubenswrapper[4691]: I0930 06:46:23.775737 4691 scope.go:117] "RemoveContainer" containerID="ea47924e4c369891476df9ecbb29c51739229ec3fd94fbc7ec8eb4455e40c7a7" Sep 30 06:46:23 crc kubenswrapper[4691]: I0930 06:46:23.831632 4691 scope.go:117] "RemoveContainer" containerID="6922a5c6d13c5d78edb8eeb161243af2152c9433ba169d033271e10bf486a4ed" Sep 30 06:46:23 crc kubenswrapper[4691]: I0930 06:46:23.956120 4691 scope.go:117] "RemoveContainer" containerID="fdc084a5e240d7ff6e63e954c36d38a5acc82bda4ee9495763fb9ce7d6d84272" Sep 30 06:46:23 crc kubenswrapper[4691]: I0930 06:46:23.995499 4691 scope.go:117] "RemoveContainer" containerID="b315a9a23aa8a12ed4e264ade7a72252704a75749ba990f4d4a22db0dcd64e9f" Sep 30 06:46:24 crc kubenswrapper[4691]: I0930 06:46:24.056582 4691 scope.go:117] "RemoveContainer" containerID="1836171cceace96fc472d2ebd5aed274964abb8c115c73e77f97741aa6d01dad" Sep 30 06:46:24 crc kubenswrapper[4691]: I0930 06:46:24.143015 4691 scope.go:117] "RemoveContainer" containerID="6f136ee1f188d34813d25f3104b5f58f7826464fcf95988ad8bea55f1af5a2a7" Sep 30 06:46:24 crc kubenswrapper[4691]: I0930 06:46:24.193110 4691 scope.go:117] "RemoveContainer" containerID="0d7718ca1e11e88c44d8fa4dfb5710bb0f114057494cb8b506bac1c3b3c9bc3b" Sep 30 06:46:24 crc kubenswrapper[4691]: I0930 06:46:24.234317 4691 scope.go:117] "RemoveContainer" containerID="6ea42558cd89fdbcf602cd6366287450c7f412f4318d07fe01f54914a22f3b1f" Sep 30 06:46:24 crc kubenswrapper[4691]: I0930 06:46:24.263135 4691 scope.go:117] "RemoveContainer" containerID="3a051da935daad5848c7eff04402e13b3aae4eb8b6ebcbb57bf8a6a9fddcf520" Sep 30 06:46:24 crc kubenswrapper[4691]: I0930 06:46:24.293714 4691 scope.go:117] "RemoveContainer" containerID="f98c62251adcc264b995ebac457d42302c3ef8121e140070275e014a1e9248bb" Sep 30 06:46:31 crc kubenswrapper[4691]: I0930 06:46:31.045387 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-jfg2g"] Sep 30 06:46:31 crc kubenswrapper[4691]: I0930 06:46:31.057165 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-jfg2g"] Sep 30 06:46:31 crc kubenswrapper[4691]: I0930 06:46:31.244493 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6315532b-2604-4b19-8b3f-cb4bb9ff83f6" path="/var/lib/kubelet/pods/6315532b-2604-4b19-8b3f-cb4bb9ff83f6/volumes" Sep 30 06:46:35 crc kubenswrapper[4691]: I0930 06:46:35.034129 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-j7phx"] Sep 30 06:46:35 crc kubenswrapper[4691]: I0930 06:46:35.058935 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-j7phx"] Sep 30 06:46:35 crc kubenswrapper[4691]: I0930 06:46:35.239083 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca252689-a07e-4e84-a79f-7884687c6db3" path="/var/lib/kubelet/pods/ca252689-a07e-4e84-a79f-7884687c6db3/volumes" Sep 30 06:46:52 crc kubenswrapper[4691]: I0930 06:46:52.850650 4691 patch_prober.go:28] interesting pod/machine-config-daemon-4w4k6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 06:46:52 crc kubenswrapper[4691]: I0930 06:46:52.851245 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 06:46:53 crc kubenswrapper[4691]: I0930 06:46:53.065137 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-sb2qw"] Sep 30 06:46:53 crc kubenswrapper[4691]: I0930 06:46:53.077691 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-sb2qw"] Sep 30 06:46:53 crc kubenswrapper[4691]: I0930 06:46:53.242751 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f010c19-f02f-4c8b-8b12-1f357e860666" path="/var/lib/kubelet/pods/1f010c19-f02f-4c8b-8b12-1f357e860666/volumes" Sep 30 06:47:01 crc kubenswrapper[4691]: I0930 06:47:01.046200 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-pblkf"] Sep 30 06:47:01 crc kubenswrapper[4691]: I0930 06:47:01.061337 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-pblkf"] Sep 30 06:47:01 crc kubenswrapper[4691]: I0930 06:47:01.237548 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e1ab390-f1ae-4ec9-b5d6-fb137a511e21" path="/var/lib/kubelet/pods/3e1ab390-f1ae-4ec9-b5d6-fb137a511e21/volumes" Sep 30 06:47:04 crc kubenswrapper[4691]: I0930 06:47:04.476861 4691 generic.go:334] "Generic (PLEG): container finished" podID="f8536c3f-e28e-49a1-9b22-bb6ab2652c5b" containerID="b74467ff3c7928d57f881fba19a137c5514e630c12fde0ce305f9aa30be75e3c" exitCode=0 Sep 30 06:47:04 crc kubenswrapper[4691]: I0930 06:47:04.476966 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mwcfh" event={"ID":"f8536c3f-e28e-49a1-9b22-bb6ab2652c5b","Type":"ContainerDied","Data":"b74467ff3c7928d57f881fba19a137c5514e630c12fde0ce305f9aa30be75e3c"} Sep 30 06:47:06 crc kubenswrapper[4691]: I0930 06:47:06.013748 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mwcfh" Sep 30 06:47:06 crc kubenswrapper[4691]: I0930 06:47:06.098177 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f8536c3f-e28e-49a1-9b22-bb6ab2652c5b-inventory\") pod \"f8536c3f-e28e-49a1-9b22-bb6ab2652c5b\" (UID: \"f8536c3f-e28e-49a1-9b22-bb6ab2652c5b\") " Sep 30 06:47:06 crc kubenswrapper[4691]: I0930 06:47:06.098253 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f8536c3f-e28e-49a1-9b22-bb6ab2652c5b-ssh-key\") pod \"f8536c3f-e28e-49a1-9b22-bb6ab2652c5b\" (UID: \"f8536c3f-e28e-49a1-9b22-bb6ab2652c5b\") " Sep 30 06:47:06 crc kubenswrapper[4691]: I0930 06:47:06.098369 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4shp\" (UniqueName: \"kubernetes.io/projected/f8536c3f-e28e-49a1-9b22-bb6ab2652c5b-kube-api-access-q4shp\") pod \"f8536c3f-e28e-49a1-9b22-bb6ab2652c5b\" (UID: \"f8536c3f-e28e-49a1-9b22-bb6ab2652c5b\") " Sep 30 06:47:06 crc kubenswrapper[4691]: I0930 06:47:06.104824 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8536c3f-e28e-49a1-9b22-bb6ab2652c5b-kube-api-access-q4shp" (OuterVolumeSpecName: "kube-api-access-q4shp") pod "f8536c3f-e28e-49a1-9b22-bb6ab2652c5b" (UID: "f8536c3f-e28e-49a1-9b22-bb6ab2652c5b"). InnerVolumeSpecName "kube-api-access-q4shp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:47:06 crc kubenswrapper[4691]: I0930 06:47:06.127724 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8536c3f-e28e-49a1-9b22-bb6ab2652c5b-inventory" (OuterVolumeSpecName: "inventory") pod "f8536c3f-e28e-49a1-9b22-bb6ab2652c5b" (UID: "f8536c3f-e28e-49a1-9b22-bb6ab2652c5b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:47:06 crc kubenswrapper[4691]: I0930 06:47:06.134860 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8536c3f-e28e-49a1-9b22-bb6ab2652c5b-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "f8536c3f-e28e-49a1-9b22-bb6ab2652c5b" (UID: "f8536c3f-e28e-49a1-9b22-bb6ab2652c5b"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:47:06 crc kubenswrapper[4691]: I0930 06:47:06.200954 4691 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f8536c3f-e28e-49a1-9b22-bb6ab2652c5b-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 06:47:06 crc kubenswrapper[4691]: I0930 06:47:06.201055 4691 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/f8536c3f-e28e-49a1-9b22-bb6ab2652c5b-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 06:47:06 crc kubenswrapper[4691]: I0930 06:47:06.201071 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4shp\" (UniqueName: \"kubernetes.io/projected/f8536c3f-e28e-49a1-9b22-bb6ab2652c5b-kube-api-access-q4shp\") on node \"crc\" DevicePath \"\"" Sep 30 06:47:06 crc kubenswrapper[4691]: I0930 06:47:06.505417 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mwcfh" event={"ID":"f8536c3f-e28e-49a1-9b22-bb6ab2652c5b","Type":"ContainerDied","Data":"a0a1900f5bc8526c084971ddc5ae0c42187903b14d5394eb9419cb1c83c7912f"} Sep 30 06:47:06 crc kubenswrapper[4691]: I0930 06:47:06.505476 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0a1900f5bc8526c084971ddc5ae0c42187903b14d5394eb9419cb1c83c7912f" Sep 30 06:47:06 crc kubenswrapper[4691]: I0930 06:47:06.505564 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-mwcfh" Sep 30 06:47:06 crc kubenswrapper[4691]: I0930 06:47:06.720961 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-528ll"] Sep 30 06:47:06 crc kubenswrapper[4691]: E0930 06:47:06.722034 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8536c3f-e28e-49a1-9b22-bb6ab2652c5b" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Sep 30 06:47:06 crc kubenswrapper[4691]: I0930 06:47:06.722062 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8536c3f-e28e-49a1-9b22-bb6ab2652c5b" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Sep 30 06:47:06 crc kubenswrapper[4691]: I0930 06:47:06.722563 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8536c3f-e28e-49a1-9b22-bb6ab2652c5b" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Sep 30 06:47:06 crc kubenswrapper[4691]: I0930 06:47:06.728192 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-528ll" Sep 30 06:47:06 crc kubenswrapper[4691]: I0930 06:47:06.733229 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 06:47:06 crc kubenswrapper[4691]: I0930 06:47:06.733550 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 06:47:06 crc kubenswrapper[4691]: I0930 06:47:06.733796 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vff8k" Sep 30 06:47:06 crc kubenswrapper[4691]: I0930 06:47:06.752179 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 06:47:06 crc kubenswrapper[4691]: I0930 06:47:06.759423 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-528ll"] Sep 30 06:47:06 crc kubenswrapper[4691]: I0930 06:47:06.838641 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4xsc\" (UniqueName: \"kubernetes.io/projected/6b78a233-7f96-48a0-b484-0bb1196d8d4e-kube-api-access-q4xsc\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-528ll\" (UID: \"6b78a233-7f96-48a0-b484-0bb1196d8d4e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-528ll" Sep 30 06:47:06 crc kubenswrapper[4691]: I0930 06:47:06.838807 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6b78a233-7f96-48a0-b484-0bb1196d8d4e-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-528ll\" (UID: \"6b78a233-7f96-48a0-b484-0bb1196d8d4e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-528ll" Sep 30 06:47:06 crc kubenswrapper[4691]: I0930 06:47:06.838879 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6b78a233-7f96-48a0-b484-0bb1196d8d4e-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-528ll\" (UID: \"6b78a233-7f96-48a0-b484-0bb1196d8d4e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-528ll" Sep 30 06:47:06 crc kubenswrapper[4691]: I0930 06:47:06.941631 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4xsc\" (UniqueName: \"kubernetes.io/projected/6b78a233-7f96-48a0-b484-0bb1196d8d4e-kube-api-access-q4xsc\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-528ll\" (UID: \"6b78a233-7f96-48a0-b484-0bb1196d8d4e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-528ll" Sep 30 06:47:06 crc kubenswrapper[4691]: I0930 06:47:06.942047 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6b78a233-7f96-48a0-b484-0bb1196d8d4e-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-528ll\" (UID: \"6b78a233-7f96-48a0-b484-0bb1196d8d4e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-528ll" Sep 30 06:47:06 crc kubenswrapper[4691]: I0930 06:47:06.942152 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6b78a233-7f96-48a0-b484-0bb1196d8d4e-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-528ll\" (UID: \"6b78a233-7f96-48a0-b484-0bb1196d8d4e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-528ll" Sep 30 06:47:06 crc kubenswrapper[4691]: I0930 06:47:06.959990 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6b78a233-7f96-48a0-b484-0bb1196d8d4e-ssh-key\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-528ll\" (UID: \"6b78a233-7f96-48a0-b484-0bb1196d8d4e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-528ll" Sep 30 06:47:06 crc kubenswrapper[4691]: I0930 06:47:06.960177 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6b78a233-7f96-48a0-b484-0bb1196d8d4e-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-528ll\" (UID: \"6b78a233-7f96-48a0-b484-0bb1196d8d4e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-528ll" Sep 30 06:47:06 crc kubenswrapper[4691]: I0930 06:47:06.972595 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4xsc\" (UniqueName: \"kubernetes.io/projected/6b78a233-7f96-48a0-b484-0bb1196d8d4e-kube-api-access-q4xsc\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-528ll\" (UID: \"6b78a233-7f96-48a0-b484-0bb1196d8d4e\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-528ll" Sep 30 06:47:07 crc kubenswrapper[4691]: I0930 06:47:07.073251 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-528ll" Sep 30 06:47:07 crc kubenswrapper[4691]: I0930 06:47:07.739724 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-528ll"] Sep 30 06:47:08 crc kubenswrapper[4691]: I0930 06:47:08.530216 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-528ll" event={"ID":"6b78a233-7f96-48a0-b484-0bb1196d8d4e","Type":"ContainerStarted","Data":"527289b6941044633c29b52cd1811c696a676ab8b35d77cbbb537bc59e31cb73"} Sep 30 06:47:08 crc kubenswrapper[4691]: I0930 06:47:08.530607 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-528ll" event={"ID":"6b78a233-7f96-48a0-b484-0bb1196d8d4e","Type":"ContainerStarted","Data":"d3b3a03c93adaceaf3209118d99c97dc323129c07ac4a4e5b12e1ee0de4f0a82"} Sep 30 06:47:08 crc kubenswrapper[4691]: I0930 06:47:08.566032 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-528ll" podStartSLOduration=2.066735711 podStartE2EDuration="2.565965697s" podCreationTimestamp="2025-09-30 06:47:06 +0000 UTC" firstStartedPulling="2025-09-30 06:47:07.736676455 +0000 UTC m=+1671.211697535" lastFinishedPulling="2025-09-30 06:47:08.235906461 +0000 UTC m=+1671.710927521" observedRunningTime="2025-09-30 06:47:08.553981574 +0000 UTC m=+1672.029002634" watchObservedRunningTime="2025-09-30 06:47:08.565965697 +0000 UTC m=+1672.040986777" Sep 30 06:47:13 crc kubenswrapper[4691]: I0930 06:47:13.058562 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-fh78b"] Sep 30 06:47:13 crc kubenswrapper[4691]: I0930 06:47:13.071393 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-fh78b"] Sep 30 06:47:13 crc kubenswrapper[4691]: I0930 06:47:13.237213 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="071e402d-9775-412e-ad8a-1643cd646d7c" path="/var/lib/kubelet/pods/071e402d-9775-412e-ad8a-1643cd646d7c/volumes" Sep 30 06:47:22 crc kubenswrapper[4691]: I0930 06:47:22.850299 4691 patch_prober.go:28] interesting pod/machine-config-daemon-4w4k6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 06:47:22 crc kubenswrapper[4691]: I0930 06:47:22.850772 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 06:47:22 crc kubenswrapper[4691]: I0930 06:47:22.850821 4691 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" Sep 30 06:47:22 crc kubenswrapper[4691]: I0930 06:47:22.851762 4691 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"584128b586b25bf701ec8def4f23ef28936697a1f4acd379f53fcb8a26a1716a"} pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 06:47:22 crc kubenswrapper[4691]: I0930 06:47:22.851832 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" containerName="machine-config-daemon" containerID="cri-o://584128b586b25bf701ec8def4f23ef28936697a1f4acd379f53fcb8a26a1716a" gracePeriod=600 Sep 30 06:47:22 crc kubenswrapper[4691]: E0930 06:47:22.984723 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 06:47:23 crc kubenswrapper[4691]: I0930 06:47:23.693985 4691 generic.go:334] "Generic (PLEG): container finished" podID="69b46ade-8260-448f-84b7-506632d23ff9" containerID="584128b586b25bf701ec8def4f23ef28936697a1f4acd379f53fcb8a26a1716a" exitCode=0 Sep 30 06:47:23 crc kubenswrapper[4691]: I0930 06:47:23.694048 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" event={"ID":"69b46ade-8260-448f-84b7-506632d23ff9","Type":"ContainerDied","Data":"584128b586b25bf701ec8def4f23ef28936697a1f4acd379f53fcb8a26a1716a"} Sep 30 06:47:23 crc kubenswrapper[4691]: I0930 06:47:23.694268 4691 scope.go:117] "RemoveContainer" containerID="cdbae690f51f4bea9a63d0f6c926710bf8cae323365923c61958365eb48e16db" Sep 30 06:47:23 crc kubenswrapper[4691]: I0930 06:47:23.695520 4691 scope.go:117] "RemoveContainer" containerID="584128b586b25bf701ec8def4f23ef28936697a1f4acd379f53fcb8a26a1716a" Sep 30 06:47:23 crc kubenswrapper[4691]: E0930 06:47:23.696320 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 06:47:24 crc kubenswrapper[4691]: I0930 06:47:24.556241 4691 scope.go:117] "RemoveContainer" containerID="c421916911fd50b60418d185f1c860b702b4ec4ec3250ddf3ec26d1b20cbdc09" Sep 30 06:47:24 crc kubenswrapper[4691]: I0930 06:47:24.598771 4691 scope.go:117] "RemoveContainer" containerID="102e58ebdd5146960fce63254dfc2ca4b3e0d16f64058217cd413713ce9faf5e" Sep 30 06:47:24 crc kubenswrapper[4691]: I0930 06:47:24.676950 4691 scope.go:117] "RemoveContainer" containerID="f53ce7512c565ebd65d02bacc532dddd113915699808a3404774f62e491fb73d" Sep 30 06:47:24 crc kubenswrapper[4691]: I0930 06:47:24.740143 4691 scope.go:117] "RemoveContainer" containerID="8b4784efaa23c98f9e640e2c991c0a4de71215216dde1b5054d199bea4aaf3fd" Sep 30 06:47:24 crc kubenswrapper[4691]: I0930 06:47:24.799011 4691 scope.go:117] "RemoveContainer" containerID="10cd68f7901e1f9bda7ddcbd82ff985ba27907d359a88feddc49294014fb128d" Sep 30 06:47:35 crc kubenswrapper[4691]: I0930 06:47:35.225294 4691 scope.go:117] "RemoveContainer" containerID="584128b586b25bf701ec8def4f23ef28936697a1f4acd379f53fcb8a26a1716a" Sep 30 06:47:35 crc kubenswrapper[4691]: E0930 06:47:35.226165 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 06:47:39 crc kubenswrapper[4691]: I0930 06:47:39.065903 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-f7vsc"] Sep 30 06:47:39 crc kubenswrapper[4691]: I0930 06:47:39.079189 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-zs2kc"] Sep 30 06:47:39 crc kubenswrapper[4691]: I0930 06:47:39.090670 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-9j44w"] Sep 30 06:47:39 crc kubenswrapper[4691]: I0930 06:47:39.102249 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-zs2kc"] Sep 30 06:47:39 crc kubenswrapper[4691]: I0930 06:47:39.112845 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-9j44w"] Sep 30 06:47:39 crc kubenswrapper[4691]: I0930 06:47:39.121337 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-f7vsc"] Sep 30 06:47:39 crc kubenswrapper[4691]: I0930 06:47:39.235146 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="149dd243-4659-4434-a6d1-63fb57617546" path="/var/lib/kubelet/pods/149dd243-4659-4434-a6d1-63fb57617546/volumes" Sep 30 06:47:39 crc kubenswrapper[4691]: I0930 06:47:39.235751 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63e509fa-065b-49fa-8e8b-292350d86b8f" path="/var/lib/kubelet/pods/63e509fa-065b-49fa-8e8b-292350d86b8f/volumes" Sep 30 06:47:39 crc kubenswrapper[4691]: I0930 06:47:39.236382 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a199530-065b-4a27-bd83-b0b8f0ae2c13" path="/var/lib/kubelet/pods/9a199530-065b-4a27-bd83-b0b8f0ae2c13/volumes" Sep 30 06:47:46 crc kubenswrapper[4691]: I0930 06:47:46.225357 4691 scope.go:117] "RemoveContainer" containerID="584128b586b25bf701ec8def4f23ef28936697a1f4acd379f53fcb8a26a1716a" Sep 30 06:47:46 crc kubenswrapper[4691]: E0930 06:47:46.227673 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 06:47:49 crc kubenswrapper[4691]: I0930 06:47:49.041639 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-7d09-account-create-9q4qn"] Sep 30 06:47:49 crc kubenswrapper[4691]: I0930 06:47:49.053560 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-2f4d-account-create-lb6cd"] Sep 30 06:47:49 crc kubenswrapper[4691]: I0930 06:47:49.074839 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-7d09-account-create-9q4qn"] Sep 30 06:47:49 crc kubenswrapper[4691]: I0930 06:47:49.086817 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-2f4d-account-create-lb6cd"] Sep 30 06:47:49 crc kubenswrapper[4691]: I0930 06:47:49.238473 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b656027-0cbe-4ac6-bc45-b6b91d5246c9" path="/var/lib/kubelet/pods/1b656027-0cbe-4ac6-bc45-b6b91d5246c9/volumes" Sep 30 06:47:49 crc kubenswrapper[4691]: I0930 06:47:49.239510 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ee2df6e-d485-4863-ad00-4a6c50d7f726" path="/var/lib/kubelet/pods/5ee2df6e-d485-4863-ad00-4a6c50d7f726/volumes" Sep 30 06:47:50 crc kubenswrapper[4691]: I0930 06:47:50.044694 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-7d40-account-create-2b8c5"] Sep 30 06:47:50 crc kubenswrapper[4691]: I0930 06:47:50.063934 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-7d40-account-create-2b8c5"] Sep 30 06:47:51 crc kubenswrapper[4691]: I0930 06:47:51.239797 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14ced404-77df-4f44-aece-f7e3d8add6d2" path="/var/lib/kubelet/pods/14ced404-77df-4f44-aece-f7e3d8add6d2/volumes" Sep 30 06:48:00 crc kubenswrapper[4691]: I0930 06:48:00.225438 4691 scope.go:117] "RemoveContainer" containerID="584128b586b25bf701ec8def4f23ef28936697a1f4acd379f53fcb8a26a1716a" Sep 30 06:48:00 crc kubenswrapper[4691]: E0930 06:48:00.226269 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 06:48:14 crc kubenswrapper[4691]: I0930 06:48:14.225673 4691 scope.go:117] "RemoveContainer" containerID="584128b586b25bf701ec8def4f23ef28936697a1f4acd379f53fcb8a26a1716a" Sep 30 06:48:14 crc kubenswrapper[4691]: E0930 06:48:14.226790 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 06:48:15 crc kubenswrapper[4691]: I0930 06:48:15.051002 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-k7vkj"] Sep 30 06:48:15 crc kubenswrapper[4691]: I0930 06:48:15.058720 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-k7vkj"] Sep 30 06:48:15 crc kubenswrapper[4691]: I0930 06:48:15.246263 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1d80b8d-cdaa-4dec-9ef5-dca9c837ed7a" path="/var/lib/kubelet/pods/c1d80b8d-cdaa-4dec-9ef5-dca9c837ed7a/volumes" Sep 30 06:48:24 crc kubenswrapper[4691]: I0930 06:48:24.439839 4691 generic.go:334] "Generic (PLEG): container finished" podID="6b78a233-7f96-48a0-b484-0bb1196d8d4e" containerID="527289b6941044633c29b52cd1811c696a676ab8b35d77cbbb537bc59e31cb73" exitCode=0 Sep 30 06:48:24 crc kubenswrapper[4691]: I0930 06:48:24.439918 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-528ll" event={"ID":"6b78a233-7f96-48a0-b484-0bb1196d8d4e","Type":"ContainerDied","Data":"527289b6941044633c29b52cd1811c696a676ab8b35d77cbbb537bc59e31cb73"} Sep 30 06:48:24 crc kubenswrapper[4691]: I0930 06:48:24.948574 4691 scope.go:117] "RemoveContainer" containerID="a50e72b3684249c6ef44c44bf367e3bc96a00911cf3cfc02d0c6b7bf6ce6bc14" Sep 30 06:48:25 crc kubenswrapper[4691]: I0930 06:48:25.025655 4691 scope.go:117] "RemoveContainer" containerID="43088109a9909db3c1bd729fe02db38943455e3bb1680ececbbb8ca2edc63c80" Sep 30 06:48:25 crc kubenswrapper[4691]: I0930 06:48:25.072680 4691 scope.go:117] "RemoveContainer" containerID="b41bf93c569126220b66ae79b81c8d5f338fe1362cfdb688b6c6a84396578d36" Sep 30 06:48:25 crc kubenswrapper[4691]: I0930 06:48:25.138684 4691 scope.go:117] "RemoveContainer" containerID="10cdfd114fab731994de2eee98c74403b9f52834b6af5b4fe1d54c3a849b5851" Sep 30 06:48:25 crc kubenswrapper[4691]: I0930 06:48:25.185080 4691 scope.go:117] "RemoveContainer" containerID="f02be3efb729e7e0e0f978ba4fa4f1b56019a452276a1452c54353dfaf7957f6" Sep 30 06:48:25 crc kubenswrapper[4691]: I0930 06:48:25.222382 4691 scope.go:117] "RemoveContainer" containerID="4fac8c1261ee213426f764c37fe193efedcaa5ff9e16a5655a1a9aa7ecdb85d3" Sep 30 06:48:25 crc kubenswrapper[4691]: I0930 06:48:25.268515 4691 scope.go:117] "RemoveContainer" containerID="44dc43004671078ef18a6070b1f939de4f8bbefd20a88c929fb127889b048565" Sep 30 06:48:25 crc kubenswrapper[4691]: I0930 06:48:25.877929 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-528ll" Sep 30 06:48:26 crc kubenswrapper[4691]: I0930 06:48:26.003973 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4xsc\" (UniqueName: \"kubernetes.io/projected/6b78a233-7f96-48a0-b484-0bb1196d8d4e-kube-api-access-q4xsc\") pod \"6b78a233-7f96-48a0-b484-0bb1196d8d4e\" (UID: \"6b78a233-7f96-48a0-b484-0bb1196d8d4e\") " Sep 30 06:48:26 crc kubenswrapper[4691]: I0930 06:48:26.004071 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6b78a233-7f96-48a0-b484-0bb1196d8d4e-inventory\") pod \"6b78a233-7f96-48a0-b484-0bb1196d8d4e\" (UID: \"6b78a233-7f96-48a0-b484-0bb1196d8d4e\") " Sep 30 06:48:26 crc kubenswrapper[4691]: I0930 06:48:26.004258 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6b78a233-7f96-48a0-b484-0bb1196d8d4e-ssh-key\") pod \"6b78a233-7f96-48a0-b484-0bb1196d8d4e\" (UID: \"6b78a233-7f96-48a0-b484-0bb1196d8d4e\") " Sep 30 06:48:26 crc kubenswrapper[4691]: I0930 06:48:26.010346 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b78a233-7f96-48a0-b484-0bb1196d8d4e-kube-api-access-q4xsc" (OuterVolumeSpecName: "kube-api-access-q4xsc") pod "6b78a233-7f96-48a0-b484-0bb1196d8d4e" (UID: "6b78a233-7f96-48a0-b484-0bb1196d8d4e"). InnerVolumeSpecName "kube-api-access-q4xsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:48:26 crc kubenswrapper[4691]: I0930 06:48:26.036407 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b78a233-7f96-48a0-b484-0bb1196d8d4e-inventory" (OuterVolumeSpecName: "inventory") pod "6b78a233-7f96-48a0-b484-0bb1196d8d4e" (UID: "6b78a233-7f96-48a0-b484-0bb1196d8d4e"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:48:26 crc kubenswrapper[4691]: I0930 06:48:26.054803 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b78a233-7f96-48a0-b484-0bb1196d8d4e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6b78a233-7f96-48a0-b484-0bb1196d8d4e" (UID: "6b78a233-7f96-48a0-b484-0bb1196d8d4e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:48:26 crc kubenswrapper[4691]: I0930 06:48:26.107181 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4xsc\" (UniqueName: \"kubernetes.io/projected/6b78a233-7f96-48a0-b484-0bb1196d8d4e-kube-api-access-q4xsc\") on node \"crc\" DevicePath \"\"" Sep 30 06:48:26 crc kubenswrapper[4691]: I0930 06:48:26.107249 4691 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6b78a233-7f96-48a0-b484-0bb1196d8d4e-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 06:48:26 crc kubenswrapper[4691]: I0930 06:48:26.107279 4691 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6b78a233-7f96-48a0-b484-0bb1196d8d4e-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 06:48:26 crc kubenswrapper[4691]: I0930 06:48:26.465200 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-528ll" event={"ID":"6b78a233-7f96-48a0-b484-0bb1196d8d4e","Type":"ContainerDied","Data":"d3b3a03c93adaceaf3209118d99c97dc323129c07ac4a4e5b12e1ee0de4f0a82"} Sep 30 06:48:26 crc kubenswrapper[4691]: I0930 06:48:26.465668 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3b3a03c93adaceaf3209118d99c97dc323129c07ac4a4e5b12e1ee0de4f0a82" Sep 30 06:48:26 crc kubenswrapper[4691]: I0930 06:48:26.465336 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-528ll" Sep 30 06:48:26 crc kubenswrapper[4691]: I0930 06:48:26.603492 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t25dz"] Sep 30 06:48:26 crc kubenswrapper[4691]: E0930 06:48:26.604076 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b78a233-7f96-48a0-b484-0bb1196d8d4e" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Sep 30 06:48:26 crc kubenswrapper[4691]: I0930 06:48:26.604097 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b78a233-7f96-48a0-b484-0bb1196d8d4e" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Sep 30 06:48:26 crc kubenswrapper[4691]: I0930 06:48:26.604341 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b78a233-7f96-48a0-b484-0bb1196d8d4e" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Sep 30 06:48:26 crc kubenswrapper[4691]: I0930 06:48:26.605239 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t25dz" Sep 30 06:48:26 crc kubenswrapper[4691]: I0930 06:48:26.608077 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vff8k" Sep 30 06:48:26 crc kubenswrapper[4691]: I0930 06:48:26.608636 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 06:48:26 crc kubenswrapper[4691]: I0930 06:48:26.609820 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 06:48:26 crc kubenswrapper[4691]: I0930 06:48:26.614059 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 06:48:26 crc kubenswrapper[4691]: I0930 06:48:26.616204 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t25dz"] Sep 30 06:48:26 crc kubenswrapper[4691]: I0930 06:48:26.720251 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5xqwz\" (UniqueName: \"kubernetes.io/projected/e97ad218-7d51-462b-bdcf-cd39157152c1-kube-api-access-5xqwz\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-t25dz\" (UID: \"e97ad218-7d51-462b-bdcf-cd39157152c1\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t25dz" Sep 30 06:48:26 crc kubenswrapper[4691]: I0930 06:48:26.720405 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e97ad218-7d51-462b-bdcf-cd39157152c1-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-t25dz\" (UID: \"e97ad218-7d51-462b-bdcf-cd39157152c1\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t25dz" Sep 30 06:48:26 crc kubenswrapper[4691]: I0930 06:48:26.720526 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e97ad218-7d51-462b-bdcf-cd39157152c1-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-t25dz\" (UID: \"e97ad218-7d51-462b-bdcf-cd39157152c1\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t25dz" Sep 30 06:48:26 crc kubenswrapper[4691]: I0930 06:48:26.822401 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5xqwz\" (UniqueName: \"kubernetes.io/projected/e97ad218-7d51-462b-bdcf-cd39157152c1-kube-api-access-5xqwz\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-t25dz\" (UID: \"e97ad218-7d51-462b-bdcf-cd39157152c1\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t25dz" Sep 30 06:48:26 crc kubenswrapper[4691]: I0930 06:48:26.822541 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e97ad218-7d51-462b-bdcf-cd39157152c1-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-t25dz\" (UID: \"e97ad218-7d51-462b-bdcf-cd39157152c1\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t25dz" Sep 30 06:48:26 crc kubenswrapper[4691]: I0930 06:48:26.822660 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e97ad218-7d51-462b-bdcf-cd39157152c1-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-t25dz\" (UID: \"e97ad218-7d51-462b-bdcf-cd39157152c1\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t25dz" Sep 30 06:48:26 crc kubenswrapper[4691]: I0930 06:48:26.829042 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e97ad218-7d51-462b-bdcf-cd39157152c1-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-t25dz\" (UID: \"e97ad218-7d51-462b-bdcf-cd39157152c1\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t25dz" Sep 30 06:48:26 crc kubenswrapper[4691]: I0930 06:48:26.833970 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e97ad218-7d51-462b-bdcf-cd39157152c1-ssh-key\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-t25dz\" (UID: \"e97ad218-7d51-462b-bdcf-cd39157152c1\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t25dz" Sep 30 06:48:26 crc kubenswrapper[4691]: I0930 06:48:26.845102 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5xqwz\" (UniqueName: \"kubernetes.io/projected/e97ad218-7d51-462b-bdcf-cd39157152c1-kube-api-access-5xqwz\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-t25dz\" (UID: \"e97ad218-7d51-462b-bdcf-cd39157152c1\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t25dz" Sep 30 06:48:26 crc kubenswrapper[4691]: I0930 06:48:26.937565 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t25dz" Sep 30 06:48:27 crc kubenswrapper[4691]: I0930 06:48:27.230631 4691 scope.go:117] "RemoveContainer" containerID="584128b586b25bf701ec8def4f23ef28936697a1f4acd379f53fcb8a26a1716a" Sep 30 06:48:27 crc kubenswrapper[4691]: E0930 06:48:27.231763 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 06:48:27 crc kubenswrapper[4691]: I0930 06:48:27.529702 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t25dz"] Sep 30 06:48:27 crc kubenswrapper[4691]: W0930 06:48:27.536081 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode97ad218_7d51_462b_bdcf_cd39157152c1.slice/crio-fd6b4e17a746cf457d5d7d52a6b5be052a4a8c485b61753a892f027a0538ca35 WatchSource:0}: Error finding container fd6b4e17a746cf457d5d7d52a6b5be052a4a8c485b61753a892f027a0538ca35: Status 404 returned error can't find the container with id fd6b4e17a746cf457d5d7d52a6b5be052a4a8c485b61753a892f027a0538ca35 Sep 30 06:48:28 crc kubenswrapper[4691]: I0930 06:48:28.484297 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t25dz" event={"ID":"e97ad218-7d51-462b-bdcf-cd39157152c1","Type":"ContainerStarted","Data":"877e8d17a6f57ab60928652a52d55dc1b4781cd1a58147c14542f7a5d32901c4"} Sep 30 06:48:28 crc kubenswrapper[4691]: I0930 06:48:28.484661 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t25dz" event={"ID":"e97ad218-7d51-462b-bdcf-cd39157152c1","Type":"ContainerStarted","Data":"fd6b4e17a746cf457d5d7d52a6b5be052a4a8c485b61753a892f027a0538ca35"} Sep 30 06:48:28 crc kubenswrapper[4691]: I0930 06:48:28.500854 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t25dz" podStartSLOduration=1.938261926 podStartE2EDuration="2.500839057s" podCreationTimestamp="2025-09-30 06:48:26 +0000 UTC" firstStartedPulling="2025-09-30 06:48:27.538754861 +0000 UTC m=+1751.013775911" lastFinishedPulling="2025-09-30 06:48:28.101331992 +0000 UTC m=+1751.576353042" observedRunningTime="2025-09-30 06:48:28.499157964 +0000 UTC m=+1751.974179034" watchObservedRunningTime="2025-09-30 06:48:28.500839057 +0000 UTC m=+1751.975860097" Sep 30 06:48:34 crc kubenswrapper[4691]: I0930 06:48:34.555133 4691 generic.go:334] "Generic (PLEG): container finished" podID="e97ad218-7d51-462b-bdcf-cd39157152c1" containerID="877e8d17a6f57ab60928652a52d55dc1b4781cd1a58147c14542f7a5d32901c4" exitCode=0 Sep 30 06:48:34 crc kubenswrapper[4691]: I0930 06:48:34.555266 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t25dz" event={"ID":"e97ad218-7d51-462b-bdcf-cd39157152c1","Type":"ContainerDied","Data":"877e8d17a6f57ab60928652a52d55dc1b4781cd1a58147c14542f7a5d32901c4"} Sep 30 06:48:36 crc kubenswrapper[4691]: I0930 06:48:36.046257 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t25dz" Sep 30 06:48:36 crc kubenswrapper[4691]: I0930 06:48:36.132020 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e97ad218-7d51-462b-bdcf-cd39157152c1-ssh-key\") pod \"e97ad218-7d51-462b-bdcf-cd39157152c1\" (UID: \"e97ad218-7d51-462b-bdcf-cd39157152c1\") " Sep 30 06:48:36 crc kubenswrapper[4691]: I0930 06:48:36.132119 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e97ad218-7d51-462b-bdcf-cd39157152c1-inventory\") pod \"e97ad218-7d51-462b-bdcf-cd39157152c1\" (UID: \"e97ad218-7d51-462b-bdcf-cd39157152c1\") " Sep 30 06:48:36 crc kubenswrapper[4691]: I0930 06:48:36.132230 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5xqwz\" (UniqueName: \"kubernetes.io/projected/e97ad218-7d51-462b-bdcf-cd39157152c1-kube-api-access-5xqwz\") pod \"e97ad218-7d51-462b-bdcf-cd39157152c1\" (UID: \"e97ad218-7d51-462b-bdcf-cd39157152c1\") " Sep 30 06:48:36 crc kubenswrapper[4691]: I0930 06:48:36.138675 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e97ad218-7d51-462b-bdcf-cd39157152c1-kube-api-access-5xqwz" (OuterVolumeSpecName: "kube-api-access-5xqwz") pod "e97ad218-7d51-462b-bdcf-cd39157152c1" (UID: "e97ad218-7d51-462b-bdcf-cd39157152c1"). InnerVolumeSpecName "kube-api-access-5xqwz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:48:36 crc kubenswrapper[4691]: I0930 06:48:36.172980 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e97ad218-7d51-462b-bdcf-cd39157152c1-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e97ad218-7d51-462b-bdcf-cd39157152c1" (UID: "e97ad218-7d51-462b-bdcf-cd39157152c1"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:48:36 crc kubenswrapper[4691]: I0930 06:48:36.185929 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e97ad218-7d51-462b-bdcf-cd39157152c1-inventory" (OuterVolumeSpecName: "inventory") pod "e97ad218-7d51-462b-bdcf-cd39157152c1" (UID: "e97ad218-7d51-462b-bdcf-cd39157152c1"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:48:36 crc kubenswrapper[4691]: I0930 06:48:36.235869 4691 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e97ad218-7d51-462b-bdcf-cd39157152c1-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 06:48:36 crc kubenswrapper[4691]: I0930 06:48:36.236215 4691 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e97ad218-7d51-462b-bdcf-cd39157152c1-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 06:48:36 crc kubenswrapper[4691]: I0930 06:48:36.236229 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5xqwz\" (UniqueName: \"kubernetes.io/projected/e97ad218-7d51-462b-bdcf-cd39157152c1-kube-api-access-5xqwz\") on node \"crc\" DevicePath \"\"" Sep 30 06:48:36 crc kubenswrapper[4691]: I0930 06:48:36.585396 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t25dz" event={"ID":"e97ad218-7d51-462b-bdcf-cd39157152c1","Type":"ContainerDied","Data":"fd6b4e17a746cf457d5d7d52a6b5be052a4a8c485b61753a892f027a0538ca35"} Sep 30 06:48:36 crc kubenswrapper[4691]: I0930 06:48:36.585440 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd6b4e17a746cf457d5d7d52a6b5be052a4a8c485b61753a892f027a0538ca35" Sep 30 06:48:36 crc kubenswrapper[4691]: I0930 06:48:36.585469 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-t25dz" Sep 30 06:48:36 crc kubenswrapper[4691]: I0930 06:48:36.706398 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-wlzdt"] Sep 30 06:48:36 crc kubenswrapper[4691]: E0930 06:48:36.707016 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e97ad218-7d51-462b-bdcf-cd39157152c1" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Sep 30 06:48:36 crc kubenswrapper[4691]: I0930 06:48:36.707049 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="e97ad218-7d51-462b-bdcf-cd39157152c1" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Sep 30 06:48:36 crc kubenswrapper[4691]: I0930 06:48:36.707337 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="e97ad218-7d51-462b-bdcf-cd39157152c1" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Sep 30 06:48:36 crc kubenswrapper[4691]: I0930 06:48:36.708284 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wlzdt" Sep 30 06:48:36 crc kubenswrapper[4691]: I0930 06:48:36.711546 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 06:48:36 crc kubenswrapper[4691]: I0930 06:48:36.712857 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 06:48:36 crc kubenswrapper[4691]: I0930 06:48:36.713171 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vff8k" Sep 30 06:48:36 crc kubenswrapper[4691]: I0930 06:48:36.714185 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 06:48:36 crc kubenswrapper[4691]: I0930 06:48:36.725992 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-wlzdt"] Sep 30 06:48:36 crc kubenswrapper[4691]: I0930 06:48:36.847575 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/455e6d2b-cc2e-4b09-899d-f913094c603f-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-wlzdt\" (UID: \"455e6d2b-cc2e-4b09-899d-f913094c603f\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wlzdt" Sep 30 06:48:36 crc kubenswrapper[4691]: I0930 06:48:36.847638 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfthk\" (UniqueName: \"kubernetes.io/projected/455e6d2b-cc2e-4b09-899d-f913094c603f-kube-api-access-lfthk\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-wlzdt\" (UID: \"455e6d2b-cc2e-4b09-899d-f913094c603f\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wlzdt" Sep 30 06:48:36 crc kubenswrapper[4691]: I0930 06:48:36.847713 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/455e6d2b-cc2e-4b09-899d-f913094c603f-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-wlzdt\" (UID: \"455e6d2b-cc2e-4b09-899d-f913094c603f\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wlzdt" Sep 30 06:48:36 crc kubenswrapper[4691]: I0930 06:48:36.952362 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/455e6d2b-cc2e-4b09-899d-f913094c603f-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-wlzdt\" (UID: \"455e6d2b-cc2e-4b09-899d-f913094c603f\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wlzdt" Sep 30 06:48:36 crc kubenswrapper[4691]: I0930 06:48:36.952456 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfthk\" (UniqueName: \"kubernetes.io/projected/455e6d2b-cc2e-4b09-899d-f913094c603f-kube-api-access-lfthk\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-wlzdt\" (UID: \"455e6d2b-cc2e-4b09-899d-f913094c603f\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wlzdt" Sep 30 06:48:36 crc kubenswrapper[4691]: I0930 06:48:36.952582 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/455e6d2b-cc2e-4b09-899d-f913094c603f-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-wlzdt\" (UID: \"455e6d2b-cc2e-4b09-899d-f913094c603f\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wlzdt" Sep 30 06:48:36 crc kubenswrapper[4691]: I0930 06:48:36.957565 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/455e6d2b-cc2e-4b09-899d-f913094c603f-ssh-key\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-wlzdt\" (UID: \"455e6d2b-cc2e-4b09-899d-f913094c603f\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wlzdt" Sep 30 06:48:36 crc kubenswrapper[4691]: I0930 06:48:36.964411 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/455e6d2b-cc2e-4b09-899d-f913094c603f-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-wlzdt\" (UID: \"455e6d2b-cc2e-4b09-899d-f913094c603f\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wlzdt" Sep 30 06:48:36 crc kubenswrapper[4691]: I0930 06:48:36.981328 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfthk\" (UniqueName: \"kubernetes.io/projected/455e6d2b-cc2e-4b09-899d-f913094c603f-kube-api-access-lfthk\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-wlzdt\" (UID: \"455e6d2b-cc2e-4b09-899d-f913094c603f\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wlzdt" Sep 30 06:48:37 crc kubenswrapper[4691]: I0930 06:48:37.032803 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wlzdt" Sep 30 06:48:37 crc kubenswrapper[4691]: I0930 06:48:37.051270 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-fvqrc"] Sep 30 06:48:37 crc kubenswrapper[4691]: I0930 06:48:37.061354 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-fvqrc"] Sep 30 06:48:37 crc kubenswrapper[4691]: I0930 06:48:37.237254 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c832fe6-a00d-4666-9f42-1adaae1d9007" path="/var/lib/kubelet/pods/5c832fe6-a00d-4666-9f42-1adaae1d9007/volumes" Sep 30 06:48:37 crc kubenswrapper[4691]: I0930 06:48:37.618280 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-wlzdt"] Sep 30 06:48:38 crc kubenswrapper[4691]: I0930 06:48:38.616446 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wlzdt" event={"ID":"455e6d2b-cc2e-4b09-899d-f913094c603f","Type":"ContainerStarted","Data":"5a21a87331f79d61cbf690b0223eb5ca4120d4ada7a284aeb61dca3c11e2b565"} Sep 30 06:48:38 crc kubenswrapper[4691]: I0930 06:48:38.617059 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wlzdt" event={"ID":"455e6d2b-cc2e-4b09-899d-f913094c603f","Type":"ContainerStarted","Data":"0857a6a3d720a02bc70697559cd668c0b9191ef086805ddb270c38471d2813d4"} Sep 30 06:48:38 crc kubenswrapper[4691]: I0930 06:48:38.644709 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wlzdt" podStartSLOduration=2.263280522 podStartE2EDuration="2.644688449s" podCreationTimestamp="2025-09-30 06:48:36 +0000 UTC" firstStartedPulling="2025-09-30 06:48:37.620448858 +0000 UTC m=+1761.095469938" lastFinishedPulling="2025-09-30 06:48:38.001856795 +0000 UTC m=+1761.476877865" observedRunningTime="2025-09-30 06:48:38.641335212 +0000 UTC m=+1762.116356252" watchObservedRunningTime="2025-09-30 06:48:38.644688449 +0000 UTC m=+1762.119709499" Sep 30 06:48:39 crc kubenswrapper[4691]: I0930 06:48:39.049771 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-dtmdb"] Sep 30 06:48:39 crc kubenswrapper[4691]: I0930 06:48:39.065349 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-dtmdb"] Sep 30 06:48:39 crc kubenswrapper[4691]: I0930 06:48:39.240083 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c0bdcf6-bc45-43e0-9d6c-e3a4be10a619" path="/var/lib/kubelet/pods/0c0bdcf6-bc45-43e0-9d6c-e3a4be10a619/volumes" Sep 30 06:48:40 crc kubenswrapper[4691]: I0930 06:48:40.224498 4691 scope.go:117] "RemoveContainer" containerID="584128b586b25bf701ec8def4f23ef28936697a1f4acd379f53fcb8a26a1716a" Sep 30 06:48:40 crc kubenswrapper[4691]: E0930 06:48:40.225302 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 06:48:51 crc kubenswrapper[4691]: I0930 06:48:51.225581 4691 scope.go:117] "RemoveContainer" containerID="584128b586b25bf701ec8def4f23ef28936697a1f4acd379f53fcb8a26a1716a" Sep 30 06:48:51 crc kubenswrapper[4691]: E0930 06:48:51.227020 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 06:49:04 crc kubenswrapper[4691]: I0930 06:49:04.224714 4691 scope.go:117] "RemoveContainer" containerID="584128b586b25bf701ec8def4f23ef28936697a1f4acd379f53fcb8a26a1716a" Sep 30 06:49:04 crc kubenswrapper[4691]: E0930 06:49:04.225613 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 06:49:16 crc kubenswrapper[4691]: I0930 06:49:16.225126 4691 scope.go:117] "RemoveContainer" containerID="584128b586b25bf701ec8def4f23ef28936697a1f4acd379f53fcb8a26a1716a" Sep 30 06:49:16 crc kubenswrapper[4691]: E0930 06:49:16.226209 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 06:49:22 crc kubenswrapper[4691]: I0930 06:49:22.110934 4691 generic.go:334] "Generic (PLEG): container finished" podID="455e6d2b-cc2e-4b09-899d-f913094c603f" containerID="5a21a87331f79d61cbf690b0223eb5ca4120d4ada7a284aeb61dca3c11e2b565" exitCode=0 Sep 30 06:49:22 crc kubenswrapper[4691]: I0930 06:49:22.111000 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wlzdt" event={"ID":"455e6d2b-cc2e-4b09-899d-f913094c603f","Type":"ContainerDied","Data":"5a21a87331f79d61cbf690b0223eb5ca4120d4ada7a284aeb61dca3c11e2b565"} Sep 30 06:49:23 crc kubenswrapper[4691]: I0930 06:49:23.570583 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wlzdt" Sep 30 06:49:23 crc kubenswrapper[4691]: I0930 06:49:23.714449 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/455e6d2b-cc2e-4b09-899d-f913094c603f-ssh-key\") pod \"455e6d2b-cc2e-4b09-899d-f913094c603f\" (UID: \"455e6d2b-cc2e-4b09-899d-f913094c603f\") " Sep 30 06:49:23 crc kubenswrapper[4691]: I0930 06:49:23.714520 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/455e6d2b-cc2e-4b09-899d-f913094c603f-inventory\") pod \"455e6d2b-cc2e-4b09-899d-f913094c603f\" (UID: \"455e6d2b-cc2e-4b09-899d-f913094c603f\") " Sep 30 06:49:23 crc kubenswrapper[4691]: I0930 06:49:23.714559 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lfthk\" (UniqueName: \"kubernetes.io/projected/455e6d2b-cc2e-4b09-899d-f913094c603f-kube-api-access-lfthk\") pod \"455e6d2b-cc2e-4b09-899d-f913094c603f\" (UID: \"455e6d2b-cc2e-4b09-899d-f913094c603f\") " Sep 30 06:49:23 crc kubenswrapper[4691]: I0930 06:49:23.728908 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/455e6d2b-cc2e-4b09-899d-f913094c603f-kube-api-access-lfthk" (OuterVolumeSpecName: "kube-api-access-lfthk") pod "455e6d2b-cc2e-4b09-899d-f913094c603f" (UID: "455e6d2b-cc2e-4b09-899d-f913094c603f"). InnerVolumeSpecName "kube-api-access-lfthk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:49:23 crc kubenswrapper[4691]: I0930 06:49:23.752860 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/455e6d2b-cc2e-4b09-899d-f913094c603f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "455e6d2b-cc2e-4b09-899d-f913094c603f" (UID: "455e6d2b-cc2e-4b09-899d-f913094c603f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:49:23 crc kubenswrapper[4691]: I0930 06:49:23.778722 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/455e6d2b-cc2e-4b09-899d-f913094c603f-inventory" (OuterVolumeSpecName: "inventory") pod "455e6d2b-cc2e-4b09-899d-f913094c603f" (UID: "455e6d2b-cc2e-4b09-899d-f913094c603f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:49:23 crc kubenswrapper[4691]: I0930 06:49:23.824143 4691 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/455e6d2b-cc2e-4b09-899d-f913094c603f-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 06:49:23 crc kubenswrapper[4691]: I0930 06:49:23.824181 4691 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/455e6d2b-cc2e-4b09-899d-f913094c603f-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 06:49:23 crc kubenswrapper[4691]: I0930 06:49:23.824193 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lfthk\" (UniqueName: \"kubernetes.io/projected/455e6d2b-cc2e-4b09-899d-f913094c603f-kube-api-access-lfthk\") on node \"crc\" DevicePath \"\"" Sep 30 06:49:24 crc kubenswrapper[4691]: I0930 06:49:24.046136 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-xqb84"] Sep 30 06:49:24 crc kubenswrapper[4691]: I0930 06:49:24.055322 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-xqb84"] Sep 30 06:49:24 crc kubenswrapper[4691]: I0930 06:49:24.145011 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wlzdt" event={"ID":"455e6d2b-cc2e-4b09-899d-f913094c603f","Type":"ContainerDied","Data":"0857a6a3d720a02bc70697559cd668c0b9191ef086805ddb270c38471d2813d4"} Sep 30 06:49:24 crc kubenswrapper[4691]: I0930 06:49:24.145278 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0857a6a3d720a02bc70697559cd668c0b9191ef086805ddb270c38471d2813d4" Sep 30 06:49:24 crc kubenswrapper[4691]: I0930 06:49:24.145122 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-wlzdt" Sep 30 06:49:24 crc kubenswrapper[4691]: I0930 06:49:24.251510 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hpsvp"] Sep 30 06:49:24 crc kubenswrapper[4691]: E0930 06:49:24.252227 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="455e6d2b-cc2e-4b09-899d-f913094c603f" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Sep 30 06:49:24 crc kubenswrapper[4691]: I0930 06:49:24.252262 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="455e6d2b-cc2e-4b09-899d-f913094c603f" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Sep 30 06:49:24 crc kubenswrapper[4691]: I0930 06:49:24.252664 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="455e6d2b-cc2e-4b09-899d-f913094c603f" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Sep 30 06:49:24 crc kubenswrapper[4691]: I0930 06:49:24.253980 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hpsvp" Sep 30 06:49:24 crc kubenswrapper[4691]: I0930 06:49:24.259517 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 06:49:24 crc kubenswrapper[4691]: I0930 06:49:24.259791 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 06:49:24 crc kubenswrapper[4691]: I0930 06:49:24.260921 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 06:49:24 crc kubenswrapper[4691]: I0930 06:49:24.260965 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vff8k" Sep 30 06:49:24 crc kubenswrapper[4691]: I0930 06:49:24.261806 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hpsvp"] Sep 30 06:49:24 crc kubenswrapper[4691]: I0930 06:49:24.433797 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6bb5c646-a0b7-4ed5-b5ef-28727886b271-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hpsvp\" (UID: \"6bb5c646-a0b7-4ed5-b5ef-28727886b271\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hpsvp" Sep 30 06:49:24 crc kubenswrapper[4691]: I0930 06:49:24.433875 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zp4k6\" (UniqueName: \"kubernetes.io/projected/6bb5c646-a0b7-4ed5-b5ef-28727886b271-kube-api-access-zp4k6\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hpsvp\" (UID: \"6bb5c646-a0b7-4ed5-b5ef-28727886b271\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hpsvp" Sep 30 06:49:24 crc kubenswrapper[4691]: I0930 06:49:24.433928 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6bb5c646-a0b7-4ed5-b5ef-28727886b271-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hpsvp\" (UID: \"6bb5c646-a0b7-4ed5-b5ef-28727886b271\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hpsvp" Sep 30 06:49:24 crc kubenswrapper[4691]: I0930 06:49:24.535767 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zp4k6\" (UniqueName: \"kubernetes.io/projected/6bb5c646-a0b7-4ed5-b5ef-28727886b271-kube-api-access-zp4k6\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hpsvp\" (UID: \"6bb5c646-a0b7-4ed5-b5ef-28727886b271\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hpsvp" Sep 30 06:49:24 crc kubenswrapper[4691]: I0930 06:49:24.535888 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6bb5c646-a0b7-4ed5-b5ef-28727886b271-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hpsvp\" (UID: \"6bb5c646-a0b7-4ed5-b5ef-28727886b271\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hpsvp" Sep 30 06:49:24 crc kubenswrapper[4691]: I0930 06:49:24.536151 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6bb5c646-a0b7-4ed5-b5ef-28727886b271-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hpsvp\" (UID: \"6bb5c646-a0b7-4ed5-b5ef-28727886b271\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hpsvp" Sep 30 06:49:24 crc kubenswrapper[4691]: I0930 06:49:24.547744 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6bb5c646-a0b7-4ed5-b5ef-28727886b271-ssh-key\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hpsvp\" (UID: \"6bb5c646-a0b7-4ed5-b5ef-28727886b271\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hpsvp" Sep 30 06:49:24 crc kubenswrapper[4691]: I0930 06:49:24.549677 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6bb5c646-a0b7-4ed5-b5ef-28727886b271-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hpsvp\" (UID: \"6bb5c646-a0b7-4ed5-b5ef-28727886b271\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hpsvp" Sep 30 06:49:24 crc kubenswrapper[4691]: I0930 06:49:24.572926 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zp4k6\" (UniqueName: \"kubernetes.io/projected/6bb5c646-a0b7-4ed5-b5ef-28727886b271-kube-api-access-zp4k6\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-hpsvp\" (UID: \"6bb5c646-a0b7-4ed5-b5ef-28727886b271\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hpsvp" Sep 30 06:49:24 crc kubenswrapper[4691]: I0930 06:49:24.871649 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hpsvp" Sep 30 06:49:25 crc kubenswrapper[4691]: I0930 06:49:25.241720 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08f0f027-7a83-431f-849a-639df1298562" path="/var/lib/kubelet/pods/08f0f027-7a83-431f-849a-639df1298562/volumes" Sep 30 06:49:25 crc kubenswrapper[4691]: I0930 06:49:25.399361 4691 scope.go:117] "RemoveContainer" containerID="7cbeb0e8a236f64eee17cfcfd4c7a9bdbe2a324c7d204ef5b6b43b8dea6677cb" Sep 30 06:49:25 crc kubenswrapper[4691]: I0930 06:49:25.410816 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hpsvp"] Sep 30 06:49:25 crc kubenswrapper[4691]: I0930 06:49:25.469161 4691 scope.go:117] "RemoveContainer" containerID="c2574f2c349e8d4c9d576972fe500c1613b4f900eb12a5a5c9b576d65fbfed51" Sep 30 06:49:25 crc kubenswrapper[4691]: I0930 06:49:25.469769 4691 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 06:49:25 crc kubenswrapper[4691]: I0930 06:49:25.530013 4691 scope.go:117] "RemoveContainer" containerID="c4bae847fff95c49052aac08cac27819361a386f663fe0425075f23a19795649" Sep 30 06:49:26 crc kubenswrapper[4691]: I0930 06:49:26.166624 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hpsvp" event={"ID":"6bb5c646-a0b7-4ed5-b5ef-28727886b271","Type":"ContainerStarted","Data":"e73c5a8d5a0e0efd5e8cda6a30c33b60744845c65764a344ef26d8e3d79cc5fe"} Sep 30 06:49:27 crc kubenswrapper[4691]: I0930 06:49:27.199487 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hpsvp" event={"ID":"6bb5c646-a0b7-4ed5-b5ef-28727886b271","Type":"ContainerStarted","Data":"735f9a616a626fb14e60d68e92e35b99a9b81558e69da525ee5636d53bf8e94b"} Sep 30 06:49:27 crc kubenswrapper[4691]: I0930 06:49:27.224787 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hpsvp" podStartSLOduration=2.584955334 podStartE2EDuration="3.224764931s" podCreationTimestamp="2025-09-30 06:49:24 +0000 UTC" firstStartedPulling="2025-09-30 06:49:25.46945982 +0000 UTC m=+1808.944480860" lastFinishedPulling="2025-09-30 06:49:26.109269407 +0000 UTC m=+1809.584290457" observedRunningTime="2025-09-30 06:49:27.224441551 +0000 UTC m=+1810.699462641" watchObservedRunningTime="2025-09-30 06:49:27.224764931 +0000 UTC m=+1810.699785981" Sep 30 06:49:29 crc kubenswrapper[4691]: I0930 06:49:29.225596 4691 scope.go:117] "RemoveContainer" containerID="584128b586b25bf701ec8def4f23ef28936697a1f4acd379f53fcb8a26a1716a" Sep 30 06:49:29 crc kubenswrapper[4691]: E0930 06:49:29.226363 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 06:49:42 crc kubenswrapper[4691]: I0930 06:49:42.225440 4691 scope.go:117] "RemoveContainer" containerID="584128b586b25bf701ec8def4f23ef28936697a1f4acd379f53fcb8a26a1716a" Sep 30 06:49:42 crc kubenswrapper[4691]: E0930 06:49:42.226175 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 06:49:56 crc kubenswrapper[4691]: I0930 06:49:56.225123 4691 scope.go:117] "RemoveContainer" containerID="584128b586b25bf701ec8def4f23ef28936697a1f4acd379f53fcb8a26a1716a" Sep 30 06:49:56 crc kubenswrapper[4691]: E0930 06:49:56.225826 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 06:50:07 crc kubenswrapper[4691]: I0930 06:50:07.231008 4691 scope.go:117] "RemoveContainer" containerID="584128b586b25bf701ec8def4f23ef28936697a1f4acd379f53fcb8a26a1716a" Sep 30 06:50:07 crc kubenswrapper[4691]: E0930 06:50:07.231862 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 06:50:20 crc kubenswrapper[4691]: I0930 06:50:20.225837 4691 scope.go:117] "RemoveContainer" containerID="584128b586b25bf701ec8def4f23ef28936697a1f4acd379f53fcb8a26a1716a" Sep 30 06:50:20 crc kubenswrapper[4691]: E0930 06:50:20.227240 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 06:50:28 crc kubenswrapper[4691]: I0930 06:50:28.913432 4691 generic.go:334] "Generic (PLEG): container finished" podID="6bb5c646-a0b7-4ed5-b5ef-28727886b271" containerID="735f9a616a626fb14e60d68e92e35b99a9b81558e69da525ee5636d53bf8e94b" exitCode=0 Sep 30 06:50:28 crc kubenswrapper[4691]: I0930 06:50:28.913557 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hpsvp" event={"ID":"6bb5c646-a0b7-4ed5-b5ef-28727886b271","Type":"ContainerDied","Data":"735f9a616a626fb14e60d68e92e35b99a9b81558e69da525ee5636d53bf8e94b"} Sep 30 06:50:30 crc kubenswrapper[4691]: I0930 06:50:30.430207 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hpsvp" Sep 30 06:50:30 crc kubenswrapper[4691]: I0930 06:50:30.482606 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6bb5c646-a0b7-4ed5-b5ef-28727886b271-inventory\") pod \"6bb5c646-a0b7-4ed5-b5ef-28727886b271\" (UID: \"6bb5c646-a0b7-4ed5-b5ef-28727886b271\") " Sep 30 06:50:30 crc kubenswrapper[4691]: I0930 06:50:30.482711 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6bb5c646-a0b7-4ed5-b5ef-28727886b271-ssh-key\") pod \"6bb5c646-a0b7-4ed5-b5ef-28727886b271\" (UID: \"6bb5c646-a0b7-4ed5-b5ef-28727886b271\") " Sep 30 06:50:30 crc kubenswrapper[4691]: I0930 06:50:30.482771 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zp4k6\" (UniqueName: \"kubernetes.io/projected/6bb5c646-a0b7-4ed5-b5ef-28727886b271-kube-api-access-zp4k6\") pod \"6bb5c646-a0b7-4ed5-b5ef-28727886b271\" (UID: \"6bb5c646-a0b7-4ed5-b5ef-28727886b271\") " Sep 30 06:50:30 crc kubenswrapper[4691]: I0930 06:50:30.491931 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bb5c646-a0b7-4ed5-b5ef-28727886b271-kube-api-access-zp4k6" (OuterVolumeSpecName: "kube-api-access-zp4k6") pod "6bb5c646-a0b7-4ed5-b5ef-28727886b271" (UID: "6bb5c646-a0b7-4ed5-b5ef-28727886b271"). InnerVolumeSpecName "kube-api-access-zp4k6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:50:30 crc kubenswrapper[4691]: I0930 06:50:30.534830 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bb5c646-a0b7-4ed5-b5ef-28727886b271-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6bb5c646-a0b7-4ed5-b5ef-28727886b271" (UID: "6bb5c646-a0b7-4ed5-b5ef-28727886b271"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:50:30 crc kubenswrapper[4691]: I0930 06:50:30.535553 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6bb5c646-a0b7-4ed5-b5ef-28727886b271-inventory" (OuterVolumeSpecName: "inventory") pod "6bb5c646-a0b7-4ed5-b5ef-28727886b271" (UID: "6bb5c646-a0b7-4ed5-b5ef-28727886b271"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:50:30 crc kubenswrapper[4691]: I0930 06:50:30.585052 4691 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6bb5c646-a0b7-4ed5-b5ef-28727886b271-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 06:50:30 crc kubenswrapper[4691]: I0930 06:50:30.585088 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zp4k6\" (UniqueName: \"kubernetes.io/projected/6bb5c646-a0b7-4ed5-b5ef-28727886b271-kube-api-access-zp4k6\") on node \"crc\" DevicePath \"\"" Sep 30 06:50:30 crc kubenswrapper[4691]: I0930 06:50:30.585104 4691 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6bb5c646-a0b7-4ed5-b5ef-28727886b271-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 06:50:30 crc kubenswrapper[4691]: I0930 06:50:30.937144 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hpsvp" event={"ID":"6bb5c646-a0b7-4ed5-b5ef-28727886b271","Type":"ContainerDied","Data":"e73c5a8d5a0e0efd5e8cda6a30c33b60744845c65764a344ef26d8e3d79cc5fe"} Sep 30 06:50:30 crc kubenswrapper[4691]: I0930 06:50:30.937192 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e73c5a8d5a0e0efd5e8cda6a30c33b60744845c65764a344ef26d8e3d79cc5fe" Sep 30 06:50:30 crc kubenswrapper[4691]: I0930 06:50:30.937188 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-hpsvp" Sep 30 06:50:31 crc kubenswrapper[4691]: I0930 06:50:31.052095 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-jsfrl"] Sep 30 06:50:31 crc kubenswrapper[4691]: E0930 06:50:31.052878 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bb5c646-a0b7-4ed5-b5ef-28727886b271" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Sep 30 06:50:31 crc kubenswrapper[4691]: I0930 06:50:31.052915 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bb5c646-a0b7-4ed5-b5ef-28727886b271" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Sep 30 06:50:31 crc kubenswrapper[4691]: I0930 06:50:31.053163 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bb5c646-a0b7-4ed5-b5ef-28727886b271" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Sep 30 06:50:31 crc kubenswrapper[4691]: I0930 06:50:31.054039 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-jsfrl" Sep 30 06:50:31 crc kubenswrapper[4691]: I0930 06:50:31.056927 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 06:50:31 crc kubenswrapper[4691]: I0930 06:50:31.057242 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vff8k" Sep 30 06:50:31 crc kubenswrapper[4691]: I0930 06:50:31.057309 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 06:50:31 crc kubenswrapper[4691]: I0930 06:50:31.058078 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 06:50:31 crc kubenswrapper[4691]: I0930 06:50:31.083960 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-jsfrl"] Sep 30 06:50:31 crc kubenswrapper[4691]: I0930 06:50:31.094574 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/890b2a56-9627-4b04-9e09-5bd7625272cd-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-jsfrl\" (UID: \"890b2a56-9627-4b04-9e09-5bd7625272cd\") " pod="openstack/ssh-known-hosts-edpm-deployment-jsfrl" Sep 30 06:50:31 crc kubenswrapper[4691]: I0930 06:50:31.094724 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqc9x\" (UniqueName: \"kubernetes.io/projected/890b2a56-9627-4b04-9e09-5bd7625272cd-kube-api-access-pqc9x\") pod \"ssh-known-hosts-edpm-deployment-jsfrl\" (UID: \"890b2a56-9627-4b04-9e09-5bd7625272cd\") " pod="openstack/ssh-known-hosts-edpm-deployment-jsfrl" Sep 30 06:50:31 crc kubenswrapper[4691]: I0930 06:50:31.094753 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/890b2a56-9627-4b04-9e09-5bd7625272cd-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-jsfrl\" (UID: \"890b2a56-9627-4b04-9e09-5bd7625272cd\") " pod="openstack/ssh-known-hosts-edpm-deployment-jsfrl" Sep 30 06:50:31 crc kubenswrapper[4691]: I0930 06:50:31.196653 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/890b2a56-9627-4b04-9e09-5bd7625272cd-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-jsfrl\" (UID: \"890b2a56-9627-4b04-9e09-5bd7625272cd\") " pod="openstack/ssh-known-hosts-edpm-deployment-jsfrl" Sep 30 06:50:31 crc kubenswrapper[4691]: I0930 06:50:31.196866 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqc9x\" (UniqueName: \"kubernetes.io/projected/890b2a56-9627-4b04-9e09-5bd7625272cd-kube-api-access-pqc9x\") pod \"ssh-known-hosts-edpm-deployment-jsfrl\" (UID: \"890b2a56-9627-4b04-9e09-5bd7625272cd\") " pod="openstack/ssh-known-hosts-edpm-deployment-jsfrl" Sep 30 06:50:31 crc kubenswrapper[4691]: I0930 06:50:31.197190 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/890b2a56-9627-4b04-9e09-5bd7625272cd-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-jsfrl\" (UID: \"890b2a56-9627-4b04-9e09-5bd7625272cd\") " pod="openstack/ssh-known-hosts-edpm-deployment-jsfrl" Sep 30 06:50:31 crc kubenswrapper[4691]: I0930 06:50:31.202956 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/890b2a56-9627-4b04-9e09-5bd7625272cd-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-jsfrl\" (UID: \"890b2a56-9627-4b04-9e09-5bd7625272cd\") " pod="openstack/ssh-known-hosts-edpm-deployment-jsfrl" Sep 30 06:50:31 crc kubenswrapper[4691]: I0930 06:50:31.202976 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/890b2a56-9627-4b04-9e09-5bd7625272cd-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-jsfrl\" (UID: \"890b2a56-9627-4b04-9e09-5bd7625272cd\") " pod="openstack/ssh-known-hosts-edpm-deployment-jsfrl" Sep 30 06:50:31 crc kubenswrapper[4691]: I0930 06:50:31.223199 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqc9x\" (UniqueName: \"kubernetes.io/projected/890b2a56-9627-4b04-9e09-5bd7625272cd-kube-api-access-pqc9x\") pod \"ssh-known-hosts-edpm-deployment-jsfrl\" (UID: \"890b2a56-9627-4b04-9e09-5bd7625272cd\") " pod="openstack/ssh-known-hosts-edpm-deployment-jsfrl" Sep 30 06:50:31 crc kubenswrapper[4691]: I0930 06:50:31.225303 4691 scope.go:117] "RemoveContainer" containerID="584128b586b25bf701ec8def4f23ef28936697a1f4acd379f53fcb8a26a1716a" Sep 30 06:50:31 crc kubenswrapper[4691]: E0930 06:50:31.225665 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 06:50:31 crc kubenswrapper[4691]: I0930 06:50:31.397236 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-jsfrl" Sep 30 06:50:32 crc kubenswrapper[4691]: I0930 06:50:32.026966 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-jsfrl"] Sep 30 06:50:32 crc kubenswrapper[4691]: I0930 06:50:32.961039 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-jsfrl" event={"ID":"890b2a56-9627-4b04-9e09-5bd7625272cd","Type":"ContainerStarted","Data":"11e99e3604c4b2699366028e2695e10c45ac917a5cb3b60be485f47bee380a73"} Sep 30 06:50:32 crc kubenswrapper[4691]: I0930 06:50:32.961376 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-jsfrl" event={"ID":"890b2a56-9627-4b04-9e09-5bd7625272cd","Type":"ContainerStarted","Data":"c9ab7e00c2127e3f54348ab9f5c8f9cdc574552d40f347666dc99fc18446661b"} Sep 30 06:50:32 crc kubenswrapper[4691]: I0930 06:50:32.995926 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-jsfrl" podStartSLOduration=1.409317623 podStartE2EDuration="1.995837449s" podCreationTimestamp="2025-09-30 06:50:31 +0000 UTC" firstStartedPulling="2025-09-30 06:50:32.028917178 +0000 UTC m=+1875.503938238" lastFinishedPulling="2025-09-30 06:50:32.615437004 +0000 UTC m=+1876.090458064" observedRunningTime="2025-09-30 06:50:32.982670689 +0000 UTC m=+1876.457691759" watchObservedRunningTime="2025-09-30 06:50:32.995837449 +0000 UTC m=+1876.470858519" Sep 30 06:50:41 crc kubenswrapper[4691]: I0930 06:50:41.058925 4691 generic.go:334] "Generic (PLEG): container finished" podID="890b2a56-9627-4b04-9e09-5bd7625272cd" containerID="11e99e3604c4b2699366028e2695e10c45ac917a5cb3b60be485f47bee380a73" exitCode=0 Sep 30 06:50:41 crc kubenswrapper[4691]: I0930 06:50:41.058999 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-jsfrl" event={"ID":"890b2a56-9627-4b04-9e09-5bd7625272cd","Type":"ContainerDied","Data":"11e99e3604c4b2699366028e2695e10c45ac917a5cb3b60be485f47bee380a73"} Sep 30 06:50:42 crc kubenswrapper[4691]: I0930 06:50:42.225611 4691 scope.go:117] "RemoveContainer" containerID="584128b586b25bf701ec8def4f23ef28936697a1f4acd379f53fcb8a26a1716a" Sep 30 06:50:42 crc kubenswrapper[4691]: E0930 06:50:42.226115 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 06:50:42 crc kubenswrapper[4691]: I0930 06:50:42.619249 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-jsfrl" Sep 30 06:50:42 crc kubenswrapper[4691]: I0930 06:50:42.675735 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/890b2a56-9627-4b04-9e09-5bd7625272cd-ssh-key-openstack-edpm-ipam\") pod \"890b2a56-9627-4b04-9e09-5bd7625272cd\" (UID: \"890b2a56-9627-4b04-9e09-5bd7625272cd\") " Sep 30 06:50:42 crc kubenswrapper[4691]: I0930 06:50:42.725069 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/890b2a56-9627-4b04-9e09-5bd7625272cd-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "890b2a56-9627-4b04-9e09-5bd7625272cd" (UID: "890b2a56-9627-4b04-9e09-5bd7625272cd"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:50:42 crc kubenswrapper[4691]: I0930 06:50:42.780578 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/890b2a56-9627-4b04-9e09-5bd7625272cd-inventory-0\") pod \"890b2a56-9627-4b04-9e09-5bd7625272cd\" (UID: \"890b2a56-9627-4b04-9e09-5bd7625272cd\") " Sep 30 06:50:42 crc kubenswrapper[4691]: I0930 06:50:42.780720 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pqc9x\" (UniqueName: \"kubernetes.io/projected/890b2a56-9627-4b04-9e09-5bd7625272cd-kube-api-access-pqc9x\") pod \"890b2a56-9627-4b04-9e09-5bd7625272cd\" (UID: \"890b2a56-9627-4b04-9e09-5bd7625272cd\") " Sep 30 06:50:42 crc kubenswrapper[4691]: I0930 06:50:42.782380 4691 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/890b2a56-9627-4b04-9e09-5bd7625272cd-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Sep 30 06:50:42 crc kubenswrapper[4691]: I0930 06:50:42.796144 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/890b2a56-9627-4b04-9e09-5bd7625272cd-kube-api-access-pqc9x" (OuterVolumeSpecName: "kube-api-access-pqc9x") pod "890b2a56-9627-4b04-9e09-5bd7625272cd" (UID: "890b2a56-9627-4b04-9e09-5bd7625272cd"). InnerVolumeSpecName "kube-api-access-pqc9x". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:50:42 crc kubenswrapper[4691]: I0930 06:50:42.847048 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/890b2a56-9627-4b04-9e09-5bd7625272cd-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "890b2a56-9627-4b04-9e09-5bd7625272cd" (UID: "890b2a56-9627-4b04-9e09-5bd7625272cd"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:50:42 crc kubenswrapper[4691]: I0930 06:50:42.883973 4691 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/890b2a56-9627-4b04-9e09-5bd7625272cd-inventory-0\") on node \"crc\" DevicePath \"\"" Sep 30 06:50:42 crc kubenswrapper[4691]: I0930 06:50:42.884019 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pqc9x\" (UniqueName: \"kubernetes.io/projected/890b2a56-9627-4b04-9e09-5bd7625272cd-kube-api-access-pqc9x\") on node \"crc\" DevicePath \"\"" Sep 30 06:50:43 crc kubenswrapper[4691]: I0930 06:50:43.080652 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-jsfrl" event={"ID":"890b2a56-9627-4b04-9e09-5bd7625272cd","Type":"ContainerDied","Data":"c9ab7e00c2127e3f54348ab9f5c8f9cdc574552d40f347666dc99fc18446661b"} Sep 30 06:50:43 crc kubenswrapper[4691]: I0930 06:50:43.081516 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c9ab7e00c2127e3f54348ab9f5c8f9cdc574552d40f347666dc99fc18446661b" Sep 30 06:50:43 crc kubenswrapper[4691]: I0930 06:50:43.080695 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-jsfrl" Sep 30 06:50:43 crc kubenswrapper[4691]: I0930 06:50:43.178176 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-lt582"] Sep 30 06:50:43 crc kubenswrapper[4691]: E0930 06:50:43.178715 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="890b2a56-9627-4b04-9e09-5bd7625272cd" containerName="ssh-known-hosts-edpm-deployment" Sep 30 06:50:43 crc kubenswrapper[4691]: I0930 06:50:43.178754 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="890b2a56-9627-4b04-9e09-5bd7625272cd" containerName="ssh-known-hosts-edpm-deployment" Sep 30 06:50:43 crc kubenswrapper[4691]: I0930 06:50:43.179083 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="890b2a56-9627-4b04-9e09-5bd7625272cd" containerName="ssh-known-hosts-edpm-deployment" Sep 30 06:50:43 crc kubenswrapper[4691]: I0930 06:50:43.179947 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-lt582" Sep 30 06:50:43 crc kubenswrapper[4691]: I0930 06:50:43.183840 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 06:50:43 crc kubenswrapper[4691]: I0930 06:50:43.183868 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 06:50:43 crc kubenswrapper[4691]: I0930 06:50:43.184404 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 06:50:43 crc kubenswrapper[4691]: I0930 06:50:43.185879 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vff8k" Sep 30 06:50:43 crc kubenswrapper[4691]: I0930 06:50:43.190285 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/30e86152-90a5-42db-a157-e86cede48629-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-lt582\" (UID: \"30e86152-90a5-42db-a157-e86cede48629\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-lt582" Sep 30 06:50:43 crc kubenswrapper[4691]: I0930 06:50:43.192611 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfb56\" (UniqueName: \"kubernetes.io/projected/30e86152-90a5-42db-a157-e86cede48629-kube-api-access-lfb56\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-lt582\" (UID: \"30e86152-90a5-42db-a157-e86cede48629\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-lt582" Sep 30 06:50:43 crc kubenswrapper[4691]: I0930 06:50:43.192740 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/30e86152-90a5-42db-a157-e86cede48629-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-lt582\" (UID: \"30e86152-90a5-42db-a157-e86cede48629\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-lt582" Sep 30 06:50:43 crc kubenswrapper[4691]: I0930 06:50:43.194416 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-lt582"] Sep 30 06:50:43 crc kubenswrapper[4691]: I0930 06:50:43.295048 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/30e86152-90a5-42db-a157-e86cede48629-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-lt582\" (UID: \"30e86152-90a5-42db-a157-e86cede48629\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-lt582" Sep 30 06:50:43 crc kubenswrapper[4691]: I0930 06:50:43.295210 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfb56\" (UniqueName: \"kubernetes.io/projected/30e86152-90a5-42db-a157-e86cede48629-kube-api-access-lfb56\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-lt582\" (UID: \"30e86152-90a5-42db-a157-e86cede48629\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-lt582" Sep 30 06:50:43 crc kubenswrapper[4691]: I0930 06:50:43.295263 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/30e86152-90a5-42db-a157-e86cede48629-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-lt582\" (UID: \"30e86152-90a5-42db-a157-e86cede48629\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-lt582" Sep 30 06:50:43 crc kubenswrapper[4691]: I0930 06:50:43.299282 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/30e86152-90a5-42db-a157-e86cede48629-ssh-key\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-lt582\" (UID: \"30e86152-90a5-42db-a157-e86cede48629\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-lt582" Sep 30 06:50:43 crc kubenswrapper[4691]: I0930 06:50:43.302045 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/30e86152-90a5-42db-a157-e86cede48629-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-lt582\" (UID: \"30e86152-90a5-42db-a157-e86cede48629\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-lt582" Sep 30 06:50:43 crc kubenswrapper[4691]: I0930 06:50:43.323656 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfb56\" (UniqueName: \"kubernetes.io/projected/30e86152-90a5-42db-a157-e86cede48629-kube-api-access-lfb56\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-lt582\" (UID: \"30e86152-90a5-42db-a157-e86cede48629\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-lt582" Sep 30 06:50:43 crc kubenswrapper[4691]: I0930 06:50:43.510314 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-lt582" Sep 30 06:50:44 crc kubenswrapper[4691]: I0930 06:50:44.052088 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-lt582"] Sep 30 06:50:44 crc kubenswrapper[4691]: I0930 06:50:44.090420 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-lt582" event={"ID":"30e86152-90a5-42db-a157-e86cede48629","Type":"ContainerStarted","Data":"069443a71125747d861134b7108e3f6c3a60a69cbf1a69b6b69bcb3215734103"} Sep 30 06:50:45 crc kubenswrapper[4691]: I0930 06:50:45.102964 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-lt582" event={"ID":"30e86152-90a5-42db-a157-e86cede48629","Type":"ContainerStarted","Data":"27989ab91a2fadafd426e109accd5b6f23d45a7c3701365de5d734124c17e5c6"} Sep 30 06:50:45 crc kubenswrapper[4691]: I0930 06:50:45.140208 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-lt582" podStartSLOduration=1.743359382 podStartE2EDuration="2.140180131s" podCreationTimestamp="2025-09-30 06:50:43 +0000 UTC" firstStartedPulling="2025-09-30 06:50:44.06177136 +0000 UTC m=+1887.536792400" lastFinishedPulling="2025-09-30 06:50:44.458592109 +0000 UTC m=+1887.933613149" observedRunningTime="2025-09-30 06:50:45.134552481 +0000 UTC m=+1888.609573521" watchObservedRunningTime="2025-09-30 06:50:45.140180131 +0000 UTC m=+1888.615201211" Sep 30 06:50:54 crc kubenswrapper[4691]: I0930 06:50:54.199264 4691 generic.go:334] "Generic (PLEG): container finished" podID="30e86152-90a5-42db-a157-e86cede48629" containerID="27989ab91a2fadafd426e109accd5b6f23d45a7c3701365de5d734124c17e5c6" exitCode=0 Sep 30 06:50:54 crc kubenswrapper[4691]: I0930 06:50:54.199339 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-lt582" event={"ID":"30e86152-90a5-42db-a157-e86cede48629","Type":"ContainerDied","Data":"27989ab91a2fadafd426e109accd5b6f23d45a7c3701365de5d734124c17e5c6"} Sep 30 06:50:55 crc kubenswrapper[4691]: I0930 06:50:55.719292 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-lt582" Sep 30 06:50:55 crc kubenswrapper[4691]: I0930 06:50:55.888576 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/30e86152-90a5-42db-a157-e86cede48629-ssh-key\") pod \"30e86152-90a5-42db-a157-e86cede48629\" (UID: \"30e86152-90a5-42db-a157-e86cede48629\") " Sep 30 06:50:55 crc kubenswrapper[4691]: I0930 06:50:55.889091 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/30e86152-90a5-42db-a157-e86cede48629-inventory\") pod \"30e86152-90a5-42db-a157-e86cede48629\" (UID: \"30e86152-90a5-42db-a157-e86cede48629\") " Sep 30 06:50:55 crc kubenswrapper[4691]: I0930 06:50:55.889177 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lfb56\" (UniqueName: \"kubernetes.io/projected/30e86152-90a5-42db-a157-e86cede48629-kube-api-access-lfb56\") pod \"30e86152-90a5-42db-a157-e86cede48629\" (UID: \"30e86152-90a5-42db-a157-e86cede48629\") " Sep 30 06:50:55 crc kubenswrapper[4691]: I0930 06:50:55.894234 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30e86152-90a5-42db-a157-e86cede48629-kube-api-access-lfb56" (OuterVolumeSpecName: "kube-api-access-lfb56") pod "30e86152-90a5-42db-a157-e86cede48629" (UID: "30e86152-90a5-42db-a157-e86cede48629"). InnerVolumeSpecName "kube-api-access-lfb56". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:50:55 crc kubenswrapper[4691]: I0930 06:50:55.918291 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30e86152-90a5-42db-a157-e86cede48629-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "30e86152-90a5-42db-a157-e86cede48629" (UID: "30e86152-90a5-42db-a157-e86cede48629"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:50:55 crc kubenswrapper[4691]: I0930 06:50:55.934955 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30e86152-90a5-42db-a157-e86cede48629-inventory" (OuterVolumeSpecName: "inventory") pod "30e86152-90a5-42db-a157-e86cede48629" (UID: "30e86152-90a5-42db-a157-e86cede48629"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:50:55 crc kubenswrapper[4691]: I0930 06:50:55.992104 4691 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/30e86152-90a5-42db-a157-e86cede48629-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 06:50:55 crc kubenswrapper[4691]: I0930 06:50:55.992148 4691 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/30e86152-90a5-42db-a157-e86cede48629-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 06:50:55 crc kubenswrapper[4691]: I0930 06:50:55.992168 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lfb56\" (UniqueName: \"kubernetes.io/projected/30e86152-90a5-42db-a157-e86cede48629-kube-api-access-lfb56\") on node \"crc\" DevicePath \"\"" Sep 30 06:50:56 crc kubenswrapper[4691]: I0930 06:50:56.225995 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-lt582" event={"ID":"30e86152-90a5-42db-a157-e86cede48629","Type":"ContainerDied","Data":"069443a71125747d861134b7108e3f6c3a60a69cbf1a69b6b69bcb3215734103"} Sep 30 06:50:56 crc kubenswrapper[4691]: I0930 06:50:56.226036 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="069443a71125747d861134b7108e3f6c3a60a69cbf1a69b6b69bcb3215734103" Sep 30 06:50:56 crc kubenswrapper[4691]: I0930 06:50:56.226114 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-lt582" Sep 30 06:50:56 crc kubenswrapper[4691]: I0930 06:50:56.315587 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cdxzg"] Sep 30 06:50:56 crc kubenswrapper[4691]: E0930 06:50:56.316015 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30e86152-90a5-42db-a157-e86cede48629" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Sep 30 06:50:56 crc kubenswrapper[4691]: I0930 06:50:56.316032 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="30e86152-90a5-42db-a157-e86cede48629" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Sep 30 06:50:56 crc kubenswrapper[4691]: I0930 06:50:56.316237 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="30e86152-90a5-42db-a157-e86cede48629" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Sep 30 06:50:56 crc kubenswrapper[4691]: I0930 06:50:56.316918 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cdxzg" Sep 30 06:50:56 crc kubenswrapper[4691]: I0930 06:50:56.322932 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 06:50:56 crc kubenswrapper[4691]: I0930 06:50:56.324331 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vff8k" Sep 30 06:50:56 crc kubenswrapper[4691]: I0930 06:50:56.324394 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 06:50:56 crc kubenswrapper[4691]: I0930 06:50:56.324432 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 06:50:56 crc kubenswrapper[4691]: I0930 06:50:56.338872 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cdxzg"] Sep 30 06:50:56 crc kubenswrapper[4691]: I0930 06:50:56.504122 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qj6vx\" (UniqueName: \"kubernetes.io/projected/6b407d88-19cd-402f-a417-64c08a37f051-kube-api-access-qj6vx\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-cdxzg\" (UID: \"6b407d88-19cd-402f-a417-64c08a37f051\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cdxzg" Sep 30 06:50:56 crc kubenswrapper[4691]: I0930 06:50:56.504227 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6b407d88-19cd-402f-a417-64c08a37f051-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-cdxzg\" (UID: \"6b407d88-19cd-402f-a417-64c08a37f051\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cdxzg" Sep 30 06:50:56 crc kubenswrapper[4691]: I0930 06:50:56.504263 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6b407d88-19cd-402f-a417-64c08a37f051-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-cdxzg\" (UID: \"6b407d88-19cd-402f-a417-64c08a37f051\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cdxzg" Sep 30 06:50:56 crc kubenswrapper[4691]: I0930 06:50:56.605946 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qj6vx\" (UniqueName: \"kubernetes.io/projected/6b407d88-19cd-402f-a417-64c08a37f051-kube-api-access-qj6vx\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-cdxzg\" (UID: \"6b407d88-19cd-402f-a417-64c08a37f051\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cdxzg" Sep 30 06:50:56 crc kubenswrapper[4691]: I0930 06:50:56.606049 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6b407d88-19cd-402f-a417-64c08a37f051-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-cdxzg\" (UID: \"6b407d88-19cd-402f-a417-64c08a37f051\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cdxzg" Sep 30 06:50:56 crc kubenswrapper[4691]: I0930 06:50:56.606084 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6b407d88-19cd-402f-a417-64c08a37f051-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-cdxzg\" (UID: \"6b407d88-19cd-402f-a417-64c08a37f051\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cdxzg" Sep 30 06:50:56 crc kubenswrapper[4691]: I0930 06:50:56.611479 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6b407d88-19cd-402f-a417-64c08a37f051-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-cdxzg\" (UID: \"6b407d88-19cd-402f-a417-64c08a37f051\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cdxzg" Sep 30 06:50:56 crc kubenswrapper[4691]: I0930 06:50:56.628479 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6b407d88-19cd-402f-a417-64c08a37f051-ssh-key\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-cdxzg\" (UID: \"6b407d88-19cd-402f-a417-64c08a37f051\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cdxzg" Sep 30 06:50:56 crc kubenswrapper[4691]: I0930 06:50:56.628787 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qj6vx\" (UniqueName: \"kubernetes.io/projected/6b407d88-19cd-402f-a417-64c08a37f051-kube-api-access-qj6vx\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-cdxzg\" (UID: \"6b407d88-19cd-402f-a417-64c08a37f051\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cdxzg" Sep 30 06:50:56 crc kubenswrapper[4691]: I0930 06:50:56.655353 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cdxzg" Sep 30 06:50:57 crc kubenswrapper[4691]: I0930 06:50:57.210508 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cdxzg"] Sep 30 06:50:57 crc kubenswrapper[4691]: I0930 06:50:57.234736 4691 scope.go:117] "RemoveContainer" containerID="584128b586b25bf701ec8def4f23ef28936697a1f4acd379f53fcb8a26a1716a" Sep 30 06:50:57 crc kubenswrapper[4691]: E0930 06:50:57.234962 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 06:50:57 crc kubenswrapper[4691]: I0930 06:50:57.242484 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cdxzg" event={"ID":"6b407d88-19cd-402f-a417-64c08a37f051","Type":"ContainerStarted","Data":"d9838201e226adee347fc171bd5e46907a8f5b9037a7e4019981e5aeb8340f66"} Sep 30 06:50:58 crc kubenswrapper[4691]: I0930 06:50:58.255157 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cdxzg" event={"ID":"6b407d88-19cd-402f-a417-64c08a37f051","Type":"ContainerStarted","Data":"c9bc3a91f14995e995337b17f2e838b9111c79f4c78c174fb60f1dc0c3f6dabd"} Sep 30 06:50:58 crc kubenswrapper[4691]: I0930 06:50:58.286562 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cdxzg" podStartSLOduration=1.8793252520000001 podStartE2EDuration="2.286538193s" podCreationTimestamp="2025-09-30 06:50:56 +0000 UTC" firstStartedPulling="2025-09-30 06:50:57.20208718 +0000 UTC m=+1900.677108230" lastFinishedPulling="2025-09-30 06:50:57.609300111 +0000 UTC m=+1901.084321171" observedRunningTime="2025-09-30 06:50:58.282714871 +0000 UTC m=+1901.757735951" watchObservedRunningTime="2025-09-30 06:50:58.286538193 +0000 UTC m=+1901.761559273" Sep 30 06:51:08 crc kubenswrapper[4691]: I0930 06:51:08.372587 4691 generic.go:334] "Generic (PLEG): container finished" podID="6b407d88-19cd-402f-a417-64c08a37f051" containerID="c9bc3a91f14995e995337b17f2e838b9111c79f4c78c174fb60f1dc0c3f6dabd" exitCode=0 Sep 30 06:51:08 crc kubenswrapper[4691]: I0930 06:51:08.373027 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cdxzg" event={"ID":"6b407d88-19cd-402f-a417-64c08a37f051","Type":"ContainerDied","Data":"c9bc3a91f14995e995337b17f2e838b9111c79f4c78c174fb60f1dc0c3f6dabd"} Sep 30 06:51:09 crc kubenswrapper[4691]: I0930 06:51:09.928402 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cdxzg" Sep 30 06:51:10 crc kubenswrapper[4691]: I0930 06:51:10.004922 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6b407d88-19cd-402f-a417-64c08a37f051-inventory\") pod \"6b407d88-19cd-402f-a417-64c08a37f051\" (UID: \"6b407d88-19cd-402f-a417-64c08a37f051\") " Sep 30 06:51:10 crc kubenswrapper[4691]: I0930 06:51:10.005013 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6b407d88-19cd-402f-a417-64c08a37f051-ssh-key\") pod \"6b407d88-19cd-402f-a417-64c08a37f051\" (UID: \"6b407d88-19cd-402f-a417-64c08a37f051\") " Sep 30 06:51:10 crc kubenswrapper[4691]: I0930 06:51:10.005165 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qj6vx\" (UniqueName: \"kubernetes.io/projected/6b407d88-19cd-402f-a417-64c08a37f051-kube-api-access-qj6vx\") pod \"6b407d88-19cd-402f-a417-64c08a37f051\" (UID: \"6b407d88-19cd-402f-a417-64c08a37f051\") " Sep 30 06:51:10 crc kubenswrapper[4691]: I0930 06:51:10.011166 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b407d88-19cd-402f-a417-64c08a37f051-kube-api-access-qj6vx" (OuterVolumeSpecName: "kube-api-access-qj6vx") pod "6b407d88-19cd-402f-a417-64c08a37f051" (UID: "6b407d88-19cd-402f-a417-64c08a37f051"). InnerVolumeSpecName "kube-api-access-qj6vx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:51:10 crc kubenswrapper[4691]: I0930 06:51:10.034419 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b407d88-19cd-402f-a417-64c08a37f051-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6b407d88-19cd-402f-a417-64c08a37f051" (UID: "6b407d88-19cd-402f-a417-64c08a37f051"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:51:10 crc kubenswrapper[4691]: I0930 06:51:10.048361 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b407d88-19cd-402f-a417-64c08a37f051-inventory" (OuterVolumeSpecName: "inventory") pod "6b407d88-19cd-402f-a417-64c08a37f051" (UID: "6b407d88-19cd-402f-a417-64c08a37f051"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:51:10 crc kubenswrapper[4691]: I0930 06:51:10.107530 4691 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6b407d88-19cd-402f-a417-64c08a37f051-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 06:51:10 crc kubenswrapper[4691]: I0930 06:51:10.107560 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qj6vx\" (UniqueName: \"kubernetes.io/projected/6b407d88-19cd-402f-a417-64c08a37f051-kube-api-access-qj6vx\") on node \"crc\" DevicePath \"\"" Sep 30 06:51:10 crc kubenswrapper[4691]: I0930 06:51:10.107569 4691 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6b407d88-19cd-402f-a417-64c08a37f051-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 06:51:10 crc kubenswrapper[4691]: I0930 06:51:10.411616 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cdxzg" event={"ID":"6b407d88-19cd-402f-a417-64c08a37f051","Type":"ContainerDied","Data":"d9838201e226adee347fc171bd5e46907a8f5b9037a7e4019981e5aeb8340f66"} Sep 30 06:51:10 crc kubenswrapper[4691]: I0930 06:51:10.411955 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d9838201e226adee347fc171bd5e46907a8f5b9037a7e4019981e5aeb8340f66" Sep 30 06:51:10 crc kubenswrapper[4691]: I0930 06:51:10.411763 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-cdxzg" Sep 30 06:51:10 crc kubenswrapper[4691]: I0930 06:51:10.532467 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-crvv4"] Sep 30 06:51:10 crc kubenswrapper[4691]: E0930 06:51:10.533121 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b407d88-19cd-402f-a417-64c08a37f051" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Sep 30 06:51:10 crc kubenswrapper[4691]: I0930 06:51:10.533154 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b407d88-19cd-402f-a417-64c08a37f051" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Sep 30 06:51:10 crc kubenswrapper[4691]: I0930 06:51:10.533457 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b407d88-19cd-402f-a417-64c08a37f051" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Sep 30 06:51:10 crc kubenswrapper[4691]: I0930 06:51:10.534479 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-crvv4" Sep 30 06:51:10 crc kubenswrapper[4691]: I0930 06:51:10.539484 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vff8k" Sep 30 06:51:10 crc kubenswrapper[4691]: I0930 06:51:10.541015 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Sep 30 06:51:10 crc kubenswrapper[4691]: I0930 06:51:10.541067 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Sep 30 06:51:10 crc kubenswrapper[4691]: I0930 06:51:10.541185 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 06:51:10 crc kubenswrapper[4691]: I0930 06:51:10.541231 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 06:51:10 crc kubenswrapper[4691]: I0930 06:51:10.541467 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Sep 30 06:51:10 crc kubenswrapper[4691]: I0930 06:51:10.542832 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 06:51:10 crc kubenswrapper[4691]: I0930 06:51:10.545115 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Sep 30 06:51:10 crc kubenswrapper[4691]: I0930 06:51:10.583641 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-crvv4"] Sep 30 06:51:10 crc kubenswrapper[4691]: I0930 06:51:10.625603 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/98946d0d-1b03-4bf2-bd9b-71105ac901f8-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-crvv4\" (UID: \"98946d0d-1b03-4bf2-bd9b-71105ac901f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-crvv4" Sep 30 06:51:10 crc kubenswrapper[4691]: I0930 06:51:10.625666 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98946d0d-1b03-4bf2-bd9b-71105ac901f8-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-crvv4\" (UID: \"98946d0d-1b03-4bf2-bd9b-71105ac901f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-crvv4" Sep 30 06:51:10 crc kubenswrapper[4691]: I0930 06:51:10.625711 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/98946d0d-1b03-4bf2-bd9b-71105ac901f8-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-crvv4\" (UID: \"98946d0d-1b03-4bf2-bd9b-71105ac901f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-crvv4" Sep 30 06:51:10 crc kubenswrapper[4691]: I0930 06:51:10.625757 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/98946d0d-1b03-4bf2-bd9b-71105ac901f8-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-crvv4\" (UID: \"98946d0d-1b03-4bf2-bd9b-71105ac901f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-crvv4" Sep 30 06:51:10 crc kubenswrapper[4691]: I0930 06:51:10.625793 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98946d0d-1b03-4bf2-bd9b-71105ac901f8-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-crvv4\" (UID: \"98946d0d-1b03-4bf2-bd9b-71105ac901f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-crvv4" Sep 30 06:51:10 crc kubenswrapper[4691]: I0930 06:51:10.625832 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98946d0d-1b03-4bf2-bd9b-71105ac901f8-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-crvv4\" (UID: \"98946d0d-1b03-4bf2-bd9b-71105ac901f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-crvv4" Sep 30 06:51:10 crc kubenswrapper[4691]: I0930 06:51:10.625873 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98946d0d-1b03-4bf2-bd9b-71105ac901f8-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-crvv4\" (UID: \"98946d0d-1b03-4bf2-bd9b-71105ac901f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-crvv4" Sep 30 06:51:10 crc kubenswrapper[4691]: I0930 06:51:10.625941 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98946d0d-1b03-4bf2-bd9b-71105ac901f8-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-crvv4\" (UID: \"98946d0d-1b03-4bf2-bd9b-71105ac901f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-crvv4" Sep 30 06:51:10 crc kubenswrapper[4691]: I0930 06:51:10.625972 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/98946d0d-1b03-4bf2-bd9b-71105ac901f8-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-crvv4\" (UID: \"98946d0d-1b03-4bf2-bd9b-71105ac901f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-crvv4" Sep 30 06:51:10 crc kubenswrapper[4691]: I0930 06:51:10.626010 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98946d0d-1b03-4bf2-bd9b-71105ac901f8-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-crvv4\" (UID: \"98946d0d-1b03-4bf2-bd9b-71105ac901f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-crvv4" Sep 30 06:51:10 crc kubenswrapper[4691]: I0930 06:51:10.626039 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sljxf\" (UniqueName: \"kubernetes.io/projected/98946d0d-1b03-4bf2-bd9b-71105ac901f8-kube-api-access-sljxf\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-crvv4\" (UID: \"98946d0d-1b03-4bf2-bd9b-71105ac901f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-crvv4" Sep 30 06:51:10 crc kubenswrapper[4691]: I0930 06:51:10.626076 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98946d0d-1b03-4bf2-bd9b-71105ac901f8-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-crvv4\" (UID: \"98946d0d-1b03-4bf2-bd9b-71105ac901f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-crvv4" Sep 30 06:51:10 crc kubenswrapper[4691]: I0930 06:51:10.626100 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/98946d0d-1b03-4bf2-bd9b-71105ac901f8-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-crvv4\" (UID: \"98946d0d-1b03-4bf2-bd9b-71105ac901f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-crvv4" Sep 30 06:51:10 crc kubenswrapper[4691]: I0930 06:51:10.626220 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98946d0d-1b03-4bf2-bd9b-71105ac901f8-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-crvv4\" (UID: \"98946d0d-1b03-4bf2-bd9b-71105ac901f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-crvv4" Sep 30 06:51:10 crc kubenswrapper[4691]: I0930 06:51:10.730278 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98946d0d-1b03-4bf2-bd9b-71105ac901f8-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-crvv4\" (UID: \"98946d0d-1b03-4bf2-bd9b-71105ac901f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-crvv4" Sep 30 06:51:10 crc kubenswrapper[4691]: I0930 06:51:10.730366 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/98946d0d-1b03-4bf2-bd9b-71105ac901f8-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-crvv4\" (UID: \"98946d0d-1b03-4bf2-bd9b-71105ac901f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-crvv4" Sep 30 06:51:10 crc kubenswrapper[4691]: I0930 06:51:10.730411 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98946d0d-1b03-4bf2-bd9b-71105ac901f8-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-crvv4\" (UID: \"98946d0d-1b03-4bf2-bd9b-71105ac901f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-crvv4" Sep 30 06:51:10 crc kubenswrapper[4691]: I0930 06:51:10.730458 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/98946d0d-1b03-4bf2-bd9b-71105ac901f8-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-crvv4\" (UID: \"98946d0d-1b03-4bf2-bd9b-71105ac901f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-crvv4" Sep 30 06:51:10 crc kubenswrapper[4691]: I0930 06:51:10.730507 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/98946d0d-1b03-4bf2-bd9b-71105ac901f8-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-crvv4\" (UID: \"98946d0d-1b03-4bf2-bd9b-71105ac901f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-crvv4" Sep 30 06:51:10 crc kubenswrapper[4691]: I0930 06:51:10.730539 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98946d0d-1b03-4bf2-bd9b-71105ac901f8-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-crvv4\" (UID: \"98946d0d-1b03-4bf2-bd9b-71105ac901f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-crvv4" Sep 30 06:51:10 crc kubenswrapper[4691]: I0930 06:51:10.730569 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98946d0d-1b03-4bf2-bd9b-71105ac901f8-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-crvv4\" (UID: \"98946d0d-1b03-4bf2-bd9b-71105ac901f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-crvv4" Sep 30 06:51:10 crc kubenswrapper[4691]: I0930 06:51:10.730599 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98946d0d-1b03-4bf2-bd9b-71105ac901f8-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-crvv4\" (UID: \"98946d0d-1b03-4bf2-bd9b-71105ac901f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-crvv4" Sep 30 06:51:10 crc kubenswrapper[4691]: I0930 06:51:10.730619 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98946d0d-1b03-4bf2-bd9b-71105ac901f8-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-crvv4\" (UID: \"98946d0d-1b03-4bf2-bd9b-71105ac901f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-crvv4" Sep 30 06:51:10 crc kubenswrapper[4691]: I0930 06:51:10.730641 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/98946d0d-1b03-4bf2-bd9b-71105ac901f8-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-crvv4\" (UID: \"98946d0d-1b03-4bf2-bd9b-71105ac901f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-crvv4" Sep 30 06:51:10 crc kubenswrapper[4691]: I0930 06:51:10.730682 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98946d0d-1b03-4bf2-bd9b-71105ac901f8-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-crvv4\" (UID: \"98946d0d-1b03-4bf2-bd9b-71105ac901f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-crvv4" Sep 30 06:51:10 crc kubenswrapper[4691]: I0930 06:51:10.730705 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sljxf\" (UniqueName: \"kubernetes.io/projected/98946d0d-1b03-4bf2-bd9b-71105ac901f8-kube-api-access-sljxf\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-crvv4\" (UID: \"98946d0d-1b03-4bf2-bd9b-71105ac901f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-crvv4" Sep 30 06:51:10 crc kubenswrapper[4691]: I0930 06:51:10.730733 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98946d0d-1b03-4bf2-bd9b-71105ac901f8-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-crvv4\" (UID: \"98946d0d-1b03-4bf2-bd9b-71105ac901f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-crvv4" Sep 30 06:51:10 crc kubenswrapper[4691]: I0930 06:51:10.730750 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/98946d0d-1b03-4bf2-bd9b-71105ac901f8-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-crvv4\" (UID: \"98946d0d-1b03-4bf2-bd9b-71105ac901f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-crvv4" Sep 30 06:51:10 crc kubenswrapper[4691]: I0930 06:51:10.735086 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/98946d0d-1b03-4bf2-bd9b-71105ac901f8-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-crvv4\" (UID: \"98946d0d-1b03-4bf2-bd9b-71105ac901f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-crvv4" Sep 30 06:51:10 crc kubenswrapper[4691]: I0930 06:51:10.735230 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98946d0d-1b03-4bf2-bd9b-71105ac901f8-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-crvv4\" (UID: \"98946d0d-1b03-4bf2-bd9b-71105ac901f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-crvv4" Sep 30 06:51:10 crc kubenswrapper[4691]: I0930 06:51:10.735251 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98946d0d-1b03-4bf2-bd9b-71105ac901f8-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-crvv4\" (UID: \"98946d0d-1b03-4bf2-bd9b-71105ac901f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-crvv4" Sep 30 06:51:10 crc kubenswrapper[4691]: I0930 06:51:10.735435 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/98946d0d-1b03-4bf2-bd9b-71105ac901f8-ssh-key\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-crvv4\" (UID: \"98946d0d-1b03-4bf2-bd9b-71105ac901f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-crvv4" Sep 30 06:51:10 crc kubenswrapper[4691]: I0930 06:51:10.735623 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/98946d0d-1b03-4bf2-bd9b-71105ac901f8-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-crvv4\" (UID: \"98946d0d-1b03-4bf2-bd9b-71105ac901f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-crvv4" Sep 30 06:51:10 crc kubenswrapper[4691]: I0930 06:51:10.738067 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98946d0d-1b03-4bf2-bd9b-71105ac901f8-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-crvv4\" (UID: \"98946d0d-1b03-4bf2-bd9b-71105ac901f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-crvv4" Sep 30 06:51:10 crc kubenswrapper[4691]: I0930 06:51:10.738109 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98946d0d-1b03-4bf2-bd9b-71105ac901f8-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-crvv4\" (UID: \"98946d0d-1b03-4bf2-bd9b-71105ac901f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-crvv4" Sep 30 06:51:10 crc kubenswrapper[4691]: I0930 06:51:10.738196 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/98946d0d-1b03-4bf2-bd9b-71105ac901f8-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-crvv4\" (UID: \"98946d0d-1b03-4bf2-bd9b-71105ac901f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-crvv4" Sep 30 06:51:10 crc kubenswrapper[4691]: I0930 06:51:10.738740 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98946d0d-1b03-4bf2-bd9b-71105ac901f8-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-crvv4\" (UID: \"98946d0d-1b03-4bf2-bd9b-71105ac901f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-crvv4" Sep 30 06:51:10 crc kubenswrapper[4691]: I0930 06:51:10.738796 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/98946d0d-1b03-4bf2-bd9b-71105ac901f8-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-crvv4\" (UID: \"98946d0d-1b03-4bf2-bd9b-71105ac901f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-crvv4" Sep 30 06:51:10 crc kubenswrapper[4691]: I0930 06:51:10.740255 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98946d0d-1b03-4bf2-bd9b-71105ac901f8-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-crvv4\" (UID: \"98946d0d-1b03-4bf2-bd9b-71105ac901f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-crvv4" Sep 30 06:51:10 crc kubenswrapper[4691]: I0930 06:51:10.740594 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98946d0d-1b03-4bf2-bd9b-71105ac901f8-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-crvv4\" (UID: \"98946d0d-1b03-4bf2-bd9b-71105ac901f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-crvv4" Sep 30 06:51:10 crc kubenswrapper[4691]: I0930 06:51:10.744939 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98946d0d-1b03-4bf2-bd9b-71105ac901f8-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-crvv4\" (UID: \"98946d0d-1b03-4bf2-bd9b-71105ac901f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-crvv4" Sep 30 06:51:10 crc kubenswrapper[4691]: I0930 06:51:10.748513 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sljxf\" (UniqueName: \"kubernetes.io/projected/98946d0d-1b03-4bf2-bd9b-71105ac901f8-kube-api-access-sljxf\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-crvv4\" (UID: \"98946d0d-1b03-4bf2-bd9b-71105ac901f8\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-crvv4" Sep 30 06:51:10 crc kubenswrapper[4691]: I0930 06:51:10.851130 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-crvv4" Sep 30 06:51:11 crc kubenswrapper[4691]: I0930 06:51:11.468160 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-crvv4"] Sep 30 06:51:12 crc kubenswrapper[4691]: I0930 06:51:12.224704 4691 scope.go:117] "RemoveContainer" containerID="584128b586b25bf701ec8def4f23ef28936697a1f4acd379f53fcb8a26a1716a" Sep 30 06:51:12 crc kubenswrapper[4691]: E0930 06:51:12.225220 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 06:51:12 crc kubenswrapper[4691]: I0930 06:51:12.435698 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-crvv4" event={"ID":"98946d0d-1b03-4bf2-bd9b-71105ac901f8","Type":"ContainerStarted","Data":"1c11572de521361ffd167af5dcc0cd4a63c025eaff0a1049dcfd3a24741f7631"} Sep 30 06:51:12 crc kubenswrapper[4691]: I0930 06:51:12.436132 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-crvv4" event={"ID":"98946d0d-1b03-4bf2-bd9b-71105ac901f8","Type":"ContainerStarted","Data":"0ea9b72f2f4a1959b9f579e15547897c285e81140e7b1c4ffaa7d92de57a3a06"} Sep 30 06:51:12 crc kubenswrapper[4691]: I0930 06:51:12.458932 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-crvv4" podStartSLOduration=2.008420911 podStartE2EDuration="2.458915584s" podCreationTimestamp="2025-09-30 06:51:10 +0000 UTC" firstStartedPulling="2025-09-30 06:51:11.466454347 +0000 UTC m=+1914.941475427" lastFinishedPulling="2025-09-30 06:51:11.91694902 +0000 UTC m=+1915.391970100" observedRunningTime="2025-09-30 06:51:12.455435533 +0000 UTC m=+1915.930456593" watchObservedRunningTime="2025-09-30 06:51:12.458915584 +0000 UTC m=+1915.933936624" Sep 30 06:51:23 crc kubenswrapper[4691]: I0930 06:51:23.227164 4691 scope.go:117] "RemoveContainer" containerID="584128b586b25bf701ec8def4f23ef28936697a1f4acd379f53fcb8a26a1716a" Sep 30 06:51:23 crc kubenswrapper[4691]: E0930 06:51:23.228380 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 06:51:36 crc kubenswrapper[4691]: I0930 06:51:36.226061 4691 scope.go:117] "RemoveContainer" containerID="584128b586b25bf701ec8def4f23ef28936697a1f4acd379f53fcb8a26a1716a" Sep 30 06:51:36 crc kubenswrapper[4691]: E0930 06:51:36.227310 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 06:51:50 crc kubenswrapper[4691]: I0930 06:51:50.226429 4691 scope.go:117] "RemoveContainer" containerID="584128b586b25bf701ec8def4f23ef28936697a1f4acd379f53fcb8a26a1716a" Sep 30 06:51:50 crc kubenswrapper[4691]: E0930 06:51:50.228070 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 06:51:57 crc kubenswrapper[4691]: I0930 06:51:57.977638 4691 generic.go:334] "Generic (PLEG): container finished" podID="98946d0d-1b03-4bf2-bd9b-71105ac901f8" containerID="1c11572de521361ffd167af5dcc0cd4a63c025eaff0a1049dcfd3a24741f7631" exitCode=0 Sep 30 06:51:57 crc kubenswrapper[4691]: I0930 06:51:57.977717 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-crvv4" event={"ID":"98946d0d-1b03-4bf2-bd9b-71105ac901f8","Type":"ContainerDied","Data":"1c11572de521361ffd167af5dcc0cd4a63c025eaff0a1049dcfd3a24741f7631"} Sep 30 06:51:59 crc kubenswrapper[4691]: I0930 06:51:59.549105 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-crvv4" Sep 30 06:51:59 crc kubenswrapper[4691]: I0930 06:51:59.639606 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98946d0d-1b03-4bf2-bd9b-71105ac901f8-bootstrap-combined-ca-bundle\") pod \"98946d0d-1b03-4bf2-bd9b-71105ac901f8\" (UID: \"98946d0d-1b03-4bf2-bd9b-71105ac901f8\") " Sep 30 06:51:59 crc kubenswrapper[4691]: I0930 06:51:59.639644 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/98946d0d-1b03-4bf2-bd9b-71105ac901f8-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"98946d0d-1b03-4bf2-bd9b-71105ac901f8\" (UID: \"98946d0d-1b03-4bf2-bd9b-71105ac901f8\") " Sep 30 06:51:59 crc kubenswrapper[4691]: I0930 06:51:59.639669 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98946d0d-1b03-4bf2-bd9b-71105ac901f8-neutron-metadata-combined-ca-bundle\") pod \"98946d0d-1b03-4bf2-bd9b-71105ac901f8\" (UID: \"98946d0d-1b03-4bf2-bd9b-71105ac901f8\") " Sep 30 06:51:59 crc kubenswrapper[4691]: I0930 06:51:59.639726 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sljxf\" (UniqueName: \"kubernetes.io/projected/98946d0d-1b03-4bf2-bd9b-71105ac901f8-kube-api-access-sljxf\") pod \"98946d0d-1b03-4bf2-bd9b-71105ac901f8\" (UID: \"98946d0d-1b03-4bf2-bd9b-71105ac901f8\") " Sep 30 06:51:59 crc kubenswrapper[4691]: I0930 06:51:59.639744 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/98946d0d-1b03-4bf2-bd9b-71105ac901f8-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"98946d0d-1b03-4bf2-bd9b-71105ac901f8\" (UID: \"98946d0d-1b03-4bf2-bd9b-71105ac901f8\") " Sep 30 06:51:59 crc kubenswrapper[4691]: I0930 06:51:59.639824 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98946d0d-1b03-4bf2-bd9b-71105ac901f8-inventory\") pod \"98946d0d-1b03-4bf2-bd9b-71105ac901f8\" (UID: \"98946d0d-1b03-4bf2-bd9b-71105ac901f8\") " Sep 30 06:51:59 crc kubenswrapper[4691]: I0930 06:51:59.639851 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/98946d0d-1b03-4bf2-bd9b-71105ac901f8-ssh-key\") pod \"98946d0d-1b03-4bf2-bd9b-71105ac901f8\" (UID: \"98946d0d-1b03-4bf2-bd9b-71105ac901f8\") " Sep 30 06:51:59 crc kubenswrapper[4691]: I0930 06:51:59.639900 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98946d0d-1b03-4bf2-bd9b-71105ac901f8-ovn-combined-ca-bundle\") pod \"98946d0d-1b03-4bf2-bd9b-71105ac901f8\" (UID: \"98946d0d-1b03-4bf2-bd9b-71105ac901f8\") " Sep 30 06:51:59 crc kubenswrapper[4691]: I0930 06:51:59.639937 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98946d0d-1b03-4bf2-bd9b-71105ac901f8-repo-setup-combined-ca-bundle\") pod \"98946d0d-1b03-4bf2-bd9b-71105ac901f8\" (UID: \"98946d0d-1b03-4bf2-bd9b-71105ac901f8\") " Sep 30 06:51:59 crc kubenswrapper[4691]: I0930 06:51:59.639976 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98946d0d-1b03-4bf2-bd9b-71105ac901f8-nova-combined-ca-bundle\") pod \"98946d0d-1b03-4bf2-bd9b-71105ac901f8\" (UID: \"98946d0d-1b03-4bf2-bd9b-71105ac901f8\") " Sep 30 06:51:59 crc kubenswrapper[4691]: I0930 06:51:59.640029 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/98946d0d-1b03-4bf2-bd9b-71105ac901f8-openstack-edpm-ipam-ovn-default-certs-0\") pod \"98946d0d-1b03-4bf2-bd9b-71105ac901f8\" (UID: \"98946d0d-1b03-4bf2-bd9b-71105ac901f8\") " Sep 30 06:51:59 crc kubenswrapper[4691]: I0930 06:51:59.640083 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/98946d0d-1b03-4bf2-bd9b-71105ac901f8-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"98946d0d-1b03-4bf2-bd9b-71105ac901f8\" (UID: \"98946d0d-1b03-4bf2-bd9b-71105ac901f8\") " Sep 30 06:51:59 crc kubenswrapper[4691]: I0930 06:51:59.640105 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98946d0d-1b03-4bf2-bd9b-71105ac901f8-libvirt-combined-ca-bundle\") pod \"98946d0d-1b03-4bf2-bd9b-71105ac901f8\" (UID: \"98946d0d-1b03-4bf2-bd9b-71105ac901f8\") " Sep 30 06:51:59 crc kubenswrapper[4691]: I0930 06:51:59.640468 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98946d0d-1b03-4bf2-bd9b-71105ac901f8-telemetry-combined-ca-bundle\") pod \"98946d0d-1b03-4bf2-bd9b-71105ac901f8\" (UID: \"98946d0d-1b03-4bf2-bd9b-71105ac901f8\") " Sep 30 06:51:59 crc kubenswrapper[4691]: I0930 06:51:59.644709 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98946d0d-1b03-4bf2-bd9b-71105ac901f8-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "98946d0d-1b03-4bf2-bd9b-71105ac901f8" (UID: "98946d0d-1b03-4bf2-bd9b-71105ac901f8"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:51:59 crc kubenswrapper[4691]: I0930 06:51:59.648215 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98946d0d-1b03-4bf2-bd9b-71105ac901f8-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "98946d0d-1b03-4bf2-bd9b-71105ac901f8" (UID: "98946d0d-1b03-4bf2-bd9b-71105ac901f8"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:51:59 crc kubenswrapper[4691]: I0930 06:51:59.648215 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98946d0d-1b03-4bf2-bd9b-71105ac901f8-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "98946d0d-1b03-4bf2-bd9b-71105ac901f8" (UID: "98946d0d-1b03-4bf2-bd9b-71105ac901f8"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:51:59 crc kubenswrapper[4691]: I0930 06:51:59.648446 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98946d0d-1b03-4bf2-bd9b-71105ac901f8-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "98946d0d-1b03-4bf2-bd9b-71105ac901f8" (UID: "98946d0d-1b03-4bf2-bd9b-71105ac901f8"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:51:59 crc kubenswrapper[4691]: I0930 06:51:59.648623 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98946d0d-1b03-4bf2-bd9b-71105ac901f8-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "98946d0d-1b03-4bf2-bd9b-71105ac901f8" (UID: "98946d0d-1b03-4bf2-bd9b-71105ac901f8"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:51:59 crc kubenswrapper[4691]: I0930 06:51:59.648786 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98946d0d-1b03-4bf2-bd9b-71105ac901f8-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "98946d0d-1b03-4bf2-bd9b-71105ac901f8" (UID: "98946d0d-1b03-4bf2-bd9b-71105ac901f8"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:51:59 crc kubenswrapper[4691]: I0930 06:51:59.648840 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98946d0d-1b03-4bf2-bd9b-71105ac901f8-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "98946d0d-1b03-4bf2-bd9b-71105ac901f8" (UID: "98946d0d-1b03-4bf2-bd9b-71105ac901f8"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:51:59 crc kubenswrapper[4691]: I0930 06:51:59.653052 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98946d0d-1b03-4bf2-bd9b-71105ac901f8-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "98946d0d-1b03-4bf2-bd9b-71105ac901f8" (UID: "98946d0d-1b03-4bf2-bd9b-71105ac901f8"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:51:59 crc kubenswrapper[4691]: I0930 06:51:59.653065 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98946d0d-1b03-4bf2-bd9b-71105ac901f8-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "98946d0d-1b03-4bf2-bd9b-71105ac901f8" (UID: "98946d0d-1b03-4bf2-bd9b-71105ac901f8"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:51:59 crc kubenswrapper[4691]: I0930 06:51:59.653600 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98946d0d-1b03-4bf2-bd9b-71105ac901f8-kube-api-access-sljxf" (OuterVolumeSpecName: "kube-api-access-sljxf") pod "98946d0d-1b03-4bf2-bd9b-71105ac901f8" (UID: "98946d0d-1b03-4bf2-bd9b-71105ac901f8"). InnerVolumeSpecName "kube-api-access-sljxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:51:59 crc kubenswrapper[4691]: I0930 06:51:59.655092 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98946d0d-1b03-4bf2-bd9b-71105ac901f8-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "98946d0d-1b03-4bf2-bd9b-71105ac901f8" (UID: "98946d0d-1b03-4bf2-bd9b-71105ac901f8"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:51:59 crc kubenswrapper[4691]: I0930 06:51:59.660751 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98946d0d-1b03-4bf2-bd9b-71105ac901f8-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "98946d0d-1b03-4bf2-bd9b-71105ac901f8" (UID: "98946d0d-1b03-4bf2-bd9b-71105ac901f8"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:51:59 crc kubenswrapper[4691]: I0930 06:51:59.678245 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98946d0d-1b03-4bf2-bd9b-71105ac901f8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "98946d0d-1b03-4bf2-bd9b-71105ac901f8" (UID: "98946d0d-1b03-4bf2-bd9b-71105ac901f8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:51:59 crc kubenswrapper[4691]: I0930 06:51:59.678644 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98946d0d-1b03-4bf2-bd9b-71105ac901f8-inventory" (OuterVolumeSpecName: "inventory") pod "98946d0d-1b03-4bf2-bd9b-71105ac901f8" (UID: "98946d0d-1b03-4bf2-bd9b-71105ac901f8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:51:59 crc kubenswrapper[4691]: I0930 06:51:59.742621 4691 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/98946d0d-1b03-4bf2-bd9b-71105ac901f8-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Sep 30 06:51:59 crc kubenswrapper[4691]: I0930 06:51:59.742650 4691 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98946d0d-1b03-4bf2-bd9b-71105ac901f8-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:51:59 crc kubenswrapper[4691]: I0930 06:51:59.742661 4691 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98946d0d-1b03-4bf2-bd9b-71105ac901f8-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:51:59 crc kubenswrapper[4691]: I0930 06:51:59.742670 4691 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98946d0d-1b03-4bf2-bd9b-71105ac901f8-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:51:59 crc kubenswrapper[4691]: I0930 06:51:59.742679 4691 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/98946d0d-1b03-4bf2-bd9b-71105ac901f8-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Sep 30 06:51:59 crc kubenswrapper[4691]: I0930 06:51:59.742689 4691 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98946d0d-1b03-4bf2-bd9b-71105ac901f8-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:51:59 crc kubenswrapper[4691]: I0930 06:51:59.742698 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sljxf\" (UniqueName: \"kubernetes.io/projected/98946d0d-1b03-4bf2-bd9b-71105ac901f8-kube-api-access-sljxf\") on node \"crc\" DevicePath \"\"" Sep 30 06:51:59 crc kubenswrapper[4691]: I0930 06:51:59.742706 4691 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/98946d0d-1b03-4bf2-bd9b-71105ac901f8-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Sep 30 06:51:59 crc kubenswrapper[4691]: I0930 06:51:59.742715 4691 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98946d0d-1b03-4bf2-bd9b-71105ac901f8-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 06:51:59 crc kubenswrapper[4691]: I0930 06:51:59.742723 4691 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/98946d0d-1b03-4bf2-bd9b-71105ac901f8-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 06:51:59 crc kubenswrapper[4691]: I0930 06:51:59.742731 4691 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98946d0d-1b03-4bf2-bd9b-71105ac901f8-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:51:59 crc kubenswrapper[4691]: I0930 06:51:59.742739 4691 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98946d0d-1b03-4bf2-bd9b-71105ac901f8-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:51:59 crc kubenswrapper[4691]: I0930 06:51:59.742748 4691 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98946d0d-1b03-4bf2-bd9b-71105ac901f8-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:51:59 crc kubenswrapper[4691]: I0930 06:51:59.742758 4691 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/98946d0d-1b03-4bf2-bd9b-71105ac901f8-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Sep 30 06:52:00 crc kubenswrapper[4691]: I0930 06:52:00.003922 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-crvv4" event={"ID":"98946d0d-1b03-4bf2-bd9b-71105ac901f8","Type":"ContainerDied","Data":"0ea9b72f2f4a1959b9f579e15547897c285e81140e7b1c4ffaa7d92de57a3a06"} Sep 30 06:52:00 crc kubenswrapper[4691]: I0930 06:52:00.003970 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ea9b72f2f4a1959b9f579e15547897c285e81140e7b1c4ffaa7d92de57a3a06" Sep 30 06:52:00 crc kubenswrapper[4691]: I0930 06:52:00.004014 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-crvv4" Sep 30 06:52:00 crc kubenswrapper[4691]: I0930 06:52:00.186659 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-bl96m"] Sep 30 06:52:00 crc kubenswrapper[4691]: E0930 06:52:00.187430 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98946d0d-1b03-4bf2-bd9b-71105ac901f8" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Sep 30 06:52:00 crc kubenswrapper[4691]: I0930 06:52:00.187456 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="98946d0d-1b03-4bf2-bd9b-71105ac901f8" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Sep 30 06:52:00 crc kubenswrapper[4691]: I0930 06:52:00.187874 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="98946d0d-1b03-4bf2-bd9b-71105ac901f8" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Sep 30 06:52:00 crc kubenswrapper[4691]: I0930 06:52:00.189050 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bl96m" Sep 30 06:52:00 crc kubenswrapper[4691]: I0930 06:52:00.191555 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 06:52:00 crc kubenswrapper[4691]: I0930 06:52:00.199946 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 06:52:00 crc kubenswrapper[4691]: I0930 06:52:00.200221 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Sep 30 06:52:00 crc kubenswrapper[4691]: I0930 06:52:00.200389 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 06:52:00 crc kubenswrapper[4691]: I0930 06:52:00.200244 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vff8k" Sep 30 06:52:00 crc kubenswrapper[4691]: I0930 06:52:00.206926 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-bl96m"] Sep 30 06:52:00 crc kubenswrapper[4691]: I0930 06:52:00.354345 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/8638fef8-f042-4cc1-949d-fb0c107085b5-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bl96m\" (UID: \"8638fef8-f042-4cc1-949d-fb0c107085b5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bl96m" Sep 30 06:52:00 crc kubenswrapper[4691]: I0930 06:52:00.354434 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8638fef8-f042-4cc1-949d-fb0c107085b5-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bl96m\" (UID: \"8638fef8-f042-4cc1-949d-fb0c107085b5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bl96m" Sep 30 06:52:00 crc kubenswrapper[4691]: I0930 06:52:00.354468 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mj4kr\" (UniqueName: \"kubernetes.io/projected/8638fef8-f042-4cc1-949d-fb0c107085b5-kube-api-access-mj4kr\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bl96m\" (UID: \"8638fef8-f042-4cc1-949d-fb0c107085b5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bl96m" Sep 30 06:52:00 crc kubenswrapper[4691]: I0930 06:52:00.354534 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8638fef8-f042-4cc1-949d-fb0c107085b5-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bl96m\" (UID: \"8638fef8-f042-4cc1-949d-fb0c107085b5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bl96m" Sep 30 06:52:00 crc kubenswrapper[4691]: I0930 06:52:00.354557 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8638fef8-f042-4cc1-949d-fb0c107085b5-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bl96m\" (UID: \"8638fef8-f042-4cc1-949d-fb0c107085b5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bl96m" Sep 30 06:52:00 crc kubenswrapper[4691]: I0930 06:52:00.456732 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8638fef8-f042-4cc1-949d-fb0c107085b5-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bl96m\" (UID: \"8638fef8-f042-4cc1-949d-fb0c107085b5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bl96m" Sep 30 06:52:00 crc kubenswrapper[4691]: I0930 06:52:00.457139 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/8638fef8-f042-4cc1-949d-fb0c107085b5-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bl96m\" (UID: \"8638fef8-f042-4cc1-949d-fb0c107085b5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bl96m" Sep 30 06:52:00 crc kubenswrapper[4691]: I0930 06:52:00.457295 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8638fef8-f042-4cc1-949d-fb0c107085b5-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bl96m\" (UID: \"8638fef8-f042-4cc1-949d-fb0c107085b5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bl96m" Sep 30 06:52:00 crc kubenswrapper[4691]: I0930 06:52:00.457388 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mj4kr\" (UniqueName: \"kubernetes.io/projected/8638fef8-f042-4cc1-949d-fb0c107085b5-kube-api-access-mj4kr\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bl96m\" (UID: \"8638fef8-f042-4cc1-949d-fb0c107085b5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bl96m" Sep 30 06:52:00 crc kubenswrapper[4691]: I0930 06:52:00.457522 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8638fef8-f042-4cc1-949d-fb0c107085b5-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bl96m\" (UID: \"8638fef8-f042-4cc1-949d-fb0c107085b5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bl96m" Sep 30 06:52:00 crc kubenswrapper[4691]: I0930 06:52:00.458535 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/8638fef8-f042-4cc1-949d-fb0c107085b5-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bl96m\" (UID: \"8638fef8-f042-4cc1-949d-fb0c107085b5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bl96m" Sep 30 06:52:00 crc kubenswrapper[4691]: I0930 06:52:00.462684 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8638fef8-f042-4cc1-949d-fb0c107085b5-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bl96m\" (UID: \"8638fef8-f042-4cc1-949d-fb0c107085b5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bl96m" Sep 30 06:52:00 crc kubenswrapper[4691]: I0930 06:52:00.463200 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8638fef8-f042-4cc1-949d-fb0c107085b5-ssh-key\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bl96m\" (UID: \"8638fef8-f042-4cc1-949d-fb0c107085b5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bl96m" Sep 30 06:52:00 crc kubenswrapper[4691]: I0930 06:52:00.464781 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8638fef8-f042-4cc1-949d-fb0c107085b5-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bl96m\" (UID: \"8638fef8-f042-4cc1-949d-fb0c107085b5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bl96m" Sep 30 06:52:00 crc kubenswrapper[4691]: I0930 06:52:00.491435 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mj4kr\" (UniqueName: \"kubernetes.io/projected/8638fef8-f042-4cc1-949d-fb0c107085b5-kube-api-access-mj4kr\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-bl96m\" (UID: \"8638fef8-f042-4cc1-949d-fb0c107085b5\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bl96m" Sep 30 06:52:00 crc kubenswrapper[4691]: I0930 06:52:00.512985 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bl96m" Sep 30 06:52:01 crc kubenswrapper[4691]: W0930 06:52:01.204050 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8638fef8_f042_4cc1_949d_fb0c107085b5.slice/crio-d4b0f92c9990b806302f492b392988ba9253d356b2aa2bf98c23e20db80ba2b0 WatchSource:0}: Error finding container d4b0f92c9990b806302f492b392988ba9253d356b2aa2bf98c23e20db80ba2b0: Status 404 returned error can't find the container with id d4b0f92c9990b806302f492b392988ba9253d356b2aa2bf98c23e20db80ba2b0 Sep 30 06:52:01 crc kubenswrapper[4691]: I0930 06:52:01.209709 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-bl96m"] Sep 30 06:52:02 crc kubenswrapper[4691]: I0930 06:52:02.028994 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bl96m" event={"ID":"8638fef8-f042-4cc1-949d-fb0c107085b5","Type":"ContainerStarted","Data":"9e7805f088fb7cc987bbf2572392598d3ed6cb370c40201370f6bc8053a74a65"} Sep 30 06:52:02 crc kubenswrapper[4691]: I0930 06:52:02.029330 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bl96m" event={"ID":"8638fef8-f042-4cc1-949d-fb0c107085b5","Type":"ContainerStarted","Data":"d4b0f92c9990b806302f492b392988ba9253d356b2aa2bf98c23e20db80ba2b0"} Sep 30 06:52:02 crc kubenswrapper[4691]: I0930 06:52:02.056548 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bl96m" podStartSLOduration=1.631326005 podStartE2EDuration="2.05652976s" podCreationTimestamp="2025-09-30 06:52:00 +0000 UTC" firstStartedPulling="2025-09-30 06:52:01.208487655 +0000 UTC m=+1964.683508705" lastFinishedPulling="2025-09-30 06:52:01.63369138 +0000 UTC m=+1965.108712460" observedRunningTime="2025-09-30 06:52:02.050639952 +0000 UTC m=+1965.525661012" watchObservedRunningTime="2025-09-30 06:52:02.05652976 +0000 UTC m=+1965.531550810" Sep 30 06:52:03 crc kubenswrapper[4691]: I0930 06:52:03.225110 4691 scope.go:117] "RemoveContainer" containerID="584128b586b25bf701ec8def4f23ef28936697a1f4acd379f53fcb8a26a1716a" Sep 30 06:52:03 crc kubenswrapper[4691]: E0930 06:52:03.225522 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 06:52:16 crc kubenswrapper[4691]: I0930 06:52:16.225138 4691 scope.go:117] "RemoveContainer" containerID="584128b586b25bf701ec8def4f23ef28936697a1f4acd379f53fcb8a26a1716a" Sep 30 06:52:16 crc kubenswrapper[4691]: E0930 06:52:16.228465 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 06:52:28 crc kubenswrapper[4691]: I0930 06:52:28.225696 4691 scope.go:117] "RemoveContainer" containerID="584128b586b25bf701ec8def4f23ef28936697a1f4acd379f53fcb8a26a1716a" Sep 30 06:52:29 crc kubenswrapper[4691]: I0930 06:52:29.356008 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" event={"ID":"69b46ade-8260-448f-84b7-506632d23ff9","Type":"ContainerStarted","Data":"9fbd472d3121d18d27461001cb7fb9f01463cdd402f37b5b16a3246b7caa1a84"} Sep 30 06:53:17 crc kubenswrapper[4691]: I0930 06:53:17.927098 4691 generic.go:334] "Generic (PLEG): container finished" podID="8638fef8-f042-4cc1-949d-fb0c107085b5" containerID="9e7805f088fb7cc987bbf2572392598d3ed6cb370c40201370f6bc8053a74a65" exitCode=0 Sep 30 06:53:17 crc kubenswrapper[4691]: I0930 06:53:17.927143 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bl96m" event={"ID":"8638fef8-f042-4cc1-949d-fb0c107085b5","Type":"ContainerDied","Data":"9e7805f088fb7cc987bbf2572392598d3ed6cb370c40201370f6bc8053a74a65"} Sep 30 06:53:19 crc kubenswrapper[4691]: I0930 06:53:19.412053 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bl96m" Sep 30 06:53:19 crc kubenswrapper[4691]: I0930 06:53:19.487582 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mj4kr\" (UniqueName: \"kubernetes.io/projected/8638fef8-f042-4cc1-949d-fb0c107085b5-kube-api-access-mj4kr\") pod \"8638fef8-f042-4cc1-949d-fb0c107085b5\" (UID: \"8638fef8-f042-4cc1-949d-fb0c107085b5\") " Sep 30 06:53:19 crc kubenswrapper[4691]: I0930 06:53:19.487645 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8638fef8-f042-4cc1-949d-fb0c107085b5-inventory\") pod \"8638fef8-f042-4cc1-949d-fb0c107085b5\" (UID: \"8638fef8-f042-4cc1-949d-fb0c107085b5\") " Sep 30 06:53:19 crc kubenswrapper[4691]: I0930 06:53:19.487725 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8638fef8-f042-4cc1-949d-fb0c107085b5-ovn-combined-ca-bundle\") pod \"8638fef8-f042-4cc1-949d-fb0c107085b5\" (UID: \"8638fef8-f042-4cc1-949d-fb0c107085b5\") " Sep 30 06:53:19 crc kubenswrapper[4691]: I0930 06:53:19.487770 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/8638fef8-f042-4cc1-949d-fb0c107085b5-ovncontroller-config-0\") pod \"8638fef8-f042-4cc1-949d-fb0c107085b5\" (UID: \"8638fef8-f042-4cc1-949d-fb0c107085b5\") " Sep 30 06:53:19 crc kubenswrapper[4691]: I0930 06:53:19.487853 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8638fef8-f042-4cc1-949d-fb0c107085b5-ssh-key\") pod \"8638fef8-f042-4cc1-949d-fb0c107085b5\" (UID: \"8638fef8-f042-4cc1-949d-fb0c107085b5\") " Sep 30 06:53:19 crc kubenswrapper[4691]: I0930 06:53:19.497391 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8638fef8-f042-4cc1-949d-fb0c107085b5-kube-api-access-mj4kr" (OuterVolumeSpecName: "kube-api-access-mj4kr") pod "8638fef8-f042-4cc1-949d-fb0c107085b5" (UID: "8638fef8-f042-4cc1-949d-fb0c107085b5"). InnerVolumeSpecName "kube-api-access-mj4kr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:53:19 crc kubenswrapper[4691]: I0930 06:53:19.497652 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8638fef8-f042-4cc1-949d-fb0c107085b5-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "8638fef8-f042-4cc1-949d-fb0c107085b5" (UID: "8638fef8-f042-4cc1-949d-fb0c107085b5"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:53:19 crc kubenswrapper[4691]: I0930 06:53:19.520834 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8638fef8-f042-4cc1-949d-fb0c107085b5-inventory" (OuterVolumeSpecName: "inventory") pod "8638fef8-f042-4cc1-949d-fb0c107085b5" (UID: "8638fef8-f042-4cc1-949d-fb0c107085b5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:53:19 crc kubenswrapper[4691]: I0930 06:53:19.530613 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8638fef8-f042-4cc1-949d-fb0c107085b5-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "8638fef8-f042-4cc1-949d-fb0c107085b5" (UID: "8638fef8-f042-4cc1-949d-fb0c107085b5"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 06:53:19 crc kubenswrapper[4691]: I0930 06:53:19.532507 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8638fef8-f042-4cc1-949d-fb0c107085b5-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "8638fef8-f042-4cc1-949d-fb0c107085b5" (UID: "8638fef8-f042-4cc1-949d-fb0c107085b5"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:53:19 crc kubenswrapper[4691]: I0930 06:53:19.590178 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mj4kr\" (UniqueName: \"kubernetes.io/projected/8638fef8-f042-4cc1-949d-fb0c107085b5-kube-api-access-mj4kr\") on node \"crc\" DevicePath \"\"" Sep 30 06:53:19 crc kubenswrapper[4691]: I0930 06:53:19.590230 4691 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8638fef8-f042-4cc1-949d-fb0c107085b5-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 06:53:19 crc kubenswrapper[4691]: I0930 06:53:19.590241 4691 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8638fef8-f042-4cc1-949d-fb0c107085b5-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:53:19 crc kubenswrapper[4691]: I0930 06:53:19.590249 4691 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/8638fef8-f042-4cc1-949d-fb0c107085b5-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Sep 30 06:53:19 crc kubenswrapper[4691]: I0930 06:53:19.590257 4691 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8638fef8-f042-4cc1-949d-fb0c107085b5-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 06:53:19 crc kubenswrapper[4691]: I0930 06:53:19.945287 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bl96m" event={"ID":"8638fef8-f042-4cc1-949d-fb0c107085b5","Type":"ContainerDied","Data":"d4b0f92c9990b806302f492b392988ba9253d356b2aa2bf98c23e20db80ba2b0"} Sep 30 06:53:19 crc kubenswrapper[4691]: I0930 06:53:19.945350 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4b0f92c9990b806302f492b392988ba9253d356b2aa2bf98c23e20db80ba2b0" Sep 30 06:53:19 crc kubenswrapper[4691]: I0930 06:53:19.945324 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-bl96m" Sep 30 06:53:20 crc kubenswrapper[4691]: I0930 06:53:20.085077 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qpl4r"] Sep 30 06:53:20 crc kubenswrapper[4691]: E0930 06:53:20.085776 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8638fef8-f042-4cc1-949d-fb0c107085b5" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Sep 30 06:53:20 crc kubenswrapper[4691]: I0930 06:53:20.085867 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="8638fef8-f042-4cc1-949d-fb0c107085b5" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Sep 30 06:53:20 crc kubenswrapper[4691]: I0930 06:53:20.086218 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="8638fef8-f042-4cc1-949d-fb0c107085b5" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Sep 30 06:53:20 crc kubenswrapper[4691]: I0930 06:53:20.087151 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qpl4r" Sep 30 06:53:20 crc kubenswrapper[4691]: I0930 06:53:20.096351 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Sep 30 06:53:20 crc kubenswrapper[4691]: I0930 06:53:20.096581 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 06:53:20 crc kubenswrapper[4691]: I0930 06:53:20.096649 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Sep 30 06:53:20 crc kubenswrapper[4691]: I0930 06:53:20.096709 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 06:53:20 crc kubenswrapper[4691]: I0930 06:53:20.096779 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vff8k" Sep 30 06:53:20 crc kubenswrapper[4691]: I0930 06:53:20.102736 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qpl4r"] Sep 30 06:53:20 crc kubenswrapper[4691]: I0930 06:53:20.103705 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 06:53:20 crc kubenswrapper[4691]: I0930 06:53:20.204976 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/214c8c4f-8184-4b59-9fcd-c1112551b5b2-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qpl4r\" (UID: \"214c8c4f-8184-4b59-9fcd-c1112551b5b2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qpl4r" Sep 30 06:53:20 crc kubenswrapper[4691]: I0930 06:53:20.205058 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/214c8c4f-8184-4b59-9fcd-c1112551b5b2-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qpl4r\" (UID: \"214c8c4f-8184-4b59-9fcd-c1112551b5b2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qpl4r" Sep 30 06:53:20 crc kubenswrapper[4691]: I0930 06:53:20.205140 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/214c8c4f-8184-4b59-9fcd-c1112551b5b2-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qpl4r\" (UID: \"214c8c4f-8184-4b59-9fcd-c1112551b5b2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qpl4r" Sep 30 06:53:20 crc kubenswrapper[4691]: I0930 06:53:20.205168 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jsf4\" (UniqueName: \"kubernetes.io/projected/214c8c4f-8184-4b59-9fcd-c1112551b5b2-kube-api-access-8jsf4\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qpl4r\" (UID: \"214c8c4f-8184-4b59-9fcd-c1112551b5b2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qpl4r" Sep 30 06:53:20 crc kubenswrapper[4691]: I0930 06:53:20.205196 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/214c8c4f-8184-4b59-9fcd-c1112551b5b2-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qpl4r\" (UID: \"214c8c4f-8184-4b59-9fcd-c1112551b5b2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qpl4r" Sep 30 06:53:20 crc kubenswrapper[4691]: I0930 06:53:20.205418 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/214c8c4f-8184-4b59-9fcd-c1112551b5b2-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qpl4r\" (UID: \"214c8c4f-8184-4b59-9fcd-c1112551b5b2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qpl4r" Sep 30 06:53:20 crc kubenswrapper[4691]: I0930 06:53:20.307470 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/214c8c4f-8184-4b59-9fcd-c1112551b5b2-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qpl4r\" (UID: \"214c8c4f-8184-4b59-9fcd-c1112551b5b2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qpl4r" Sep 30 06:53:20 crc kubenswrapper[4691]: I0930 06:53:20.307684 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/214c8c4f-8184-4b59-9fcd-c1112551b5b2-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qpl4r\" (UID: \"214c8c4f-8184-4b59-9fcd-c1112551b5b2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qpl4r" Sep 30 06:53:20 crc kubenswrapper[4691]: I0930 06:53:20.307800 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/214c8c4f-8184-4b59-9fcd-c1112551b5b2-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qpl4r\" (UID: \"214c8c4f-8184-4b59-9fcd-c1112551b5b2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qpl4r" Sep 30 06:53:20 crc kubenswrapper[4691]: I0930 06:53:20.307825 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jsf4\" (UniqueName: \"kubernetes.io/projected/214c8c4f-8184-4b59-9fcd-c1112551b5b2-kube-api-access-8jsf4\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qpl4r\" (UID: \"214c8c4f-8184-4b59-9fcd-c1112551b5b2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qpl4r" Sep 30 06:53:20 crc kubenswrapper[4691]: I0930 06:53:20.307850 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/214c8c4f-8184-4b59-9fcd-c1112551b5b2-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qpl4r\" (UID: \"214c8c4f-8184-4b59-9fcd-c1112551b5b2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qpl4r" Sep 30 06:53:20 crc kubenswrapper[4691]: I0930 06:53:20.307939 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/214c8c4f-8184-4b59-9fcd-c1112551b5b2-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qpl4r\" (UID: \"214c8c4f-8184-4b59-9fcd-c1112551b5b2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qpl4r" Sep 30 06:53:20 crc kubenswrapper[4691]: I0930 06:53:20.313378 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/214c8c4f-8184-4b59-9fcd-c1112551b5b2-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qpl4r\" (UID: \"214c8c4f-8184-4b59-9fcd-c1112551b5b2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qpl4r" Sep 30 06:53:20 crc kubenswrapper[4691]: I0930 06:53:20.313721 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/214c8c4f-8184-4b59-9fcd-c1112551b5b2-ssh-key\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qpl4r\" (UID: \"214c8c4f-8184-4b59-9fcd-c1112551b5b2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qpl4r" Sep 30 06:53:20 crc kubenswrapper[4691]: I0930 06:53:20.315418 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/214c8c4f-8184-4b59-9fcd-c1112551b5b2-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qpl4r\" (UID: \"214c8c4f-8184-4b59-9fcd-c1112551b5b2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qpl4r" Sep 30 06:53:20 crc kubenswrapper[4691]: I0930 06:53:20.316660 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/214c8c4f-8184-4b59-9fcd-c1112551b5b2-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qpl4r\" (UID: \"214c8c4f-8184-4b59-9fcd-c1112551b5b2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qpl4r" Sep 30 06:53:20 crc kubenswrapper[4691]: I0930 06:53:20.320563 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/214c8c4f-8184-4b59-9fcd-c1112551b5b2-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qpl4r\" (UID: \"214c8c4f-8184-4b59-9fcd-c1112551b5b2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qpl4r" Sep 30 06:53:20 crc kubenswrapper[4691]: I0930 06:53:20.335903 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jsf4\" (UniqueName: \"kubernetes.io/projected/214c8c4f-8184-4b59-9fcd-c1112551b5b2-kube-api-access-8jsf4\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-qpl4r\" (UID: \"214c8c4f-8184-4b59-9fcd-c1112551b5b2\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qpl4r" Sep 30 06:53:20 crc kubenswrapper[4691]: I0930 06:53:20.417472 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qpl4r" Sep 30 06:53:21 crc kubenswrapper[4691]: I0930 06:53:21.092647 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qpl4r"] Sep 30 06:53:21 crc kubenswrapper[4691]: I0930 06:53:21.984913 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qpl4r" event={"ID":"214c8c4f-8184-4b59-9fcd-c1112551b5b2","Type":"ContainerStarted","Data":"ce520dbaa365e49809f98449a2e82823b4fc782d9c862fa59cb0b26714db9579"} Sep 30 06:53:21 crc kubenswrapper[4691]: I0930 06:53:21.985195 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qpl4r" event={"ID":"214c8c4f-8184-4b59-9fcd-c1112551b5b2","Type":"ContainerStarted","Data":"f1894a9d56dbda9bd0bda787405fdd358f9f71895405d8952bfbbf9a341a5e45"} Sep 30 06:53:22 crc kubenswrapper[4691]: I0930 06:53:22.021165 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qpl4r" podStartSLOduration=1.52388237 podStartE2EDuration="2.021137915s" podCreationTimestamp="2025-09-30 06:53:20 +0000 UTC" firstStartedPulling="2025-09-30 06:53:21.098225928 +0000 UTC m=+2044.573246978" lastFinishedPulling="2025-09-30 06:53:21.595481473 +0000 UTC m=+2045.070502523" observedRunningTime="2025-09-30 06:53:22.005063789 +0000 UTC m=+2045.480084869" watchObservedRunningTime="2025-09-30 06:53:22.021137915 +0000 UTC m=+2045.496158995" Sep 30 06:54:15 crc kubenswrapper[4691]: I0930 06:54:15.035264 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fztdr"] Sep 30 06:54:15 crc kubenswrapper[4691]: I0930 06:54:15.037981 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fztdr" Sep 30 06:54:15 crc kubenswrapper[4691]: I0930 06:54:15.047849 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fztdr"] Sep 30 06:54:15 crc kubenswrapper[4691]: I0930 06:54:15.101653 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/441dfe2b-9391-4f05-9c16-b721ce87a854-catalog-content\") pod \"community-operators-fztdr\" (UID: \"441dfe2b-9391-4f05-9c16-b721ce87a854\") " pod="openshift-marketplace/community-operators-fztdr" Sep 30 06:54:15 crc kubenswrapper[4691]: I0930 06:54:15.101779 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ph87s\" (UniqueName: \"kubernetes.io/projected/441dfe2b-9391-4f05-9c16-b721ce87a854-kube-api-access-ph87s\") pod \"community-operators-fztdr\" (UID: \"441dfe2b-9391-4f05-9c16-b721ce87a854\") " pod="openshift-marketplace/community-operators-fztdr" Sep 30 06:54:15 crc kubenswrapper[4691]: I0930 06:54:15.101800 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/441dfe2b-9391-4f05-9c16-b721ce87a854-utilities\") pod \"community-operators-fztdr\" (UID: \"441dfe2b-9391-4f05-9c16-b721ce87a854\") " pod="openshift-marketplace/community-operators-fztdr" Sep 30 06:54:15 crc kubenswrapper[4691]: I0930 06:54:15.204089 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/441dfe2b-9391-4f05-9c16-b721ce87a854-catalog-content\") pod \"community-operators-fztdr\" (UID: \"441dfe2b-9391-4f05-9c16-b721ce87a854\") " pod="openshift-marketplace/community-operators-fztdr" Sep 30 06:54:15 crc kubenswrapper[4691]: I0930 06:54:15.204172 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ph87s\" (UniqueName: \"kubernetes.io/projected/441dfe2b-9391-4f05-9c16-b721ce87a854-kube-api-access-ph87s\") pod \"community-operators-fztdr\" (UID: \"441dfe2b-9391-4f05-9c16-b721ce87a854\") " pod="openshift-marketplace/community-operators-fztdr" Sep 30 06:54:15 crc kubenswrapper[4691]: I0930 06:54:15.204193 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/441dfe2b-9391-4f05-9c16-b721ce87a854-utilities\") pod \"community-operators-fztdr\" (UID: \"441dfe2b-9391-4f05-9c16-b721ce87a854\") " pod="openshift-marketplace/community-operators-fztdr" Sep 30 06:54:15 crc kubenswrapper[4691]: I0930 06:54:15.204913 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/441dfe2b-9391-4f05-9c16-b721ce87a854-utilities\") pod \"community-operators-fztdr\" (UID: \"441dfe2b-9391-4f05-9c16-b721ce87a854\") " pod="openshift-marketplace/community-operators-fztdr" Sep 30 06:54:15 crc kubenswrapper[4691]: I0930 06:54:15.204969 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/441dfe2b-9391-4f05-9c16-b721ce87a854-catalog-content\") pod \"community-operators-fztdr\" (UID: \"441dfe2b-9391-4f05-9c16-b721ce87a854\") " pod="openshift-marketplace/community-operators-fztdr" Sep 30 06:54:15 crc kubenswrapper[4691]: I0930 06:54:15.229026 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ph87s\" (UniqueName: \"kubernetes.io/projected/441dfe2b-9391-4f05-9c16-b721ce87a854-kube-api-access-ph87s\") pod \"community-operators-fztdr\" (UID: \"441dfe2b-9391-4f05-9c16-b721ce87a854\") " pod="openshift-marketplace/community-operators-fztdr" Sep 30 06:54:15 crc kubenswrapper[4691]: I0930 06:54:15.376341 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fztdr" Sep 30 06:54:15 crc kubenswrapper[4691]: I0930 06:54:15.959315 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fztdr"] Sep 30 06:54:16 crc kubenswrapper[4691]: I0930 06:54:16.634161 4691 generic.go:334] "Generic (PLEG): container finished" podID="441dfe2b-9391-4f05-9c16-b721ce87a854" containerID="277f9dfa3af4a9cfdeb6fc0fbaed23ee9bd3754bfe40444ec4c84774c7c4a21c" exitCode=0 Sep 30 06:54:16 crc kubenswrapper[4691]: I0930 06:54:16.634257 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fztdr" event={"ID":"441dfe2b-9391-4f05-9c16-b721ce87a854","Type":"ContainerDied","Data":"277f9dfa3af4a9cfdeb6fc0fbaed23ee9bd3754bfe40444ec4c84774c7c4a21c"} Sep 30 06:54:16 crc kubenswrapper[4691]: I0930 06:54:16.634545 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fztdr" event={"ID":"441dfe2b-9391-4f05-9c16-b721ce87a854","Type":"ContainerStarted","Data":"8b24f81a68cde5d053c743c341ca2374c548c1d45bf31b5d1dbc20f7c47cc19c"} Sep 30 06:54:17 crc kubenswrapper[4691]: I0930 06:54:17.648607 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fztdr" event={"ID":"441dfe2b-9391-4f05-9c16-b721ce87a854","Type":"ContainerStarted","Data":"6369ba00c4835f83777924922bebc1297e9c61aaf75303376da1b601cdf565fe"} Sep 30 06:54:18 crc kubenswrapper[4691]: I0930 06:54:18.660444 4691 generic.go:334] "Generic (PLEG): container finished" podID="441dfe2b-9391-4f05-9c16-b721ce87a854" containerID="6369ba00c4835f83777924922bebc1297e9c61aaf75303376da1b601cdf565fe" exitCode=0 Sep 30 06:54:18 crc kubenswrapper[4691]: I0930 06:54:18.660522 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fztdr" event={"ID":"441dfe2b-9391-4f05-9c16-b721ce87a854","Type":"ContainerDied","Data":"6369ba00c4835f83777924922bebc1297e9c61aaf75303376da1b601cdf565fe"} Sep 30 06:54:19 crc kubenswrapper[4691]: I0930 06:54:19.675423 4691 generic.go:334] "Generic (PLEG): container finished" podID="214c8c4f-8184-4b59-9fcd-c1112551b5b2" containerID="ce520dbaa365e49809f98449a2e82823b4fc782d9c862fa59cb0b26714db9579" exitCode=0 Sep 30 06:54:19 crc kubenswrapper[4691]: I0930 06:54:19.675555 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qpl4r" event={"ID":"214c8c4f-8184-4b59-9fcd-c1112551b5b2","Type":"ContainerDied","Data":"ce520dbaa365e49809f98449a2e82823b4fc782d9c862fa59cb0b26714db9579"} Sep 30 06:54:19 crc kubenswrapper[4691]: I0930 06:54:19.684998 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fztdr" event={"ID":"441dfe2b-9391-4f05-9c16-b721ce87a854","Type":"ContainerStarted","Data":"b9684df024e66226fdd4021a8ab62e59ccba64d635bd24b88761184191cb1f14"} Sep 30 06:54:19 crc kubenswrapper[4691]: I0930 06:54:19.776668 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fztdr" podStartSLOduration=2.382880368 podStartE2EDuration="4.776648728s" podCreationTimestamp="2025-09-30 06:54:15 +0000 UTC" firstStartedPulling="2025-09-30 06:54:16.636819476 +0000 UTC m=+2100.111840556" lastFinishedPulling="2025-09-30 06:54:19.030587846 +0000 UTC m=+2102.505608916" observedRunningTime="2025-09-30 06:54:19.766243625 +0000 UTC m=+2103.241264665" watchObservedRunningTime="2025-09-30 06:54:19.776648728 +0000 UTC m=+2103.251669768" Sep 30 06:54:21 crc kubenswrapper[4691]: I0930 06:54:21.217592 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qpl4r" Sep 30 06:54:21 crc kubenswrapper[4691]: I0930 06:54:21.317917 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/214c8c4f-8184-4b59-9fcd-c1112551b5b2-neutron-metadata-combined-ca-bundle\") pod \"214c8c4f-8184-4b59-9fcd-c1112551b5b2\" (UID: \"214c8c4f-8184-4b59-9fcd-c1112551b5b2\") " Sep 30 06:54:21 crc kubenswrapper[4691]: I0930 06:54:21.317964 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/214c8c4f-8184-4b59-9fcd-c1112551b5b2-nova-metadata-neutron-config-0\") pod \"214c8c4f-8184-4b59-9fcd-c1112551b5b2\" (UID: \"214c8c4f-8184-4b59-9fcd-c1112551b5b2\") " Sep 30 06:54:21 crc kubenswrapper[4691]: I0930 06:54:21.318104 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8jsf4\" (UniqueName: \"kubernetes.io/projected/214c8c4f-8184-4b59-9fcd-c1112551b5b2-kube-api-access-8jsf4\") pod \"214c8c4f-8184-4b59-9fcd-c1112551b5b2\" (UID: \"214c8c4f-8184-4b59-9fcd-c1112551b5b2\") " Sep 30 06:54:21 crc kubenswrapper[4691]: I0930 06:54:21.318132 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/214c8c4f-8184-4b59-9fcd-c1112551b5b2-inventory\") pod \"214c8c4f-8184-4b59-9fcd-c1112551b5b2\" (UID: \"214c8c4f-8184-4b59-9fcd-c1112551b5b2\") " Sep 30 06:54:21 crc kubenswrapper[4691]: I0930 06:54:21.318286 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/214c8c4f-8184-4b59-9fcd-c1112551b5b2-neutron-ovn-metadata-agent-neutron-config-0\") pod \"214c8c4f-8184-4b59-9fcd-c1112551b5b2\" (UID: \"214c8c4f-8184-4b59-9fcd-c1112551b5b2\") " Sep 30 06:54:21 crc kubenswrapper[4691]: I0930 06:54:21.318376 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/214c8c4f-8184-4b59-9fcd-c1112551b5b2-ssh-key\") pod \"214c8c4f-8184-4b59-9fcd-c1112551b5b2\" (UID: \"214c8c4f-8184-4b59-9fcd-c1112551b5b2\") " Sep 30 06:54:21 crc kubenswrapper[4691]: I0930 06:54:21.326078 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/214c8c4f-8184-4b59-9fcd-c1112551b5b2-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "214c8c4f-8184-4b59-9fcd-c1112551b5b2" (UID: "214c8c4f-8184-4b59-9fcd-c1112551b5b2"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:54:21 crc kubenswrapper[4691]: I0930 06:54:21.326131 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/214c8c4f-8184-4b59-9fcd-c1112551b5b2-kube-api-access-8jsf4" (OuterVolumeSpecName: "kube-api-access-8jsf4") pod "214c8c4f-8184-4b59-9fcd-c1112551b5b2" (UID: "214c8c4f-8184-4b59-9fcd-c1112551b5b2"). InnerVolumeSpecName "kube-api-access-8jsf4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:54:21 crc kubenswrapper[4691]: I0930 06:54:21.349142 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/214c8c4f-8184-4b59-9fcd-c1112551b5b2-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "214c8c4f-8184-4b59-9fcd-c1112551b5b2" (UID: "214c8c4f-8184-4b59-9fcd-c1112551b5b2"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:54:21 crc kubenswrapper[4691]: I0930 06:54:21.351002 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/214c8c4f-8184-4b59-9fcd-c1112551b5b2-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "214c8c4f-8184-4b59-9fcd-c1112551b5b2" (UID: "214c8c4f-8184-4b59-9fcd-c1112551b5b2"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:54:21 crc kubenswrapper[4691]: I0930 06:54:21.355521 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/214c8c4f-8184-4b59-9fcd-c1112551b5b2-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "214c8c4f-8184-4b59-9fcd-c1112551b5b2" (UID: "214c8c4f-8184-4b59-9fcd-c1112551b5b2"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:54:21 crc kubenswrapper[4691]: I0930 06:54:21.370159 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/214c8c4f-8184-4b59-9fcd-c1112551b5b2-inventory" (OuterVolumeSpecName: "inventory") pod "214c8c4f-8184-4b59-9fcd-c1112551b5b2" (UID: "214c8c4f-8184-4b59-9fcd-c1112551b5b2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:54:21 crc kubenswrapper[4691]: I0930 06:54:21.420591 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8jsf4\" (UniqueName: \"kubernetes.io/projected/214c8c4f-8184-4b59-9fcd-c1112551b5b2-kube-api-access-8jsf4\") on node \"crc\" DevicePath \"\"" Sep 30 06:54:21 crc kubenswrapper[4691]: I0930 06:54:21.420628 4691 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/214c8c4f-8184-4b59-9fcd-c1112551b5b2-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 06:54:21 crc kubenswrapper[4691]: I0930 06:54:21.420642 4691 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/214c8c4f-8184-4b59-9fcd-c1112551b5b2-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Sep 30 06:54:21 crc kubenswrapper[4691]: I0930 06:54:21.420652 4691 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/214c8c4f-8184-4b59-9fcd-c1112551b5b2-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 06:54:21 crc kubenswrapper[4691]: I0930 06:54:21.420668 4691 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/214c8c4f-8184-4b59-9fcd-c1112551b5b2-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:54:21 crc kubenswrapper[4691]: I0930 06:54:21.420677 4691 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/214c8c4f-8184-4b59-9fcd-c1112551b5b2-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Sep 30 06:54:21 crc kubenswrapper[4691]: I0930 06:54:21.710535 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qpl4r" event={"ID":"214c8c4f-8184-4b59-9fcd-c1112551b5b2","Type":"ContainerDied","Data":"f1894a9d56dbda9bd0bda787405fdd358f9f71895405d8952bfbbf9a341a5e45"} Sep 30 06:54:21 crc kubenswrapper[4691]: I0930 06:54:21.711001 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1894a9d56dbda9bd0bda787405fdd358f9f71895405d8952bfbbf9a341a5e45" Sep 30 06:54:21 crc kubenswrapper[4691]: I0930 06:54:21.710654 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-qpl4r" Sep 30 06:54:21 crc kubenswrapper[4691]: I0930 06:54:21.842593 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qgtz6"] Sep 30 06:54:21 crc kubenswrapper[4691]: E0930 06:54:21.843177 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="214c8c4f-8184-4b59-9fcd-c1112551b5b2" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Sep 30 06:54:21 crc kubenswrapper[4691]: I0930 06:54:21.843190 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="214c8c4f-8184-4b59-9fcd-c1112551b5b2" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Sep 30 06:54:21 crc kubenswrapper[4691]: I0930 06:54:21.843410 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="214c8c4f-8184-4b59-9fcd-c1112551b5b2" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Sep 30 06:54:21 crc kubenswrapper[4691]: I0930 06:54:21.844052 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qgtz6" Sep 30 06:54:21 crc kubenswrapper[4691]: I0930 06:54:21.847499 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 06:54:21 crc kubenswrapper[4691]: I0930 06:54:21.847910 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Sep 30 06:54:21 crc kubenswrapper[4691]: I0930 06:54:21.848157 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 06:54:21 crc kubenswrapper[4691]: I0930 06:54:21.848359 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vff8k" Sep 30 06:54:21 crc kubenswrapper[4691]: I0930 06:54:21.848912 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 06:54:21 crc kubenswrapper[4691]: I0930 06:54:21.850100 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qgtz6"] Sep 30 06:54:22 crc kubenswrapper[4691]: I0930 06:54:22.030099 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89c2v\" (UniqueName: \"kubernetes.io/projected/8e08d67e-28fd-4a4b-905a-765d0e33013d-kube-api-access-89c2v\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-qgtz6\" (UID: \"8e08d67e-28fd-4a4b-905a-765d0e33013d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qgtz6" Sep 30 06:54:22 crc kubenswrapper[4691]: I0930 06:54:22.030663 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/8e08d67e-28fd-4a4b-905a-765d0e33013d-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-qgtz6\" (UID: \"8e08d67e-28fd-4a4b-905a-765d0e33013d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qgtz6" Sep 30 06:54:22 crc kubenswrapper[4691]: I0930 06:54:22.030848 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8e08d67e-28fd-4a4b-905a-765d0e33013d-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-qgtz6\" (UID: \"8e08d67e-28fd-4a4b-905a-765d0e33013d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qgtz6" Sep 30 06:54:22 crc kubenswrapper[4691]: I0930 06:54:22.031757 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e08d67e-28fd-4a4b-905a-765d0e33013d-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-qgtz6\" (UID: \"8e08d67e-28fd-4a4b-905a-765d0e33013d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qgtz6" Sep 30 06:54:22 crc kubenswrapper[4691]: I0930 06:54:22.032130 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8e08d67e-28fd-4a4b-905a-765d0e33013d-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-qgtz6\" (UID: \"8e08d67e-28fd-4a4b-905a-765d0e33013d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qgtz6" Sep 30 06:54:22 crc kubenswrapper[4691]: I0930 06:54:22.134788 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e08d67e-28fd-4a4b-905a-765d0e33013d-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-qgtz6\" (UID: \"8e08d67e-28fd-4a4b-905a-765d0e33013d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qgtz6" Sep 30 06:54:22 crc kubenswrapper[4691]: I0930 06:54:22.135354 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8e08d67e-28fd-4a4b-905a-765d0e33013d-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-qgtz6\" (UID: \"8e08d67e-28fd-4a4b-905a-765d0e33013d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qgtz6" Sep 30 06:54:22 crc kubenswrapper[4691]: I0930 06:54:22.135587 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89c2v\" (UniqueName: \"kubernetes.io/projected/8e08d67e-28fd-4a4b-905a-765d0e33013d-kube-api-access-89c2v\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-qgtz6\" (UID: \"8e08d67e-28fd-4a4b-905a-765d0e33013d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qgtz6" Sep 30 06:54:22 crc kubenswrapper[4691]: I0930 06:54:22.136046 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/8e08d67e-28fd-4a4b-905a-765d0e33013d-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-qgtz6\" (UID: \"8e08d67e-28fd-4a4b-905a-765d0e33013d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qgtz6" Sep 30 06:54:22 crc kubenswrapper[4691]: I0930 06:54:22.136253 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8e08d67e-28fd-4a4b-905a-765d0e33013d-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-qgtz6\" (UID: \"8e08d67e-28fd-4a4b-905a-765d0e33013d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qgtz6" Sep 30 06:54:22 crc kubenswrapper[4691]: I0930 06:54:22.141573 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e08d67e-28fd-4a4b-905a-765d0e33013d-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-qgtz6\" (UID: \"8e08d67e-28fd-4a4b-905a-765d0e33013d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qgtz6" Sep 30 06:54:22 crc kubenswrapper[4691]: I0930 06:54:22.141828 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/8e08d67e-28fd-4a4b-905a-765d0e33013d-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-qgtz6\" (UID: \"8e08d67e-28fd-4a4b-905a-765d0e33013d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qgtz6" Sep 30 06:54:22 crc kubenswrapper[4691]: I0930 06:54:22.142518 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8e08d67e-28fd-4a4b-905a-765d0e33013d-ssh-key\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-qgtz6\" (UID: \"8e08d67e-28fd-4a4b-905a-765d0e33013d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qgtz6" Sep 30 06:54:22 crc kubenswrapper[4691]: I0930 06:54:22.143019 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8e08d67e-28fd-4a4b-905a-765d0e33013d-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-qgtz6\" (UID: \"8e08d67e-28fd-4a4b-905a-765d0e33013d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qgtz6" Sep 30 06:54:22 crc kubenswrapper[4691]: I0930 06:54:22.172635 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89c2v\" (UniqueName: \"kubernetes.io/projected/8e08d67e-28fd-4a4b-905a-765d0e33013d-kube-api-access-89c2v\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-qgtz6\" (UID: \"8e08d67e-28fd-4a4b-905a-765d0e33013d\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qgtz6" Sep 30 06:54:22 crc kubenswrapper[4691]: I0930 06:54:22.470274 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qgtz6" Sep 30 06:54:22 crc kubenswrapper[4691]: I0930 06:54:22.846277 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qgtz6"] Sep 30 06:54:22 crc kubenswrapper[4691]: W0930 06:54:22.853592 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8e08d67e_28fd_4a4b_905a_765d0e33013d.slice/crio-987d1fca80269ce92b42143f00e3893f43c9deb15ba81982f954aed57034fa5c WatchSource:0}: Error finding container 987d1fca80269ce92b42143f00e3893f43c9deb15ba81982f954aed57034fa5c: Status 404 returned error can't find the container with id 987d1fca80269ce92b42143f00e3893f43c9deb15ba81982f954aed57034fa5c Sep 30 06:54:23 crc kubenswrapper[4691]: I0930 06:54:23.737222 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qgtz6" event={"ID":"8e08d67e-28fd-4a4b-905a-765d0e33013d","Type":"ContainerStarted","Data":"bf36b329a91011c28e89feab998de5356240c0f3e8110b09151760bcafb0c9ed"} Sep 30 06:54:23 crc kubenswrapper[4691]: I0930 06:54:23.737577 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qgtz6" event={"ID":"8e08d67e-28fd-4a4b-905a-765d0e33013d","Type":"ContainerStarted","Data":"987d1fca80269ce92b42143f00e3893f43c9deb15ba81982f954aed57034fa5c"} Sep 30 06:54:23 crc kubenswrapper[4691]: I0930 06:54:23.765170 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qgtz6" podStartSLOduration=2.33423252 podStartE2EDuration="2.76515123s" podCreationTimestamp="2025-09-30 06:54:21 +0000 UTC" firstStartedPulling="2025-09-30 06:54:22.856858512 +0000 UTC m=+2106.331879582" lastFinishedPulling="2025-09-30 06:54:23.287777212 +0000 UTC m=+2106.762798292" observedRunningTime="2025-09-30 06:54:23.754576021 +0000 UTC m=+2107.229597061" watchObservedRunningTime="2025-09-30 06:54:23.76515123 +0000 UTC m=+2107.240172260" Sep 30 06:54:25 crc kubenswrapper[4691]: I0930 06:54:25.376961 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fztdr" Sep 30 06:54:25 crc kubenswrapper[4691]: I0930 06:54:25.378521 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fztdr" Sep 30 06:54:25 crc kubenswrapper[4691]: I0930 06:54:25.433451 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fztdr" Sep 30 06:54:25 crc kubenswrapper[4691]: I0930 06:54:25.834640 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fztdr" Sep 30 06:54:25 crc kubenswrapper[4691]: I0930 06:54:25.903442 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fztdr"] Sep 30 06:54:27 crc kubenswrapper[4691]: I0930 06:54:27.777205 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fztdr" podUID="441dfe2b-9391-4f05-9c16-b721ce87a854" containerName="registry-server" containerID="cri-o://b9684df024e66226fdd4021a8ab62e59ccba64d635bd24b88761184191cb1f14" gracePeriod=2 Sep 30 06:54:28 crc kubenswrapper[4691]: I0930 06:54:28.269577 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fztdr" Sep 30 06:54:28 crc kubenswrapper[4691]: I0930 06:54:28.371730 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/441dfe2b-9391-4f05-9c16-b721ce87a854-utilities\") pod \"441dfe2b-9391-4f05-9c16-b721ce87a854\" (UID: \"441dfe2b-9391-4f05-9c16-b721ce87a854\") " Sep 30 06:54:28 crc kubenswrapper[4691]: I0930 06:54:28.371946 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/441dfe2b-9391-4f05-9c16-b721ce87a854-catalog-content\") pod \"441dfe2b-9391-4f05-9c16-b721ce87a854\" (UID: \"441dfe2b-9391-4f05-9c16-b721ce87a854\") " Sep 30 06:54:28 crc kubenswrapper[4691]: I0930 06:54:28.372096 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ph87s\" (UniqueName: \"kubernetes.io/projected/441dfe2b-9391-4f05-9c16-b721ce87a854-kube-api-access-ph87s\") pod \"441dfe2b-9391-4f05-9c16-b721ce87a854\" (UID: \"441dfe2b-9391-4f05-9c16-b721ce87a854\") " Sep 30 06:54:28 crc kubenswrapper[4691]: I0930 06:54:28.372758 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/441dfe2b-9391-4f05-9c16-b721ce87a854-utilities" (OuterVolumeSpecName: "utilities") pod "441dfe2b-9391-4f05-9c16-b721ce87a854" (UID: "441dfe2b-9391-4f05-9c16-b721ce87a854"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:54:28 crc kubenswrapper[4691]: I0930 06:54:28.383671 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/441dfe2b-9391-4f05-9c16-b721ce87a854-kube-api-access-ph87s" (OuterVolumeSpecName: "kube-api-access-ph87s") pod "441dfe2b-9391-4f05-9c16-b721ce87a854" (UID: "441dfe2b-9391-4f05-9c16-b721ce87a854"). InnerVolumeSpecName "kube-api-access-ph87s". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:54:28 crc kubenswrapper[4691]: I0930 06:54:28.414877 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/441dfe2b-9391-4f05-9c16-b721ce87a854-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "441dfe2b-9391-4f05-9c16-b721ce87a854" (UID: "441dfe2b-9391-4f05-9c16-b721ce87a854"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:54:28 crc kubenswrapper[4691]: I0930 06:54:28.474097 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ph87s\" (UniqueName: \"kubernetes.io/projected/441dfe2b-9391-4f05-9c16-b721ce87a854-kube-api-access-ph87s\") on node \"crc\" DevicePath \"\"" Sep 30 06:54:28 crc kubenswrapper[4691]: I0930 06:54:28.474143 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/441dfe2b-9391-4f05-9c16-b721ce87a854-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 06:54:28 crc kubenswrapper[4691]: I0930 06:54:28.474159 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/441dfe2b-9391-4f05-9c16-b721ce87a854-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 06:54:28 crc kubenswrapper[4691]: I0930 06:54:28.792623 4691 generic.go:334] "Generic (PLEG): container finished" podID="441dfe2b-9391-4f05-9c16-b721ce87a854" containerID="b9684df024e66226fdd4021a8ab62e59ccba64d635bd24b88761184191cb1f14" exitCode=0 Sep 30 06:54:28 crc kubenswrapper[4691]: I0930 06:54:28.792667 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fztdr" event={"ID":"441dfe2b-9391-4f05-9c16-b721ce87a854","Type":"ContainerDied","Data":"b9684df024e66226fdd4021a8ab62e59ccba64d635bd24b88761184191cb1f14"} Sep 30 06:54:28 crc kubenswrapper[4691]: I0930 06:54:28.792700 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fztdr" event={"ID":"441dfe2b-9391-4f05-9c16-b721ce87a854","Type":"ContainerDied","Data":"8b24f81a68cde5d053c743c341ca2374c548c1d45bf31b5d1dbc20f7c47cc19c"} Sep 30 06:54:28 crc kubenswrapper[4691]: I0930 06:54:28.792723 4691 scope.go:117] "RemoveContainer" containerID="b9684df024e66226fdd4021a8ab62e59ccba64d635bd24b88761184191cb1f14" Sep 30 06:54:28 crc kubenswrapper[4691]: I0930 06:54:28.792725 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fztdr" Sep 30 06:54:28 crc kubenswrapper[4691]: I0930 06:54:28.827584 4691 scope.go:117] "RemoveContainer" containerID="6369ba00c4835f83777924922bebc1297e9c61aaf75303376da1b601cdf565fe" Sep 30 06:54:28 crc kubenswrapper[4691]: I0930 06:54:28.865184 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fztdr"] Sep 30 06:54:28 crc kubenswrapper[4691]: I0930 06:54:28.873813 4691 scope.go:117] "RemoveContainer" containerID="277f9dfa3af4a9cfdeb6fc0fbaed23ee9bd3754bfe40444ec4c84774c7c4a21c" Sep 30 06:54:28 crc kubenswrapper[4691]: I0930 06:54:28.878414 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fztdr"] Sep 30 06:54:28 crc kubenswrapper[4691]: I0930 06:54:28.930761 4691 scope.go:117] "RemoveContainer" containerID="b9684df024e66226fdd4021a8ab62e59ccba64d635bd24b88761184191cb1f14" Sep 30 06:54:28 crc kubenswrapper[4691]: E0930 06:54:28.931248 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9684df024e66226fdd4021a8ab62e59ccba64d635bd24b88761184191cb1f14\": container with ID starting with b9684df024e66226fdd4021a8ab62e59ccba64d635bd24b88761184191cb1f14 not found: ID does not exist" containerID="b9684df024e66226fdd4021a8ab62e59ccba64d635bd24b88761184191cb1f14" Sep 30 06:54:28 crc kubenswrapper[4691]: I0930 06:54:28.931294 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9684df024e66226fdd4021a8ab62e59ccba64d635bd24b88761184191cb1f14"} err="failed to get container status \"b9684df024e66226fdd4021a8ab62e59ccba64d635bd24b88761184191cb1f14\": rpc error: code = NotFound desc = could not find container \"b9684df024e66226fdd4021a8ab62e59ccba64d635bd24b88761184191cb1f14\": container with ID starting with b9684df024e66226fdd4021a8ab62e59ccba64d635bd24b88761184191cb1f14 not found: ID does not exist" Sep 30 06:54:28 crc kubenswrapper[4691]: I0930 06:54:28.931321 4691 scope.go:117] "RemoveContainer" containerID="6369ba00c4835f83777924922bebc1297e9c61aaf75303376da1b601cdf565fe" Sep 30 06:54:28 crc kubenswrapper[4691]: E0930 06:54:28.931694 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6369ba00c4835f83777924922bebc1297e9c61aaf75303376da1b601cdf565fe\": container with ID starting with 6369ba00c4835f83777924922bebc1297e9c61aaf75303376da1b601cdf565fe not found: ID does not exist" containerID="6369ba00c4835f83777924922bebc1297e9c61aaf75303376da1b601cdf565fe" Sep 30 06:54:28 crc kubenswrapper[4691]: I0930 06:54:28.931718 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6369ba00c4835f83777924922bebc1297e9c61aaf75303376da1b601cdf565fe"} err="failed to get container status \"6369ba00c4835f83777924922bebc1297e9c61aaf75303376da1b601cdf565fe\": rpc error: code = NotFound desc = could not find container \"6369ba00c4835f83777924922bebc1297e9c61aaf75303376da1b601cdf565fe\": container with ID starting with 6369ba00c4835f83777924922bebc1297e9c61aaf75303376da1b601cdf565fe not found: ID does not exist" Sep 30 06:54:28 crc kubenswrapper[4691]: I0930 06:54:28.931730 4691 scope.go:117] "RemoveContainer" containerID="277f9dfa3af4a9cfdeb6fc0fbaed23ee9bd3754bfe40444ec4c84774c7c4a21c" Sep 30 06:54:28 crc kubenswrapper[4691]: E0930 06:54:28.932077 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"277f9dfa3af4a9cfdeb6fc0fbaed23ee9bd3754bfe40444ec4c84774c7c4a21c\": container with ID starting with 277f9dfa3af4a9cfdeb6fc0fbaed23ee9bd3754bfe40444ec4c84774c7c4a21c not found: ID does not exist" containerID="277f9dfa3af4a9cfdeb6fc0fbaed23ee9bd3754bfe40444ec4c84774c7c4a21c" Sep 30 06:54:28 crc kubenswrapper[4691]: I0930 06:54:28.932189 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"277f9dfa3af4a9cfdeb6fc0fbaed23ee9bd3754bfe40444ec4c84774c7c4a21c"} err="failed to get container status \"277f9dfa3af4a9cfdeb6fc0fbaed23ee9bd3754bfe40444ec4c84774c7c4a21c\": rpc error: code = NotFound desc = could not find container \"277f9dfa3af4a9cfdeb6fc0fbaed23ee9bd3754bfe40444ec4c84774c7c4a21c\": container with ID starting with 277f9dfa3af4a9cfdeb6fc0fbaed23ee9bd3754bfe40444ec4c84774c7c4a21c not found: ID does not exist" Sep 30 06:54:29 crc kubenswrapper[4691]: I0930 06:54:29.075680 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-m6qxn"] Sep 30 06:54:29 crc kubenswrapper[4691]: E0930 06:54:29.076303 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="441dfe2b-9391-4f05-9c16-b721ce87a854" containerName="extract-content" Sep 30 06:54:29 crc kubenswrapper[4691]: I0930 06:54:29.076328 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="441dfe2b-9391-4f05-9c16-b721ce87a854" containerName="extract-content" Sep 30 06:54:29 crc kubenswrapper[4691]: E0930 06:54:29.076348 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="441dfe2b-9391-4f05-9c16-b721ce87a854" containerName="extract-utilities" Sep 30 06:54:29 crc kubenswrapper[4691]: I0930 06:54:29.076360 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="441dfe2b-9391-4f05-9c16-b721ce87a854" containerName="extract-utilities" Sep 30 06:54:29 crc kubenswrapper[4691]: E0930 06:54:29.076381 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="441dfe2b-9391-4f05-9c16-b721ce87a854" containerName="registry-server" Sep 30 06:54:29 crc kubenswrapper[4691]: I0930 06:54:29.076389 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="441dfe2b-9391-4f05-9c16-b721ce87a854" containerName="registry-server" Sep 30 06:54:29 crc kubenswrapper[4691]: I0930 06:54:29.076658 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="441dfe2b-9391-4f05-9c16-b721ce87a854" containerName="registry-server" Sep 30 06:54:29 crc kubenswrapper[4691]: I0930 06:54:29.092963 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m6qxn" Sep 30 06:54:29 crc kubenswrapper[4691]: I0930 06:54:29.094872 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-m6qxn"] Sep 30 06:54:29 crc kubenswrapper[4691]: I0930 06:54:29.195160 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e9b3fba-b31f-47f4-88ae-555d3c870855-catalog-content\") pod \"certified-operators-m6qxn\" (UID: \"5e9b3fba-b31f-47f4-88ae-555d3c870855\") " pod="openshift-marketplace/certified-operators-m6qxn" Sep 30 06:54:29 crc kubenswrapper[4691]: I0930 06:54:29.195485 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e9b3fba-b31f-47f4-88ae-555d3c870855-utilities\") pod \"certified-operators-m6qxn\" (UID: \"5e9b3fba-b31f-47f4-88ae-555d3c870855\") " pod="openshift-marketplace/certified-operators-m6qxn" Sep 30 06:54:29 crc kubenswrapper[4691]: I0930 06:54:29.195743 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcjtg\" (UniqueName: \"kubernetes.io/projected/5e9b3fba-b31f-47f4-88ae-555d3c870855-kube-api-access-dcjtg\") pod \"certified-operators-m6qxn\" (UID: \"5e9b3fba-b31f-47f4-88ae-555d3c870855\") " pod="openshift-marketplace/certified-operators-m6qxn" Sep 30 06:54:29 crc kubenswrapper[4691]: I0930 06:54:29.236905 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="441dfe2b-9391-4f05-9c16-b721ce87a854" path="/var/lib/kubelet/pods/441dfe2b-9391-4f05-9c16-b721ce87a854/volumes" Sep 30 06:54:29 crc kubenswrapper[4691]: I0930 06:54:29.292754 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6bwdg"] Sep 30 06:54:29 crc kubenswrapper[4691]: I0930 06:54:29.297324 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcjtg\" (UniqueName: \"kubernetes.io/projected/5e9b3fba-b31f-47f4-88ae-555d3c870855-kube-api-access-dcjtg\") pod \"certified-operators-m6qxn\" (UID: \"5e9b3fba-b31f-47f4-88ae-555d3c870855\") " pod="openshift-marketplace/certified-operators-m6qxn" Sep 30 06:54:29 crc kubenswrapper[4691]: I0930 06:54:29.297442 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e9b3fba-b31f-47f4-88ae-555d3c870855-catalog-content\") pod \"certified-operators-m6qxn\" (UID: \"5e9b3fba-b31f-47f4-88ae-555d3c870855\") " pod="openshift-marketplace/certified-operators-m6qxn" Sep 30 06:54:29 crc kubenswrapper[4691]: I0930 06:54:29.297462 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e9b3fba-b31f-47f4-88ae-555d3c870855-utilities\") pod \"certified-operators-m6qxn\" (UID: \"5e9b3fba-b31f-47f4-88ae-555d3c870855\") " pod="openshift-marketplace/certified-operators-m6qxn" Sep 30 06:54:29 crc kubenswrapper[4691]: I0930 06:54:29.298106 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e9b3fba-b31f-47f4-88ae-555d3c870855-catalog-content\") pod \"certified-operators-m6qxn\" (UID: \"5e9b3fba-b31f-47f4-88ae-555d3c870855\") " pod="openshift-marketplace/certified-operators-m6qxn" Sep 30 06:54:29 crc kubenswrapper[4691]: I0930 06:54:29.298135 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e9b3fba-b31f-47f4-88ae-555d3c870855-utilities\") pod \"certified-operators-m6qxn\" (UID: \"5e9b3fba-b31f-47f4-88ae-555d3c870855\") " pod="openshift-marketplace/certified-operators-m6qxn" Sep 30 06:54:29 crc kubenswrapper[4691]: I0930 06:54:29.302272 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6bwdg" Sep 30 06:54:29 crc kubenswrapper[4691]: I0930 06:54:29.319394 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6bwdg"] Sep 30 06:54:29 crc kubenswrapper[4691]: I0930 06:54:29.342706 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcjtg\" (UniqueName: \"kubernetes.io/projected/5e9b3fba-b31f-47f4-88ae-555d3c870855-kube-api-access-dcjtg\") pod \"certified-operators-m6qxn\" (UID: \"5e9b3fba-b31f-47f4-88ae-555d3c870855\") " pod="openshift-marketplace/certified-operators-m6qxn" Sep 30 06:54:29 crc kubenswrapper[4691]: I0930 06:54:29.399424 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a23a8574-92cf-4328-a17a-a7e6049d1c4c-catalog-content\") pod \"redhat-marketplace-6bwdg\" (UID: \"a23a8574-92cf-4328-a17a-a7e6049d1c4c\") " pod="openshift-marketplace/redhat-marketplace-6bwdg" Sep 30 06:54:29 crc kubenswrapper[4691]: I0930 06:54:29.399483 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djfpg\" (UniqueName: \"kubernetes.io/projected/a23a8574-92cf-4328-a17a-a7e6049d1c4c-kube-api-access-djfpg\") pod \"redhat-marketplace-6bwdg\" (UID: \"a23a8574-92cf-4328-a17a-a7e6049d1c4c\") " pod="openshift-marketplace/redhat-marketplace-6bwdg" Sep 30 06:54:29 crc kubenswrapper[4691]: I0930 06:54:29.399511 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a23a8574-92cf-4328-a17a-a7e6049d1c4c-utilities\") pod \"redhat-marketplace-6bwdg\" (UID: \"a23a8574-92cf-4328-a17a-a7e6049d1c4c\") " pod="openshift-marketplace/redhat-marketplace-6bwdg" Sep 30 06:54:29 crc kubenswrapper[4691]: I0930 06:54:29.416774 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m6qxn" Sep 30 06:54:29 crc kubenswrapper[4691]: I0930 06:54:29.505124 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a23a8574-92cf-4328-a17a-a7e6049d1c4c-catalog-content\") pod \"redhat-marketplace-6bwdg\" (UID: \"a23a8574-92cf-4328-a17a-a7e6049d1c4c\") " pod="openshift-marketplace/redhat-marketplace-6bwdg" Sep 30 06:54:29 crc kubenswrapper[4691]: I0930 06:54:29.505200 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djfpg\" (UniqueName: \"kubernetes.io/projected/a23a8574-92cf-4328-a17a-a7e6049d1c4c-kube-api-access-djfpg\") pod \"redhat-marketplace-6bwdg\" (UID: \"a23a8574-92cf-4328-a17a-a7e6049d1c4c\") " pod="openshift-marketplace/redhat-marketplace-6bwdg" Sep 30 06:54:29 crc kubenswrapper[4691]: I0930 06:54:29.505226 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a23a8574-92cf-4328-a17a-a7e6049d1c4c-utilities\") pod \"redhat-marketplace-6bwdg\" (UID: \"a23a8574-92cf-4328-a17a-a7e6049d1c4c\") " pod="openshift-marketplace/redhat-marketplace-6bwdg" Sep 30 06:54:29 crc kubenswrapper[4691]: I0930 06:54:29.505692 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a23a8574-92cf-4328-a17a-a7e6049d1c4c-utilities\") pod \"redhat-marketplace-6bwdg\" (UID: \"a23a8574-92cf-4328-a17a-a7e6049d1c4c\") " pod="openshift-marketplace/redhat-marketplace-6bwdg" Sep 30 06:54:29 crc kubenswrapper[4691]: I0930 06:54:29.505910 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a23a8574-92cf-4328-a17a-a7e6049d1c4c-catalog-content\") pod \"redhat-marketplace-6bwdg\" (UID: \"a23a8574-92cf-4328-a17a-a7e6049d1c4c\") " pod="openshift-marketplace/redhat-marketplace-6bwdg" Sep 30 06:54:29 crc kubenswrapper[4691]: I0930 06:54:29.534368 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djfpg\" (UniqueName: \"kubernetes.io/projected/a23a8574-92cf-4328-a17a-a7e6049d1c4c-kube-api-access-djfpg\") pod \"redhat-marketplace-6bwdg\" (UID: \"a23a8574-92cf-4328-a17a-a7e6049d1c4c\") " pod="openshift-marketplace/redhat-marketplace-6bwdg" Sep 30 06:54:29 crc kubenswrapper[4691]: I0930 06:54:29.621395 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6bwdg" Sep 30 06:54:29 crc kubenswrapper[4691]: I0930 06:54:29.981171 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-m6qxn"] Sep 30 06:54:30 crc kubenswrapper[4691]: I0930 06:54:30.193316 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6bwdg"] Sep 30 06:54:30 crc kubenswrapper[4691]: I0930 06:54:30.815669 4691 generic.go:334] "Generic (PLEG): container finished" podID="5e9b3fba-b31f-47f4-88ae-555d3c870855" containerID="023dbf0378dbbbb9c1bd7222d28937a0b927e7c51144eec9c7d236a00db6476d" exitCode=0 Sep 30 06:54:30 crc kubenswrapper[4691]: I0930 06:54:30.815713 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m6qxn" event={"ID":"5e9b3fba-b31f-47f4-88ae-555d3c870855","Type":"ContainerDied","Data":"023dbf0378dbbbb9c1bd7222d28937a0b927e7c51144eec9c7d236a00db6476d"} Sep 30 06:54:30 crc kubenswrapper[4691]: I0930 06:54:30.815754 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m6qxn" event={"ID":"5e9b3fba-b31f-47f4-88ae-555d3c870855","Type":"ContainerStarted","Data":"2cd0cc0370f4aff950ab0246c2e169d2a68579beb8f3f7763d7b3833bcb5e736"} Sep 30 06:54:30 crc kubenswrapper[4691]: I0930 06:54:30.817335 4691 generic.go:334] "Generic (PLEG): container finished" podID="a23a8574-92cf-4328-a17a-a7e6049d1c4c" containerID="aef2569024042b96fbca51541f2632a4c1896dd7fbc2d9c6270db283151fcda3" exitCode=0 Sep 30 06:54:30 crc kubenswrapper[4691]: I0930 06:54:30.817370 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6bwdg" event={"ID":"a23a8574-92cf-4328-a17a-a7e6049d1c4c","Type":"ContainerDied","Data":"aef2569024042b96fbca51541f2632a4c1896dd7fbc2d9c6270db283151fcda3"} Sep 30 06:54:30 crc kubenswrapper[4691]: I0930 06:54:30.817388 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6bwdg" event={"ID":"a23a8574-92cf-4328-a17a-a7e6049d1c4c","Type":"ContainerStarted","Data":"8d946435147b7f19562c204781bc684b6899b0171924b3d84e100004f2e27dc1"} Sep 30 06:54:30 crc kubenswrapper[4691]: I0930 06:54:30.817660 4691 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 06:54:31 crc kubenswrapper[4691]: I0930 06:54:31.709013 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rhzk5"] Sep 30 06:54:31 crc kubenswrapper[4691]: I0930 06:54:31.712223 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rhzk5" Sep 30 06:54:31 crc kubenswrapper[4691]: I0930 06:54:31.721390 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rhzk5"] Sep 30 06:54:31 crc kubenswrapper[4691]: I0930 06:54:31.830387 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m6qxn" event={"ID":"5e9b3fba-b31f-47f4-88ae-555d3c870855","Type":"ContainerStarted","Data":"83bdd961ff10b97c671c7e5f3dfc47e57bc04064701da539bda0b93d7009ad54"} Sep 30 06:54:31 crc kubenswrapper[4691]: I0930 06:54:31.832826 4691 generic.go:334] "Generic (PLEG): container finished" podID="a23a8574-92cf-4328-a17a-a7e6049d1c4c" containerID="db6a2b9bad6ec02338e5ffda201bec21d8b98dfff46cb83bf71fbdd994bbaec8" exitCode=0 Sep 30 06:54:31 crc kubenswrapper[4691]: I0930 06:54:31.832870 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6bwdg" event={"ID":"a23a8574-92cf-4328-a17a-a7e6049d1c4c","Type":"ContainerDied","Data":"db6a2b9bad6ec02338e5ffda201bec21d8b98dfff46cb83bf71fbdd994bbaec8"} Sep 30 06:54:31 crc kubenswrapper[4691]: I0930 06:54:31.847116 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a8bb54f-c1ff-4a7a-9e04-2298d265584e-catalog-content\") pod \"redhat-operators-rhzk5\" (UID: \"8a8bb54f-c1ff-4a7a-9e04-2298d265584e\") " pod="openshift-marketplace/redhat-operators-rhzk5" Sep 30 06:54:31 crc kubenswrapper[4691]: I0930 06:54:31.847238 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxh8m\" (UniqueName: \"kubernetes.io/projected/8a8bb54f-c1ff-4a7a-9e04-2298d265584e-kube-api-access-nxh8m\") pod \"redhat-operators-rhzk5\" (UID: \"8a8bb54f-c1ff-4a7a-9e04-2298d265584e\") " pod="openshift-marketplace/redhat-operators-rhzk5" Sep 30 06:54:31 crc kubenswrapper[4691]: I0930 06:54:31.847283 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a8bb54f-c1ff-4a7a-9e04-2298d265584e-utilities\") pod \"redhat-operators-rhzk5\" (UID: \"8a8bb54f-c1ff-4a7a-9e04-2298d265584e\") " pod="openshift-marketplace/redhat-operators-rhzk5" Sep 30 06:54:31 crc kubenswrapper[4691]: I0930 06:54:31.949715 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxh8m\" (UniqueName: \"kubernetes.io/projected/8a8bb54f-c1ff-4a7a-9e04-2298d265584e-kube-api-access-nxh8m\") pod \"redhat-operators-rhzk5\" (UID: \"8a8bb54f-c1ff-4a7a-9e04-2298d265584e\") " pod="openshift-marketplace/redhat-operators-rhzk5" Sep 30 06:54:31 crc kubenswrapper[4691]: I0930 06:54:31.949786 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a8bb54f-c1ff-4a7a-9e04-2298d265584e-utilities\") pod \"redhat-operators-rhzk5\" (UID: \"8a8bb54f-c1ff-4a7a-9e04-2298d265584e\") " pod="openshift-marketplace/redhat-operators-rhzk5" Sep 30 06:54:31 crc kubenswrapper[4691]: I0930 06:54:31.950205 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a8bb54f-c1ff-4a7a-9e04-2298d265584e-utilities\") pod \"redhat-operators-rhzk5\" (UID: \"8a8bb54f-c1ff-4a7a-9e04-2298d265584e\") " pod="openshift-marketplace/redhat-operators-rhzk5" Sep 30 06:54:31 crc kubenswrapper[4691]: I0930 06:54:31.950518 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a8bb54f-c1ff-4a7a-9e04-2298d265584e-catalog-content\") pod \"redhat-operators-rhzk5\" (UID: \"8a8bb54f-c1ff-4a7a-9e04-2298d265584e\") " pod="openshift-marketplace/redhat-operators-rhzk5" Sep 30 06:54:31 crc kubenswrapper[4691]: I0930 06:54:31.950836 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a8bb54f-c1ff-4a7a-9e04-2298d265584e-catalog-content\") pod \"redhat-operators-rhzk5\" (UID: \"8a8bb54f-c1ff-4a7a-9e04-2298d265584e\") " pod="openshift-marketplace/redhat-operators-rhzk5" Sep 30 06:54:31 crc kubenswrapper[4691]: I0930 06:54:31.972779 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxh8m\" (UniqueName: \"kubernetes.io/projected/8a8bb54f-c1ff-4a7a-9e04-2298d265584e-kube-api-access-nxh8m\") pod \"redhat-operators-rhzk5\" (UID: \"8a8bb54f-c1ff-4a7a-9e04-2298d265584e\") " pod="openshift-marketplace/redhat-operators-rhzk5" Sep 30 06:54:32 crc kubenswrapper[4691]: I0930 06:54:32.043772 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rhzk5" Sep 30 06:54:32 crc kubenswrapper[4691]: I0930 06:54:32.411831 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rhzk5"] Sep 30 06:54:32 crc kubenswrapper[4691]: I0930 06:54:32.846831 4691 generic.go:334] "Generic (PLEG): container finished" podID="5e9b3fba-b31f-47f4-88ae-555d3c870855" containerID="83bdd961ff10b97c671c7e5f3dfc47e57bc04064701da539bda0b93d7009ad54" exitCode=0 Sep 30 06:54:32 crc kubenswrapper[4691]: I0930 06:54:32.846917 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m6qxn" event={"ID":"5e9b3fba-b31f-47f4-88ae-555d3c870855","Type":"ContainerDied","Data":"83bdd961ff10b97c671c7e5f3dfc47e57bc04064701da539bda0b93d7009ad54"} Sep 30 06:54:32 crc kubenswrapper[4691]: I0930 06:54:32.848975 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6bwdg" event={"ID":"a23a8574-92cf-4328-a17a-a7e6049d1c4c","Type":"ContainerStarted","Data":"47062dc89406ec5dcb3873c8ec7ef88f1737fc148a669fc816a73d6e72ac3c79"} Sep 30 06:54:32 crc kubenswrapper[4691]: I0930 06:54:32.851529 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rhzk5" event={"ID":"8a8bb54f-c1ff-4a7a-9e04-2298d265584e","Type":"ContainerStarted","Data":"790a8091ff1bfd1160766a75a7db7f7fc5439b69b136ceef44b730af59d81678"} Sep 30 06:54:32 crc kubenswrapper[4691]: I0930 06:54:32.894359 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6bwdg" podStartSLOduration=2.437164886 podStartE2EDuration="3.894341752s" podCreationTimestamp="2025-09-30 06:54:29 +0000 UTC" firstStartedPulling="2025-09-30 06:54:30.821192539 +0000 UTC m=+2114.296213589" lastFinishedPulling="2025-09-30 06:54:32.278369415 +0000 UTC m=+2115.753390455" observedRunningTime="2025-09-30 06:54:32.885720995 +0000 UTC m=+2116.360742075" watchObservedRunningTime="2025-09-30 06:54:32.894341752 +0000 UTC m=+2116.369362792" Sep 30 06:54:33 crc kubenswrapper[4691]: I0930 06:54:33.862384 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m6qxn" event={"ID":"5e9b3fba-b31f-47f4-88ae-555d3c870855","Type":"ContainerStarted","Data":"80c0359e1290b64167707fcdcd2c7b9be998773df4d284807eb5be71c1300779"} Sep 30 06:54:33 crc kubenswrapper[4691]: I0930 06:54:33.864969 4691 generic.go:334] "Generic (PLEG): container finished" podID="8a8bb54f-c1ff-4a7a-9e04-2298d265584e" containerID="4f1dbcb08e635f26e28be0acf234f10bb73933715ea8ba092ac8368e50c3ed28" exitCode=0 Sep 30 06:54:33 crc kubenswrapper[4691]: I0930 06:54:33.865110 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rhzk5" event={"ID":"8a8bb54f-c1ff-4a7a-9e04-2298d265584e","Type":"ContainerDied","Data":"4f1dbcb08e635f26e28be0acf234f10bb73933715ea8ba092ac8368e50c3ed28"} Sep 30 06:54:33 crc kubenswrapper[4691]: I0930 06:54:33.895326 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-m6qxn" podStartSLOduration=2.496268388 podStartE2EDuration="4.895305027s" podCreationTimestamp="2025-09-30 06:54:29 +0000 UTC" firstStartedPulling="2025-09-30 06:54:30.817418558 +0000 UTC m=+2114.292439598" lastFinishedPulling="2025-09-30 06:54:33.216455197 +0000 UTC m=+2116.691476237" observedRunningTime="2025-09-30 06:54:33.87823001 +0000 UTC m=+2117.353251050" watchObservedRunningTime="2025-09-30 06:54:33.895305027 +0000 UTC m=+2117.370326067" Sep 30 06:54:34 crc kubenswrapper[4691]: I0930 06:54:34.873993 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rhzk5" event={"ID":"8a8bb54f-c1ff-4a7a-9e04-2298d265584e","Type":"ContainerStarted","Data":"6cd9c6c2a87d0fdbbac4ccc0dffe1cb00367da0415dd9fca8d01f33467e976c2"} Sep 30 06:54:37 crc kubenswrapper[4691]: I0930 06:54:37.904661 4691 generic.go:334] "Generic (PLEG): container finished" podID="8a8bb54f-c1ff-4a7a-9e04-2298d265584e" containerID="6cd9c6c2a87d0fdbbac4ccc0dffe1cb00367da0415dd9fca8d01f33467e976c2" exitCode=0 Sep 30 06:54:37 crc kubenswrapper[4691]: I0930 06:54:37.904770 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rhzk5" event={"ID":"8a8bb54f-c1ff-4a7a-9e04-2298d265584e","Type":"ContainerDied","Data":"6cd9c6c2a87d0fdbbac4ccc0dffe1cb00367da0415dd9fca8d01f33467e976c2"} Sep 30 06:54:38 crc kubenswrapper[4691]: I0930 06:54:38.921688 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rhzk5" event={"ID":"8a8bb54f-c1ff-4a7a-9e04-2298d265584e","Type":"ContainerStarted","Data":"d50d28ff5d62a2e5cec51a1aea11c25a89bc3488da9e21ae65ef6a503e6cdad7"} Sep 30 06:54:38 crc kubenswrapper[4691]: I0930 06:54:38.957175 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rhzk5" podStartSLOduration=3.5648043769999997 podStartE2EDuration="7.957148282s" podCreationTimestamp="2025-09-30 06:54:31 +0000 UTC" firstStartedPulling="2025-09-30 06:54:33.867581489 +0000 UTC m=+2117.342602529" lastFinishedPulling="2025-09-30 06:54:38.259925394 +0000 UTC m=+2121.734946434" observedRunningTime="2025-09-30 06:54:38.94240916 +0000 UTC m=+2122.417430240" watchObservedRunningTime="2025-09-30 06:54:38.957148282 +0000 UTC m=+2122.432169362" Sep 30 06:54:39 crc kubenswrapper[4691]: I0930 06:54:39.418597 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-m6qxn" Sep 30 06:54:39 crc kubenswrapper[4691]: I0930 06:54:39.418638 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-m6qxn" Sep 30 06:54:39 crc kubenswrapper[4691]: I0930 06:54:39.466136 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-m6qxn" Sep 30 06:54:39 crc kubenswrapper[4691]: I0930 06:54:39.622229 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6bwdg" Sep 30 06:54:39 crc kubenswrapper[4691]: I0930 06:54:39.622263 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6bwdg" Sep 30 06:54:39 crc kubenswrapper[4691]: I0930 06:54:39.668538 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6bwdg" Sep 30 06:54:39 crc kubenswrapper[4691]: I0930 06:54:39.992206 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6bwdg" Sep 30 06:54:39 crc kubenswrapper[4691]: I0930 06:54:39.996204 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-m6qxn" Sep 30 06:54:42 crc kubenswrapper[4691]: I0930 06:54:42.044264 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rhzk5" Sep 30 06:54:42 crc kubenswrapper[4691]: I0930 06:54:42.046035 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rhzk5" Sep 30 06:54:42 crc kubenswrapper[4691]: I0930 06:54:42.075332 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-m6qxn"] Sep 30 06:54:42 crc kubenswrapper[4691]: I0930 06:54:42.075596 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-m6qxn" podUID="5e9b3fba-b31f-47f4-88ae-555d3c870855" containerName="registry-server" containerID="cri-o://80c0359e1290b64167707fcdcd2c7b9be998773df4d284807eb5be71c1300779" gracePeriod=2 Sep 30 06:54:42 crc kubenswrapper[4691]: I0930 06:54:42.267310 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6bwdg"] Sep 30 06:54:42 crc kubenswrapper[4691]: I0930 06:54:42.267729 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-6bwdg" podUID="a23a8574-92cf-4328-a17a-a7e6049d1c4c" containerName="registry-server" containerID="cri-o://47062dc89406ec5dcb3873c8ec7ef88f1737fc148a669fc816a73d6e72ac3c79" gracePeriod=2 Sep 30 06:54:42 crc kubenswrapper[4691]: I0930 06:54:42.647516 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m6qxn" Sep 30 06:54:42 crc kubenswrapper[4691]: I0930 06:54:42.693390 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e9b3fba-b31f-47f4-88ae-555d3c870855-catalog-content\") pod \"5e9b3fba-b31f-47f4-88ae-555d3c870855\" (UID: \"5e9b3fba-b31f-47f4-88ae-555d3c870855\") " Sep 30 06:54:42 crc kubenswrapper[4691]: I0930 06:54:42.693541 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e9b3fba-b31f-47f4-88ae-555d3c870855-utilities\") pod \"5e9b3fba-b31f-47f4-88ae-555d3c870855\" (UID: \"5e9b3fba-b31f-47f4-88ae-555d3c870855\") " Sep 30 06:54:42 crc kubenswrapper[4691]: I0930 06:54:42.693800 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dcjtg\" (UniqueName: \"kubernetes.io/projected/5e9b3fba-b31f-47f4-88ae-555d3c870855-kube-api-access-dcjtg\") pod \"5e9b3fba-b31f-47f4-88ae-555d3c870855\" (UID: \"5e9b3fba-b31f-47f4-88ae-555d3c870855\") " Sep 30 06:54:42 crc kubenswrapper[4691]: I0930 06:54:42.694263 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e9b3fba-b31f-47f4-88ae-555d3c870855-utilities" (OuterVolumeSpecName: "utilities") pod "5e9b3fba-b31f-47f4-88ae-555d3c870855" (UID: "5e9b3fba-b31f-47f4-88ae-555d3c870855"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:54:42 crc kubenswrapper[4691]: I0930 06:54:42.698233 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5e9b3fba-b31f-47f4-88ae-555d3c870855-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 06:54:42 crc kubenswrapper[4691]: I0930 06:54:42.707208 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e9b3fba-b31f-47f4-88ae-555d3c870855-kube-api-access-dcjtg" (OuterVolumeSpecName: "kube-api-access-dcjtg") pod "5e9b3fba-b31f-47f4-88ae-555d3c870855" (UID: "5e9b3fba-b31f-47f4-88ae-555d3c870855"). InnerVolumeSpecName "kube-api-access-dcjtg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:54:42 crc kubenswrapper[4691]: I0930 06:54:42.781470 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6bwdg" Sep 30 06:54:42 crc kubenswrapper[4691]: I0930 06:54:42.783562 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5e9b3fba-b31f-47f4-88ae-555d3c870855-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5e9b3fba-b31f-47f4-88ae-555d3c870855" (UID: "5e9b3fba-b31f-47f4-88ae-555d3c870855"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:54:42 crc kubenswrapper[4691]: I0930 06:54:42.801270 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dcjtg\" (UniqueName: \"kubernetes.io/projected/5e9b3fba-b31f-47f4-88ae-555d3c870855-kube-api-access-dcjtg\") on node \"crc\" DevicePath \"\"" Sep 30 06:54:42 crc kubenswrapper[4691]: I0930 06:54:42.801315 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5e9b3fba-b31f-47f4-88ae-555d3c870855-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 06:54:42 crc kubenswrapper[4691]: I0930 06:54:42.902800 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a23a8574-92cf-4328-a17a-a7e6049d1c4c-catalog-content\") pod \"a23a8574-92cf-4328-a17a-a7e6049d1c4c\" (UID: \"a23a8574-92cf-4328-a17a-a7e6049d1c4c\") " Sep 30 06:54:42 crc kubenswrapper[4691]: I0930 06:54:42.902919 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a23a8574-92cf-4328-a17a-a7e6049d1c4c-utilities\") pod \"a23a8574-92cf-4328-a17a-a7e6049d1c4c\" (UID: \"a23a8574-92cf-4328-a17a-a7e6049d1c4c\") " Sep 30 06:54:42 crc kubenswrapper[4691]: I0930 06:54:42.903088 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djfpg\" (UniqueName: \"kubernetes.io/projected/a23a8574-92cf-4328-a17a-a7e6049d1c4c-kube-api-access-djfpg\") pod \"a23a8574-92cf-4328-a17a-a7e6049d1c4c\" (UID: \"a23a8574-92cf-4328-a17a-a7e6049d1c4c\") " Sep 30 06:54:42 crc kubenswrapper[4691]: I0930 06:54:42.903690 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a23a8574-92cf-4328-a17a-a7e6049d1c4c-utilities" (OuterVolumeSpecName: "utilities") pod "a23a8574-92cf-4328-a17a-a7e6049d1c4c" (UID: "a23a8574-92cf-4328-a17a-a7e6049d1c4c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:54:42 crc kubenswrapper[4691]: I0930 06:54:42.906785 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a23a8574-92cf-4328-a17a-a7e6049d1c4c-kube-api-access-djfpg" (OuterVolumeSpecName: "kube-api-access-djfpg") pod "a23a8574-92cf-4328-a17a-a7e6049d1c4c" (UID: "a23a8574-92cf-4328-a17a-a7e6049d1c4c"). InnerVolumeSpecName "kube-api-access-djfpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:54:42 crc kubenswrapper[4691]: I0930 06:54:42.915691 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a23a8574-92cf-4328-a17a-a7e6049d1c4c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a23a8574-92cf-4328-a17a-a7e6049d1c4c" (UID: "a23a8574-92cf-4328-a17a-a7e6049d1c4c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:54:42 crc kubenswrapper[4691]: I0930 06:54:42.963186 4691 generic.go:334] "Generic (PLEG): container finished" podID="5e9b3fba-b31f-47f4-88ae-555d3c870855" containerID="80c0359e1290b64167707fcdcd2c7b9be998773df4d284807eb5be71c1300779" exitCode=0 Sep 30 06:54:42 crc kubenswrapper[4691]: I0930 06:54:42.963243 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m6qxn" event={"ID":"5e9b3fba-b31f-47f4-88ae-555d3c870855","Type":"ContainerDied","Data":"80c0359e1290b64167707fcdcd2c7b9be998773df4d284807eb5be71c1300779"} Sep 30 06:54:42 crc kubenswrapper[4691]: I0930 06:54:42.963272 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-m6qxn" event={"ID":"5e9b3fba-b31f-47f4-88ae-555d3c870855","Type":"ContainerDied","Data":"2cd0cc0370f4aff950ab0246c2e169d2a68579beb8f3f7763d7b3833bcb5e736"} Sep 30 06:54:42 crc kubenswrapper[4691]: I0930 06:54:42.963291 4691 scope.go:117] "RemoveContainer" containerID="80c0359e1290b64167707fcdcd2c7b9be998773df4d284807eb5be71c1300779" Sep 30 06:54:42 crc kubenswrapper[4691]: I0930 06:54:42.963428 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-m6qxn" Sep 30 06:54:42 crc kubenswrapper[4691]: I0930 06:54:42.967254 4691 generic.go:334] "Generic (PLEG): container finished" podID="a23a8574-92cf-4328-a17a-a7e6049d1c4c" containerID="47062dc89406ec5dcb3873c8ec7ef88f1737fc148a669fc816a73d6e72ac3c79" exitCode=0 Sep 30 06:54:42 crc kubenswrapper[4691]: I0930 06:54:42.967301 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6bwdg" event={"ID":"a23a8574-92cf-4328-a17a-a7e6049d1c4c","Type":"ContainerDied","Data":"47062dc89406ec5dcb3873c8ec7ef88f1737fc148a669fc816a73d6e72ac3c79"} Sep 30 06:54:42 crc kubenswrapper[4691]: I0930 06:54:42.967330 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6bwdg" event={"ID":"a23a8574-92cf-4328-a17a-a7e6049d1c4c","Type":"ContainerDied","Data":"8d946435147b7f19562c204781bc684b6899b0171924b3d84e100004f2e27dc1"} Sep 30 06:54:42 crc kubenswrapper[4691]: I0930 06:54:42.967391 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6bwdg" Sep 30 06:54:42 crc kubenswrapper[4691]: I0930 06:54:42.987053 4691 scope.go:117] "RemoveContainer" containerID="83bdd961ff10b97c671c7e5f3dfc47e57bc04064701da539bda0b93d7009ad54" Sep 30 06:54:43 crc kubenswrapper[4691]: I0930 06:54:43.000753 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-m6qxn"] Sep 30 06:54:43 crc kubenswrapper[4691]: I0930 06:54:43.010038 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djfpg\" (UniqueName: \"kubernetes.io/projected/a23a8574-92cf-4328-a17a-a7e6049d1c4c-kube-api-access-djfpg\") on node \"crc\" DevicePath \"\"" Sep 30 06:54:43 crc kubenswrapper[4691]: I0930 06:54:43.010075 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a23a8574-92cf-4328-a17a-a7e6049d1c4c-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 06:54:43 crc kubenswrapper[4691]: I0930 06:54:43.010088 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a23a8574-92cf-4328-a17a-a7e6049d1c4c-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 06:54:43 crc kubenswrapper[4691]: I0930 06:54:43.011302 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-m6qxn"] Sep 30 06:54:43 crc kubenswrapper[4691]: I0930 06:54:43.020391 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-6bwdg"] Sep 30 06:54:43 crc kubenswrapper[4691]: I0930 06:54:43.025196 4691 scope.go:117] "RemoveContainer" containerID="023dbf0378dbbbb9c1bd7222d28937a0b927e7c51144eec9c7d236a00db6476d" Sep 30 06:54:43 crc kubenswrapper[4691]: I0930 06:54:43.028618 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-6bwdg"] Sep 30 06:54:43 crc kubenswrapper[4691]: I0930 06:54:43.046712 4691 scope.go:117] "RemoveContainer" containerID="80c0359e1290b64167707fcdcd2c7b9be998773df4d284807eb5be71c1300779" Sep 30 06:54:43 crc kubenswrapper[4691]: E0930 06:54:43.047640 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80c0359e1290b64167707fcdcd2c7b9be998773df4d284807eb5be71c1300779\": container with ID starting with 80c0359e1290b64167707fcdcd2c7b9be998773df4d284807eb5be71c1300779 not found: ID does not exist" containerID="80c0359e1290b64167707fcdcd2c7b9be998773df4d284807eb5be71c1300779" Sep 30 06:54:43 crc kubenswrapper[4691]: I0930 06:54:43.047788 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80c0359e1290b64167707fcdcd2c7b9be998773df4d284807eb5be71c1300779"} err="failed to get container status \"80c0359e1290b64167707fcdcd2c7b9be998773df4d284807eb5be71c1300779\": rpc error: code = NotFound desc = could not find container \"80c0359e1290b64167707fcdcd2c7b9be998773df4d284807eb5be71c1300779\": container with ID starting with 80c0359e1290b64167707fcdcd2c7b9be998773df4d284807eb5be71c1300779 not found: ID does not exist" Sep 30 06:54:43 crc kubenswrapper[4691]: I0930 06:54:43.047991 4691 scope.go:117] "RemoveContainer" containerID="83bdd961ff10b97c671c7e5f3dfc47e57bc04064701da539bda0b93d7009ad54" Sep 30 06:54:43 crc kubenswrapper[4691]: E0930 06:54:43.048381 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83bdd961ff10b97c671c7e5f3dfc47e57bc04064701da539bda0b93d7009ad54\": container with ID starting with 83bdd961ff10b97c671c7e5f3dfc47e57bc04064701da539bda0b93d7009ad54 not found: ID does not exist" containerID="83bdd961ff10b97c671c7e5f3dfc47e57bc04064701da539bda0b93d7009ad54" Sep 30 06:54:43 crc kubenswrapper[4691]: I0930 06:54:43.048409 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83bdd961ff10b97c671c7e5f3dfc47e57bc04064701da539bda0b93d7009ad54"} err="failed to get container status \"83bdd961ff10b97c671c7e5f3dfc47e57bc04064701da539bda0b93d7009ad54\": rpc error: code = NotFound desc = could not find container \"83bdd961ff10b97c671c7e5f3dfc47e57bc04064701da539bda0b93d7009ad54\": container with ID starting with 83bdd961ff10b97c671c7e5f3dfc47e57bc04064701da539bda0b93d7009ad54 not found: ID does not exist" Sep 30 06:54:43 crc kubenswrapper[4691]: I0930 06:54:43.048431 4691 scope.go:117] "RemoveContainer" containerID="023dbf0378dbbbb9c1bd7222d28937a0b927e7c51144eec9c7d236a00db6476d" Sep 30 06:54:43 crc kubenswrapper[4691]: E0930 06:54:43.052417 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"023dbf0378dbbbb9c1bd7222d28937a0b927e7c51144eec9c7d236a00db6476d\": container with ID starting with 023dbf0378dbbbb9c1bd7222d28937a0b927e7c51144eec9c7d236a00db6476d not found: ID does not exist" containerID="023dbf0378dbbbb9c1bd7222d28937a0b927e7c51144eec9c7d236a00db6476d" Sep 30 06:54:43 crc kubenswrapper[4691]: I0930 06:54:43.052446 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"023dbf0378dbbbb9c1bd7222d28937a0b927e7c51144eec9c7d236a00db6476d"} err="failed to get container status \"023dbf0378dbbbb9c1bd7222d28937a0b927e7c51144eec9c7d236a00db6476d\": rpc error: code = NotFound desc = could not find container \"023dbf0378dbbbb9c1bd7222d28937a0b927e7c51144eec9c7d236a00db6476d\": container with ID starting with 023dbf0378dbbbb9c1bd7222d28937a0b927e7c51144eec9c7d236a00db6476d not found: ID does not exist" Sep 30 06:54:43 crc kubenswrapper[4691]: I0930 06:54:43.052461 4691 scope.go:117] "RemoveContainer" containerID="47062dc89406ec5dcb3873c8ec7ef88f1737fc148a669fc816a73d6e72ac3c79" Sep 30 06:54:43 crc kubenswrapper[4691]: I0930 06:54:43.079615 4691 scope.go:117] "RemoveContainer" containerID="db6a2b9bad6ec02338e5ffda201bec21d8b98dfff46cb83bf71fbdd994bbaec8" Sep 30 06:54:43 crc kubenswrapper[4691]: I0930 06:54:43.108221 4691 scope.go:117] "RemoveContainer" containerID="aef2569024042b96fbca51541f2632a4c1896dd7fbc2d9c6270db283151fcda3" Sep 30 06:54:43 crc kubenswrapper[4691]: I0930 06:54:43.127442 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rhzk5" podUID="8a8bb54f-c1ff-4a7a-9e04-2298d265584e" containerName="registry-server" probeResult="failure" output=< Sep 30 06:54:43 crc kubenswrapper[4691]: timeout: failed to connect service ":50051" within 1s Sep 30 06:54:43 crc kubenswrapper[4691]: > Sep 30 06:54:43 crc kubenswrapper[4691]: I0930 06:54:43.134177 4691 scope.go:117] "RemoveContainer" containerID="47062dc89406ec5dcb3873c8ec7ef88f1737fc148a669fc816a73d6e72ac3c79" Sep 30 06:54:43 crc kubenswrapper[4691]: E0930 06:54:43.134669 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47062dc89406ec5dcb3873c8ec7ef88f1737fc148a669fc816a73d6e72ac3c79\": container with ID starting with 47062dc89406ec5dcb3873c8ec7ef88f1737fc148a669fc816a73d6e72ac3c79 not found: ID does not exist" containerID="47062dc89406ec5dcb3873c8ec7ef88f1737fc148a669fc816a73d6e72ac3c79" Sep 30 06:54:43 crc kubenswrapper[4691]: I0930 06:54:43.134720 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47062dc89406ec5dcb3873c8ec7ef88f1737fc148a669fc816a73d6e72ac3c79"} err="failed to get container status \"47062dc89406ec5dcb3873c8ec7ef88f1737fc148a669fc816a73d6e72ac3c79\": rpc error: code = NotFound desc = could not find container \"47062dc89406ec5dcb3873c8ec7ef88f1737fc148a669fc816a73d6e72ac3c79\": container with ID starting with 47062dc89406ec5dcb3873c8ec7ef88f1737fc148a669fc816a73d6e72ac3c79 not found: ID does not exist" Sep 30 06:54:43 crc kubenswrapper[4691]: I0930 06:54:43.134741 4691 scope.go:117] "RemoveContainer" containerID="db6a2b9bad6ec02338e5ffda201bec21d8b98dfff46cb83bf71fbdd994bbaec8" Sep 30 06:54:43 crc kubenswrapper[4691]: E0930 06:54:43.135196 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db6a2b9bad6ec02338e5ffda201bec21d8b98dfff46cb83bf71fbdd994bbaec8\": container with ID starting with db6a2b9bad6ec02338e5ffda201bec21d8b98dfff46cb83bf71fbdd994bbaec8 not found: ID does not exist" containerID="db6a2b9bad6ec02338e5ffda201bec21d8b98dfff46cb83bf71fbdd994bbaec8" Sep 30 06:54:43 crc kubenswrapper[4691]: I0930 06:54:43.135244 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db6a2b9bad6ec02338e5ffda201bec21d8b98dfff46cb83bf71fbdd994bbaec8"} err="failed to get container status \"db6a2b9bad6ec02338e5ffda201bec21d8b98dfff46cb83bf71fbdd994bbaec8\": rpc error: code = NotFound desc = could not find container \"db6a2b9bad6ec02338e5ffda201bec21d8b98dfff46cb83bf71fbdd994bbaec8\": container with ID starting with db6a2b9bad6ec02338e5ffda201bec21d8b98dfff46cb83bf71fbdd994bbaec8 not found: ID does not exist" Sep 30 06:54:43 crc kubenswrapper[4691]: I0930 06:54:43.135280 4691 scope.go:117] "RemoveContainer" containerID="aef2569024042b96fbca51541f2632a4c1896dd7fbc2d9c6270db283151fcda3" Sep 30 06:54:43 crc kubenswrapper[4691]: E0930 06:54:43.135622 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aef2569024042b96fbca51541f2632a4c1896dd7fbc2d9c6270db283151fcda3\": container with ID starting with aef2569024042b96fbca51541f2632a4c1896dd7fbc2d9c6270db283151fcda3 not found: ID does not exist" containerID="aef2569024042b96fbca51541f2632a4c1896dd7fbc2d9c6270db283151fcda3" Sep 30 06:54:43 crc kubenswrapper[4691]: I0930 06:54:43.135650 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aef2569024042b96fbca51541f2632a4c1896dd7fbc2d9c6270db283151fcda3"} err="failed to get container status \"aef2569024042b96fbca51541f2632a4c1896dd7fbc2d9c6270db283151fcda3\": rpc error: code = NotFound desc = could not find container \"aef2569024042b96fbca51541f2632a4c1896dd7fbc2d9c6270db283151fcda3\": container with ID starting with aef2569024042b96fbca51541f2632a4c1896dd7fbc2d9c6270db283151fcda3 not found: ID does not exist" Sep 30 06:54:43 crc kubenswrapper[4691]: I0930 06:54:43.238705 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e9b3fba-b31f-47f4-88ae-555d3c870855" path="/var/lib/kubelet/pods/5e9b3fba-b31f-47f4-88ae-555d3c870855/volumes" Sep 30 06:54:43 crc kubenswrapper[4691]: I0930 06:54:43.239542 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a23a8574-92cf-4328-a17a-a7e6049d1c4c" path="/var/lib/kubelet/pods/a23a8574-92cf-4328-a17a-a7e6049d1c4c/volumes" Sep 30 06:54:52 crc kubenswrapper[4691]: I0930 06:54:52.126777 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rhzk5" Sep 30 06:54:52 crc kubenswrapper[4691]: I0930 06:54:52.188557 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rhzk5" Sep 30 06:54:52 crc kubenswrapper[4691]: I0930 06:54:52.382858 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rhzk5"] Sep 30 06:54:52 crc kubenswrapper[4691]: I0930 06:54:52.850154 4691 patch_prober.go:28] interesting pod/machine-config-daemon-4w4k6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 06:54:52 crc kubenswrapper[4691]: I0930 06:54:52.850228 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 06:54:54 crc kubenswrapper[4691]: I0930 06:54:54.104040 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rhzk5" podUID="8a8bb54f-c1ff-4a7a-9e04-2298d265584e" containerName="registry-server" containerID="cri-o://d50d28ff5d62a2e5cec51a1aea11c25a89bc3488da9e21ae65ef6a503e6cdad7" gracePeriod=2 Sep 30 06:54:54 crc kubenswrapper[4691]: I0930 06:54:54.594879 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rhzk5" Sep 30 06:54:54 crc kubenswrapper[4691]: I0930 06:54:54.663309 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a8bb54f-c1ff-4a7a-9e04-2298d265584e-utilities\") pod \"8a8bb54f-c1ff-4a7a-9e04-2298d265584e\" (UID: \"8a8bb54f-c1ff-4a7a-9e04-2298d265584e\") " Sep 30 06:54:54 crc kubenswrapper[4691]: I0930 06:54:54.663383 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxh8m\" (UniqueName: \"kubernetes.io/projected/8a8bb54f-c1ff-4a7a-9e04-2298d265584e-kube-api-access-nxh8m\") pod \"8a8bb54f-c1ff-4a7a-9e04-2298d265584e\" (UID: \"8a8bb54f-c1ff-4a7a-9e04-2298d265584e\") " Sep 30 06:54:54 crc kubenswrapper[4691]: I0930 06:54:54.669749 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a8bb54f-c1ff-4a7a-9e04-2298d265584e-catalog-content\") pod \"8a8bb54f-c1ff-4a7a-9e04-2298d265584e\" (UID: \"8a8bb54f-c1ff-4a7a-9e04-2298d265584e\") " Sep 30 06:54:54 crc kubenswrapper[4691]: I0930 06:54:54.671079 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a8bb54f-c1ff-4a7a-9e04-2298d265584e-utilities" (OuterVolumeSpecName: "utilities") pod "8a8bb54f-c1ff-4a7a-9e04-2298d265584e" (UID: "8a8bb54f-c1ff-4a7a-9e04-2298d265584e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:54:54 crc kubenswrapper[4691]: I0930 06:54:54.694550 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a8bb54f-c1ff-4a7a-9e04-2298d265584e-kube-api-access-nxh8m" (OuterVolumeSpecName: "kube-api-access-nxh8m") pod "8a8bb54f-c1ff-4a7a-9e04-2298d265584e" (UID: "8a8bb54f-c1ff-4a7a-9e04-2298d265584e"). InnerVolumeSpecName "kube-api-access-nxh8m". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:54:54 crc kubenswrapper[4691]: I0930 06:54:54.762351 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a8bb54f-c1ff-4a7a-9e04-2298d265584e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8a8bb54f-c1ff-4a7a-9e04-2298d265584e" (UID: "8a8bb54f-c1ff-4a7a-9e04-2298d265584e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 06:54:54 crc kubenswrapper[4691]: I0930 06:54:54.772478 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8a8bb54f-c1ff-4a7a-9e04-2298d265584e-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 06:54:54 crc kubenswrapper[4691]: I0930 06:54:54.772666 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8a8bb54f-c1ff-4a7a-9e04-2298d265584e-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 06:54:54 crc kubenswrapper[4691]: I0930 06:54:54.772765 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxh8m\" (UniqueName: \"kubernetes.io/projected/8a8bb54f-c1ff-4a7a-9e04-2298d265584e-kube-api-access-nxh8m\") on node \"crc\" DevicePath \"\"" Sep 30 06:54:55 crc kubenswrapper[4691]: I0930 06:54:55.120680 4691 generic.go:334] "Generic (PLEG): container finished" podID="8a8bb54f-c1ff-4a7a-9e04-2298d265584e" containerID="d50d28ff5d62a2e5cec51a1aea11c25a89bc3488da9e21ae65ef6a503e6cdad7" exitCode=0 Sep 30 06:54:55 crc kubenswrapper[4691]: I0930 06:54:55.121018 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rhzk5" event={"ID":"8a8bb54f-c1ff-4a7a-9e04-2298d265584e","Type":"ContainerDied","Data":"d50d28ff5d62a2e5cec51a1aea11c25a89bc3488da9e21ae65ef6a503e6cdad7"} Sep 30 06:54:55 crc kubenswrapper[4691]: I0930 06:54:55.121058 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rhzk5" event={"ID":"8a8bb54f-c1ff-4a7a-9e04-2298d265584e","Type":"ContainerDied","Data":"790a8091ff1bfd1160766a75a7db7f7fc5439b69b136ceef44b730af59d81678"} Sep 30 06:54:55 crc kubenswrapper[4691]: I0930 06:54:55.121081 4691 scope.go:117] "RemoveContainer" containerID="d50d28ff5d62a2e5cec51a1aea11c25a89bc3488da9e21ae65ef6a503e6cdad7" Sep 30 06:54:55 crc kubenswrapper[4691]: I0930 06:54:55.121294 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rhzk5" Sep 30 06:54:55 crc kubenswrapper[4691]: I0930 06:54:55.147751 4691 scope.go:117] "RemoveContainer" containerID="6cd9c6c2a87d0fdbbac4ccc0dffe1cb00367da0415dd9fca8d01f33467e976c2" Sep 30 06:54:55 crc kubenswrapper[4691]: I0930 06:54:55.183483 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rhzk5"] Sep 30 06:54:55 crc kubenswrapper[4691]: I0930 06:54:55.195204 4691 scope.go:117] "RemoveContainer" containerID="4f1dbcb08e635f26e28be0acf234f10bb73933715ea8ba092ac8368e50c3ed28" Sep 30 06:54:55 crc kubenswrapper[4691]: I0930 06:54:55.198670 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rhzk5"] Sep 30 06:54:55 crc kubenswrapper[4691]: I0930 06:54:55.233984 4691 scope.go:117] "RemoveContainer" containerID="d50d28ff5d62a2e5cec51a1aea11c25a89bc3488da9e21ae65ef6a503e6cdad7" Sep 30 06:54:55 crc kubenswrapper[4691]: E0930 06:54:55.234384 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d50d28ff5d62a2e5cec51a1aea11c25a89bc3488da9e21ae65ef6a503e6cdad7\": container with ID starting with d50d28ff5d62a2e5cec51a1aea11c25a89bc3488da9e21ae65ef6a503e6cdad7 not found: ID does not exist" containerID="d50d28ff5d62a2e5cec51a1aea11c25a89bc3488da9e21ae65ef6a503e6cdad7" Sep 30 06:54:55 crc kubenswrapper[4691]: I0930 06:54:55.234442 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d50d28ff5d62a2e5cec51a1aea11c25a89bc3488da9e21ae65ef6a503e6cdad7"} err="failed to get container status \"d50d28ff5d62a2e5cec51a1aea11c25a89bc3488da9e21ae65ef6a503e6cdad7\": rpc error: code = NotFound desc = could not find container \"d50d28ff5d62a2e5cec51a1aea11c25a89bc3488da9e21ae65ef6a503e6cdad7\": container with ID starting with d50d28ff5d62a2e5cec51a1aea11c25a89bc3488da9e21ae65ef6a503e6cdad7 not found: ID does not exist" Sep 30 06:54:55 crc kubenswrapper[4691]: I0930 06:54:55.234475 4691 scope.go:117] "RemoveContainer" containerID="6cd9c6c2a87d0fdbbac4ccc0dffe1cb00367da0415dd9fca8d01f33467e976c2" Sep 30 06:54:55 crc kubenswrapper[4691]: E0930 06:54:55.234901 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6cd9c6c2a87d0fdbbac4ccc0dffe1cb00367da0415dd9fca8d01f33467e976c2\": container with ID starting with 6cd9c6c2a87d0fdbbac4ccc0dffe1cb00367da0415dd9fca8d01f33467e976c2 not found: ID does not exist" containerID="6cd9c6c2a87d0fdbbac4ccc0dffe1cb00367da0415dd9fca8d01f33467e976c2" Sep 30 06:54:55 crc kubenswrapper[4691]: I0930 06:54:55.234948 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6cd9c6c2a87d0fdbbac4ccc0dffe1cb00367da0415dd9fca8d01f33467e976c2"} err="failed to get container status \"6cd9c6c2a87d0fdbbac4ccc0dffe1cb00367da0415dd9fca8d01f33467e976c2\": rpc error: code = NotFound desc = could not find container \"6cd9c6c2a87d0fdbbac4ccc0dffe1cb00367da0415dd9fca8d01f33467e976c2\": container with ID starting with 6cd9c6c2a87d0fdbbac4ccc0dffe1cb00367da0415dd9fca8d01f33467e976c2 not found: ID does not exist" Sep 30 06:54:55 crc kubenswrapper[4691]: I0930 06:54:55.234984 4691 scope.go:117] "RemoveContainer" containerID="4f1dbcb08e635f26e28be0acf234f10bb73933715ea8ba092ac8368e50c3ed28" Sep 30 06:54:55 crc kubenswrapper[4691]: E0930 06:54:55.235310 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f1dbcb08e635f26e28be0acf234f10bb73933715ea8ba092ac8368e50c3ed28\": container with ID starting with 4f1dbcb08e635f26e28be0acf234f10bb73933715ea8ba092ac8368e50c3ed28 not found: ID does not exist" containerID="4f1dbcb08e635f26e28be0acf234f10bb73933715ea8ba092ac8368e50c3ed28" Sep 30 06:54:55 crc kubenswrapper[4691]: I0930 06:54:55.235348 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f1dbcb08e635f26e28be0acf234f10bb73933715ea8ba092ac8368e50c3ed28"} err="failed to get container status \"4f1dbcb08e635f26e28be0acf234f10bb73933715ea8ba092ac8368e50c3ed28\": rpc error: code = NotFound desc = could not find container \"4f1dbcb08e635f26e28be0acf234f10bb73933715ea8ba092ac8368e50c3ed28\": container with ID starting with 4f1dbcb08e635f26e28be0acf234f10bb73933715ea8ba092ac8368e50c3ed28 not found: ID does not exist" Sep 30 06:54:55 crc kubenswrapper[4691]: I0930 06:54:55.240441 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a8bb54f-c1ff-4a7a-9e04-2298d265584e" path="/var/lib/kubelet/pods/8a8bb54f-c1ff-4a7a-9e04-2298d265584e/volumes" Sep 30 06:55:22 crc kubenswrapper[4691]: I0930 06:55:22.850151 4691 patch_prober.go:28] interesting pod/machine-config-daemon-4w4k6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 06:55:22 crc kubenswrapper[4691]: I0930 06:55:22.850786 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 06:55:52 crc kubenswrapper[4691]: I0930 06:55:52.850495 4691 patch_prober.go:28] interesting pod/machine-config-daemon-4w4k6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 06:55:52 crc kubenswrapper[4691]: I0930 06:55:52.851165 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 06:55:52 crc kubenswrapper[4691]: I0930 06:55:52.851228 4691 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" Sep 30 06:55:52 crc kubenswrapper[4691]: I0930 06:55:52.852246 4691 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9fbd472d3121d18d27461001cb7fb9f01463cdd402f37b5b16a3246b7caa1a84"} pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 06:55:52 crc kubenswrapper[4691]: I0930 06:55:52.852340 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" containerName="machine-config-daemon" containerID="cri-o://9fbd472d3121d18d27461001cb7fb9f01463cdd402f37b5b16a3246b7caa1a84" gracePeriod=600 Sep 30 06:55:53 crc kubenswrapper[4691]: I0930 06:55:53.810651 4691 generic.go:334] "Generic (PLEG): container finished" podID="69b46ade-8260-448f-84b7-506632d23ff9" containerID="9fbd472d3121d18d27461001cb7fb9f01463cdd402f37b5b16a3246b7caa1a84" exitCode=0 Sep 30 06:55:53 crc kubenswrapper[4691]: I0930 06:55:53.811042 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" event={"ID":"69b46ade-8260-448f-84b7-506632d23ff9","Type":"ContainerDied","Data":"9fbd472d3121d18d27461001cb7fb9f01463cdd402f37b5b16a3246b7caa1a84"} Sep 30 06:55:53 crc kubenswrapper[4691]: I0930 06:55:53.811429 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" event={"ID":"69b46ade-8260-448f-84b7-506632d23ff9","Type":"ContainerStarted","Data":"02eef150c90beb3661cb2f8cd317bd9191cd9f934ce1f2317599df254f95be1f"} Sep 30 06:55:53 crc kubenswrapper[4691]: I0930 06:55:53.811470 4691 scope.go:117] "RemoveContainer" containerID="584128b586b25bf701ec8def4f23ef28936697a1f4acd379f53fcb8a26a1716a" Sep 30 06:58:22 crc kubenswrapper[4691]: I0930 06:58:22.849961 4691 patch_prober.go:28] interesting pod/machine-config-daemon-4w4k6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 06:58:22 crc kubenswrapper[4691]: I0930 06:58:22.850533 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 06:58:52 crc kubenswrapper[4691]: I0930 06:58:52.849991 4691 patch_prober.go:28] interesting pod/machine-config-daemon-4w4k6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 06:58:52 crc kubenswrapper[4691]: I0930 06:58:52.850567 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 06:59:13 crc kubenswrapper[4691]: I0930 06:59:13.124632 4691 generic.go:334] "Generic (PLEG): container finished" podID="8e08d67e-28fd-4a4b-905a-765d0e33013d" containerID="bf36b329a91011c28e89feab998de5356240c0f3e8110b09151760bcafb0c9ed" exitCode=0 Sep 30 06:59:13 crc kubenswrapper[4691]: I0930 06:59:13.124758 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qgtz6" event={"ID":"8e08d67e-28fd-4a4b-905a-765d0e33013d","Type":"ContainerDied","Data":"bf36b329a91011c28e89feab998de5356240c0f3e8110b09151760bcafb0c9ed"} Sep 30 06:59:14 crc kubenswrapper[4691]: I0930 06:59:14.629518 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qgtz6" Sep 30 06:59:14 crc kubenswrapper[4691]: I0930 06:59:14.797719 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-89c2v\" (UniqueName: \"kubernetes.io/projected/8e08d67e-28fd-4a4b-905a-765d0e33013d-kube-api-access-89c2v\") pod \"8e08d67e-28fd-4a4b-905a-765d0e33013d\" (UID: \"8e08d67e-28fd-4a4b-905a-765d0e33013d\") " Sep 30 06:59:14 crc kubenswrapper[4691]: I0930 06:59:14.797777 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8e08d67e-28fd-4a4b-905a-765d0e33013d-ssh-key\") pod \"8e08d67e-28fd-4a4b-905a-765d0e33013d\" (UID: \"8e08d67e-28fd-4a4b-905a-765d0e33013d\") " Sep 30 06:59:14 crc kubenswrapper[4691]: I0930 06:59:14.797907 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8e08d67e-28fd-4a4b-905a-765d0e33013d-inventory\") pod \"8e08d67e-28fd-4a4b-905a-765d0e33013d\" (UID: \"8e08d67e-28fd-4a4b-905a-765d0e33013d\") " Sep 30 06:59:14 crc kubenswrapper[4691]: I0930 06:59:14.797949 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e08d67e-28fd-4a4b-905a-765d0e33013d-libvirt-combined-ca-bundle\") pod \"8e08d67e-28fd-4a4b-905a-765d0e33013d\" (UID: \"8e08d67e-28fd-4a4b-905a-765d0e33013d\") " Sep 30 06:59:14 crc kubenswrapper[4691]: I0930 06:59:14.798078 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/8e08d67e-28fd-4a4b-905a-765d0e33013d-libvirt-secret-0\") pod \"8e08d67e-28fd-4a4b-905a-765d0e33013d\" (UID: \"8e08d67e-28fd-4a4b-905a-765d0e33013d\") " Sep 30 06:59:14 crc kubenswrapper[4691]: I0930 06:59:14.804090 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e08d67e-28fd-4a4b-905a-765d0e33013d-kube-api-access-89c2v" (OuterVolumeSpecName: "kube-api-access-89c2v") pod "8e08d67e-28fd-4a4b-905a-765d0e33013d" (UID: "8e08d67e-28fd-4a4b-905a-765d0e33013d"). InnerVolumeSpecName "kube-api-access-89c2v". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 06:59:14 crc kubenswrapper[4691]: I0930 06:59:14.804147 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e08d67e-28fd-4a4b-905a-765d0e33013d-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "8e08d67e-28fd-4a4b-905a-765d0e33013d" (UID: "8e08d67e-28fd-4a4b-905a-765d0e33013d"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:59:14 crc kubenswrapper[4691]: I0930 06:59:14.827758 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e08d67e-28fd-4a4b-905a-765d0e33013d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "8e08d67e-28fd-4a4b-905a-765d0e33013d" (UID: "8e08d67e-28fd-4a4b-905a-765d0e33013d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:59:14 crc kubenswrapper[4691]: I0930 06:59:14.835967 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e08d67e-28fd-4a4b-905a-765d0e33013d-inventory" (OuterVolumeSpecName: "inventory") pod "8e08d67e-28fd-4a4b-905a-765d0e33013d" (UID: "8e08d67e-28fd-4a4b-905a-765d0e33013d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:59:14 crc kubenswrapper[4691]: I0930 06:59:14.843605 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e08d67e-28fd-4a4b-905a-765d0e33013d-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "8e08d67e-28fd-4a4b-905a-765d0e33013d" (UID: "8e08d67e-28fd-4a4b-905a-765d0e33013d"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 06:59:14 crc kubenswrapper[4691]: I0930 06:59:14.900091 4691 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/8e08d67e-28fd-4a4b-905a-765d0e33013d-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Sep 30 06:59:14 crc kubenswrapper[4691]: I0930 06:59:14.900146 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-89c2v\" (UniqueName: \"kubernetes.io/projected/8e08d67e-28fd-4a4b-905a-765d0e33013d-kube-api-access-89c2v\") on node \"crc\" DevicePath \"\"" Sep 30 06:59:14 crc kubenswrapper[4691]: I0930 06:59:14.900160 4691 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8e08d67e-28fd-4a4b-905a-765d0e33013d-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 06:59:14 crc kubenswrapper[4691]: I0930 06:59:14.900172 4691 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8e08d67e-28fd-4a4b-905a-765d0e33013d-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 06:59:14 crc kubenswrapper[4691]: I0930 06:59:14.900187 4691 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e08d67e-28fd-4a4b-905a-765d0e33013d-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 06:59:15 crc kubenswrapper[4691]: I0930 06:59:15.153070 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qgtz6" event={"ID":"8e08d67e-28fd-4a4b-905a-765d0e33013d","Type":"ContainerDied","Data":"987d1fca80269ce92b42143f00e3893f43c9deb15ba81982f954aed57034fa5c"} Sep 30 06:59:15 crc kubenswrapper[4691]: I0930 06:59:15.153108 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="987d1fca80269ce92b42143f00e3893f43c9deb15ba81982f954aed57034fa5c" Sep 30 06:59:15 crc kubenswrapper[4691]: I0930 06:59:15.153169 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-qgtz6" Sep 30 06:59:15 crc kubenswrapper[4691]: I0930 06:59:15.287733 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-z8bsh"] Sep 30 06:59:15 crc kubenswrapper[4691]: E0930 06:59:15.288512 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a8bb54f-c1ff-4a7a-9e04-2298d265584e" containerName="registry-server" Sep 30 06:59:15 crc kubenswrapper[4691]: I0930 06:59:15.288653 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a8bb54f-c1ff-4a7a-9e04-2298d265584e" containerName="registry-server" Sep 30 06:59:15 crc kubenswrapper[4691]: E0930 06:59:15.288792 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e08d67e-28fd-4a4b-905a-765d0e33013d" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Sep 30 06:59:15 crc kubenswrapper[4691]: I0930 06:59:15.288876 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e08d67e-28fd-4a4b-905a-765d0e33013d" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Sep 30 06:59:15 crc kubenswrapper[4691]: E0930 06:59:15.288999 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a23a8574-92cf-4328-a17a-a7e6049d1c4c" containerName="registry-server" Sep 30 06:59:15 crc kubenswrapper[4691]: I0930 06:59:15.289082 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="a23a8574-92cf-4328-a17a-a7e6049d1c4c" containerName="registry-server" Sep 30 06:59:15 crc kubenswrapper[4691]: E0930 06:59:15.289176 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a23a8574-92cf-4328-a17a-a7e6049d1c4c" containerName="extract-content" Sep 30 06:59:15 crc kubenswrapper[4691]: I0930 06:59:15.289259 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="a23a8574-92cf-4328-a17a-a7e6049d1c4c" containerName="extract-content" Sep 30 06:59:15 crc kubenswrapper[4691]: E0930 06:59:15.289347 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e9b3fba-b31f-47f4-88ae-555d3c870855" containerName="extract-utilities" Sep 30 06:59:15 crc kubenswrapper[4691]: I0930 06:59:15.289418 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e9b3fba-b31f-47f4-88ae-555d3c870855" containerName="extract-utilities" Sep 30 06:59:15 crc kubenswrapper[4691]: E0930 06:59:15.289502 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a23a8574-92cf-4328-a17a-a7e6049d1c4c" containerName="extract-utilities" Sep 30 06:59:15 crc kubenswrapper[4691]: I0930 06:59:15.289583 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="a23a8574-92cf-4328-a17a-a7e6049d1c4c" containerName="extract-utilities" Sep 30 06:59:15 crc kubenswrapper[4691]: E0930 06:59:15.289657 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e9b3fba-b31f-47f4-88ae-555d3c870855" containerName="extract-content" Sep 30 06:59:15 crc kubenswrapper[4691]: I0930 06:59:15.289727 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e9b3fba-b31f-47f4-88ae-555d3c870855" containerName="extract-content" Sep 30 06:59:15 crc kubenswrapper[4691]: E0930 06:59:15.289810 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a8bb54f-c1ff-4a7a-9e04-2298d265584e" containerName="extract-utilities" Sep 30 06:59:15 crc kubenswrapper[4691]: I0930 06:59:15.289916 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a8bb54f-c1ff-4a7a-9e04-2298d265584e" containerName="extract-utilities" Sep 30 06:59:15 crc kubenswrapper[4691]: E0930 06:59:15.290032 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e9b3fba-b31f-47f4-88ae-555d3c870855" containerName="registry-server" Sep 30 06:59:15 crc kubenswrapper[4691]: I0930 06:59:15.290126 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e9b3fba-b31f-47f4-88ae-555d3c870855" containerName="registry-server" Sep 30 06:59:15 crc kubenswrapper[4691]: E0930 06:59:15.290242 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a8bb54f-c1ff-4a7a-9e04-2298d265584e" containerName="extract-content" Sep 30 06:59:15 crc kubenswrapper[4691]: I0930 06:59:15.290347 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a8bb54f-c1ff-4a7a-9e04-2298d265584e" containerName="extract-content" Sep 30 06:59:15 crc kubenswrapper[4691]: I0930 06:59:15.290757 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e08d67e-28fd-4a4b-905a-765d0e33013d" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Sep 30 06:59:15 crc kubenswrapper[4691]: I0930 06:59:15.290970 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e9b3fba-b31f-47f4-88ae-555d3c870855" containerName="registry-server" Sep 30 06:59:15 crc kubenswrapper[4691]: I0930 06:59:15.291089 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a8bb54f-c1ff-4a7a-9e04-2298d265584e" containerName="registry-server" Sep 30 06:59:15 crc kubenswrapper[4691]: I0930 06:59:15.291237 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="a23a8574-92cf-4328-a17a-a7e6049d1c4c" containerName="registry-server" Sep 30 06:59:15 crc kubenswrapper[4691]: I0930 06:59:15.292436 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z8bsh" Sep 30 06:59:15 crc kubenswrapper[4691]: I0930 06:59:15.296170 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vff8k" Sep 30 06:59:15 crc kubenswrapper[4691]: I0930 06:59:15.296195 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Sep 30 06:59:15 crc kubenswrapper[4691]: I0930 06:59:15.296538 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Sep 30 06:59:15 crc kubenswrapper[4691]: I0930 06:59:15.296840 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 06:59:15 crc kubenswrapper[4691]: I0930 06:59:15.296873 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 06:59:15 crc kubenswrapper[4691]: I0930 06:59:15.297036 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 06:59:15 crc kubenswrapper[4691]: I0930 06:59:15.297070 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Sep 30 06:59:15 crc kubenswrapper[4691]: I0930 06:59:15.308540 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-z8bsh"] Sep 30 06:59:15 crc kubenswrapper[4691]: I0930 06:59:15.412385 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/c6e5786f-d234-454c-8276-5355726052be-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-z8bsh\" (UID: \"c6e5786f-d234-454c-8276-5355726052be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z8bsh" Sep 30 06:59:15 crc kubenswrapper[4691]: I0930 06:59:15.412450 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/c6e5786f-d234-454c-8276-5355726052be-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-z8bsh\" (UID: \"c6e5786f-d234-454c-8276-5355726052be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z8bsh" Sep 30 06:59:15 crc kubenswrapper[4691]: I0930 06:59:15.412474 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhbgw\" (UniqueName: \"kubernetes.io/projected/c6e5786f-d234-454c-8276-5355726052be-kube-api-access-xhbgw\") pod \"nova-edpm-deployment-openstack-edpm-ipam-z8bsh\" (UID: \"c6e5786f-d234-454c-8276-5355726052be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z8bsh" Sep 30 06:59:15 crc kubenswrapper[4691]: I0930 06:59:15.412721 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c6e5786f-d234-454c-8276-5355726052be-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-z8bsh\" (UID: \"c6e5786f-d234-454c-8276-5355726052be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z8bsh" Sep 30 06:59:15 crc kubenswrapper[4691]: I0930 06:59:15.412779 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6e5786f-d234-454c-8276-5355726052be-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-z8bsh\" (UID: \"c6e5786f-d234-454c-8276-5355726052be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z8bsh" Sep 30 06:59:15 crc kubenswrapper[4691]: I0930 06:59:15.412814 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/c6e5786f-d234-454c-8276-5355726052be-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-z8bsh\" (UID: \"c6e5786f-d234-454c-8276-5355726052be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z8bsh" Sep 30 06:59:15 crc kubenswrapper[4691]: I0930 06:59:15.412850 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/c6e5786f-d234-454c-8276-5355726052be-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-z8bsh\" (UID: \"c6e5786f-d234-454c-8276-5355726052be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z8bsh" Sep 30 06:59:15 crc kubenswrapper[4691]: I0930 06:59:15.413135 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c6e5786f-d234-454c-8276-5355726052be-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-z8bsh\" (UID: \"c6e5786f-d234-454c-8276-5355726052be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z8bsh" Sep 30 06:59:15 crc kubenswrapper[4691]: I0930 06:59:15.413371 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/c6e5786f-d234-454c-8276-5355726052be-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-z8bsh\" (UID: \"c6e5786f-d234-454c-8276-5355726052be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z8bsh" Sep 30 06:59:15 crc kubenswrapper[4691]: I0930 06:59:15.515319 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/c6e5786f-d234-454c-8276-5355726052be-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-z8bsh\" (UID: \"c6e5786f-d234-454c-8276-5355726052be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z8bsh" Sep 30 06:59:15 crc kubenswrapper[4691]: I0930 06:59:15.515375 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/c6e5786f-d234-454c-8276-5355726052be-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-z8bsh\" (UID: \"c6e5786f-d234-454c-8276-5355726052be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z8bsh" Sep 30 06:59:15 crc kubenswrapper[4691]: I0930 06:59:15.515407 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhbgw\" (UniqueName: \"kubernetes.io/projected/c6e5786f-d234-454c-8276-5355726052be-kube-api-access-xhbgw\") pod \"nova-edpm-deployment-openstack-edpm-ipam-z8bsh\" (UID: \"c6e5786f-d234-454c-8276-5355726052be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z8bsh" Sep 30 06:59:15 crc kubenswrapper[4691]: I0930 06:59:15.515472 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c6e5786f-d234-454c-8276-5355726052be-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-z8bsh\" (UID: \"c6e5786f-d234-454c-8276-5355726052be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z8bsh" Sep 30 06:59:15 crc kubenswrapper[4691]: I0930 06:59:15.515495 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6e5786f-d234-454c-8276-5355726052be-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-z8bsh\" (UID: \"c6e5786f-d234-454c-8276-5355726052be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z8bsh" Sep 30 06:59:15 crc kubenswrapper[4691]: I0930 06:59:15.515514 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/c6e5786f-d234-454c-8276-5355726052be-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-z8bsh\" (UID: \"c6e5786f-d234-454c-8276-5355726052be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z8bsh" Sep 30 06:59:15 crc kubenswrapper[4691]: I0930 06:59:15.515533 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/c6e5786f-d234-454c-8276-5355726052be-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-z8bsh\" (UID: \"c6e5786f-d234-454c-8276-5355726052be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z8bsh" Sep 30 06:59:15 crc kubenswrapper[4691]: I0930 06:59:15.515561 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c6e5786f-d234-454c-8276-5355726052be-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-z8bsh\" (UID: \"c6e5786f-d234-454c-8276-5355726052be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z8bsh" Sep 30 06:59:15 crc kubenswrapper[4691]: I0930 06:59:15.515614 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/c6e5786f-d234-454c-8276-5355726052be-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-z8bsh\" (UID: \"c6e5786f-d234-454c-8276-5355726052be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z8bsh" Sep 30 06:59:15 crc kubenswrapper[4691]: I0930 06:59:15.516264 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/c6e5786f-d234-454c-8276-5355726052be-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-z8bsh\" (UID: \"c6e5786f-d234-454c-8276-5355726052be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z8bsh" Sep 30 06:59:15 crc kubenswrapper[4691]: I0930 06:59:15.519033 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/c6e5786f-d234-454c-8276-5355726052be-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-z8bsh\" (UID: \"c6e5786f-d234-454c-8276-5355726052be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z8bsh" Sep 30 06:59:15 crc kubenswrapper[4691]: I0930 06:59:15.519384 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/c6e5786f-d234-454c-8276-5355726052be-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-z8bsh\" (UID: \"c6e5786f-d234-454c-8276-5355726052be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z8bsh" Sep 30 06:59:15 crc kubenswrapper[4691]: I0930 06:59:15.519739 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6e5786f-d234-454c-8276-5355726052be-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-z8bsh\" (UID: \"c6e5786f-d234-454c-8276-5355726052be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z8bsh" Sep 30 06:59:15 crc kubenswrapper[4691]: I0930 06:59:15.520465 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/c6e5786f-d234-454c-8276-5355726052be-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-z8bsh\" (UID: \"c6e5786f-d234-454c-8276-5355726052be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z8bsh" Sep 30 06:59:15 crc kubenswrapper[4691]: I0930 06:59:15.521083 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/c6e5786f-d234-454c-8276-5355726052be-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-z8bsh\" (UID: \"c6e5786f-d234-454c-8276-5355726052be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z8bsh" Sep 30 06:59:15 crc kubenswrapper[4691]: I0930 06:59:15.521381 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c6e5786f-d234-454c-8276-5355726052be-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-z8bsh\" (UID: \"c6e5786f-d234-454c-8276-5355726052be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z8bsh" Sep 30 06:59:15 crc kubenswrapper[4691]: I0930 06:59:15.522282 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c6e5786f-d234-454c-8276-5355726052be-ssh-key\") pod \"nova-edpm-deployment-openstack-edpm-ipam-z8bsh\" (UID: \"c6e5786f-d234-454c-8276-5355726052be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z8bsh" Sep 30 06:59:15 crc kubenswrapper[4691]: I0930 06:59:15.531250 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhbgw\" (UniqueName: \"kubernetes.io/projected/c6e5786f-d234-454c-8276-5355726052be-kube-api-access-xhbgw\") pod \"nova-edpm-deployment-openstack-edpm-ipam-z8bsh\" (UID: \"c6e5786f-d234-454c-8276-5355726052be\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z8bsh" Sep 30 06:59:15 crc kubenswrapper[4691]: I0930 06:59:15.629561 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z8bsh" Sep 30 06:59:16 crc kubenswrapper[4691]: I0930 06:59:16.174946 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-z8bsh"] Sep 30 06:59:17 crc kubenswrapper[4691]: I0930 06:59:17.200930 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z8bsh" event={"ID":"c6e5786f-d234-454c-8276-5355726052be","Type":"ContainerStarted","Data":"2e5d220baccf418195f86a7db909bf8b46901deebbc0522f100b5a742835991b"} Sep 30 06:59:17 crc kubenswrapper[4691]: I0930 06:59:17.201206 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z8bsh" event={"ID":"c6e5786f-d234-454c-8276-5355726052be","Type":"ContainerStarted","Data":"8163a07915ad4f024a8700044ad29abb2a0444f5ed4d67c13cad834912485688"} Sep 30 06:59:17 crc kubenswrapper[4691]: I0930 06:59:17.225434 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z8bsh" podStartSLOduration=1.614872207 podStartE2EDuration="2.225412589s" podCreationTimestamp="2025-09-30 06:59:15 +0000 UTC" firstStartedPulling="2025-09-30 06:59:16.188045315 +0000 UTC m=+2399.663066355" lastFinishedPulling="2025-09-30 06:59:16.798585667 +0000 UTC m=+2400.273606737" observedRunningTime="2025-09-30 06:59:17.222444255 +0000 UTC m=+2400.697465285" watchObservedRunningTime="2025-09-30 06:59:17.225412589 +0000 UTC m=+2400.700433639" Sep 30 06:59:22 crc kubenswrapper[4691]: I0930 06:59:22.850192 4691 patch_prober.go:28] interesting pod/machine-config-daemon-4w4k6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 06:59:22 crc kubenswrapper[4691]: I0930 06:59:22.851795 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 06:59:22 crc kubenswrapper[4691]: I0930 06:59:22.851862 4691 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" Sep 30 06:59:22 crc kubenswrapper[4691]: I0930 06:59:22.852773 4691 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"02eef150c90beb3661cb2f8cd317bd9191cd9f934ce1f2317599df254f95be1f"} pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 06:59:22 crc kubenswrapper[4691]: I0930 06:59:22.852845 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" containerName="machine-config-daemon" containerID="cri-o://02eef150c90beb3661cb2f8cd317bd9191cd9f934ce1f2317599df254f95be1f" gracePeriod=600 Sep 30 06:59:22 crc kubenswrapper[4691]: E0930 06:59:22.989233 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 06:59:23 crc kubenswrapper[4691]: I0930 06:59:23.268230 4691 generic.go:334] "Generic (PLEG): container finished" podID="69b46ade-8260-448f-84b7-506632d23ff9" containerID="02eef150c90beb3661cb2f8cd317bd9191cd9f934ce1f2317599df254f95be1f" exitCode=0 Sep 30 06:59:23 crc kubenswrapper[4691]: I0930 06:59:23.268323 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" event={"ID":"69b46ade-8260-448f-84b7-506632d23ff9","Type":"ContainerDied","Data":"02eef150c90beb3661cb2f8cd317bd9191cd9f934ce1f2317599df254f95be1f"} Sep 30 06:59:23 crc kubenswrapper[4691]: I0930 06:59:23.268426 4691 scope.go:117] "RemoveContainer" containerID="9fbd472d3121d18d27461001cb7fb9f01463cdd402f37b5b16a3246b7caa1a84" Sep 30 06:59:23 crc kubenswrapper[4691]: I0930 06:59:23.269560 4691 scope.go:117] "RemoveContainer" containerID="02eef150c90beb3661cb2f8cd317bd9191cd9f934ce1f2317599df254f95be1f" Sep 30 06:59:23 crc kubenswrapper[4691]: E0930 06:59:23.270076 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 06:59:37 crc kubenswrapper[4691]: I0930 06:59:37.239081 4691 scope.go:117] "RemoveContainer" containerID="02eef150c90beb3661cb2f8cd317bd9191cd9f934ce1f2317599df254f95be1f" Sep 30 06:59:37 crc kubenswrapper[4691]: E0930 06:59:37.240458 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 06:59:51 crc kubenswrapper[4691]: I0930 06:59:51.225980 4691 scope.go:117] "RemoveContainer" containerID="02eef150c90beb3661cb2f8cd317bd9191cd9f934ce1f2317599df254f95be1f" Sep 30 06:59:51 crc kubenswrapper[4691]: E0930 06:59:51.226976 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:00:00 crc kubenswrapper[4691]: I0930 07:00:00.171590 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320260-8mrl4"] Sep 30 07:00:00 crc kubenswrapper[4691]: I0930 07:00:00.176047 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320260-8mrl4" Sep 30 07:00:00 crc kubenswrapper[4691]: I0930 07:00:00.178488 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 07:00:00 crc kubenswrapper[4691]: I0930 07:00:00.180345 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 07:00:00 crc kubenswrapper[4691]: I0930 07:00:00.190505 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320260-8mrl4"] Sep 30 07:00:00 crc kubenswrapper[4691]: I0930 07:00:00.289841 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9b641bf2-9852-4550-bbb0-9add7388a1f6-secret-volume\") pod \"collect-profiles-29320260-8mrl4\" (UID: \"9b641bf2-9852-4550-bbb0-9add7388a1f6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320260-8mrl4" Sep 30 07:00:00 crc kubenswrapper[4691]: I0930 07:00:00.291229 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zwcl\" (UniqueName: \"kubernetes.io/projected/9b641bf2-9852-4550-bbb0-9add7388a1f6-kube-api-access-2zwcl\") pod \"collect-profiles-29320260-8mrl4\" (UID: \"9b641bf2-9852-4550-bbb0-9add7388a1f6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320260-8mrl4" Sep 30 07:00:00 crc kubenswrapper[4691]: I0930 07:00:00.291284 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9b641bf2-9852-4550-bbb0-9add7388a1f6-config-volume\") pod \"collect-profiles-29320260-8mrl4\" (UID: \"9b641bf2-9852-4550-bbb0-9add7388a1f6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320260-8mrl4" Sep 30 07:00:00 crc kubenswrapper[4691]: I0930 07:00:00.394772 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9b641bf2-9852-4550-bbb0-9add7388a1f6-secret-volume\") pod \"collect-profiles-29320260-8mrl4\" (UID: \"9b641bf2-9852-4550-bbb0-9add7388a1f6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320260-8mrl4" Sep 30 07:00:00 crc kubenswrapper[4691]: I0930 07:00:00.394970 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zwcl\" (UniqueName: \"kubernetes.io/projected/9b641bf2-9852-4550-bbb0-9add7388a1f6-kube-api-access-2zwcl\") pod \"collect-profiles-29320260-8mrl4\" (UID: \"9b641bf2-9852-4550-bbb0-9add7388a1f6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320260-8mrl4" Sep 30 07:00:00 crc kubenswrapper[4691]: I0930 07:00:00.395019 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9b641bf2-9852-4550-bbb0-9add7388a1f6-config-volume\") pod \"collect-profiles-29320260-8mrl4\" (UID: \"9b641bf2-9852-4550-bbb0-9add7388a1f6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320260-8mrl4" Sep 30 07:00:00 crc kubenswrapper[4691]: I0930 07:00:00.396509 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9b641bf2-9852-4550-bbb0-9add7388a1f6-config-volume\") pod \"collect-profiles-29320260-8mrl4\" (UID: \"9b641bf2-9852-4550-bbb0-9add7388a1f6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320260-8mrl4" Sep 30 07:00:00 crc kubenswrapper[4691]: I0930 07:00:00.406553 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9b641bf2-9852-4550-bbb0-9add7388a1f6-secret-volume\") pod \"collect-profiles-29320260-8mrl4\" (UID: \"9b641bf2-9852-4550-bbb0-9add7388a1f6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320260-8mrl4" Sep 30 07:00:00 crc kubenswrapper[4691]: I0930 07:00:00.416711 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zwcl\" (UniqueName: \"kubernetes.io/projected/9b641bf2-9852-4550-bbb0-9add7388a1f6-kube-api-access-2zwcl\") pod \"collect-profiles-29320260-8mrl4\" (UID: \"9b641bf2-9852-4550-bbb0-9add7388a1f6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320260-8mrl4" Sep 30 07:00:00 crc kubenswrapper[4691]: I0930 07:00:00.509282 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320260-8mrl4" Sep 30 07:00:00 crc kubenswrapper[4691]: W0930 07:00:00.998156 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b641bf2_9852_4550_bbb0_9add7388a1f6.slice/crio-70ee37c8a63ee388ddb5ddd85954c972898a212b086414d4fcba2e4d6cd8a5d7 WatchSource:0}: Error finding container 70ee37c8a63ee388ddb5ddd85954c972898a212b086414d4fcba2e4d6cd8a5d7: Status 404 returned error can't find the container with id 70ee37c8a63ee388ddb5ddd85954c972898a212b086414d4fcba2e4d6cd8a5d7 Sep 30 07:00:00 crc kubenswrapper[4691]: I0930 07:00:00.998578 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320260-8mrl4"] Sep 30 07:00:01 crc kubenswrapper[4691]: I0930 07:00:01.725765 4691 generic.go:334] "Generic (PLEG): container finished" podID="9b641bf2-9852-4550-bbb0-9add7388a1f6" containerID="c25f8b2d85989887fdce89b2427ec3a0e8dac608cf51ab17b1ea51a207231a25" exitCode=0 Sep 30 07:00:01 crc kubenswrapper[4691]: I0930 07:00:01.725817 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320260-8mrl4" event={"ID":"9b641bf2-9852-4550-bbb0-9add7388a1f6","Type":"ContainerDied","Data":"c25f8b2d85989887fdce89b2427ec3a0e8dac608cf51ab17b1ea51a207231a25"} Sep 30 07:00:01 crc kubenswrapper[4691]: I0930 07:00:01.726069 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320260-8mrl4" event={"ID":"9b641bf2-9852-4550-bbb0-9add7388a1f6","Type":"ContainerStarted","Data":"70ee37c8a63ee388ddb5ddd85954c972898a212b086414d4fcba2e4d6cd8a5d7"} Sep 30 07:00:02 crc kubenswrapper[4691]: I0930 07:00:02.226235 4691 scope.go:117] "RemoveContainer" containerID="02eef150c90beb3661cb2f8cd317bd9191cd9f934ce1f2317599df254f95be1f" Sep 30 07:00:02 crc kubenswrapper[4691]: E0930 07:00:02.227078 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:00:03 crc kubenswrapper[4691]: I0930 07:00:03.194113 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320260-8mrl4" Sep 30 07:00:03 crc kubenswrapper[4691]: I0930 07:00:03.264501 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9b641bf2-9852-4550-bbb0-9add7388a1f6-config-volume\") pod \"9b641bf2-9852-4550-bbb0-9add7388a1f6\" (UID: \"9b641bf2-9852-4550-bbb0-9add7388a1f6\") " Sep 30 07:00:03 crc kubenswrapper[4691]: I0930 07:00:03.264804 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9b641bf2-9852-4550-bbb0-9add7388a1f6-secret-volume\") pod \"9b641bf2-9852-4550-bbb0-9add7388a1f6\" (UID: \"9b641bf2-9852-4550-bbb0-9add7388a1f6\") " Sep 30 07:00:03 crc kubenswrapper[4691]: I0930 07:00:03.264855 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2zwcl\" (UniqueName: \"kubernetes.io/projected/9b641bf2-9852-4550-bbb0-9add7388a1f6-kube-api-access-2zwcl\") pod \"9b641bf2-9852-4550-bbb0-9add7388a1f6\" (UID: \"9b641bf2-9852-4550-bbb0-9add7388a1f6\") " Sep 30 07:00:03 crc kubenswrapper[4691]: I0930 07:00:03.266193 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b641bf2-9852-4550-bbb0-9add7388a1f6-config-volume" (OuterVolumeSpecName: "config-volume") pod "9b641bf2-9852-4550-bbb0-9add7388a1f6" (UID: "9b641bf2-9852-4550-bbb0-9add7388a1f6"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:00:03 crc kubenswrapper[4691]: I0930 07:00:03.274630 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b641bf2-9852-4550-bbb0-9add7388a1f6-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "9b641bf2-9852-4550-bbb0-9add7388a1f6" (UID: "9b641bf2-9852-4550-bbb0-9add7388a1f6"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:00:03 crc kubenswrapper[4691]: I0930 07:00:03.274821 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b641bf2-9852-4550-bbb0-9add7388a1f6-kube-api-access-2zwcl" (OuterVolumeSpecName: "kube-api-access-2zwcl") pod "9b641bf2-9852-4550-bbb0-9add7388a1f6" (UID: "9b641bf2-9852-4550-bbb0-9add7388a1f6"). InnerVolumeSpecName "kube-api-access-2zwcl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:00:03 crc kubenswrapper[4691]: I0930 07:00:03.367363 4691 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9b641bf2-9852-4550-bbb0-9add7388a1f6-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 07:00:03 crc kubenswrapper[4691]: I0930 07:00:03.367401 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2zwcl\" (UniqueName: \"kubernetes.io/projected/9b641bf2-9852-4550-bbb0-9add7388a1f6-kube-api-access-2zwcl\") on node \"crc\" DevicePath \"\"" Sep 30 07:00:03 crc kubenswrapper[4691]: I0930 07:00:03.367413 4691 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9b641bf2-9852-4550-bbb0-9add7388a1f6-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 07:00:03 crc kubenswrapper[4691]: I0930 07:00:03.747442 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320260-8mrl4" event={"ID":"9b641bf2-9852-4550-bbb0-9add7388a1f6","Type":"ContainerDied","Data":"70ee37c8a63ee388ddb5ddd85954c972898a212b086414d4fcba2e4d6cd8a5d7"} Sep 30 07:00:03 crc kubenswrapper[4691]: I0930 07:00:03.747766 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="70ee37c8a63ee388ddb5ddd85954c972898a212b086414d4fcba2e4d6cd8a5d7" Sep 30 07:00:03 crc kubenswrapper[4691]: I0930 07:00:03.747514 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320260-8mrl4" Sep 30 07:00:04 crc kubenswrapper[4691]: I0930 07:00:04.291393 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320215-9kldt"] Sep 30 07:00:04 crc kubenswrapper[4691]: I0930 07:00:04.300294 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320215-9kldt"] Sep 30 07:00:05 crc kubenswrapper[4691]: I0930 07:00:05.241386 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2df0f77-43e6-417a-a2ef-dfee3e80cbd8" path="/var/lib/kubelet/pods/e2df0f77-43e6-417a-a2ef-dfee3e80cbd8/volumes" Sep 30 07:00:15 crc kubenswrapper[4691]: I0930 07:00:15.224463 4691 scope.go:117] "RemoveContainer" containerID="02eef150c90beb3661cb2f8cd317bd9191cd9f934ce1f2317599df254f95be1f" Sep 30 07:00:15 crc kubenswrapper[4691]: E0930 07:00:15.225191 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:00:25 crc kubenswrapper[4691]: I0930 07:00:25.968232 4691 scope.go:117] "RemoveContainer" containerID="3e3eaea348966226bce3d5ed1f615a20e73b6acce16227c61e972962c9df8826" Sep 30 07:00:30 crc kubenswrapper[4691]: I0930 07:00:30.224712 4691 scope.go:117] "RemoveContainer" containerID="02eef150c90beb3661cb2f8cd317bd9191cd9f934ce1f2317599df254f95be1f" Sep 30 07:00:30 crc kubenswrapper[4691]: E0930 07:00:30.225333 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:00:44 crc kubenswrapper[4691]: I0930 07:00:44.225695 4691 scope.go:117] "RemoveContainer" containerID="02eef150c90beb3661cb2f8cd317bd9191cd9f934ce1f2317599df254f95be1f" Sep 30 07:00:44 crc kubenswrapper[4691]: E0930 07:00:44.226505 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:00:57 crc kubenswrapper[4691]: I0930 07:00:57.234924 4691 scope.go:117] "RemoveContainer" containerID="02eef150c90beb3661cb2f8cd317bd9191cd9f934ce1f2317599df254f95be1f" Sep 30 07:00:57 crc kubenswrapper[4691]: E0930 07:00:57.235817 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:01:00 crc kubenswrapper[4691]: I0930 07:01:00.157776 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29320261-kh7r8"] Sep 30 07:01:00 crc kubenswrapper[4691]: E0930 07:01:00.158633 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b641bf2-9852-4550-bbb0-9add7388a1f6" containerName="collect-profiles" Sep 30 07:01:00 crc kubenswrapper[4691]: I0930 07:01:00.158651 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b641bf2-9852-4550-bbb0-9add7388a1f6" containerName="collect-profiles" Sep 30 07:01:00 crc kubenswrapper[4691]: I0930 07:01:00.159120 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b641bf2-9852-4550-bbb0-9add7388a1f6" containerName="collect-profiles" Sep 30 07:01:00 crc kubenswrapper[4691]: I0930 07:01:00.159988 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29320261-kh7r8" Sep 30 07:01:00 crc kubenswrapper[4691]: I0930 07:01:00.185761 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29320261-kh7r8"] Sep 30 07:01:00 crc kubenswrapper[4691]: I0930 07:01:00.249658 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a5b3e3e6-7e53-4c0f-a1a5-e77f887cb06b-fernet-keys\") pod \"keystone-cron-29320261-kh7r8\" (UID: \"a5b3e3e6-7e53-4c0f-a1a5-e77f887cb06b\") " pod="openstack/keystone-cron-29320261-kh7r8" Sep 30 07:01:00 crc kubenswrapper[4691]: I0930 07:01:00.249747 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5b3e3e6-7e53-4c0f-a1a5-e77f887cb06b-combined-ca-bundle\") pod \"keystone-cron-29320261-kh7r8\" (UID: \"a5b3e3e6-7e53-4c0f-a1a5-e77f887cb06b\") " pod="openstack/keystone-cron-29320261-kh7r8" Sep 30 07:01:00 crc kubenswrapper[4691]: I0930 07:01:00.249845 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5b3e3e6-7e53-4c0f-a1a5-e77f887cb06b-config-data\") pod \"keystone-cron-29320261-kh7r8\" (UID: \"a5b3e3e6-7e53-4c0f-a1a5-e77f887cb06b\") " pod="openstack/keystone-cron-29320261-kh7r8" Sep 30 07:01:00 crc kubenswrapper[4691]: I0930 07:01:00.249953 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8r78m\" (UniqueName: \"kubernetes.io/projected/a5b3e3e6-7e53-4c0f-a1a5-e77f887cb06b-kube-api-access-8r78m\") pod \"keystone-cron-29320261-kh7r8\" (UID: \"a5b3e3e6-7e53-4c0f-a1a5-e77f887cb06b\") " pod="openstack/keystone-cron-29320261-kh7r8" Sep 30 07:01:00 crc kubenswrapper[4691]: I0930 07:01:00.352202 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a5b3e3e6-7e53-4c0f-a1a5-e77f887cb06b-fernet-keys\") pod \"keystone-cron-29320261-kh7r8\" (UID: \"a5b3e3e6-7e53-4c0f-a1a5-e77f887cb06b\") " pod="openstack/keystone-cron-29320261-kh7r8" Sep 30 07:01:00 crc kubenswrapper[4691]: I0930 07:01:00.352278 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5b3e3e6-7e53-4c0f-a1a5-e77f887cb06b-combined-ca-bundle\") pod \"keystone-cron-29320261-kh7r8\" (UID: \"a5b3e3e6-7e53-4c0f-a1a5-e77f887cb06b\") " pod="openstack/keystone-cron-29320261-kh7r8" Sep 30 07:01:00 crc kubenswrapper[4691]: I0930 07:01:00.352353 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5b3e3e6-7e53-4c0f-a1a5-e77f887cb06b-config-data\") pod \"keystone-cron-29320261-kh7r8\" (UID: \"a5b3e3e6-7e53-4c0f-a1a5-e77f887cb06b\") " pod="openstack/keystone-cron-29320261-kh7r8" Sep 30 07:01:00 crc kubenswrapper[4691]: I0930 07:01:00.352416 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8r78m\" (UniqueName: \"kubernetes.io/projected/a5b3e3e6-7e53-4c0f-a1a5-e77f887cb06b-kube-api-access-8r78m\") pod \"keystone-cron-29320261-kh7r8\" (UID: \"a5b3e3e6-7e53-4c0f-a1a5-e77f887cb06b\") " pod="openstack/keystone-cron-29320261-kh7r8" Sep 30 07:01:00 crc kubenswrapper[4691]: I0930 07:01:00.360143 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5b3e3e6-7e53-4c0f-a1a5-e77f887cb06b-combined-ca-bundle\") pod \"keystone-cron-29320261-kh7r8\" (UID: \"a5b3e3e6-7e53-4c0f-a1a5-e77f887cb06b\") " pod="openstack/keystone-cron-29320261-kh7r8" Sep 30 07:01:00 crc kubenswrapper[4691]: I0930 07:01:00.360485 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5b3e3e6-7e53-4c0f-a1a5-e77f887cb06b-config-data\") pod \"keystone-cron-29320261-kh7r8\" (UID: \"a5b3e3e6-7e53-4c0f-a1a5-e77f887cb06b\") " pod="openstack/keystone-cron-29320261-kh7r8" Sep 30 07:01:00 crc kubenswrapper[4691]: I0930 07:01:00.368843 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a5b3e3e6-7e53-4c0f-a1a5-e77f887cb06b-fernet-keys\") pod \"keystone-cron-29320261-kh7r8\" (UID: \"a5b3e3e6-7e53-4c0f-a1a5-e77f887cb06b\") " pod="openstack/keystone-cron-29320261-kh7r8" Sep 30 07:01:00 crc kubenswrapper[4691]: I0930 07:01:00.376715 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8r78m\" (UniqueName: \"kubernetes.io/projected/a5b3e3e6-7e53-4c0f-a1a5-e77f887cb06b-kube-api-access-8r78m\") pod \"keystone-cron-29320261-kh7r8\" (UID: \"a5b3e3e6-7e53-4c0f-a1a5-e77f887cb06b\") " pod="openstack/keystone-cron-29320261-kh7r8" Sep 30 07:01:00 crc kubenswrapper[4691]: I0930 07:01:00.495746 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29320261-kh7r8" Sep 30 07:01:00 crc kubenswrapper[4691]: I0930 07:01:00.978733 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29320261-kh7r8"] Sep 30 07:01:01 crc kubenswrapper[4691]: I0930 07:01:01.543041 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29320261-kh7r8" event={"ID":"a5b3e3e6-7e53-4c0f-a1a5-e77f887cb06b","Type":"ContainerStarted","Data":"0aa98646653eb0f9e63abd15d3ac3198421334f8670ac7cf16cc68a824562175"} Sep 30 07:01:01 crc kubenswrapper[4691]: I0930 07:01:01.545189 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29320261-kh7r8" event={"ID":"a5b3e3e6-7e53-4c0f-a1a5-e77f887cb06b","Type":"ContainerStarted","Data":"7fcf09781a5fa1bd13229eb5d220749afd3c1e48d14b4e9fd0c438a986e1a3cb"} Sep 30 07:01:01 crc kubenswrapper[4691]: I0930 07:01:01.581437 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29320261-kh7r8" podStartSLOduration=1.581406493 podStartE2EDuration="1.581406493s" podCreationTimestamp="2025-09-30 07:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:01:01.566062753 +0000 UTC m=+2505.041083823" watchObservedRunningTime="2025-09-30 07:01:01.581406493 +0000 UTC m=+2505.056427573" Sep 30 07:01:04 crc kubenswrapper[4691]: I0930 07:01:04.583445 4691 generic.go:334] "Generic (PLEG): container finished" podID="a5b3e3e6-7e53-4c0f-a1a5-e77f887cb06b" containerID="0aa98646653eb0f9e63abd15d3ac3198421334f8670ac7cf16cc68a824562175" exitCode=0 Sep 30 07:01:04 crc kubenswrapper[4691]: I0930 07:01:04.583506 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29320261-kh7r8" event={"ID":"a5b3e3e6-7e53-4c0f-a1a5-e77f887cb06b","Type":"ContainerDied","Data":"0aa98646653eb0f9e63abd15d3ac3198421334f8670ac7cf16cc68a824562175"} Sep 30 07:01:06 crc kubenswrapper[4691]: I0930 07:01:06.076652 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29320261-kh7r8" Sep 30 07:01:06 crc kubenswrapper[4691]: I0930 07:01:06.196007 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a5b3e3e6-7e53-4c0f-a1a5-e77f887cb06b-fernet-keys\") pod \"a5b3e3e6-7e53-4c0f-a1a5-e77f887cb06b\" (UID: \"a5b3e3e6-7e53-4c0f-a1a5-e77f887cb06b\") " Sep 30 07:01:06 crc kubenswrapper[4691]: I0930 07:01:06.196059 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8r78m\" (UniqueName: \"kubernetes.io/projected/a5b3e3e6-7e53-4c0f-a1a5-e77f887cb06b-kube-api-access-8r78m\") pod \"a5b3e3e6-7e53-4c0f-a1a5-e77f887cb06b\" (UID: \"a5b3e3e6-7e53-4c0f-a1a5-e77f887cb06b\") " Sep 30 07:01:06 crc kubenswrapper[4691]: I0930 07:01:06.196259 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5b3e3e6-7e53-4c0f-a1a5-e77f887cb06b-combined-ca-bundle\") pod \"a5b3e3e6-7e53-4c0f-a1a5-e77f887cb06b\" (UID: \"a5b3e3e6-7e53-4c0f-a1a5-e77f887cb06b\") " Sep 30 07:01:06 crc kubenswrapper[4691]: I0930 07:01:06.196315 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5b3e3e6-7e53-4c0f-a1a5-e77f887cb06b-config-data\") pod \"a5b3e3e6-7e53-4c0f-a1a5-e77f887cb06b\" (UID: \"a5b3e3e6-7e53-4c0f-a1a5-e77f887cb06b\") " Sep 30 07:01:06 crc kubenswrapper[4691]: I0930 07:01:06.202789 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5b3e3e6-7e53-4c0f-a1a5-e77f887cb06b-kube-api-access-8r78m" (OuterVolumeSpecName: "kube-api-access-8r78m") pod "a5b3e3e6-7e53-4c0f-a1a5-e77f887cb06b" (UID: "a5b3e3e6-7e53-4c0f-a1a5-e77f887cb06b"). InnerVolumeSpecName "kube-api-access-8r78m". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:01:06 crc kubenswrapper[4691]: I0930 07:01:06.217311 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5b3e3e6-7e53-4c0f-a1a5-e77f887cb06b-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "a5b3e3e6-7e53-4c0f-a1a5-e77f887cb06b" (UID: "a5b3e3e6-7e53-4c0f-a1a5-e77f887cb06b"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:01:06 crc kubenswrapper[4691]: I0930 07:01:06.237094 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5b3e3e6-7e53-4c0f-a1a5-e77f887cb06b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a5b3e3e6-7e53-4c0f-a1a5-e77f887cb06b" (UID: "a5b3e3e6-7e53-4c0f-a1a5-e77f887cb06b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:01:06 crc kubenswrapper[4691]: I0930 07:01:06.261790 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5b3e3e6-7e53-4c0f-a1a5-e77f887cb06b-config-data" (OuterVolumeSpecName: "config-data") pod "a5b3e3e6-7e53-4c0f-a1a5-e77f887cb06b" (UID: "a5b3e3e6-7e53-4c0f-a1a5-e77f887cb06b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:01:06 crc kubenswrapper[4691]: I0930 07:01:06.300660 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5b3e3e6-7e53-4c0f-a1a5-e77f887cb06b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 07:01:06 crc kubenswrapper[4691]: I0930 07:01:06.300694 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5b3e3e6-7e53-4c0f-a1a5-e77f887cb06b-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 07:01:06 crc kubenswrapper[4691]: I0930 07:01:06.300706 4691 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a5b3e3e6-7e53-4c0f-a1a5-e77f887cb06b-fernet-keys\") on node \"crc\" DevicePath \"\"" Sep 30 07:01:06 crc kubenswrapper[4691]: I0930 07:01:06.300720 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8r78m\" (UniqueName: \"kubernetes.io/projected/a5b3e3e6-7e53-4c0f-a1a5-e77f887cb06b-kube-api-access-8r78m\") on node \"crc\" DevicePath \"\"" Sep 30 07:01:06 crc kubenswrapper[4691]: I0930 07:01:06.614692 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29320261-kh7r8" event={"ID":"a5b3e3e6-7e53-4c0f-a1a5-e77f887cb06b","Type":"ContainerDied","Data":"7fcf09781a5fa1bd13229eb5d220749afd3c1e48d14b4e9fd0c438a986e1a3cb"} Sep 30 07:01:06 crc kubenswrapper[4691]: I0930 07:01:06.614749 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7fcf09781a5fa1bd13229eb5d220749afd3c1e48d14b4e9fd0c438a986e1a3cb" Sep 30 07:01:06 crc kubenswrapper[4691]: I0930 07:01:06.614785 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29320261-kh7r8" Sep 30 07:01:09 crc kubenswrapper[4691]: I0930 07:01:09.224865 4691 scope.go:117] "RemoveContainer" containerID="02eef150c90beb3661cb2f8cd317bd9191cd9f934ce1f2317599df254f95be1f" Sep 30 07:01:09 crc kubenswrapper[4691]: E0930 07:01:09.225491 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:01:23 crc kubenswrapper[4691]: I0930 07:01:23.226161 4691 scope.go:117] "RemoveContainer" containerID="02eef150c90beb3661cb2f8cd317bd9191cd9f934ce1f2317599df254f95be1f" Sep 30 07:01:23 crc kubenswrapper[4691]: E0930 07:01:23.226953 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:01:35 crc kubenswrapper[4691]: I0930 07:01:35.226601 4691 scope.go:117] "RemoveContainer" containerID="02eef150c90beb3661cb2f8cd317bd9191cd9f934ce1f2317599df254f95be1f" Sep 30 07:01:35 crc kubenswrapper[4691]: E0930 07:01:35.227802 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:01:46 crc kubenswrapper[4691]: I0930 07:01:46.224887 4691 scope.go:117] "RemoveContainer" containerID="02eef150c90beb3661cb2f8cd317bd9191cd9f934ce1f2317599df254f95be1f" Sep 30 07:01:46 crc kubenswrapper[4691]: E0930 07:01:46.226973 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:02:00 crc kubenswrapper[4691]: I0930 07:02:00.225326 4691 scope.go:117] "RemoveContainer" containerID="02eef150c90beb3661cb2f8cd317bd9191cd9f934ce1f2317599df254f95be1f" Sep 30 07:02:00 crc kubenswrapper[4691]: E0930 07:02:00.226652 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:02:15 crc kubenswrapper[4691]: I0930 07:02:15.224777 4691 scope.go:117] "RemoveContainer" containerID="02eef150c90beb3661cb2f8cd317bd9191cd9f934ce1f2317599df254f95be1f" Sep 30 07:02:15 crc kubenswrapper[4691]: E0930 07:02:15.227120 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:02:29 crc kubenswrapper[4691]: I0930 07:02:29.225829 4691 scope.go:117] "RemoveContainer" containerID="02eef150c90beb3661cb2f8cd317bd9191cd9f934ce1f2317599df254f95be1f" Sep 30 07:02:29 crc kubenswrapper[4691]: E0930 07:02:29.226621 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:02:44 crc kubenswrapper[4691]: I0930 07:02:44.225350 4691 scope.go:117] "RemoveContainer" containerID="02eef150c90beb3661cb2f8cd317bd9191cd9f934ce1f2317599df254f95be1f" Sep 30 07:02:44 crc kubenswrapper[4691]: E0930 07:02:44.226532 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:02:55 crc kubenswrapper[4691]: I0930 07:02:55.225450 4691 scope.go:117] "RemoveContainer" containerID="02eef150c90beb3661cb2f8cd317bd9191cd9f934ce1f2317599df254f95be1f" Sep 30 07:02:55 crc kubenswrapper[4691]: E0930 07:02:55.227564 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:03:10 crc kubenswrapper[4691]: I0930 07:03:10.225642 4691 scope.go:117] "RemoveContainer" containerID="02eef150c90beb3661cb2f8cd317bd9191cd9f934ce1f2317599df254f95be1f" Sep 30 07:03:10 crc kubenswrapper[4691]: E0930 07:03:10.227764 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:03:23 crc kubenswrapper[4691]: I0930 07:03:23.225038 4691 scope.go:117] "RemoveContainer" containerID="02eef150c90beb3661cb2f8cd317bd9191cd9f934ce1f2317599df254f95be1f" Sep 30 07:03:23 crc kubenswrapper[4691]: E0930 07:03:23.228432 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:03:23 crc kubenswrapper[4691]: I0930 07:03:23.247372 4691 generic.go:334] "Generic (PLEG): container finished" podID="c6e5786f-d234-454c-8276-5355726052be" containerID="2e5d220baccf418195f86a7db909bf8b46901deebbc0522f100b5a742835991b" exitCode=0 Sep 30 07:03:23 crc kubenswrapper[4691]: I0930 07:03:23.247436 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z8bsh" event={"ID":"c6e5786f-d234-454c-8276-5355726052be","Type":"ContainerDied","Data":"2e5d220baccf418195f86a7db909bf8b46901deebbc0522f100b5a742835991b"} Sep 30 07:03:24 crc kubenswrapper[4691]: I0930 07:03:24.771767 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z8bsh" Sep 30 07:03:24 crc kubenswrapper[4691]: I0930 07:03:24.860465 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c6e5786f-d234-454c-8276-5355726052be-ssh-key\") pod \"c6e5786f-d234-454c-8276-5355726052be\" (UID: \"c6e5786f-d234-454c-8276-5355726052be\") " Sep 30 07:03:24 crc kubenswrapper[4691]: I0930 07:03:24.860522 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/c6e5786f-d234-454c-8276-5355726052be-nova-cell1-compute-config-1\") pod \"c6e5786f-d234-454c-8276-5355726052be\" (UID: \"c6e5786f-d234-454c-8276-5355726052be\") " Sep 30 07:03:24 crc kubenswrapper[4691]: I0930 07:03:24.860629 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhbgw\" (UniqueName: \"kubernetes.io/projected/c6e5786f-d234-454c-8276-5355726052be-kube-api-access-xhbgw\") pod \"c6e5786f-d234-454c-8276-5355726052be\" (UID: \"c6e5786f-d234-454c-8276-5355726052be\") " Sep 30 07:03:24 crc kubenswrapper[4691]: I0930 07:03:24.860711 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6e5786f-d234-454c-8276-5355726052be-nova-combined-ca-bundle\") pod \"c6e5786f-d234-454c-8276-5355726052be\" (UID: \"c6e5786f-d234-454c-8276-5355726052be\") " Sep 30 07:03:24 crc kubenswrapper[4691]: I0930 07:03:24.860745 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c6e5786f-d234-454c-8276-5355726052be-inventory\") pod \"c6e5786f-d234-454c-8276-5355726052be\" (UID: \"c6e5786f-d234-454c-8276-5355726052be\") " Sep 30 07:03:24 crc kubenswrapper[4691]: I0930 07:03:24.860764 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/c6e5786f-d234-454c-8276-5355726052be-nova-migration-ssh-key-0\") pod \"c6e5786f-d234-454c-8276-5355726052be\" (UID: \"c6e5786f-d234-454c-8276-5355726052be\") " Sep 30 07:03:24 crc kubenswrapper[4691]: I0930 07:03:24.860821 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/c6e5786f-d234-454c-8276-5355726052be-nova-extra-config-0\") pod \"c6e5786f-d234-454c-8276-5355726052be\" (UID: \"c6e5786f-d234-454c-8276-5355726052be\") " Sep 30 07:03:24 crc kubenswrapper[4691]: I0930 07:03:24.860958 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/c6e5786f-d234-454c-8276-5355726052be-nova-cell1-compute-config-0\") pod \"c6e5786f-d234-454c-8276-5355726052be\" (UID: \"c6e5786f-d234-454c-8276-5355726052be\") " Sep 30 07:03:24 crc kubenswrapper[4691]: I0930 07:03:24.861105 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/c6e5786f-d234-454c-8276-5355726052be-nova-migration-ssh-key-1\") pod \"c6e5786f-d234-454c-8276-5355726052be\" (UID: \"c6e5786f-d234-454c-8276-5355726052be\") " Sep 30 07:03:24 crc kubenswrapper[4691]: I0930 07:03:24.867673 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6e5786f-d234-454c-8276-5355726052be-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "c6e5786f-d234-454c-8276-5355726052be" (UID: "c6e5786f-d234-454c-8276-5355726052be"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:03:24 crc kubenswrapper[4691]: I0930 07:03:24.868207 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6e5786f-d234-454c-8276-5355726052be-kube-api-access-xhbgw" (OuterVolumeSpecName: "kube-api-access-xhbgw") pod "c6e5786f-d234-454c-8276-5355726052be" (UID: "c6e5786f-d234-454c-8276-5355726052be"). InnerVolumeSpecName "kube-api-access-xhbgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:03:24 crc kubenswrapper[4691]: I0930 07:03:24.890283 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6e5786f-d234-454c-8276-5355726052be-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "c6e5786f-d234-454c-8276-5355726052be" (UID: "c6e5786f-d234-454c-8276-5355726052be"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:03:24 crc kubenswrapper[4691]: I0930 07:03:24.897720 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6e5786f-d234-454c-8276-5355726052be-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "c6e5786f-d234-454c-8276-5355726052be" (UID: "c6e5786f-d234-454c-8276-5355726052be"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:03:24 crc kubenswrapper[4691]: I0930 07:03:24.899433 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6e5786f-d234-454c-8276-5355726052be-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "c6e5786f-d234-454c-8276-5355726052be" (UID: "c6e5786f-d234-454c-8276-5355726052be"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:03:24 crc kubenswrapper[4691]: I0930 07:03:24.900320 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6e5786f-d234-454c-8276-5355726052be-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "c6e5786f-d234-454c-8276-5355726052be" (UID: "c6e5786f-d234-454c-8276-5355726052be"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:03:24 crc kubenswrapper[4691]: I0930 07:03:24.900537 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6e5786f-d234-454c-8276-5355726052be-inventory" (OuterVolumeSpecName: "inventory") pod "c6e5786f-d234-454c-8276-5355726052be" (UID: "c6e5786f-d234-454c-8276-5355726052be"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:03:24 crc kubenswrapper[4691]: I0930 07:03:24.903987 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6e5786f-d234-454c-8276-5355726052be-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "c6e5786f-d234-454c-8276-5355726052be" (UID: "c6e5786f-d234-454c-8276-5355726052be"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:03:24 crc kubenswrapper[4691]: I0930 07:03:24.910357 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c6e5786f-d234-454c-8276-5355726052be-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "c6e5786f-d234-454c-8276-5355726052be" (UID: "c6e5786f-d234-454c-8276-5355726052be"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:03:24 crc kubenswrapper[4691]: I0930 07:03:24.963945 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xhbgw\" (UniqueName: \"kubernetes.io/projected/c6e5786f-d234-454c-8276-5355726052be-kube-api-access-xhbgw\") on node \"crc\" DevicePath \"\"" Sep 30 07:03:24 crc kubenswrapper[4691]: I0930 07:03:24.963973 4691 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c6e5786f-d234-454c-8276-5355726052be-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 07:03:24 crc kubenswrapper[4691]: I0930 07:03:24.963984 4691 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c6e5786f-d234-454c-8276-5355726052be-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 07:03:24 crc kubenswrapper[4691]: I0930 07:03:24.963995 4691 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/c6e5786f-d234-454c-8276-5355726052be-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Sep 30 07:03:24 crc kubenswrapper[4691]: I0930 07:03:24.964006 4691 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/c6e5786f-d234-454c-8276-5355726052be-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Sep 30 07:03:24 crc kubenswrapper[4691]: I0930 07:03:24.964016 4691 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/c6e5786f-d234-454c-8276-5355726052be-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Sep 30 07:03:24 crc kubenswrapper[4691]: I0930 07:03:24.964024 4691 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/c6e5786f-d234-454c-8276-5355726052be-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Sep 30 07:03:24 crc kubenswrapper[4691]: I0930 07:03:24.964034 4691 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/c6e5786f-d234-454c-8276-5355726052be-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 07:03:24 crc kubenswrapper[4691]: I0930 07:03:24.964043 4691 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/c6e5786f-d234-454c-8276-5355726052be-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Sep 30 07:03:25 crc kubenswrapper[4691]: I0930 07:03:25.276212 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z8bsh" event={"ID":"c6e5786f-d234-454c-8276-5355726052be","Type":"ContainerDied","Data":"8163a07915ad4f024a8700044ad29abb2a0444f5ed4d67c13cad834912485688"} Sep 30 07:03:25 crc kubenswrapper[4691]: I0930 07:03:25.276262 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8163a07915ad4f024a8700044ad29abb2a0444f5ed4d67c13cad834912485688" Sep 30 07:03:25 crc kubenswrapper[4691]: I0930 07:03:25.276262 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-z8bsh" Sep 30 07:03:25 crc kubenswrapper[4691]: I0930 07:03:25.485508 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-cthz9"] Sep 30 07:03:25 crc kubenswrapper[4691]: E0930 07:03:25.486386 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5b3e3e6-7e53-4c0f-a1a5-e77f887cb06b" containerName="keystone-cron" Sep 30 07:03:25 crc kubenswrapper[4691]: I0930 07:03:25.486407 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5b3e3e6-7e53-4c0f-a1a5-e77f887cb06b" containerName="keystone-cron" Sep 30 07:03:25 crc kubenswrapper[4691]: E0930 07:03:25.486467 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6e5786f-d234-454c-8276-5355726052be" containerName="nova-edpm-deployment-openstack-edpm-ipam" Sep 30 07:03:25 crc kubenswrapper[4691]: I0930 07:03:25.486477 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6e5786f-d234-454c-8276-5355726052be" containerName="nova-edpm-deployment-openstack-edpm-ipam" Sep 30 07:03:25 crc kubenswrapper[4691]: I0930 07:03:25.487054 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5b3e3e6-7e53-4c0f-a1a5-e77f887cb06b" containerName="keystone-cron" Sep 30 07:03:25 crc kubenswrapper[4691]: I0930 07:03:25.487104 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6e5786f-d234-454c-8276-5355726052be" containerName="nova-edpm-deployment-openstack-edpm-ipam" Sep 30 07:03:25 crc kubenswrapper[4691]: I0930 07:03:25.488355 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-cthz9" Sep 30 07:03:25 crc kubenswrapper[4691]: I0930 07:03:25.501496 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Sep 30 07:03:25 crc kubenswrapper[4691]: I0930 07:03:25.519592 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-cthz9"] Sep 30 07:03:25 crc kubenswrapper[4691]: I0930 07:03:25.539426 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Sep 30 07:03:25 crc kubenswrapper[4691]: I0930 07:03:25.540039 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Sep 30 07:03:25 crc kubenswrapper[4691]: I0930 07:03:25.540129 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-vff8k" Sep 30 07:03:25 crc kubenswrapper[4691]: I0930 07:03:25.540211 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Sep 30 07:03:25 crc kubenswrapper[4691]: I0930 07:03:25.586966 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c1bc2df-cff0-4d61-9773-0db30010956c-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-cthz9\" (UID: \"8c1bc2df-cff0-4d61-9773-0db30010956c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-cthz9" Sep 30 07:03:25 crc kubenswrapper[4691]: I0930 07:03:25.587026 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/8c1bc2df-cff0-4d61-9773-0db30010956c-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-cthz9\" (UID: \"8c1bc2df-cff0-4d61-9773-0db30010956c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-cthz9" Sep 30 07:03:25 crc kubenswrapper[4691]: I0930 07:03:25.587090 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/8c1bc2df-cff0-4d61-9773-0db30010956c-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-cthz9\" (UID: \"8c1bc2df-cff0-4d61-9773-0db30010956c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-cthz9" Sep 30 07:03:25 crc kubenswrapper[4691]: I0930 07:03:25.587141 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8c1bc2df-cff0-4d61-9773-0db30010956c-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-cthz9\" (UID: \"8c1bc2df-cff0-4d61-9773-0db30010956c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-cthz9" Sep 30 07:03:25 crc kubenswrapper[4691]: I0930 07:03:25.587205 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/8c1bc2df-cff0-4d61-9773-0db30010956c-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-cthz9\" (UID: \"8c1bc2df-cff0-4d61-9773-0db30010956c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-cthz9" Sep 30 07:03:25 crc kubenswrapper[4691]: I0930 07:03:25.587238 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzbz8\" (UniqueName: \"kubernetes.io/projected/8c1bc2df-cff0-4d61-9773-0db30010956c-kube-api-access-wzbz8\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-cthz9\" (UID: \"8c1bc2df-cff0-4d61-9773-0db30010956c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-cthz9" Sep 30 07:03:25 crc kubenswrapper[4691]: I0930 07:03:25.587266 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8c1bc2df-cff0-4d61-9773-0db30010956c-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-cthz9\" (UID: \"8c1bc2df-cff0-4d61-9773-0db30010956c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-cthz9" Sep 30 07:03:25 crc kubenswrapper[4691]: I0930 07:03:25.688940 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c1bc2df-cff0-4d61-9773-0db30010956c-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-cthz9\" (UID: \"8c1bc2df-cff0-4d61-9773-0db30010956c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-cthz9" Sep 30 07:03:25 crc kubenswrapper[4691]: I0930 07:03:25.689000 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/8c1bc2df-cff0-4d61-9773-0db30010956c-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-cthz9\" (UID: \"8c1bc2df-cff0-4d61-9773-0db30010956c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-cthz9" Sep 30 07:03:25 crc kubenswrapper[4691]: I0930 07:03:25.689032 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/8c1bc2df-cff0-4d61-9773-0db30010956c-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-cthz9\" (UID: \"8c1bc2df-cff0-4d61-9773-0db30010956c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-cthz9" Sep 30 07:03:25 crc kubenswrapper[4691]: I0930 07:03:25.689074 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8c1bc2df-cff0-4d61-9773-0db30010956c-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-cthz9\" (UID: \"8c1bc2df-cff0-4d61-9773-0db30010956c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-cthz9" Sep 30 07:03:25 crc kubenswrapper[4691]: I0930 07:03:25.689115 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/8c1bc2df-cff0-4d61-9773-0db30010956c-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-cthz9\" (UID: \"8c1bc2df-cff0-4d61-9773-0db30010956c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-cthz9" Sep 30 07:03:25 crc kubenswrapper[4691]: I0930 07:03:25.689143 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzbz8\" (UniqueName: \"kubernetes.io/projected/8c1bc2df-cff0-4d61-9773-0db30010956c-kube-api-access-wzbz8\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-cthz9\" (UID: \"8c1bc2df-cff0-4d61-9773-0db30010956c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-cthz9" Sep 30 07:03:25 crc kubenswrapper[4691]: I0930 07:03:25.689166 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8c1bc2df-cff0-4d61-9773-0db30010956c-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-cthz9\" (UID: \"8c1bc2df-cff0-4d61-9773-0db30010956c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-cthz9" Sep 30 07:03:25 crc kubenswrapper[4691]: I0930 07:03:25.692605 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8c1bc2df-cff0-4d61-9773-0db30010956c-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-cthz9\" (UID: \"8c1bc2df-cff0-4d61-9773-0db30010956c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-cthz9" Sep 30 07:03:25 crc kubenswrapper[4691]: I0930 07:03:25.693377 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/8c1bc2df-cff0-4d61-9773-0db30010956c-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-cthz9\" (UID: \"8c1bc2df-cff0-4d61-9773-0db30010956c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-cthz9" Sep 30 07:03:25 crc kubenswrapper[4691]: I0930 07:03:25.693698 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8c1bc2df-cff0-4d61-9773-0db30010956c-ssh-key\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-cthz9\" (UID: \"8c1bc2df-cff0-4d61-9773-0db30010956c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-cthz9" Sep 30 07:03:25 crc kubenswrapper[4691]: I0930 07:03:25.693771 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c1bc2df-cff0-4d61-9773-0db30010956c-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-cthz9\" (UID: \"8c1bc2df-cff0-4d61-9773-0db30010956c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-cthz9" Sep 30 07:03:25 crc kubenswrapper[4691]: I0930 07:03:25.693842 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/8c1bc2df-cff0-4d61-9773-0db30010956c-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-cthz9\" (UID: \"8c1bc2df-cff0-4d61-9773-0db30010956c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-cthz9" Sep 30 07:03:25 crc kubenswrapper[4691]: I0930 07:03:25.694694 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/8c1bc2df-cff0-4d61-9773-0db30010956c-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-cthz9\" (UID: \"8c1bc2df-cff0-4d61-9773-0db30010956c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-cthz9" Sep 30 07:03:25 crc kubenswrapper[4691]: I0930 07:03:25.724407 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzbz8\" (UniqueName: \"kubernetes.io/projected/8c1bc2df-cff0-4d61-9773-0db30010956c-kube-api-access-wzbz8\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-cthz9\" (UID: \"8c1bc2df-cff0-4d61-9773-0db30010956c\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-cthz9" Sep 30 07:03:25 crc kubenswrapper[4691]: I0930 07:03:25.846073 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-cthz9" Sep 30 07:03:26 crc kubenswrapper[4691]: I0930 07:03:26.448136 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-cthz9"] Sep 30 07:03:26 crc kubenswrapper[4691]: I0930 07:03:26.452528 4691 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 07:03:27 crc kubenswrapper[4691]: I0930 07:03:27.301660 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-cthz9" event={"ID":"8c1bc2df-cff0-4d61-9773-0db30010956c","Type":"ContainerStarted","Data":"87de3534c90c4ddc69a46cfab9754ab4399d9d4e4e0be0c502d885477309e7d9"} Sep 30 07:03:27 crc kubenswrapper[4691]: I0930 07:03:27.302039 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-cthz9" event={"ID":"8c1bc2df-cff0-4d61-9773-0db30010956c","Type":"ContainerStarted","Data":"6a6f854609800e77a101be0cb86559d7bc8fdf157a59da96f3dd54c2f05a2bbb"} Sep 30 07:03:27 crc kubenswrapper[4691]: I0930 07:03:27.335202 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-cthz9" podStartSLOduration=1.87211569 podStartE2EDuration="2.335174338s" podCreationTimestamp="2025-09-30 07:03:25 +0000 UTC" firstStartedPulling="2025-09-30 07:03:26.45230178 +0000 UTC m=+2649.927322830" lastFinishedPulling="2025-09-30 07:03:26.915360438 +0000 UTC m=+2650.390381478" observedRunningTime="2025-09-30 07:03:27.320969052 +0000 UTC m=+2650.795990132" watchObservedRunningTime="2025-09-30 07:03:27.335174338 +0000 UTC m=+2650.810195418" Sep 30 07:03:37 crc kubenswrapper[4691]: I0930 07:03:37.233875 4691 scope.go:117] "RemoveContainer" containerID="02eef150c90beb3661cb2f8cd317bd9191cd9f934ce1f2317599df254f95be1f" Sep 30 07:03:37 crc kubenswrapper[4691]: E0930 07:03:37.234589 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:03:50 crc kubenswrapper[4691]: I0930 07:03:50.225307 4691 scope.go:117] "RemoveContainer" containerID="02eef150c90beb3661cb2f8cd317bd9191cd9f934ce1f2317599df254f95be1f" Sep 30 07:03:50 crc kubenswrapper[4691]: E0930 07:03:50.226450 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:04:04 crc kubenswrapper[4691]: I0930 07:04:04.227742 4691 scope.go:117] "RemoveContainer" containerID="02eef150c90beb3661cb2f8cd317bd9191cd9f934ce1f2317599df254f95be1f" Sep 30 07:04:04 crc kubenswrapper[4691]: E0930 07:04:04.229436 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:04:19 crc kubenswrapper[4691]: I0930 07:04:19.225932 4691 scope.go:117] "RemoveContainer" containerID="02eef150c90beb3661cb2f8cd317bd9191cd9f934ce1f2317599df254f95be1f" Sep 30 07:04:19 crc kubenswrapper[4691]: E0930 07:04:19.229125 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:04:34 crc kubenswrapper[4691]: I0930 07:04:34.225669 4691 scope.go:117] "RemoveContainer" containerID="02eef150c90beb3661cb2f8cd317bd9191cd9f934ce1f2317599df254f95be1f" Sep 30 07:04:35 crc kubenswrapper[4691]: I0930 07:04:35.146932 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" event={"ID":"69b46ade-8260-448f-84b7-506632d23ff9","Type":"ContainerStarted","Data":"be8ef5b3095c5d5b2e4262f242c72c9857c3992dd41b5ac407f3540740dd3d31"} Sep 30 07:04:48 crc kubenswrapper[4691]: I0930 07:04:48.231425 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2dnzh"] Sep 30 07:04:48 crc kubenswrapper[4691]: I0930 07:04:48.237412 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2dnzh" Sep 30 07:04:48 crc kubenswrapper[4691]: I0930 07:04:48.274590 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2dnzh"] Sep 30 07:04:48 crc kubenswrapper[4691]: I0930 07:04:48.391753 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd9a3794-0dec-4779-928c-cc0742f12e96-catalog-content\") pod \"community-operators-2dnzh\" (UID: \"fd9a3794-0dec-4779-928c-cc0742f12e96\") " pod="openshift-marketplace/community-operators-2dnzh" Sep 30 07:04:48 crc kubenswrapper[4691]: I0930 07:04:48.391918 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngs69\" (UniqueName: \"kubernetes.io/projected/fd9a3794-0dec-4779-928c-cc0742f12e96-kube-api-access-ngs69\") pod \"community-operators-2dnzh\" (UID: \"fd9a3794-0dec-4779-928c-cc0742f12e96\") " pod="openshift-marketplace/community-operators-2dnzh" Sep 30 07:04:48 crc kubenswrapper[4691]: I0930 07:04:48.391968 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd9a3794-0dec-4779-928c-cc0742f12e96-utilities\") pod \"community-operators-2dnzh\" (UID: \"fd9a3794-0dec-4779-928c-cc0742f12e96\") " pod="openshift-marketplace/community-operators-2dnzh" Sep 30 07:04:48 crc kubenswrapper[4691]: I0930 07:04:48.493833 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd9a3794-0dec-4779-928c-cc0742f12e96-catalog-content\") pod \"community-operators-2dnzh\" (UID: \"fd9a3794-0dec-4779-928c-cc0742f12e96\") " pod="openshift-marketplace/community-operators-2dnzh" Sep 30 07:04:48 crc kubenswrapper[4691]: I0930 07:04:48.494044 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ngs69\" (UniqueName: \"kubernetes.io/projected/fd9a3794-0dec-4779-928c-cc0742f12e96-kube-api-access-ngs69\") pod \"community-operators-2dnzh\" (UID: \"fd9a3794-0dec-4779-928c-cc0742f12e96\") " pod="openshift-marketplace/community-operators-2dnzh" Sep 30 07:04:48 crc kubenswrapper[4691]: I0930 07:04:48.494117 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd9a3794-0dec-4779-928c-cc0742f12e96-utilities\") pod \"community-operators-2dnzh\" (UID: \"fd9a3794-0dec-4779-928c-cc0742f12e96\") " pod="openshift-marketplace/community-operators-2dnzh" Sep 30 07:04:48 crc kubenswrapper[4691]: I0930 07:04:48.494648 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd9a3794-0dec-4779-928c-cc0742f12e96-catalog-content\") pod \"community-operators-2dnzh\" (UID: \"fd9a3794-0dec-4779-928c-cc0742f12e96\") " pod="openshift-marketplace/community-operators-2dnzh" Sep 30 07:04:48 crc kubenswrapper[4691]: I0930 07:04:48.494789 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd9a3794-0dec-4779-928c-cc0742f12e96-utilities\") pod \"community-operators-2dnzh\" (UID: \"fd9a3794-0dec-4779-928c-cc0742f12e96\") " pod="openshift-marketplace/community-operators-2dnzh" Sep 30 07:04:48 crc kubenswrapper[4691]: I0930 07:04:48.525843 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngs69\" (UniqueName: \"kubernetes.io/projected/fd9a3794-0dec-4779-928c-cc0742f12e96-kube-api-access-ngs69\") pod \"community-operators-2dnzh\" (UID: \"fd9a3794-0dec-4779-928c-cc0742f12e96\") " pod="openshift-marketplace/community-operators-2dnzh" Sep 30 07:04:48 crc kubenswrapper[4691]: I0930 07:04:48.582311 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2dnzh" Sep 30 07:04:49 crc kubenswrapper[4691]: I0930 07:04:49.053531 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2dnzh"] Sep 30 07:04:49 crc kubenswrapper[4691]: W0930 07:04:49.054443 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd9a3794_0dec_4779_928c_cc0742f12e96.slice/crio-c83fd5bc1f52cd563c111b9a2c9b07ba62c1eb9262ff5ca538ceb185ac859dfe WatchSource:0}: Error finding container c83fd5bc1f52cd563c111b9a2c9b07ba62c1eb9262ff5ca538ceb185ac859dfe: Status 404 returned error can't find the container with id c83fd5bc1f52cd563c111b9a2c9b07ba62c1eb9262ff5ca538ceb185ac859dfe Sep 30 07:04:49 crc kubenswrapper[4691]: I0930 07:04:49.310965 4691 generic.go:334] "Generic (PLEG): container finished" podID="fd9a3794-0dec-4779-928c-cc0742f12e96" containerID="f2e059637a7f45d801853f1c540b881cd04c918d0a46178da306f13b5d0a2521" exitCode=0 Sep 30 07:04:49 crc kubenswrapper[4691]: I0930 07:04:49.311019 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2dnzh" event={"ID":"fd9a3794-0dec-4779-928c-cc0742f12e96","Type":"ContainerDied","Data":"f2e059637a7f45d801853f1c540b881cd04c918d0a46178da306f13b5d0a2521"} Sep 30 07:04:49 crc kubenswrapper[4691]: I0930 07:04:49.311063 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2dnzh" event={"ID":"fd9a3794-0dec-4779-928c-cc0742f12e96","Type":"ContainerStarted","Data":"c83fd5bc1f52cd563c111b9a2c9b07ba62c1eb9262ff5ca538ceb185ac859dfe"} Sep 30 07:04:50 crc kubenswrapper[4691]: I0930 07:04:50.991653 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-f6tps"] Sep 30 07:04:50 crc kubenswrapper[4691]: I0930 07:04:50.994935 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f6tps" Sep 30 07:04:51 crc kubenswrapper[4691]: I0930 07:04:51.004654 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-f6tps"] Sep 30 07:04:51 crc kubenswrapper[4691]: I0930 07:04:51.149351 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prqvk\" (UniqueName: \"kubernetes.io/projected/a3478733-7984-4f66-82f1-b95b9fcc3e43-kube-api-access-prqvk\") pod \"redhat-operators-f6tps\" (UID: \"a3478733-7984-4f66-82f1-b95b9fcc3e43\") " pod="openshift-marketplace/redhat-operators-f6tps" Sep 30 07:04:51 crc kubenswrapper[4691]: I0930 07:04:51.149852 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3478733-7984-4f66-82f1-b95b9fcc3e43-catalog-content\") pod \"redhat-operators-f6tps\" (UID: \"a3478733-7984-4f66-82f1-b95b9fcc3e43\") " pod="openshift-marketplace/redhat-operators-f6tps" Sep 30 07:04:51 crc kubenswrapper[4691]: I0930 07:04:51.149906 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3478733-7984-4f66-82f1-b95b9fcc3e43-utilities\") pod \"redhat-operators-f6tps\" (UID: \"a3478733-7984-4f66-82f1-b95b9fcc3e43\") " pod="openshift-marketplace/redhat-operators-f6tps" Sep 30 07:04:51 crc kubenswrapper[4691]: I0930 07:04:51.251338 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prqvk\" (UniqueName: \"kubernetes.io/projected/a3478733-7984-4f66-82f1-b95b9fcc3e43-kube-api-access-prqvk\") pod \"redhat-operators-f6tps\" (UID: \"a3478733-7984-4f66-82f1-b95b9fcc3e43\") " pod="openshift-marketplace/redhat-operators-f6tps" Sep 30 07:04:51 crc kubenswrapper[4691]: I0930 07:04:51.251766 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3478733-7984-4f66-82f1-b95b9fcc3e43-catalog-content\") pod \"redhat-operators-f6tps\" (UID: \"a3478733-7984-4f66-82f1-b95b9fcc3e43\") " pod="openshift-marketplace/redhat-operators-f6tps" Sep 30 07:04:51 crc kubenswrapper[4691]: I0930 07:04:51.251908 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3478733-7984-4f66-82f1-b95b9fcc3e43-utilities\") pod \"redhat-operators-f6tps\" (UID: \"a3478733-7984-4f66-82f1-b95b9fcc3e43\") " pod="openshift-marketplace/redhat-operators-f6tps" Sep 30 07:04:51 crc kubenswrapper[4691]: I0930 07:04:51.252694 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3478733-7984-4f66-82f1-b95b9fcc3e43-utilities\") pod \"redhat-operators-f6tps\" (UID: \"a3478733-7984-4f66-82f1-b95b9fcc3e43\") " pod="openshift-marketplace/redhat-operators-f6tps" Sep 30 07:04:51 crc kubenswrapper[4691]: I0930 07:04:51.252801 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3478733-7984-4f66-82f1-b95b9fcc3e43-catalog-content\") pod \"redhat-operators-f6tps\" (UID: \"a3478733-7984-4f66-82f1-b95b9fcc3e43\") " pod="openshift-marketplace/redhat-operators-f6tps" Sep 30 07:04:51 crc kubenswrapper[4691]: I0930 07:04:51.282873 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prqvk\" (UniqueName: \"kubernetes.io/projected/a3478733-7984-4f66-82f1-b95b9fcc3e43-kube-api-access-prqvk\") pod \"redhat-operators-f6tps\" (UID: \"a3478733-7984-4f66-82f1-b95b9fcc3e43\") " pod="openshift-marketplace/redhat-operators-f6tps" Sep 30 07:04:51 crc kubenswrapper[4691]: I0930 07:04:51.325807 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f6tps" Sep 30 07:04:51 crc kubenswrapper[4691]: I0930 07:04:51.341729 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2dnzh" event={"ID":"fd9a3794-0dec-4779-928c-cc0742f12e96","Type":"ContainerStarted","Data":"3595045c826f8c6811fcd8e5f6408706af79af62504e505c00321ffd5df1b78e"} Sep 30 07:04:51 crc kubenswrapper[4691]: I0930 07:04:51.627384 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-f6tps"] Sep 30 07:04:52 crc kubenswrapper[4691]: I0930 07:04:52.357512 4691 generic.go:334] "Generic (PLEG): container finished" podID="fd9a3794-0dec-4779-928c-cc0742f12e96" containerID="3595045c826f8c6811fcd8e5f6408706af79af62504e505c00321ffd5df1b78e" exitCode=0 Sep 30 07:04:52 crc kubenswrapper[4691]: I0930 07:04:52.357566 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2dnzh" event={"ID":"fd9a3794-0dec-4779-928c-cc0742f12e96","Type":"ContainerDied","Data":"3595045c826f8c6811fcd8e5f6408706af79af62504e505c00321ffd5df1b78e"} Sep 30 07:04:52 crc kubenswrapper[4691]: I0930 07:04:52.360046 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f6tps" event={"ID":"a3478733-7984-4f66-82f1-b95b9fcc3e43","Type":"ContainerStarted","Data":"8ab43a5170f1fae43993c7e55f7c1c3b58565425d38877598e74f81f60b2c7e1"} Sep 30 07:04:52 crc kubenswrapper[4691]: I0930 07:04:52.360197 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f6tps" event={"ID":"a3478733-7984-4f66-82f1-b95b9fcc3e43","Type":"ContainerStarted","Data":"ac7848345ae69bf78d173882b473f2b31b578a87199bd0daf573d6254ea43d70"} Sep 30 07:04:53 crc kubenswrapper[4691]: I0930 07:04:53.372197 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2dnzh" event={"ID":"fd9a3794-0dec-4779-928c-cc0742f12e96","Type":"ContainerStarted","Data":"ab58c7481f0923ce9c79ec9dee73895331a6f8447b09c024b11df07262ed6835"} Sep 30 07:04:53 crc kubenswrapper[4691]: I0930 07:04:53.375159 4691 generic.go:334] "Generic (PLEG): container finished" podID="a3478733-7984-4f66-82f1-b95b9fcc3e43" containerID="8ab43a5170f1fae43993c7e55f7c1c3b58565425d38877598e74f81f60b2c7e1" exitCode=0 Sep 30 07:04:53 crc kubenswrapper[4691]: I0930 07:04:53.375315 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f6tps" event={"ID":"a3478733-7984-4f66-82f1-b95b9fcc3e43","Type":"ContainerDied","Data":"8ab43a5170f1fae43993c7e55f7c1c3b58565425d38877598e74f81f60b2c7e1"} Sep 30 07:04:53 crc kubenswrapper[4691]: I0930 07:04:53.426671 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2dnzh" podStartSLOduration=1.987141441 podStartE2EDuration="5.426645349s" podCreationTimestamp="2025-09-30 07:04:48 +0000 UTC" firstStartedPulling="2025-09-30 07:04:49.31340957 +0000 UTC m=+2732.788430650" lastFinishedPulling="2025-09-30 07:04:52.752913488 +0000 UTC m=+2736.227934558" observedRunningTime="2025-09-30 07:04:53.397197558 +0000 UTC m=+2736.872218608" watchObservedRunningTime="2025-09-30 07:04:53.426645349 +0000 UTC m=+2736.901666399" Sep 30 07:04:55 crc kubenswrapper[4691]: I0930 07:04:55.409721 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f6tps" event={"ID":"a3478733-7984-4f66-82f1-b95b9fcc3e43","Type":"ContainerStarted","Data":"27bdc4264c997c8b7accbda2edba37ae5cc69ca2f754975c4cea16ec57b5c2d1"} Sep 30 07:04:58 crc kubenswrapper[4691]: I0930 07:04:58.456119 4691 generic.go:334] "Generic (PLEG): container finished" podID="a3478733-7984-4f66-82f1-b95b9fcc3e43" containerID="27bdc4264c997c8b7accbda2edba37ae5cc69ca2f754975c4cea16ec57b5c2d1" exitCode=0 Sep 30 07:04:58 crc kubenswrapper[4691]: I0930 07:04:58.456222 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f6tps" event={"ID":"a3478733-7984-4f66-82f1-b95b9fcc3e43","Type":"ContainerDied","Data":"27bdc4264c997c8b7accbda2edba37ae5cc69ca2f754975c4cea16ec57b5c2d1"} Sep 30 07:04:58 crc kubenswrapper[4691]: I0930 07:04:58.582931 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2dnzh" Sep 30 07:04:58 crc kubenswrapper[4691]: I0930 07:04:58.583378 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2dnzh" Sep 30 07:04:58 crc kubenswrapper[4691]: I0930 07:04:58.664023 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2dnzh" Sep 30 07:04:59 crc kubenswrapper[4691]: I0930 07:04:59.471129 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f6tps" event={"ID":"a3478733-7984-4f66-82f1-b95b9fcc3e43","Type":"ContainerStarted","Data":"a4d90bdb9ea6aca7059b70cda5746e4822ed7cbcffab7cec9d7078f3ed431534"} Sep 30 07:04:59 crc kubenswrapper[4691]: I0930 07:04:59.501705 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-f6tps" podStartSLOduration=3.978355628 podStartE2EDuration="9.50163773s" podCreationTimestamp="2025-09-30 07:04:50 +0000 UTC" firstStartedPulling="2025-09-30 07:04:53.379875213 +0000 UTC m=+2736.854896273" lastFinishedPulling="2025-09-30 07:04:58.903157325 +0000 UTC m=+2742.378178375" observedRunningTime="2025-09-30 07:04:59.50133523 +0000 UTC m=+2742.976356350" watchObservedRunningTime="2025-09-30 07:04:59.50163773 +0000 UTC m=+2742.976658820" Sep 30 07:04:59 crc kubenswrapper[4691]: I0930 07:04:59.533037 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2dnzh" Sep 30 07:04:59 crc kubenswrapper[4691]: I0930 07:04:59.982463 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2dnzh"] Sep 30 07:05:01 crc kubenswrapper[4691]: I0930 07:05:01.327448 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-f6tps" Sep 30 07:05:01 crc kubenswrapper[4691]: I0930 07:05:01.327786 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-f6tps" Sep 30 07:05:01 crc kubenswrapper[4691]: I0930 07:05:01.492811 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2dnzh" podUID="fd9a3794-0dec-4779-928c-cc0742f12e96" containerName="registry-server" containerID="cri-o://ab58c7481f0923ce9c79ec9dee73895331a6f8447b09c024b11df07262ed6835" gracePeriod=2 Sep 30 07:05:01 crc kubenswrapper[4691]: E0930 07:05:01.720743 4691 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd9a3794_0dec_4779_928c_cc0742f12e96.slice/crio-conmon-ab58c7481f0923ce9c79ec9dee73895331a6f8447b09c024b11df07262ed6835.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd9a3794_0dec_4779_928c_cc0742f12e96.slice/crio-ab58c7481f0923ce9c79ec9dee73895331a6f8447b09c024b11df07262ed6835.scope\": RecentStats: unable to find data in memory cache]" Sep 30 07:05:01 crc kubenswrapper[4691]: I0930 07:05:01.956754 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2dnzh" Sep 30 07:05:02 crc kubenswrapper[4691]: I0930 07:05:02.103097 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd9a3794-0dec-4779-928c-cc0742f12e96-catalog-content\") pod \"fd9a3794-0dec-4779-928c-cc0742f12e96\" (UID: \"fd9a3794-0dec-4779-928c-cc0742f12e96\") " Sep 30 07:05:02 crc kubenswrapper[4691]: I0930 07:05:02.103457 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngs69\" (UniqueName: \"kubernetes.io/projected/fd9a3794-0dec-4779-928c-cc0742f12e96-kube-api-access-ngs69\") pod \"fd9a3794-0dec-4779-928c-cc0742f12e96\" (UID: \"fd9a3794-0dec-4779-928c-cc0742f12e96\") " Sep 30 07:05:02 crc kubenswrapper[4691]: I0930 07:05:02.103534 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd9a3794-0dec-4779-928c-cc0742f12e96-utilities\") pod \"fd9a3794-0dec-4779-928c-cc0742f12e96\" (UID: \"fd9a3794-0dec-4779-928c-cc0742f12e96\") " Sep 30 07:05:02 crc kubenswrapper[4691]: I0930 07:05:02.104934 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd9a3794-0dec-4779-928c-cc0742f12e96-utilities" (OuterVolumeSpecName: "utilities") pod "fd9a3794-0dec-4779-928c-cc0742f12e96" (UID: "fd9a3794-0dec-4779-928c-cc0742f12e96"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:05:02 crc kubenswrapper[4691]: I0930 07:05:02.111487 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd9a3794-0dec-4779-928c-cc0742f12e96-kube-api-access-ngs69" (OuterVolumeSpecName: "kube-api-access-ngs69") pod "fd9a3794-0dec-4779-928c-cc0742f12e96" (UID: "fd9a3794-0dec-4779-928c-cc0742f12e96"). InnerVolumeSpecName "kube-api-access-ngs69". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:05:02 crc kubenswrapper[4691]: I0930 07:05:02.180069 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd9a3794-0dec-4779-928c-cc0742f12e96-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fd9a3794-0dec-4779-928c-cc0742f12e96" (UID: "fd9a3794-0dec-4779-928c-cc0742f12e96"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:05:02 crc kubenswrapper[4691]: I0930 07:05:02.205739 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd9a3794-0dec-4779-928c-cc0742f12e96-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 07:05:02 crc kubenswrapper[4691]: I0930 07:05:02.205973 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngs69\" (UniqueName: \"kubernetes.io/projected/fd9a3794-0dec-4779-928c-cc0742f12e96-kube-api-access-ngs69\") on node \"crc\" DevicePath \"\"" Sep 30 07:05:02 crc kubenswrapper[4691]: I0930 07:05:02.206061 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd9a3794-0dec-4779-928c-cc0742f12e96-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 07:05:02 crc kubenswrapper[4691]: I0930 07:05:02.390475 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-f6tps" podUID="a3478733-7984-4f66-82f1-b95b9fcc3e43" containerName="registry-server" probeResult="failure" output=< Sep 30 07:05:02 crc kubenswrapper[4691]: timeout: failed to connect service ":50051" within 1s Sep 30 07:05:02 crc kubenswrapper[4691]: > Sep 30 07:05:02 crc kubenswrapper[4691]: I0930 07:05:02.520437 4691 generic.go:334] "Generic (PLEG): container finished" podID="fd9a3794-0dec-4779-928c-cc0742f12e96" containerID="ab58c7481f0923ce9c79ec9dee73895331a6f8447b09c024b11df07262ed6835" exitCode=0 Sep 30 07:05:02 crc kubenswrapper[4691]: I0930 07:05:02.520507 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2dnzh" event={"ID":"fd9a3794-0dec-4779-928c-cc0742f12e96","Type":"ContainerDied","Data":"ab58c7481f0923ce9c79ec9dee73895331a6f8447b09c024b11df07262ed6835"} Sep 30 07:05:02 crc kubenswrapper[4691]: I0930 07:05:02.520581 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2dnzh" event={"ID":"fd9a3794-0dec-4779-928c-cc0742f12e96","Type":"ContainerDied","Data":"c83fd5bc1f52cd563c111b9a2c9b07ba62c1eb9262ff5ca538ceb185ac859dfe"} Sep 30 07:05:02 crc kubenswrapper[4691]: I0930 07:05:02.520607 4691 scope.go:117] "RemoveContainer" containerID="ab58c7481f0923ce9c79ec9dee73895331a6f8447b09c024b11df07262ed6835" Sep 30 07:05:02 crc kubenswrapper[4691]: I0930 07:05:02.521482 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2dnzh" Sep 30 07:05:02 crc kubenswrapper[4691]: I0930 07:05:02.552587 4691 scope.go:117] "RemoveContainer" containerID="3595045c826f8c6811fcd8e5f6408706af79af62504e505c00321ffd5df1b78e" Sep 30 07:05:02 crc kubenswrapper[4691]: I0930 07:05:02.578971 4691 scope.go:117] "RemoveContainer" containerID="f2e059637a7f45d801853f1c540b881cd04c918d0a46178da306f13b5d0a2521" Sep 30 07:05:02 crc kubenswrapper[4691]: I0930 07:05:02.590616 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2dnzh"] Sep 30 07:05:02 crc kubenswrapper[4691]: I0930 07:05:02.608158 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2dnzh"] Sep 30 07:05:02 crc kubenswrapper[4691]: I0930 07:05:02.628760 4691 scope.go:117] "RemoveContainer" containerID="ab58c7481f0923ce9c79ec9dee73895331a6f8447b09c024b11df07262ed6835" Sep 30 07:05:02 crc kubenswrapper[4691]: E0930 07:05:02.629344 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab58c7481f0923ce9c79ec9dee73895331a6f8447b09c024b11df07262ed6835\": container with ID starting with ab58c7481f0923ce9c79ec9dee73895331a6f8447b09c024b11df07262ed6835 not found: ID does not exist" containerID="ab58c7481f0923ce9c79ec9dee73895331a6f8447b09c024b11df07262ed6835" Sep 30 07:05:02 crc kubenswrapper[4691]: I0930 07:05:02.629400 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab58c7481f0923ce9c79ec9dee73895331a6f8447b09c024b11df07262ed6835"} err="failed to get container status \"ab58c7481f0923ce9c79ec9dee73895331a6f8447b09c024b11df07262ed6835\": rpc error: code = NotFound desc = could not find container \"ab58c7481f0923ce9c79ec9dee73895331a6f8447b09c024b11df07262ed6835\": container with ID starting with ab58c7481f0923ce9c79ec9dee73895331a6f8447b09c024b11df07262ed6835 not found: ID does not exist" Sep 30 07:05:02 crc kubenswrapper[4691]: I0930 07:05:02.629434 4691 scope.go:117] "RemoveContainer" containerID="3595045c826f8c6811fcd8e5f6408706af79af62504e505c00321ffd5df1b78e" Sep 30 07:05:02 crc kubenswrapper[4691]: E0930 07:05:02.630233 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3595045c826f8c6811fcd8e5f6408706af79af62504e505c00321ffd5df1b78e\": container with ID starting with 3595045c826f8c6811fcd8e5f6408706af79af62504e505c00321ffd5df1b78e not found: ID does not exist" containerID="3595045c826f8c6811fcd8e5f6408706af79af62504e505c00321ffd5df1b78e" Sep 30 07:05:02 crc kubenswrapper[4691]: I0930 07:05:02.630302 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3595045c826f8c6811fcd8e5f6408706af79af62504e505c00321ffd5df1b78e"} err="failed to get container status \"3595045c826f8c6811fcd8e5f6408706af79af62504e505c00321ffd5df1b78e\": rpc error: code = NotFound desc = could not find container \"3595045c826f8c6811fcd8e5f6408706af79af62504e505c00321ffd5df1b78e\": container with ID starting with 3595045c826f8c6811fcd8e5f6408706af79af62504e505c00321ffd5df1b78e not found: ID does not exist" Sep 30 07:05:02 crc kubenswrapper[4691]: I0930 07:05:02.630339 4691 scope.go:117] "RemoveContainer" containerID="f2e059637a7f45d801853f1c540b881cd04c918d0a46178da306f13b5d0a2521" Sep 30 07:05:02 crc kubenswrapper[4691]: E0930 07:05:02.630734 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2e059637a7f45d801853f1c540b881cd04c918d0a46178da306f13b5d0a2521\": container with ID starting with f2e059637a7f45d801853f1c540b881cd04c918d0a46178da306f13b5d0a2521 not found: ID does not exist" containerID="f2e059637a7f45d801853f1c540b881cd04c918d0a46178da306f13b5d0a2521" Sep 30 07:05:02 crc kubenswrapper[4691]: I0930 07:05:02.630926 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2e059637a7f45d801853f1c540b881cd04c918d0a46178da306f13b5d0a2521"} err="failed to get container status \"f2e059637a7f45d801853f1c540b881cd04c918d0a46178da306f13b5d0a2521\": rpc error: code = NotFound desc = could not find container \"f2e059637a7f45d801853f1c540b881cd04c918d0a46178da306f13b5d0a2521\": container with ID starting with f2e059637a7f45d801853f1c540b881cd04c918d0a46178da306f13b5d0a2521 not found: ID does not exist" Sep 30 07:05:03 crc kubenswrapper[4691]: I0930 07:05:03.247377 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd9a3794-0dec-4779-928c-cc0742f12e96" path="/var/lib/kubelet/pods/fd9a3794-0dec-4779-928c-cc0742f12e96/volumes" Sep 30 07:05:11 crc kubenswrapper[4691]: I0930 07:05:11.412263 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-f6tps" Sep 30 07:05:11 crc kubenswrapper[4691]: I0930 07:05:11.509793 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-f6tps" Sep 30 07:05:11 crc kubenswrapper[4691]: I0930 07:05:11.677984 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-f6tps"] Sep 30 07:05:12 crc kubenswrapper[4691]: I0930 07:05:12.653143 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-f6tps" podUID="a3478733-7984-4f66-82f1-b95b9fcc3e43" containerName="registry-server" containerID="cri-o://a4d90bdb9ea6aca7059b70cda5746e4822ed7cbcffab7cec9d7078f3ed431534" gracePeriod=2 Sep 30 07:05:13 crc kubenswrapper[4691]: I0930 07:05:13.364974 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f6tps" Sep 30 07:05:13 crc kubenswrapper[4691]: I0930 07:05:13.460615 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3478733-7984-4f66-82f1-b95b9fcc3e43-utilities\") pod \"a3478733-7984-4f66-82f1-b95b9fcc3e43\" (UID: \"a3478733-7984-4f66-82f1-b95b9fcc3e43\") " Sep 30 07:05:13 crc kubenswrapper[4691]: I0930 07:05:13.460705 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-prqvk\" (UniqueName: \"kubernetes.io/projected/a3478733-7984-4f66-82f1-b95b9fcc3e43-kube-api-access-prqvk\") pod \"a3478733-7984-4f66-82f1-b95b9fcc3e43\" (UID: \"a3478733-7984-4f66-82f1-b95b9fcc3e43\") " Sep 30 07:05:13 crc kubenswrapper[4691]: I0930 07:05:13.460839 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3478733-7984-4f66-82f1-b95b9fcc3e43-catalog-content\") pod \"a3478733-7984-4f66-82f1-b95b9fcc3e43\" (UID: \"a3478733-7984-4f66-82f1-b95b9fcc3e43\") " Sep 30 07:05:13 crc kubenswrapper[4691]: I0930 07:05:13.462554 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3478733-7984-4f66-82f1-b95b9fcc3e43-utilities" (OuterVolumeSpecName: "utilities") pod "a3478733-7984-4f66-82f1-b95b9fcc3e43" (UID: "a3478733-7984-4f66-82f1-b95b9fcc3e43"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:05:13 crc kubenswrapper[4691]: I0930 07:05:13.475504 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3478733-7984-4f66-82f1-b95b9fcc3e43-kube-api-access-prqvk" (OuterVolumeSpecName: "kube-api-access-prqvk") pod "a3478733-7984-4f66-82f1-b95b9fcc3e43" (UID: "a3478733-7984-4f66-82f1-b95b9fcc3e43"). InnerVolumeSpecName "kube-api-access-prqvk". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:05:13 crc kubenswrapper[4691]: I0930 07:05:13.540790 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a3478733-7984-4f66-82f1-b95b9fcc3e43-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a3478733-7984-4f66-82f1-b95b9fcc3e43" (UID: "a3478733-7984-4f66-82f1-b95b9fcc3e43"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:05:13 crc kubenswrapper[4691]: I0930 07:05:13.563478 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a3478733-7984-4f66-82f1-b95b9fcc3e43-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 07:05:13 crc kubenswrapper[4691]: I0930 07:05:13.563515 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a3478733-7984-4f66-82f1-b95b9fcc3e43-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 07:05:13 crc kubenswrapper[4691]: I0930 07:05:13.563528 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-prqvk\" (UniqueName: \"kubernetes.io/projected/a3478733-7984-4f66-82f1-b95b9fcc3e43-kube-api-access-prqvk\") on node \"crc\" DevicePath \"\"" Sep 30 07:05:13 crc kubenswrapper[4691]: I0930 07:05:13.667500 4691 generic.go:334] "Generic (PLEG): container finished" podID="a3478733-7984-4f66-82f1-b95b9fcc3e43" containerID="a4d90bdb9ea6aca7059b70cda5746e4822ed7cbcffab7cec9d7078f3ed431534" exitCode=0 Sep 30 07:05:13 crc kubenswrapper[4691]: I0930 07:05:13.667584 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f6tps" Sep 30 07:05:13 crc kubenswrapper[4691]: I0930 07:05:13.667605 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f6tps" event={"ID":"a3478733-7984-4f66-82f1-b95b9fcc3e43","Type":"ContainerDied","Data":"a4d90bdb9ea6aca7059b70cda5746e4822ed7cbcffab7cec9d7078f3ed431534"} Sep 30 07:05:13 crc kubenswrapper[4691]: I0930 07:05:13.668035 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f6tps" event={"ID":"a3478733-7984-4f66-82f1-b95b9fcc3e43","Type":"ContainerDied","Data":"ac7848345ae69bf78d173882b473f2b31b578a87199bd0daf573d6254ea43d70"} Sep 30 07:05:13 crc kubenswrapper[4691]: I0930 07:05:13.668059 4691 scope.go:117] "RemoveContainer" containerID="a4d90bdb9ea6aca7059b70cda5746e4822ed7cbcffab7cec9d7078f3ed431534" Sep 30 07:05:13 crc kubenswrapper[4691]: I0930 07:05:13.694839 4691 scope.go:117] "RemoveContainer" containerID="27bdc4264c997c8b7accbda2edba37ae5cc69ca2f754975c4cea16ec57b5c2d1" Sep 30 07:05:13 crc kubenswrapper[4691]: I0930 07:05:13.729836 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-f6tps"] Sep 30 07:05:13 crc kubenswrapper[4691]: I0930 07:05:13.736858 4691 scope.go:117] "RemoveContainer" containerID="8ab43a5170f1fae43993c7e55f7c1c3b58565425d38877598e74f81f60b2c7e1" Sep 30 07:05:13 crc kubenswrapper[4691]: I0930 07:05:13.742822 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-f6tps"] Sep 30 07:05:13 crc kubenswrapper[4691]: I0930 07:05:13.771289 4691 scope.go:117] "RemoveContainer" containerID="a4d90bdb9ea6aca7059b70cda5746e4822ed7cbcffab7cec9d7078f3ed431534" Sep 30 07:05:13 crc kubenswrapper[4691]: E0930 07:05:13.771811 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4d90bdb9ea6aca7059b70cda5746e4822ed7cbcffab7cec9d7078f3ed431534\": container with ID starting with a4d90bdb9ea6aca7059b70cda5746e4822ed7cbcffab7cec9d7078f3ed431534 not found: ID does not exist" containerID="a4d90bdb9ea6aca7059b70cda5746e4822ed7cbcffab7cec9d7078f3ed431534" Sep 30 07:05:13 crc kubenswrapper[4691]: I0930 07:05:13.771864 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4d90bdb9ea6aca7059b70cda5746e4822ed7cbcffab7cec9d7078f3ed431534"} err="failed to get container status \"a4d90bdb9ea6aca7059b70cda5746e4822ed7cbcffab7cec9d7078f3ed431534\": rpc error: code = NotFound desc = could not find container \"a4d90bdb9ea6aca7059b70cda5746e4822ed7cbcffab7cec9d7078f3ed431534\": container with ID starting with a4d90bdb9ea6aca7059b70cda5746e4822ed7cbcffab7cec9d7078f3ed431534 not found: ID does not exist" Sep 30 07:05:13 crc kubenswrapper[4691]: I0930 07:05:13.771919 4691 scope.go:117] "RemoveContainer" containerID="27bdc4264c997c8b7accbda2edba37ae5cc69ca2f754975c4cea16ec57b5c2d1" Sep 30 07:05:13 crc kubenswrapper[4691]: E0930 07:05:13.772294 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27bdc4264c997c8b7accbda2edba37ae5cc69ca2f754975c4cea16ec57b5c2d1\": container with ID starting with 27bdc4264c997c8b7accbda2edba37ae5cc69ca2f754975c4cea16ec57b5c2d1 not found: ID does not exist" containerID="27bdc4264c997c8b7accbda2edba37ae5cc69ca2f754975c4cea16ec57b5c2d1" Sep 30 07:05:13 crc kubenswrapper[4691]: I0930 07:05:13.772356 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27bdc4264c997c8b7accbda2edba37ae5cc69ca2f754975c4cea16ec57b5c2d1"} err="failed to get container status \"27bdc4264c997c8b7accbda2edba37ae5cc69ca2f754975c4cea16ec57b5c2d1\": rpc error: code = NotFound desc = could not find container \"27bdc4264c997c8b7accbda2edba37ae5cc69ca2f754975c4cea16ec57b5c2d1\": container with ID starting with 27bdc4264c997c8b7accbda2edba37ae5cc69ca2f754975c4cea16ec57b5c2d1 not found: ID does not exist" Sep 30 07:05:13 crc kubenswrapper[4691]: I0930 07:05:13.772403 4691 scope.go:117] "RemoveContainer" containerID="8ab43a5170f1fae43993c7e55f7c1c3b58565425d38877598e74f81f60b2c7e1" Sep 30 07:05:13 crc kubenswrapper[4691]: E0930 07:05:13.772831 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ab43a5170f1fae43993c7e55f7c1c3b58565425d38877598e74f81f60b2c7e1\": container with ID starting with 8ab43a5170f1fae43993c7e55f7c1c3b58565425d38877598e74f81f60b2c7e1 not found: ID does not exist" containerID="8ab43a5170f1fae43993c7e55f7c1c3b58565425d38877598e74f81f60b2c7e1" Sep 30 07:05:13 crc kubenswrapper[4691]: I0930 07:05:13.772869 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ab43a5170f1fae43993c7e55f7c1c3b58565425d38877598e74f81f60b2c7e1"} err="failed to get container status \"8ab43a5170f1fae43993c7e55f7c1c3b58565425d38877598e74f81f60b2c7e1\": rpc error: code = NotFound desc = could not find container \"8ab43a5170f1fae43993c7e55f7c1c3b58565425d38877598e74f81f60b2c7e1\": container with ID starting with 8ab43a5170f1fae43993c7e55f7c1c3b58565425d38877598e74f81f60b2c7e1 not found: ID does not exist" Sep 30 07:05:15 crc kubenswrapper[4691]: I0930 07:05:15.247142 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3478733-7984-4f66-82f1-b95b9fcc3e43" path="/var/lib/kubelet/pods/a3478733-7984-4f66-82f1-b95b9fcc3e43/volumes" Sep 30 07:05:36 crc kubenswrapper[4691]: I0930 07:05:36.748438 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-4hhmm"] Sep 30 07:05:36 crc kubenswrapper[4691]: E0930 07:05:36.750109 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3478733-7984-4f66-82f1-b95b9fcc3e43" containerName="extract-content" Sep 30 07:05:36 crc kubenswrapper[4691]: I0930 07:05:36.750176 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3478733-7984-4f66-82f1-b95b9fcc3e43" containerName="extract-content" Sep 30 07:05:36 crc kubenswrapper[4691]: E0930 07:05:36.750197 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3478733-7984-4f66-82f1-b95b9fcc3e43" containerName="extract-utilities" Sep 30 07:05:36 crc kubenswrapper[4691]: I0930 07:05:36.750211 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3478733-7984-4f66-82f1-b95b9fcc3e43" containerName="extract-utilities" Sep 30 07:05:36 crc kubenswrapper[4691]: E0930 07:05:36.750243 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3478733-7984-4f66-82f1-b95b9fcc3e43" containerName="registry-server" Sep 30 07:05:36 crc kubenswrapper[4691]: I0930 07:05:36.750256 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3478733-7984-4f66-82f1-b95b9fcc3e43" containerName="registry-server" Sep 30 07:05:36 crc kubenswrapper[4691]: E0930 07:05:36.750289 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd9a3794-0dec-4779-928c-cc0742f12e96" containerName="registry-server" Sep 30 07:05:36 crc kubenswrapper[4691]: I0930 07:05:36.750301 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd9a3794-0dec-4779-928c-cc0742f12e96" containerName="registry-server" Sep 30 07:05:36 crc kubenswrapper[4691]: E0930 07:05:36.750334 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd9a3794-0dec-4779-928c-cc0742f12e96" containerName="extract-content" Sep 30 07:05:36 crc kubenswrapper[4691]: I0930 07:05:36.750347 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd9a3794-0dec-4779-928c-cc0742f12e96" containerName="extract-content" Sep 30 07:05:36 crc kubenswrapper[4691]: E0930 07:05:36.750368 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd9a3794-0dec-4779-928c-cc0742f12e96" containerName="extract-utilities" Sep 30 07:05:36 crc kubenswrapper[4691]: I0930 07:05:36.750380 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd9a3794-0dec-4779-928c-cc0742f12e96" containerName="extract-utilities" Sep 30 07:05:36 crc kubenswrapper[4691]: I0930 07:05:36.750787 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd9a3794-0dec-4779-928c-cc0742f12e96" containerName="registry-server" Sep 30 07:05:36 crc kubenswrapper[4691]: I0930 07:05:36.750843 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3478733-7984-4f66-82f1-b95b9fcc3e43" containerName="registry-server" Sep 30 07:05:36 crc kubenswrapper[4691]: I0930 07:05:36.753490 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4hhmm" Sep 30 07:05:36 crc kubenswrapper[4691]: I0930 07:05:36.758987 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28e27d62-b9ad-4954-8c1b-55c1720a767d-catalog-content\") pod \"certified-operators-4hhmm\" (UID: \"28e27d62-b9ad-4954-8c1b-55c1720a767d\") " pod="openshift-marketplace/certified-operators-4hhmm" Sep 30 07:05:36 crc kubenswrapper[4691]: I0930 07:05:36.759191 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28e27d62-b9ad-4954-8c1b-55c1720a767d-utilities\") pod \"certified-operators-4hhmm\" (UID: \"28e27d62-b9ad-4954-8c1b-55c1720a767d\") " pod="openshift-marketplace/certified-operators-4hhmm" Sep 30 07:05:36 crc kubenswrapper[4691]: I0930 07:05:36.759362 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-452sf\" (UniqueName: \"kubernetes.io/projected/28e27d62-b9ad-4954-8c1b-55c1720a767d-kube-api-access-452sf\") pod \"certified-operators-4hhmm\" (UID: \"28e27d62-b9ad-4954-8c1b-55c1720a767d\") " pod="openshift-marketplace/certified-operators-4hhmm" Sep 30 07:05:36 crc kubenswrapper[4691]: I0930 07:05:36.776639 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4hhmm"] Sep 30 07:05:36 crc kubenswrapper[4691]: I0930 07:05:36.860832 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-452sf\" (UniqueName: \"kubernetes.io/projected/28e27d62-b9ad-4954-8c1b-55c1720a767d-kube-api-access-452sf\") pod \"certified-operators-4hhmm\" (UID: \"28e27d62-b9ad-4954-8c1b-55c1720a767d\") " pod="openshift-marketplace/certified-operators-4hhmm" Sep 30 07:05:36 crc kubenswrapper[4691]: I0930 07:05:36.860907 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28e27d62-b9ad-4954-8c1b-55c1720a767d-catalog-content\") pod \"certified-operators-4hhmm\" (UID: \"28e27d62-b9ad-4954-8c1b-55c1720a767d\") " pod="openshift-marketplace/certified-operators-4hhmm" Sep 30 07:05:36 crc kubenswrapper[4691]: I0930 07:05:36.860986 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28e27d62-b9ad-4954-8c1b-55c1720a767d-utilities\") pod \"certified-operators-4hhmm\" (UID: \"28e27d62-b9ad-4954-8c1b-55c1720a767d\") " pod="openshift-marketplace/certified-operators-4hhmm" Sep 30 07:05:36 crc kubenswrapper[4691]: I0930 07:05:36.861452 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28e27d62-b9ad-4954-8c1b-55c1720a767d-utilities\") pod \"certified-operators-4hhmm\" (UID: \"28e27d62-b9ad-4954-8c1b-55c1720a767d\") " pod="openshift-marketplace/certified-operators-4hhmm" Sep 30 07:05:36 crc kubenswrapper[4691]: I0930 07:05:36.861997 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28e27d62-b9ad-4954-8c1b-55c1720a767d-catalog-content\") pod \"certified-operators-4hhmm\" (UID: \"28e27d62-b9ad-4954-8c1b-55c1720a767d\") " pod="openshift-marketplace/certified-operators-4hhmm" Sep 30 07:05:36 crc kubenswrapper[4691]: I0930 07:05:36.888747 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-452sf\" (UniqueName: \"kubernetes.io/projected/28e27d62-b9ad-4954-8c1b-55c1720a767d-kube-api-access-452sf\") pod \"certified-operators-4hhmm\" (UID: \"28e27d62-b9ad-4954-8c1b-55c1720a767d\") " pod="openshift-marketplace/certified-operators-4hhmm" Sep 30 07:05:37 crc kubenswrapper[4691]: I0930 07:05:37.117839 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4hhmm" Sep 30 07:05:37 crc kubenswrapper[4691]: I0930 07:05:37.608047 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-4hhmm"] Sep 30 07:05:37 crc kubenswrapper[4691]: I0930 07:05:37.941974 4691 generic.go:334] "Generic (PLEG): container finished" podID="28e27d62-b9ad-4954-8c1b-55c1720a767d" containerID="96c6f271c550a555396ac6cef19402d21dc42cb5d9f57276ca77e7b62a328ba2" exitCode=0 Sep 30 07:05:37 crc kubenswrapper[4691]: I0930 07:05:37.942011 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4hhmm" event={"ID":"28e27d62-b9ad-4954-8c1b-55c1720a767d","Type":"ContainerDied","Data":"96c6f271c550a555396ac6cef19402d21dc42cb5d9f57276ca77e7b62a328ba2"} Sep 30 07:05:37 crc kubenswrapper[4691]: I0930 07:05:37.942036 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4hhmm" event={"ID":"28e27d62-b9ad-4954-8c1b-55c1720a767d","Type":"ContainerStarted","Data":"616c20e33cae215499aa50a29d8b0565ae242d57dffca7f33b084cd7da8a09e1"} Sep 30 07:05:38 crc kubenswrapper[4691]: I0930 07:05:38.953610 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4hhmm" event={"ID":"28e27d62-b9ad-4954-8c1b-55c1720a767d","Type":"ContainerStarted","Data":"3f73742cc80284f0483e4f1b88717b48f0ff3ff5aed4fac8c4ea74ce66bb03a3"} Sep 30 07:05:40 crc kubenswrapper[4691]: I0930 07:05:40.976628 4691 generic.go:334] "Generic (PLEG): container finished" podID="28e27d62-b9ad-4954-8c1b-55c1720a767d" containerID="3f73742cc80284f0483e4f1b88717b48f0ff3ff5aed4fac8c4ea74ce66bb03a3" exitCode=0 Sep 30 07:05:40 crc kubenswrapper[4691]: I0930 07:05:40.976669 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4hhmm" event={"ID":"28e27d62-b9ad-4954-8c1b-55c1720a767d","Type":"ContainerDied","Data":"3f73742cc80284f0483e4f1b88717b48f0ff3ff5aed4fac8c4ea74ce66bb03a3"} Sep 30 07:05:41 crc kubenswrapper[4691]: I0930 07:05:41.991152 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4hhmm" event={"ID":"28e27d62-b9ad-4954-8c1b-55c1720a767d","Type":"ContainerStarted","Data":"5b133f4b3b21c94a752450bfc463f2367bb971c73112dc7e1b3b040f937c7934"} Sep 30 07:05:42 crc kubenswrapper[4691]: I0930 07:05:42.020412 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-4hhmm" podStartSLOduration=2.510135953 podStartE2EDuration="6.020393764s" podCreationTimestamp="2025-09-30 07:05:36 +0000 UTC" firstStartedPulling="2025-09-30 07:05:37.943769405 +0000 UTC m=+2781.418790445" lastFinishedPulling="2025-09-30 07:05:41.454027206 +0000 UTC m=+2784.929048256" observedRunningTime="2025-09-30 07:05:42.014802466 +0000 UTC m=+2785.489823516" watchObservedRunningTime="2025-09-30 07:05:42.020393764 +0000 UTC m=+2785.495414804" Sep 30 07:05:47 crc kubenswrapper[4691]: I0930 07:05:47.118562 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-4hhmm" Sep 30 07:05:47 crc kubenswrapper[4691]: I0930 07:05:47.119714 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-4hhmm" Sep 30 07:05:47 crc kubenswrapper[4691]: I0930 07:05:47.163248 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-4hhmm" Sep 30 07:05:48 crc kubenswrapper[4691]: I0930 07:05:48.109545 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-4hhmm" Sep 30 07:05:48 crc kubenswrapper[4691]: I0930 07:05:48.194800 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4hhmm"] Sep 30 07:05:50 crc kubenswrapper[4691]: I0930 07:05:50.069610 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-4hhmm" podUID="28e27d62-b9ad-4954-8c1b-55c1720a767d" containerName="registry-server" containerID="cri-o://5b133f4b3b21c94a752450bfc463f2367bb971c73112dc7e1b3b040f937c7934" gracePeriod=2 Sep 30 07:05:50 crc kubenswrapper[4691]: I0930 07:05:50.559686 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4hhmm" Sep 30 07:05:50 crc kubenswrapper[4691]: I0930 07:05:50.680965 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-452sf\" (UniqueName: \"kubernetes.io/projected/28e27d62-b9ad-4954-8c1b-55c1720a767d-kube-api-access-452sf\") pod \"28e27d62-b9ad-4954-8c1b-55c1720a767d\" (UID: \"28e27d62-b9ad-4954-8c1b-55c1720a767d\") " Sep 30 07:05:50 crc kubenswrapper[4691]: I0930 07:05:50.681055 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28e27d62-b9ad-4954-8c1b-55c1720a767d-utilities\") pod \"28e27d62-b9ad-4954-8c1b-55c1720a767d\" (UID: \"28e27d62-b9ad-4954-8c1b-55c1720a767d\") " Sep 30 07:05:50 crc kubenswrapper[4691]: I0930 07:05:50.681370 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28e27d62-b9ad-4954-8c1b-55c1720a767d-catalog-content\") pod \"28e27d62-b9ad-4954-8c1b-55c1720a767d\" (UID: \"28e27d62-b9ad-4954-8c1b-55c1720a767d\") " Sep 30 07:05:50 crc kubenswrapper[4691]: I0930 07:05:50.682344 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28e27d62-b9ad-4954-8c1b-55c1720a767d-utilities" (OuterVolumeSpecName: "utilities") pod "28e27d62-b9ad-4954-8c1b-55c1720a767d" (UID: "28e27d62-b9ad-4954-8c1b-55c1720a767d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:05:50 crc kubenswrapper[4691]: I0930 07:05:50.693924 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28e27d62-b9ad-4954-8c1b-55c1720a767d-kube-api-access-452sf" (OuterVolumeSpecName: "kube-api-access-452sf") pod "28e27d62-b9ad-4954-8c1b-55c1720a767d" (UID: "28e27d62-b9ad-4954-8c1b-55c1720a767d"). InnerVolumeSpecName "kube-api-access-452sf". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:05:50 crc kubenswrapper[4691]: I0930 07:05:50.764898 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28e27d62-b9ad-4954-8c1b-55c1720a767d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "28e27d62-b9ad-4954-8c1b-55c1720a767d" (UID: "28e27d62-b9ad-4954-8c1b-55c1720a767d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:05:50 crc kubenswrapper[4691]: I0930 07:05:50.784326 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28e27d62-b9ad-4954-8c1b-55c1720a767d-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 07:05:50 crc kubenswrapper[4691]: I0930 07:05:50.784362 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-452sf\" (UniqueName: \"kubernetes.io/projected/28e27d62-b9ad-4954-8c1b-55c1720a767d-kube-api-access-452sf\") on node \"crc\" DevicePath \"\"" Sep 30 07:05:50 crc kubenswrapper[4691]: I0930 07:05:50.784390 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28e27d62-b9ad-4954-8c1b-55c1720a767d-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 07:05:51 crc kubenswrapper[4691]: I0930 07:05:51.084661 4691 generic.go:334] "Generic (PLEG): container finished" podID="28e27d62-b9ad-4954-8c1b-55c1720a767d" containerID="5b133f4b3b21c94a752450bfc463f2367bb971c73112dc7e1b3b040f937c7934" exitCode=0 Sep 30 07:05:51 crc kubenswrapper[4691]: I0930 07:05:51.084701 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4hhmm" event={"ID":"28e27d62-b9ad-4954-8c1b-55c1720a767d","Type":"ContainerDied","Data":"5b133f4b3b21c94a752450bfc463f2367bb971c73112dc7e1b3b040f937c7934"} Sep 30 07:05:51 crc kubenswrapper[4691]: I0930 07:05:51.084728 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-4hhmm" event={"ID":"28e27d62-b9ad-4954-8c1b-55c1720a767d","Type":"ContainerDied","Data":"616c20e33cae215499aa50a29d8b0565ae242d57dffca7f33b084cd7da8a09e1"} Sep 30 07:05:51 crc kubenswrapper[4691]: I0930 07:05:51.084735 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-4hhmm" Sep 30 07:05:51 crc kubenswrapper[4691]: I0930 07:05:51.084745 4691 scope.go:117] "RemoveContainer" containerID="5b133f4b3b21c94a752450bfc463f2367bb971c73112dc7e1b3b040f937c7934" Sep 30 07:05:51 crc kubenswrapper[4691]: I0930 07:05:51.117054 4691 scope.go:117] "RemoveContainer" containerID="3f73742cc80284f0483e4f1b88717b48f0ff3ff5aed4fac8c4ea74ce66bb03a3" Sep 30 07:05:51 crc kubenswrapper[4691]: I0930 07:05:51.135873 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-4hhmm"] Sep 30 07:05:51 crc kubenswrapper[4691]: I0930 07:05:51.145087 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-4hhmm"] Sep 30 07:05:51 crc kubenswrapper[4691]: I0930 07:05:51.167126 4691 scope.go:117] "RemoveContainer" containerID="96c6f271c550a555396ac6cef19402d21dc42cb5d9f57276ca77e7b62a328ba2" Sep 30 07:05:51 crc kubenswrapper[4691]: I0930 07:05:51.243424 4691 scope.go:117] "RemoveContainer" containerID="5b133f4b3b21c94a752450bfc463f2367bb971c73112dc7e1b3b040f937c7934" Sep 30 07:05:51 crc kubenswrapper[4691]: E0930 07:05:51.246592 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b133f4b3b21c94a752450bfc463f2367bb971c73112dc7e1b3b040f937c7934\": container with ID starting with 5b133f4b3b21c94a752450bfc463f2367bb971c73112dc7e1b3b040f937c7934 not found: ID does not exist" containerID="5b133f4b3b21c94a752450bfc463f2367bb971c73112dc7e1b3b040f937c7934" Sep 30 07:05:51 crc kubenswrapper[4691]: I0930 07:05:51.246631 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b133f4b3b21c94a752450bfc463f2367bb971c73112dc7e1b3b040f937c7934"} err="failed to get container status \"5b133f4b3b21c94a752450bfc463f2367bb971c73112dc7e1b3b040f937c7934\": rpc error: code = NotFound desc = could not find container \"5b133f4b3b21c94a752450bfc463f2367bb971c73112dc7e1b3b040f937c7934\": container with ID starting with 5b133f4b3b21c94a752450bfc463f2367bb971c73112dc7e1b3b040f937c7934 not found: ID does not exist" Sep 30 07:05:51 crc kubenswrapper[4691]: I0930 07:05:51.246654 4691 scope.go:117] "RemoveContainer" containerID="3f73742cc80284f0483e4f1b88717b48f0ff3ff5aed4fac8c4ea74ce66bb03a3" Sep 30 07:05:51 crc kubenswrapper[4691]: E0930 07:05:51.246990 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f73742cc80284f0483e4f1b88717b48f0ff3ff5aed4fac8c4ea74ce66bb03a3\": container with ID starting with 3f73742cc80284f0483e4f1b88717b48f0ff3ff5aed4fac8c4ea74ce66bb03a3 not found: ID does not exist" containerID="3f73742cc80284f0483e4f1b88717b48f0ff3ff5aed4fac8c4ea74ce66bb03a3" Sep 30 07:05:51 crc kubenswrapper[4691]: I0930 07:05:51.247030 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f73742cc80284f0483e4f1b88717b48f0ff3ff5aed4fac8c4ea74ce66bb03a3"} err="failed to get container status \"3f73742cc80284f0483e4f1b88717b48f0ff3ff5aed4fac8c4ea74ce66bb03a3\": rpc error: code = NotFound desc = could not find container \"3f73742cc80284f0483e4f1b88717b48f0ff3ff5aed4fac8c4ea74ce66bb03a3\": container with ID starting with 3f73742cc80284f0483e4f1b88717b48f0ff3ff5aed4fac8c4ea74ce66bb03a3 not found: ID does not exist" Sep 30 07:05:51 crc kubenswrapper[4691]: I0930 07:05:51.247306 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28e27d62-b9ad-4954-8c1b-55c1720a767d" path="/var/lib/kubelet/pods/28e27d62-b9ad-4954-8c1b-55c1720a767d/volumes" Sep 30 07:05:51 crc kubenswrapper[4691]: I0930 07:05:51.247099 4691 scope.go:117] "RemoveContainer" containerID="96c6f271c550a555396ac6cef19402d21dc42cb5d9f57276ca77e7b62a328ba2" Sep 30 07:05:51 crc kubenswrapper[4691]: E0930 07:05:51.248417 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96c6f271c550a555396ac6cef19402d21dc42cb5d9f57276ca77e7b62a328ba2\": container with ID starting with 96c6f271c550a555396ac6cef19402d21dc42cb5d9f57276ca77e7b62a328ba2 not found: ID does not exist" containerID="96c6f271c550a555396ac6cef19402d21dc42cb5d9f57276ca77e7b62a328ba2" Sep 30 07:05:51 crc kubenswrapper[4691]: I0930 07:05:51.248449 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96c6f271c550a555396ac6cef19402d21dc42cb5d9f57276ca77e7b62a328ba2"} err="failed to get container status \"96c6f271c550a555396ac6cef19402d21dc42cb5d9f57276ca77e7b62a328ba2\": rpc error: code = NotFound desc = could not find container \"96c6f271c550a555396ac6cef19402d21dc42cb5d9f57276ca77e7b62a328ba2\": container with ID starting with 96c6f271c550a555396ac6cef19402d21dc42cb5d9f57276ca77e7b62a328ba2 not found: ID does not exist" Sep 30 07:05:52 crc kubenswrapper[4691]: I0930 07:05:52.817483 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5ch9t"] Sep 30 07:05:52 crc kubenswrapper[4691]: E0930 07:05:52.818549 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28e27d62-b9ad-4954-8c1b-55c1720a767d" containerName="registry-server" Sep 30 07:05:52 crc kubenswrapper[4691]: I0930 07:05:52.818571 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="28e27d62-b9ad-4954-8c1b-55c1720a767d" containerName="registry-server" Sep 30 07:05:52 crc kubenswrapper[4691]: E0930 07:05:52.818923 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28e27d62-b9ad-4954-8c1b-55c1720a767d" containerName="extract-content" Sep 30 07:05:52 crc kubenswrapper[4691]: I0930 07:05:52.818941 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="28e27d62-b9ad-4954-8c1b-55c1720a767d" containerName="extract-content" Sep 30 07:05:52 crc kubenswrapper[4691]: E0930 07:05:52.818964 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28e27d62-b9ad-4954-8c1b-55c1720a767d" containerName="extract-utilities" Sep 30 07:05:52 crc kubenswrapper[4691]: I0930 07:05:52.818977 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="28e27d62-b9ad-4954-8c1b-55c1720a767d" containerName="extract-utilities" Sep 30 07:05:52 crc kubenswrapper[4691]: I0930 07:05:52.819306 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="28e27d62-b9ad-4954-8c1b-55c1720a767d" containerName="registry-server" Sep 30 07:05:52 crc kubenswrapper[4691]: I0930 07:05:52.821597 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5ch9t" Sep 30 07:05:52 crc kubenswrapper[4691]: I0930 07:05:52.846136 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5ch9t"] Sep 30 07:05:52 crc kubenswrapper[4691]: I0930 07:05:52.928965 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ff9ce1f-1b1b-41f2-b95d-711ce61f4b87-catalog-content\") pod \"redhat-marketplace-5ch9t\" (UID: \"6ff9ce1f-1b1b-41f2-b95d-711ce61f4b87\") " pod="openshift-marketplace/redhat-marketplace-5ch9t" Sep 30 07:05:52 crc kubenswrapper[4691]: I0930 07:05:52.929032 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vmq8\" (UniqueName: \"kubernetes.io/projected/6ff9ce1f-1b1b-41f2-b95d-711ce61f4b87-kube-api-access-6vmq8\") pod \"redhat-marketplace-5ch9t\" (UID: \"6ff9ce1f-1b1b-41f2-b95d-711ce61f4b87\") " pod="openshift-marketplace/redhat-marketplace-5ch9t" Sep 30 07:05:52 crc kubenswrapper[4691]: I0930 07:05:52.929205 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ff9ce1f-1b1b-41f2-b95d-711ce61f4b87-utilities\") pod \"redhat-marketplace-5ch9t\" (UID: \"6ff9ce1f-1b1b-41f2-b95d-711ce61f4b87\") " pod="openshift-marketplace/redhat-marketplace-5ch9t" Sep 30 07:05:53 crc kubenswrapper[4691]: I0930 07:05:53.031111 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ff9ce1f-1b1b-41f2-b95d-711ce61f4b87-utilities\") pod \"redhat-marketplace-5ch9t\" (UID: \"6ff9ce1f-1b1b-41f2-b95d-711ce61f4b87\") " pod="openshift-marketplace/redhat-marketplace-5ch9t" Sep 30 07:05:53 crc kubenswrapper[4691]: I0930 07:05:53.031244 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ff9ce1f-1b1b-41f2-b95d-711ce61f4b87-catalog-content\") pod \"redhat-marketplace-5ch9t\" (UID: \"6ff9ce1f-1b1b-41f2-b95d-711ce61f4b87\") " pod="openshift-marketplace/redhat-marketplace-5ch9t" Sep 30 07:05:53 crc kubenswrapper[4691]: I0930 07:05:53.031288 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vmq8\" (UniqueName: \"kubernetes.io/projected/6ff9ce1f-1b1b-41f2-b95d-711ce61f4b87-kube-api-access-6vmq8\") pod \"redhat-marketplace-5ch9t\" (UID: \"6ff9ce1f-1b1b-41f2-b95d-711ce61f4b87\") " pod="openshift-marketplace/redhat-marketplace-5ch9t" Sep 30 07:05:53 crc kubenswrapper[4691]: I0930 07:05:53.031693 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ff9ce1f-1b1b-41f2-b95d-711ce61f4b87-catalog-content\") pod \"redhat-marketplace-5ch9t\" (UID: \"6ff9ce1f-1b1b-41f2-b95d-711ce61f4b87\") " pod="openshift-marketplace/redhat-marketplace-5ch9t" Sep 30 07:05:53 crc kubenswrapper[4691]: I0930 07:05:53.032016 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ff9ce1f-1b1b-41f2-b95d-711ce61f4b87-utilities\") pod \"redhat-marketplace-5ch9t\" (UID: \"6ff9ce1f-1b1b-41f2-b95d-711ce61f4b87\") " pod="openshift-marketplace/redhat-marketplace-5ch9t" Sep 30 07:05:53 crc kubenswrapper[4691]: I0930 07:05:53.068241 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vmq8\" (UniqueName: \"kubernetes.io/projected/6ff9ce1f-1b1b-41f2-b95d-711ce61f4b87-kube-api-access-6vmq8\") pod \"redhat-marketplace-5ch9t\" (UID: \"6ff9ce1f-1b1b-41f2-b95d-711ce61f4b87\") " pod="openshift-marketplace/redhat-marketplace-5ch9t" Sep 30 07:05:53 crc kubenswrapper[4691]: I0930 07:05:53.148736 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5ch9t" Sep 30 07:05:53 crc kubenswrapper[4691]: I0930 07:05:53.718689 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5ch9t"] Sep 30 07:05:54 crc kubenswrapper[4691]: I0930 07:05:54.136417 4691 generic.go:334] "Generic (PLEG): container finished" podID="6ff9ce1f-1b1b-41f2-b95d-711ce61f4b87" containerID="838012aae278de515e2f7995ae98aecd2533f948f35976c36d05196d05aeb6e1" exitCode=0 Sep 30 07:05:54 crc kubenswrapper[4691]: I0930 07:05:54.136473 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5ch9t" event={"ID":"6ff9ce1f-1b1b-41f2-b95d-711ce61f4b87","Type":"ContainerDied","Data":"838012aae278de515e2f7995ae98aecd2533f948f35976c36d05196d05aeb6e1"} Sep 30 07:05:54 crc kubenswrapper[4691]: I0930 07:05:54.136505 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5ch9t" event={"ID":"6ff9ce1f-1b1b-41f2-b95d-711ce61f4b87","Type":"ContainerStarted","Data":"27b2f3a48c300570f48a3e5d0bbd7d3f2e7928e967c064aab0de7d9800cdbcd3"} Sep 30 07:05:55 crc kubenswrapper[4691]: I0930 07:05:55.150358 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5ch9t" event={"ID":"6ff9ce1f-1b1b-41f2-b95d-711ce61f4b87","Type":"ContainerStarted","Data":"6e7812a4d9a7e1684cc7df282a474573df4365fa547e684009d31aaf89e72607"} Sep 30 07:05:56 crc kubenswrapper[4691]: I0930 07:05:56.159965 4691 generic.go:334] "Generic (PLEG): container finished" podID="6ff9ce1f-1b1b-41f2-b95d-711ce61f4b87" containerID="6e7812a4d9a7e1684cc7df282a474573df4365fa547e684009d31aaf89e72607" exitCode=0 Sep 30 07:05:56 crc kubenswrapper[4691]: I0930 07:05:56.160262 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5ch9t" event={"ID":"6ff9ce1f-1b1b-41f2-b95d-711ce61f4b87","Type":"ContainerDied","Data":"6e7812a4d9a7e1684cc7df282a474573df4365fa547e684009d31aaf89e72607"} Sep 30 07:05:57 crc kubenswrapper[4691]: I0930 07:05:57.172150 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5ch9t" event={"ID":"6ff9ce1f-1b1b-41f2-b95d-711ce61f4b87","Type":"ContainerStarted","Data":"963b0f00bc0499d0337402b7ed92965933380fbac6dfc271405b7df19a35d077"} Sep 30 07:05:57 crc kubenswrapper[4691]: I0930 07:05:57.195118 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5ch9t" podStartSLOduration=2.648701869 podStartE2EDuration="5.195097034s" podCreationTimestamp="2025-09-30 07:05:52 +0000 UTC" firstStartedPulling="2025-09-30 07:05:54.138833789 +0000 UTC m=+2797.613854869" lastFinishedPulling="2025-09-30 07:05:56.685228984 +0000 UTC m=+2800.160250034" observedRunningTime="2025-09-30 07:05:57.189692342 +0000 UTC m=+2800.664713412" watchObservedRunningTime="2025-09-30 07:05:57.195097034 +0000 UTC m=+2800.670118084" Sep 30 07:06:03 crc kubenswrapper[4691]: I0930 07:06:03.150512 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5ch9t" Sep 30 07:06:03 crc kubenswrapper[4691]: I0930 07:06:03.151494 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5ch9t" Sep 30 07:06:03 crc kubenswrapper[4691]: I0930 07:06:03.244648 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5ch9t" Sep 30 07:06:03 crc kubenswrapper[4691]: I0930 07:06:03.253392 4691 generic.go:334] "Generic (PLEG): container finished" podID="8c1bc2df-cff0-4d61-9773-0db30010956c" containerID="87de3534c90c4ddc69a46cfab9754ab4399d9d4e4e0be0c502d885477309e7d9" exitCode=0 Sep 30 07:06:03 crc kubenswrapper[4691]: I0930 07:06:03.253512 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-cthz9" event={"ID":"8c1bc2df-cff0-4d61-9773-0db30010956c","Type":"ContainerDied","Data":"87de3534c90c4ddc69a46cfab9754ab4399d9d4e4e0be0c502d885477309e7d9"} Sep 30 07:06:03 crc kubenswrapper[4691]: I0930 07:06:03.337281 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5ch9t" Sep 30 07:06:03 crc kubenswrapper[4691]: I0930 07:06:03.484668 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5ch9t"] Sep 30 07:06:04 crc kubenswrapper[4691]: I0930 07:06:04.695591 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-cthz9" Sep 30 07:06:04 crc kubenswrapper[4691]: I0930 07:06:04.805208 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/8c1bc2df-cff0-4d61-9773-0db30010956c-ceilometer-compute-config-data-2\") pod \"8c1bc2df-cff0-4d61-9773-0db30010956c\" (UID: \"8c1bc2df-cff0-4d61-9773-0db30010956c\") " Sep 30 07:06:04 crc kubenswrapper[4691]: I0930 07:06:04.805255 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/8c1bc2df-cff0-4d61-9773-0db30010956c-ceilometer-compute-config-data-1\") pod \"8c1bc2df-cff0-4d61-9773-0db30010956c\" (UID: \"8c1bc2df-cff0-4d61-9773-0db30010956c\") " Sep 30 07:06:04 crc kubenswrapper[4691]: I0930 07:06:04.805305 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/8c1bc2df-cff0-4d61-9773-0db30010956c-ceilometer-compute-config-data-0\") pod \"8c1bc2df-cff0-4d61-9773-0db30010956c\" (UID: \"8c1bc2df-cff0-4d61-9773-0db30010956c\") " Sep 30 07:06:04 crc kubenswrapper[4691]: I0930 07:06:04.805339 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzbz8\" (UniqueName: \"kubernetes.io/projected/8c1bc2df-cff0-4d61-9773-0db30010956c-kube-api-access-wzbz8\") pod \"8c1bc2df-cff0-4d61-9773-0db30010956c\" (UID: \"8c1bc2df-cff0-4d61-9773-0db30010956c\") " Sep 30 07:06:04 crc kubenswrapper[4691]: I0930 07:06:04.805364 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8c1bc2df-cff0-4d61-9773-0db30010956c-inventory\") pod \"8c1bc2df-cff0-4d61-9773-0db30010956c\" (UID: \"8c1bc2df-cff0-4d61-9773-0db30010956c\") " Sep 30 07:06:04 crc kubenswrapper[4691]: I0930 07:06:04.805449 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8c1bc2df-cff0-4d61-9773-0db30010956c-ssh-key\") pod \"8c1bc2df-cff0-4d61-9773-0db30010956c\" (UID: \"8c1bc2df-cff0-4d61-9773-0db30010956c\") " Sep 30 07:06:04 crc kubenswrapper[4691]: I0930 07:06:04.805479 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c1bc2df-cff0-4d61-9773-0db30010956c-telemetry-combined-ca-bundle\") pod \"8c1bc2df-cff0-4d61-9773-0db30010956c\" (UID: \"8c1bc2df-cff0-4d61-9773-0db30010956c\") " Sep 30 07:06:04 crc kubenswrapper[4691]: I0930 07:06:04.823138 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c1bc2df-cff0-4d61-9773-0db30010956c-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "8c1bc2df-cff0-4d61-9773-0db30010956c" (UID: "8c1bc2df-cff0-4d61-9773-0db30010956c"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:06:04 crc kubenswrapper[4691]: I0930 07:06:04.833125 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c1bc2df-cff0-4d61-9773-0db30010956c-kube-api-access-wzbz8" (OuterVolumeSpecName: "kube-api-access-wzbz8") pod "8c1bc2df-cff0-4d61-9773-0db30010956c" (UID: "8c1bc2df-cff0-4d61-9773-0db30010956c"). InnerVolumeSpecName "kube-api-access-wzbz8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:06:04 crc kubenswrapper[4691]: I0930 07:06:04.850969 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c1bc2df-cff0-4d61-9773-0db30010956c-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "8c1bc2df-cff0-4d61-9773-0db30010956c" (UID: "8c1bc2df-cff0-4d61-9773-0db30010956c"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:06:04 crc kubenswrapper[4691]: I0930 07:06:04.894030 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c1bc2df-cff0-4d61-9773-0db30010956c-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "8c1bc2df-cff0-4d61-9773-0db30010956c" (UID: "8c1bc2df-cff0-4d61-9773-0db30010956c"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:06:04 crc kubenswrapper[4691]: I0930 07:06:04.910345 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c1bc2df-cff0-4d61-9773-0db30010956c-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "8c1bc2df-cff0-4d61-9773-0db30010956c" (UID: "8c1bc2df-cff0-4d61-9773-0db30010956c"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:06:04 crc kubenswrapper[4691]: I0930 07:06:04.910376 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c1bc2df-cff0-4d61-9773-0db30010956c-inventory" (OuterVolumeSpecName: "inventory") pod "8c1bc2df-cff0-4d61-9773-0db30010956c" (UID: "8c1bc2df-cff0-4d61-9773-0db30010956c"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:06:04 crc kubenswrapper[4691]: I0930 07:06:04.911669 4691 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/8c1bc2df-cff0-4d61-9773-0db30010956c-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Sep 30 07:06:04 crc kubenswrapper[4691]: I0930 07:06:04.911695 4691 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/8c1bc2df-cff0-4d61-9773-0db30010956c-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Sep 30 07:06:04 crc kubenswrapper[4691]: I0930 07:06:04.911705 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wzbz8\" (UniqueName: \"kubernetes.io/projected/8c1bc2df-cff0-4d61-9773-0db30010956c-kube-api-access-wzbz8\") on node \"crc\" DevicePath \"\"" Sep 30 07:06:04 crc kubenswrapper[4691]: I0930 07:06:04.911717 4691 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8c1bc2df-cff0-4d61-9773-0db30010956c-inventory\") on node \"crc\" DevicePath \"\"" Sep 30 07:06:04 crc kubenswrapper[4691]: I0930 07:06:04.911726 4691 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8c1bc2df-cff0-4d61-9773-0db30010956c-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 07:06:04 crc kubenswrapper[4691]: I0930 07:06:04.911734 4691 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c1bc2df-cff0-4d61-9773-0db30010956c-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 07:06:04 crc kubenswrapper[4691]: I0930 07:06:04.912011 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c1bc2df-cff0-4d61-9773-0db30010956c-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "8c1bc2df-cff0-4d61-9773-0db30010956c" (UID: "8c1bc2df-cff0-4d61-9773-0db30010956c"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:06:05 crc kubenswrapper[4691]: I0930 07:06:05.013843 4691 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/8c1bc2df-cff0-4d61-9773-0db30010956c-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Sep 30 07:06:05 crc kubenswrapper[4691]: I0930 07:06:05.283545 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-cthz9" event={"ID":"8c1bc2df-cff0-4d61-9773-0db30010956c","Type":"ContainerDied","Data":"6a6f854609800e77a101be0cb86559d7bc8fdf157a59da96f3dd54c2f05a2bbb"} Sep 30 07:06:05 crc kubenswrapper[4691]: I0930 07:06:05.283902 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a6f854609800e77a101be0cb86559d7bc8fdf157a59da96f3dd54c2f05a2bbb" Sep 30 07:06:05 crc kubenswrapper[4691]: I0930 07:06:05.283571 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-cthz9" Sep 30 07:06:05 crc kubenswrapper[4691]: I0930 07:06:05.283673 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5ch9t" podUID="6ff9ce1f-1b1b-41f2-b95d-711ce61f4b87" containerName="registry-server" containerID="cri-o://963b0f00bc0499d0337402b7ed92965933380fbac6dfc271405b7df19a35d077" gracePeriod=2 Sep 30 07:06:05 crc kubenswrapper[4691]: I0930 07:06:05.816720 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5ch9t" Sep 30 07:06:05 crc kubenswrapper[4691]: I0930 07:06:05.930909 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ff9ce1f-1b1b-41f2-b95d-711ce61f4b87-catalog-content\") pod \"6ff9ce1f-1b1b-41f2-b95d-711ce61f4b87\" (UID: \"6ff9ce1f-1b1b-41f2-b95d-711ce61f4b87\") " Sep 30 07:06:05 crc kubenswrapper[4691]: I0930 07:06:05.931296 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vmq8\" (UniqueName: \"kubernetes.io/projected/6ff9ce1f-1b1b-41f2-b95d-711ce61f4b87-kube-api-access-6vmq8\") pod \"6ff9ce1f-1b1b-41f2-b95d-711ce61f4b87\" (UID: \"6ff9ce1f-1b1b-41f2-b95d-711ce61f4b87\") " Sep 30 07:06:05 crc kubenswrapper[4691]: I0930 07:06:05.931360 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ff9ce1f-1b1b-41f2-b95d-711ce61f4b87-utilities\") pod \"6ff9ce1f-1b1b-41f2-b95d-711ce61f4b87\" (UID: \"6ff9ce1f-1b1b-41f2-b95d-711ce61f4b87\") " Sep 30 07:06:05 crc kubenswrapper[4691]: I0930 07:06:05.932204 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ff9ce1f-1b1b-41f2-b95d-711ce61f4b87-utilities" (OuterVolumeSpecName: "utilities") pod "6ff9ce1f-1b1b-41f2-b95d-711ce61f4b87" (UID: "6ff9ce1f-1b1b-41f2-b95d-711ce61f4b87"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:06:05 crc kubenswrapper[4691]: I0930 07:06:05.937833 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ff9ce1f-1b1b-41f2-b95d-711ce61f4b87-kube-api-access-6vmq8" (OuterVolumeSpecName: "kube-api-access-6vmq8") pod "6ff9ce1f-1b1b-41f2-b95d-711ce61f4b87" (UID: "6ff9ce1f-1b1b-41f2-b95d-711ce61f4b87"). InnerVolumeSpecName "kube-api-access-6vmq8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:06:05 crc kubenswrapper[4691]: I0930 07:06:05.961602 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ff9ce1f-1b1b-41f2-b95d-711ce61f4b87-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6ff9ce1f-1b1b-41f2-b95d-711ce61f4b87" (UID: "6ff9ce1f-1b1b-41f2-b95d-711ce61f4b87"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:06:06 crc kubenswrapper[4691]: I0930 07:06:06.034502 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vmq8\" (UniqueName: \"kubernetes.io/projected/6ff9ce1f-1b1b-41f2-b95d-711ce61f4b87-kube-api-access-6vmq8\") on node \"crc\" DevicePath \"\"" Sep 30 07:06:06 crc kubenswrapper[4691]: I0930 07:06:06.034540 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ff9ce1f-1b1b-41f2-b95d-711ce61f4b87-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 07:06:06 crc kubenswrapper[4691]: I0930 07:06:06.034554 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ff9ce1f-1b1b-41f2-b95d-711ce61f4b87-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 07:06:06 crc kubenswrapper[4691]: I0930 07:06:06.295985 4691 generic.go:334] "Generic (PLEG): container finished" podID="6ff9ce1f-1b1b-41f2-b95d-711ce61f4b87" containerID="963b0f00bc0499d0337402b7ed92965933380fbac6dfc271405b7df19a35d077" exitCode=0 Sep 30 07:06:06 crc kubenswrapper[4691]: I0930 07:06:06.296032 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5ch9t" event={"ID":"6ff9ce1f-1b1b-41f2-b95d-711ce61f4b87","Type":"ContainerDied","Data":"963b0f00bc0499d0337402b7ed92965933380fbac6dfc271405b7df19a35d077"} Sep 30 07:06:06 crc kubenswrapper[4691]: I0930 07:06:06.296060 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5ch9t" event={"ID":"6ff9ce1f-1b1b-41f2-b95d-711ce61f4b87","Type":"ContainerDied","Data":"27b2f3a48c300570f48a3e5d0bbd7d3f2e7928e967c064aab0de7d9800cdbcd3"} Sep 30 07:06:06 crc kubenswrapper[4691]: I0930 07:06:06.296059 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5ch9t" Sep 30 07:06:06 crc kubenswrapper[4691]: I0930 07:06:06.296076 4691 scope.go:117] "RemoveContainer" containerID="963b0f00bc0499d0337402b7ed92965933380fbac6dfc271405b7df19a35d077" Sep 30 07:06:06 crc kubenswrapper[4691]: I0930 07:06:06.333628 4691 scope.go:117] "RemoveContainer" containerID="6e7812a4d9a7e1684cc7df282a474573df4365fa547e684009d31aaf89e72607" Sep 30 07:06:06 crc kubenswrapper[4691]: I0930 07:06:06.338629 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5ch9t"] Sep 30 07:06:06 crc kubenswrapper[4691]: I0930 07:06:06.348540 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5ch9t"] Sep 30 07:06:06 crc kubenswrapper[4691]: I0930 07:06:06.363111 4691 scope.go:117] "RemoveContainer" containerID="838012aae278de515e2f7995ae98aecd2533f948f35976c36d05196d05aeb6e1" Sep 30 07:06:06 crc kubenswrapper[4691]: I0930 07:06:06.435686 4691 scope.go:117] "RemoveContainer" containerID="963b0f00bc0499d0337402b7ed92965933380fbac6dfc271405b7df19a35d077" Sep 30 07:06:06 crc kubenswrapper[4691]: E0930 07:06:06.436261 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"963b0f00bc0499d0337402b7ed92965933380fbac6dfc271405b7df19a35d077\": container with ID starting with 963b0f00bc0499d0337402b7ed92965933380fbac6dfc271405b7df19a35d077 not found: ID does not exist" containerID="963b0f00bc0499d0337402b7ed92965933380fbac6dfc271405b7df19a35d077" Sep 30 07:06:06 crc kubenswrapper[4691]: I0930 07:06:06.436335 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"963b0f00bc0499d0337402b7ed92965933380fbac6dfc271405b7df19a35d077"} err="failed to get container status \"963b0f00bc0499d0337402b7ed92965933380fbac6dfc271405b7df19a35d077\": rpc error: code = NotFound desc = could not find container \"963b0f00bc0499d0337402b7ed92965933380fbac6dfc271405b7df19a35d077\": container with ID starting with 963b0f00bc0499d0337402b7ed92965933380fbac6dfc271405b7df19a35d077 not found: ID does not exist" Sep 30 07:06:06 crc kubenswrapper[4691]: I0930 07:06:06.436378 4691 scope.go:117] "RemoveContainer" containerID="6e7812a4d9a7e1684cc7df282a474573df4365fa547e684009d31aaf89e72607" Sep 30 07:06:06 crc kubenswrapper[4691]: E0930 07:06:06.436921 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e7812a4d9a7e1684cc7df282a474573df4365fa547e684009d31aaf89e72607\": container with ID starting with 6e7812a4d9a7e1684cc7df282a474573df4365fa547e684009d31aaf89e72607 not found: ID does not exist" containerID="6e7812a4d9a7e1684cc7df282a474573df4365fa547e684009d31aaf89e72607" Sep 30 07:06:06 crc kubenswrapper[4691]: I0930 07:06:06.436955 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e7812a4d9a7e1684cc7df282a474573df4365fa547e684009d31aaf89e72607"} err="failed to get container status \"6e7812a4d9a7e1684cc7df282a474573df4365fa547e684009d31aaf89e72607\": rpc error: code = NotFound desc = could not find container \"6e7812a4d9a7e1684cc7df282a474573df4365fa547e684009d31aaf89e72607\": container with ID starting with 6e7812a4d9a7e1684cc7df282a474573df4365fa547e684009d31aaf89e72607 not found: ID does not exist" Sep 30 07:06:06 crc kubenswrapper[4691]: I0930 07:06:06.436980 4691 scope.go:117] "RemoveContainer" containerID="838012aae278de515e2f7995ae98aecd2533f948f35976c36d05196d05aeb6e1" Sep 30 07:06:06 crc kubenswrapper[4691]: E0930 07:06:06.437198 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"838012aae278de515e2f7995ae98aecd2533f948f35976c36d05196d05aeb6e1\": container with ID starting with 838012aae278de515e2f7995ae98aecd2533f948f35976c36d05196d05aeb6e1 not found: ID does not exist" containerID="838012aae278de515e2f7995ae98aecd2533f948f35976c36d05196d05aeb6e1" Sep 30 07:06:06 crc kubenswrapper[4691]: I0930 07:06:06.437218 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"838012aae278de515e2f7995ae98aecd2533f948f35976c36d05196d05aeb6e1"} err="failed to get container status \"838012aae278de515e2f7995ae98aecd2533f948f35976c36d05196d05aeb6e1\": rpc error: code = NotFound desc = could not find container \"838012aae278de515e2f7995ae98aecd2533f948f35976c36d05196d05aeb6e1\": container with ID starting with 838012aae278de515e2f7995ae98aecd2533f948f35976c36d05196d05aeb6e1 not found: ID does not exist" Sep 30 07:06:07 crc kubenswrapper[4691]: I0930 07:06:07.241611 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ff9ce1f-1b1b-41f2-b95d-711ce61f4b87" path="/var/lib/kubelet/pods/6ff9ce1f-1b1b-41f2-b95d-711ce61f4b87/volumes" Sep 30 07:06:40 crc kubenswrapper[4691]: I0930 07:06:40.884743 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 07:06:40 crc kubenswrapper[4691]: I0930 07:06:40.886010 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="e6d36519-9195-4e0b-9760-844d420e2661" containerName="prometheus" containerID="cri-o://f2066cd08b87186a70377d50558d23be269f769184c84558f83ebec7565f936b" gracePeriod=600 Sep 30 07:06:40 crc kubenswrapper[4691]: I0930 07:06:40.886150 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="e6d36519-9195-4e0b-9760-844d420e2661" containerName="config-reloader" containerID="cri-o://8e24f260ef6ef82ea29d52730656e5557e2769f5ebd1c1afff062234d4267ed5" gracePeriod=600 Sep 30 07:06:40 crc kubenswrapper[4691]: I0930 07:06:40.886148 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="e6d36519-9195-4e0b-9760-844d420e2661" containerName="thanos-sidecar" containerID="cri-o://ce79f06433243c760f5081894f47324ef44d9e935d4dfdcb32ee476850baf204" gracePeriod=600 Sep 30 07:06:41 crc kubenswrapper[4691]: I0930 07:06:41.753193 4691 generic.go:334] "Generic (PLEG): container finished" podID="e6d36519-9195-4e0b-9760-844d420e2661" containerID="ce79f06433243c760f5081894f47324ef44d9e935d4dfdcb32ee476850baf204" exitCode=0 Sep 30 07:06:41 crc kubenswrapper[4691]: I0930 07:06:41.753536 4691 generic.go:334] "Generic (PLEG): container finished" podID="e6d36519-9195-4e0b-9760-844d420e2661" containerID="f2066cd08b87186a70377d50558d23be269f769184c84558f83ebec7565f936b" exitCode=0 Sep 30 07:06:41 crc kubenswrapper[4691]: I0930 07:06:41.753255 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e6d36519-9195-4e0b-9760-844d420e2661","Type":"ContainerDied","Data":"ce79f06433243c760f5081894f47324ef44d9e935d4dfdcb32ee476850baf204"} Sep 30 07:06:41 crc kubenswrapper[4691]: I0930 07:06:41.753573 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e6d36519-9195-4e0b-9760-844d420e2661","Type":"ContainerDied","Data":"f2066cd08b87186a70377d50558d23be269f769184c84558f83ebec7565f936b"} Sep 30 07:06:41 crc kubenswrapper[4691]: I0930 07:06:41.804588 4691 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="e6d36519-9195-4e0b-9760-844d420e2661" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.133:9090/-/ready\": dial tcp 10.217.0.133:9090: connect: connection refused" Sep 30 07:06:42 crc kubenswrapper[4691]: I0930 07:06:42.497749 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Sep 30 07:06:42 crc kubenswrapper[4691]: I0930 07:06:42.616740 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e6d36519-9195-4e0b-9760-844d420e2661-config-out\") pod \"e6d36519-9195-4e0b-9760-844d420e2661\" (UID: \"e6d36519-9195-4e0b-9760-844d420e2661\") " Sep 30 07:06:42 crc kubenswrapper[4691]: I0930 07:06:42.616823 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/e6d36519-9195-4e0b-9760-844d420e2661-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"e6d36519-9195-4e0b-9760-844d420e2661\" (UID: \"e6d36519-9195-4e0b-9760-844d420e2661\") " Sep 30 07:06:42 crc kubenswrapper[4691]: I0930 07:06:42.616855 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wgfvq\" (UniqueName: \"kubernetes.io/projected/e6d36519-9195-4e0b-9760-844d420e2661-kube-api-access-wgfvq\") pod \"e6d36519-9195-4e0b-9760-844d420e2661\" (UID: \"e6d36519-9195-4e0b-9760-844d420e2661\") " Sep 30 07:06:42 crc kubenswrapper[4691]: I0930 07:06:42.616919 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e6d36519-9195-4e0b-9760-844d420e2661-web-config\") pod \"e6d36519-9195-4e0b-9760-844d420e2661\" (UID: \"e6d36519-9195-4e0b-9760-844d420e2661\") " Sep 30 07:06:42 crc kubenswrapper[4691]: I0930 07:06:42.617104 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-299d6a78-f368-48b6-8212-bc99e11a0dd5\") pod \"e6d36519-9195-4e0b-9760-844d420e2661\" (UID: \"e6d36519-9195-4e0b-9760-844d420e2661\") " Sep 30 07:06:42 crc kubenswrapper[4691]: I0930 07:06:42.617130 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e6d36519-9195-4e0b-9760-844d420e2661-prometheus-metric-storage-rulefiles-0\") pod \"e6d36519-9195-4e0b-9760-844d420e2661\" (UID: \"e6d36519-9195-4e0b-9760-844d420e2661\") " Sep 30 07:06:42 crc kubenswrapper[4691]: I0930 07:06:42.617157 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/e6d36519-9195-4e0b-9760-844d420e2661-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"e6d36519-9195-4e0b-9760-844d420e2661\" (UID: \"e6d36519-9195-4e0b-9760-844d420e2661\") " Sep 30 07:06:42 crc kubenswrapper[4691]: I0930 07:06:42.617772 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6d36519-9195-4e0b-9760-844d420e2661-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "e6d36519-9195-4e0b-9760-844d420e2661" (UID: "e6d36519-9195-4e0b-9760-844d420e2661"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:06:42 crc kubenswrapper[4691]: I0930 07:06:42.617817 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e6d36519-9195-4e0b-9760-844d420e2661-tls-assets\") pod \"e6d36519-9195-4e0b-9760-844d420e2661\" (UID: \"e6d36519-9195-4e0b-9760-844d420e2661\") " Sep 30 07:06:42 crc kubenswrapper[4691]: I0930 07:06:42.617857 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e6d36519-9195-4e0b-9760-844d420e2661-config\") pod \"e6d36519-9195-4e0b-9760-844d420e2661\" (UID: \"e6d36519-9195-4e0b-9760-844d420e2661\") " Sep 30 07:06:42 crc kubenswrapper[4691]: I0930 07:06:42.617992 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6d36519-9195-4e0b-9760-844d420e2661-secret-combined-ca-bundle\") pod \"e6d36519-9195-4e0b-9760-844d420e2661\" (UID: \"e6d36519-9195-4e0b-9760-844d420e2661\") " Sep 30 07:06:42 crc kubenswrapper[4691]: I0930 07:06:42.618064 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e6d36519-9195-4e0b-9760-844d420e2661-thanos-prometheus-http-client-file\") pod \"e6d36519-9195-4e0b-9760-844d420e2661\" (UID: \"e6d36519-9195-4e0b-9760-844d420e2661\") " Sep 30 07:06:42 crc kubenswrapper[4691]: I0930 07:06:42.618981 4691 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e6d36519-9195-4e0b-9760-844d420e2661-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Sep 30 07:06:42 crc kubenswrapper[4691]: I0930 07:06:42.628834 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6d36519-9195-4e0b-9760-844d420e2661-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d") pod "e6d36519-9195-4e0b-9760-844d420e2661" (UID: "e6d36519-9195-4e0b-9760-844d420e2661"). InnerVolumeSpecName "web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:06:42 crc kubenswrapper[4691]: I0930 07:06:42.628981 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6d36519-9195-4e0b-9760-844d420e2661-secret-combined-ca-bundle" (OuterVolumeSpecName: "secret-combined-ca-bundle") pod "e6d36519-9195-4e0b-9760-844d420e2661" (UID: "e6d36519-9195-4e0b-9760-844d420e2661"). InnerVolumeSpecName "secret-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:06:42 crc kubenswrapper[4691]: I0930 07:06:42.629494 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6d36519-9195-4e0b-9760-844d420e2661-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "e6d36519-9195-4e0b-9760-844d420e2661" (UID: "e6d36519-9195-4e0b-9760-844d420e2661"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:06:42 crc kubenswrapper[4691]: I0930 07:06:42.629678 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6d36519-9195-4e0b-9760-844d420e2661-config" (OuterVolumeSpecName: "config") pod "e6d36519-9195-4e0b-9760-844d420e2661" (UID: "e6d36519-9195-4e0b-9760-844d420e2661"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:06:42 crc kubenswrapper[4691]: I0930 07:06:42.629927 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6d36519-9195-4e0b-9760-844d420e2661-kube-api-access-wgfvq" (OuterVolumeSpecName: "kube-api-access-wgfvq") pod "e6d36519-9195-4e0b-9760-844d420e2661" (UID: "e6d36519-9195-4e0b-9760-844d420e2661"). InnerVolumeSpecName "kube-api-access-wgfvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:06:42 crc kubenswrapper[4691]: I0930 07:06:42.637166 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6d36519-9195-4e0b-9760-844d420e2661-config-out" (OuterVolumeSpecName: "config-out") pod "e6d36519-9195-4e0b-9760-844d420e2661" (UID: "e6d36519-9195-4e0b-9760-844d420e2661"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:06:42 crc kubenswrapper[4691]: I0930 07:06:42.637209 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6d36519-9195-4e0b-9760-844d420e2661-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "e6d36519-9195-4e0b-9760-844d420e2661" (UID: "e6d36519-9195-4e0b-9760-844d420e2661"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:06:42 crc kubenswrapper[4691]: I0930 07:06:42.637323 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6d36519-9195-4e0b-9760-844d420e2661-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d" (OuterVolumeSpecName: "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d") pod "e6d36519-9195-4e0b-9760-844d420e2661" (UID: "e6d36519-9195-4e0b-9760-844d420e2661"). InnerVolumeSpecName "web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:06:42 crc kubenswrapper[4691]: I0930 07:06:42.682958 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-299d6a78-f368-48b6-8212-bc99e11a0dd5" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "e6d36519-9195-4e0b-9760-844d420e2661" (UID: "e6d36519-9195-4e0b-9760-844d420e2661"). InnerVolumeSpecName "pvc-299d6a78-f368-48b6-8212-bc99e11a0dd5". PluginName "kubernetes.io/csi", VolumeGidValue "" Sep 30 07:06:42 crc kubenswrapper[4691]: I0930 07:06:42.720897 4691 reconciler_common.go:293] "Volume detached for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e6d36519-9195-4e0b-9760-844d420e2661-secret-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 07:06:42 crc kubenswrapper[4691]: I0930 07:06:42.721083 4691 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e6d36519-9195-4e0b-9760-844d420e2661-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Sep 30 07:06:42 crc kubenswrapper[4691]: I0930 07:06:42.721100 4691 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e6d36519-9195-4e0b-9760-844d420e2661-config-out\") on node \"crc\" DevicePath \"\"" Sep 30 07:06:42 crc kubenswrapper[4691]: I0930 07:06:42.721112 4691 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/e6d36519-9195-4e0b-9760-844d420e2661-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") on node \"crc\" DevicePath \"\"" Sep 30 07:06:42 crc kubenswrapper[4691]: I0930 07:06:42.721124 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wgfvq\" (UniqueName: \"kubernetes.io/projected/e6d36519-9195-4e0b-9760-844d420e2661-kube-api-access-wgfvq\") on node \"crc\" DevicePath \"\"" Sep 30 07:06:42 crc kubenswrapper[4691]: I0930 07:06:42.721156 4691 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-299d6a78-f368-48b6-8212-bc99e11a0dd5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-299d6a78-f368-48b6-8212-bc99e11a0dd5\") on node \"crc\" " Sep 30 07:06:42 crc kubenswrapper[4691]: I0930 07:06:42.721168 4691 reconciler_common.go:293] "Volume detached for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/e6d36519-9195-4e0b-9760-844d420e2661-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") on node \"crc\" DevicePath \"\"" Sep 30 07:06:42 crc kubenswrapper[4691]: I0930 07:06:42.721177 4691 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e6d36519-9195-4e0b-9760-844d420e2661-tls-assets\") on node \"crc\" DevicePath \"\"" Sep 30 07:06:42 crc kubenswrapper[4691]: I0930 07:06:42.721186 4691 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/e6d36519-9195-4e0b-9760-844d420e2661-config\") on node \"crc\" DevicePath \"\"" Sep 30 07:06:42 crc kubenswrapper[4691]: I0930 07:06:42.727573 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e6d36519-9195-4e0b-9760-844d420e2661-web-config" (OuterVolumeSpecName: "web-config") pod "e6d36519-9195-4e0b-9760-844d420e2661" (UID: "e6d36519-9195-4e0b-9760-844d420e2661"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:06:42 crc kubenswrapper[4691]: I0930 07:06:42.751477 4691 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Sep 30 07:06:42 crc kubenswrapper[4691]: I0930 07:06:42.752711 4691 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-299d6a78-f368-48b6-8212-bc99e11a0dd5" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-299d6a78-f368-48b6-8212-bc99e11a0dd5") on node "crc" Sep 30 07:06:42 crc kubenswrapper[4691]: I0930 07:06:42.770271 4691 generic.go:334] "Generic (PLEG): container finished" podID="e6d36519-9195-4e0b-9760-844d420e2661" containerID="8e24f260ef6ef82ea29d52730656e5557e2769f5ebd1c1afff062234d4267ed5" exitCode=0 Sep 30 07:06:42 crc kubenswrapper[4691]: I0930 07:06:42.770343 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Sep 30 07:06:42 crc kubenswrapper[4691]: I0930 07:06:42.770349 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e6d36519-9195-4e0b-9760-844d420e2661","Type":"ContainerDied","Data":"8e24f260ef6ef82ea29d52730656e5557e2769f5ebd1c1afff062234d4267ed5"} Sep 30 07:06:42 crc kubenswrapper[4691]: I0930 07:06:42.770439 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"e6d36519-9195-4e0b-9760-844d420e2661","Type":"ContainerDied","Data":"996dc983d58157a9561dd45d248b41b9d8d77e918cf36938e0e399b6d006c25a"} Sep 30 07:06:42 crc kubenswrapper[4691]: I0930 07:06:42.770459 4691 scope.go:117] "RemoveContainer" containerID="ce79f06433243c760f5081894f47324ef44d9e935d4dfdcb32ee476850baf204" Sep 30 07:06:42 crc kubenswrapper[4691]: I0930 07:06:42.823852 4691 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e6d36519-9195-4e0b-9760-844d420e2661-web-config\") on node \"crc\" DevicePath \"\"" Sep 30 07:06:42 crc kubenswrapper[4691]: I0930 07:06:42.823910 4691 reconciler_common.go:293] "Volume detached for volume \"pvc-299d6a78-f368-48b6-8212-bc99e11a0dd5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-299d6a78-f368-48b6-8212-bc99e11a0dd5\") on node \"crc\" DevicePath \"\"" Sep 30 07:06:42 crc kubenswrapper[4691]: I0930 07:06:42.853092 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 07:06:42 crc kubenswrapper[4691]: I0930 07:06:42.863137 4691 scope.go:117] "RemoveContainer" containerID="8e24f260ef6ef82ea29d52730656e5557e2769f5ebd1c1afff062234d4267ed5" Sep 30 07:06:42 crc kubenswrapper[4691]: I0930 07:06:42.865072 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 07:06:42 crc kubenswrapper[4691]: I0930 07:06:42.890980 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 07:06:42 crc kubenswrapper[4691]: E0930 07:06:42.892409 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6d36519-9195-4e0b-9760-844d420e2661" containerName="thanos-sidecar" Sep 30 07:06:42 crc kubenswrapper[4691]: I0930 07:06:42.892437 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6d36519-9195-4e0b-9760-844d420e2661" containerName="thanos-sidecar" Sep 30 07:06:42 crc kubenswrapper[4691]: E0930 07:06:42.892458 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c1bc2df-cff0-4d61-9773-0db30010956c" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Sep 30 07:06:42 crc kubenswrapper[4691]: I0930 07:06:42.892466 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c1bc2df-cff0-4d61-9773-0db30010956c" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Sep 30 07:06:42 crc kubenswrapper[4691]: E0930 07:06:42.892503 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ff9ce1f-1b1b-41f2-b95d-711ce61f4b87" containerName="extract-utilities" Sep 30 07:06:42 crc kubenswrapper[4691]: I0930 07:06:42.892509 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ff9ce1f-1b1b-41f2-b95d-711ce61f4b87" containerName="extract-utilities" Sep 30 07:06:42 crc kubenswrapper[4691]: E0930 07:06:42.892520 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ff9ce1f-1b1b-41f2-b95d-711ce61f4b87" containerName="extract-content" Sep 30 07:06:42 crc kubenswrapper[4691]: I0930 07:06:42.892525 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ff9ce1f-1b1b-41f2-b95d-711ce61f4b87" containerName="extract-content" Sep 30 07:06:42 crc kubenswrapper[4691]: E0930 07:06:42.892538 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6d36519-9195-4e0b-9760-844d420e2661" containerName="config-reloader" Sep 30 07:06:42 crc kubenswrapper[4691]: I0930 07:06:42.892544 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6d36519-9195-4e0b-9760-844d420e2661" containerName="config-reloader" Sep 30 07:06:42 crc kubenswrapper[4691]: E0930 07:06:42.892578 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ff9ce1f-1b1b-41f2-b95d-711ce61f4b87" containerName="registry-server" Sep 30 07:06:42 crc kubenswrapper[4691]: I0930 07:06:42.892585 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ff9ce1f-1b1b-41f2-b95d-711ce61f4b87" containerName="registry-server" Sep 30 07:06:42 crc kubenswrapper[4691]: E0930 07:06:42.892603 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6d36519-9195-4e0b-9760-844d420e2661" containerName="prometheus" Sep 30 07:06:42 crc kubenswrapper[4691]: I0930 07:06:42.892609 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6d36519-9195-4e0b-9760-844d420e2661" containerName="prometheus" Sep 30 07:06:42 crc kubenswrapper[4691]: E0930 07:06:42.892621 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6d36519-9195-4e0b-9760-844d420e2661" containerName="init-config-reloader" Sep 30 07:06:42 crc kubenswrapper[4691]: I0930 07:06:42.892648 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6d36519-9195-4e0b-9760-844d420e2661" containerName="init-config-reloader" Sep 30 07:06:42 crc kubenswrapper[4691]: I0930 07:06:42.892984 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c1bc2df-cff0-4d61-9773-0db30010956c" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Sep 30 07:06:42 crc kubenswrapper[4691]: I0930 07:06:42.893009 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6d36519-9195-4e0b-9760-844d420e2661" containerName="prometheus" Sep 30 07:06:42 crc kubenswrapper[4691]: I0930 07:06:42.893020 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6d36519-9195-4e0b-9760-844d420e2661" containerName="thanos-sidecar" Sep 30 07:06:42 crc kubenswrapper[4691]: I0930 07:06:42.893071 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ff9ce1f-1b1b-41f2-b95d-711ce61f4b87" containerName="registry-server" Sep 30 07:06:42 crc kubenswrapper[4691]: I0930 07:06:42.893084 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6d36519-9195-4e0b-9760-844d420e2661" containerName="config-reloader" Sep 30 07:06:42 crc kubenswrapper[4691]: I0930 07:06:42.898578 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Sep 30 07:06:42 crc kubenswrapper[4691]: I0930 07:06:42.900662 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Sep 30 07:06:42 crc kubenswrapper[4691]: I0930 07:06:42.900943 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Sep 30 07:06:42 crc kubenswrapper[4691]: I0930 07:06:42.901659 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Sep 30 07:06:42 crc kubenswrapper[4691]: I0930 07:06:42.901935 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-5hw8h" Sep 30 07:06:42 crc kubenswrapper[4691]: I0930 07:06:42.902083 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Sep 30 07:06:42 crc kubenswrapper[4691]: I0930 07:06:42.920683 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Sep 30 07:06:42 crc kubenswrapper[4691]: I0930 07:06:42.923082 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 07:06:42 crc kubenswrapper[4691]: I0930 07:06:42.938259 4691 scope.go:117] "RemoveContainer" containerID="f2066cd08b87186a70377d50558d23be269f769184c84558f83ebec7565f936b" Sep 30 07:06:42 crc kubenswrapper[4691]: I0930 07:06:42.978336 4691 scope.go:117] "RemoveContainer" containerID="e70ed48128da6ed15d63b20c1b58852c8191c4f116e99cd1584116f3c89961b6" Sep 30 07:06:43 crc kubenswrapper[4691]: I0930 07:06:43.003497 4691 scope.go:117] "RemoveContainer" containerID="ce79f06433243c760f5081894f47324ef44d9e935d4dfdcb32ee476850baf204" Sep 30 07:06:43 crc kubenswrapper[4691]: E0930 07:06:43.003979 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce79f06433243c760f5081894f47324ef44d9e935d4dfdcb32ee476850baf204\": container with ID starting with ce79f06433243c760f5081894f47324ef44d9e935d4dfdcb32ee476850baf204 not found: ID does not exist" containerID="ce79f06433243c760f5081894f47324ef44d9e935d4dfdcb32ee476850baf204" Sep 30 07:06:43 crc kubenswrapper[4691]: I0930 07:06:43.004015 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce79f06433243c760f5081894f47324ef44d9e935d4dfdcb32ee476850baf204"} err="failed to get container status \"ce79f06433243c760f5081894f47324ef44d9e935d4dfdcb32ee476850baf204\": rpc error: code = NotFound desc = could not find container \"ce79f06433243c760f5081894f47324ef44d9e935d4dfdcb32ee476850baf204\": container with ID starting with ce79f06433243c760f5081894f47324ef44d9e935d4dfdcb32ee476850baf204 not found: ID does not exist" Sep 30 07:06:43 crc kubenswrapper[4691]: I0930 07:06:43.004040 4691 scope.go:117] "RemoveContainer" containerID="8e24f260ef6ef82ea29d52730656e5557e2769f5ebd1c1afff062234d4267ed5" Sep 30 07:06:43 crc kubenswrapper[4691]: E0930 07:06:43.004382 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8e24f260ef6ef82ea29d52730656e5557e2769f5ebd1c1afff062234d4267ed5\": container with ID starting with 8e24f260ef6ef82ea29d52730656e5557e2769f5ebd1c1afff062234d4267ed5 not found: ID does not exist" containerID="8e24f260ef6ef82ea29d52730656e5557e2769f5ebd1c1afff062234d4267ed5" Sep 30 07:06:43 crc kubenswrapper[4691]: I0930 07:06:43.004421 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8e24f260ef6ef82ea29d52730656e5557e2769f5ebd1c1afff062234d4267ed5"} err="failed to get container status \"8e24f260ef6ef82ea29d52730656e5557e2769f5ebd1c1afff062234d4267ed5\": rpc error: code = NotFound desc = could not find container \"8e24f260ef6ef82ea29d52730656e5557e2769f5ebd1c1afff062234d4267ed5\": container with ID starting with 8e24f260ef6ef82ea29d52730656e5557e2769f5ebd1c1afff062234d4267ed5 not found: ID does not exist" Sep 30 07:06:43 crc kubenswrapper[4691]: I0930 07:06:43.004450 4691 scope.go:117] "RemoveContainer" containerID="f2066cd08b87186a70377d50558d23be269f769184c84558f83ebec7565f936b" Sep 30 07:06:43 crc kubenswrapper[4691]: E0930 07:06:43.004934 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2066cd08b87186a70377d50558d23be269f769184c84558f83ebec7565f936b\": container with ID starting with f2066cd08b87186a70377d50558d23be269f769184c84558f83ebec7565f936b not found: ID does not exist" containerID="f2066cd08b87186a70377d50558d23be269f769184c84558f83ebec7565f936b" Sep 30 07:06:43 crc kubenswrapper[4691]: I0930 07:06:43.004971 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2066cd08b87186a70377d50558d23be269f769184c84558f83ebec7565f936b"} err="failed to get container status \"f2066cd08b87186a70377d50558d23be269f769184c84558f83ebec7565f936b\": rpc error: code = NotFound desc = could not find container \"f2066cd08b87186a70377d50558d23be269f769184c84558f83ebec7565f936b\": container with ID starting with f2066cd08b87186a70377d50558d23be269f769184c84558f83ebec7565f936b not found: ID does not exist" Sep 30 07:06:43 crc kubenswrapper[4691]: I0930 07:06:43.004991 4691 scope.go:117] "RemoveContainer" containerID="e70ed48128da6ed15d63b20c1b58852c8191c4f116e99cd1584116f3c89961b6" Sep 30 07:06:43 crc kubenswrapper[4691]: E0930 07:06:43.005333 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e70ed48128da6ed15d63b20c1b58852c8191c4f116e99cd1584116f3c89961b6\": container with ID starting with e70ed48128da6ed15d63b20c1b58852c8191c4f116e99cd1584116f3c89961b6 not found: ID does not exist" containerID="e70ed48128da6ed15d63b20c1b58852c8191c4f116e99cd1584116f3c89961b6" Sep 30 07:06:43 crc kubenswrapper[4691]: I0930 07:06:43.005392 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e70ed48128da6ed15d63b20c1b58852c8191c4f116e99cd1584116f3c89961b6"} err="failed to get container status \"e70ed48128da6ed15d63b20c1b58852c8191c4f116e99cd1584116f3c89961b6\": rpc error: code = NotFound desc = could not find container \"e70ed48128da6ed15d63b20c1b58852c8191c4f116e99cd1584116f3c89961b6\": container with ID starting with e70ed48128da6ed15d63b20c1b58852c8191c4f116e99cd1584116f3c89961b6 not found: ID does not exist" Sep 30 07:06:43 crc kubenswrapper[4691]: I0930 07:06:43.027762 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/5621e369-5e8a-491d-aa26-098025c50c2f-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"5621e369-5e8a-491d-aa26-098025c50c2f\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:06:43 crc kubenswrapper[4691]: I0930 07:06:43.027820 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5621e369-5e8a-491d-aa26-098025c50c2f-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"5621e369-5e8a-491d-aa26-098025c50c2f\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:06:43 crc kubenswrapper[4691]: I0930 07:06:43.028012 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/5621e369-5e8a-491d-aa26-098025c50c2f-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"5621e369-5e8a-491d-aa26-098025c50c2f\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:06:43 crc kubenswrapper[4691]: I0930 07:06:43.028098 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5621e369-5e8a-491d-aa26-098025c50c2f-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"5621e369-5e8a-491d-aa26-098025c50c2f\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:06:43 crc kubenswrapper[4691]: I0930 07:06:43.028339 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5621e369-5e8a-491d-aa26-098025c50c2f-config\") pod \"prometheus-metric-storage-0\" (UID: \"5621e369-5e8a-491d-aa26-098025c50c2f\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:06:43 crc kubenswrapper[4691]: I0930 07:06:43.028457 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/5621e369-5e8a-491d-aa26-098025c50c2f-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"5621e369-5e8a-491d-aa26-098025c50c2f\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:06:43 crc kubenswrapper[4691]: I0930 07:06:43.028518 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jsst\" (UniqueName: \"kubernetes.io/projected/5621e369-5e8a-491d-aa26-098025c50c2f-kube-api-access-8jsst\") pod \"prometheus-metric-storage-0\" (UID: \"5621e369-5e8a-491d-aa26-098025c50c2f\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:06:43 crc kubenswrapper[4691]: I0930 07:06:43.028570 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/5621e369-5e8a-491d-aa26-098025c50c2f-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"5621e369-5e8a-491d-aa26-098025c50c2f\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:06:43 crc kubenswrapper[4691]: I0930 07:06:43.028655 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-299d6a78-f368-48b6-8212-bc99e11a0dd5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-299d6a78-f368-48b6-8212-bc99e11a0dd5\") pod \"prometheus-metric-storage-0\" (UID: \"5621e369-5e8a-491d-aa26-098025c50c2f\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:06:43 crc kubenswrapper[4691]: I0930 07:06:43.028748 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5621e369-5e8a-491d-aa26-098025c50c2f-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"5621e369-5e8a-491d-aa26-098025c50c2f\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:06:43 crc kubenswrapper[4691]: I0930 07:06:43.028805 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5621e369-5e8a-491d-aa26-098025c50c2f-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"5621e369-5e8a-491d-aa26-098025c50c2f\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:06:43 crc kubenswrapper[4691]: I0930 07:06:43.130096 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-299d6a78-f368-48b6-8212-bc99e11a0dd5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-299d6a78-f368-48b6-8212-bc99e11a0dd5\") pod \"prometheus-metric-storage-0\" (UID: \"5621e369-5e8a-491d-aa26-098025c50c2f\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:06:43 crc kubenswrapper[4691]: I0930 07:06:43.130153 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5621e369-5e8a-491d-aa26-098025c50c2f-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"5621e369-5e8a-491d-aa26-098025c50c2f\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:06:43 crc kubenswrapper[4691]: I0930 07:06:43.130190 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5621e369-5e8a-491d-aa26-098025c50c2f-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"5621e369-5e8a-491d-aa26-098025c50c2f\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:06:43 crc kubenswrapper[4691]: I0930 07:06:43.130231 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/5621e369-5e8a-491d-aa26-098025c50c2f-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"5621e369-5e8a-491d-aa26-098025c50c2f\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:06:43 crc kubenswrapper[4691]: I0930 07:06:43.130246 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5621e369-5e8a-491d-aa26-098025c50c2f-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"5621e369-5e8a-491d-aa26-098025c50c2f\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:06:43 crc kubenswrapper[4691]: I0930 07:06:43.130273 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/5621e369-5e8a-491d-aa26-098025c50c2f-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"5621e369-5e8a-491d-aa26-098025c50c2f\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:06:43 crc kubenswrapper[4691]: I0930 07:06:43.130291 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5621e369-5e8a-491d-aa26-098025c50c2f-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"5621e369-5e8a-491d-aa26-098025c50c2f\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:06:43 crc kubenswrapper[4691]: I0930 07:06:43.130353 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/5621e369-5e8a-491d-aa26-098025c50c2f-config\") pod \"prometheus-metric-storage-0\" (UID: \"5621e369-5e8a-491d-aa26-098025c50c2f\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:06:43 crc kubenswrapper[4691]: I0930 07:06:43.130390 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/5621e369-5e8a-491d-aa26-098025c50c2f-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"5621e369-5e8a-491d-aa26-098025c50c2f\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:06:43 crc kubenswrapper[4691]: I0930 07:06:43.130417 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jsst\" (UniqueName: \"kubernetes.io/projected/5621e369-5e8a-491d-aa26-098025c50c2f-kube-api-access-8jsst\") pod \"prometheus-metric-storage-0\" (UID: \"5621e369-5e8a-491d-aa26-098025c50c2f\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:06:43 crc kubenswrapper[4691]: I0930 07:06:43.130438 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/5621e369-5e8a-491d-aa26-098025c50c2f-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"5621e369-5e8a-491d-aa26-098025c50c2f\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:06:43 crc kubenswrapper[4691]: I0930 07:06:43.132715 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/5621e369-5e8a-491d-aa26-098025c50c2f-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"5621e369-5e8a-491d-aa26-098025c50c2f\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:06:43 crc kubenswrapper[4691]: I0930 07:06:43.135675 4691 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Sep 30 07:06:43 crc kubenswrapper[4691]: I0930 07:06:43.135837 4691 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-299d6a78-f368-48b6-8212-bc99e11a0dd5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-299d6a78-f368-48b6-8212-bc99e11a0dd5\") pod \"prometheus-metric-storage-0\" (UID: \"5621e369-5e8a-491d-aa26-098025c50c2f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/c288a5d41a5881b6adb6be722d4e7a99207424eb3b5d2db5e4a72cf60753eefa/globalmount\"" pod="openstack/prometheus-metric-storage-0" Sep 30 07:06:43 crc kubenswrapper[4691]: I0930 07:06:43.136715 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/5621e369-5e8a-491d-aa26-098025c50c2f-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"5621e369-5e8a-491d-aa26-098025c50c2f\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:06:43 crc kubenswrapper[4691]: I0930 07:06:43.137604 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/5621e369-5e8a-491d-aa26-098025c50c2f-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"5621e369-5e8a-491d-aa26-098025c50c2f\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:06:43 crc kubenswrapper[4691]: I0930 07:06:43.141848 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/5621e369-5e8a-491d-aa26-098025c50c2f-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"5621e369-5e8a-491d-aa26-098025c50c2f\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:06:43 crc kubenswrapper[4691]: I0930 07:06:43.141934 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5621e369-5e8a-491d-aa26-098025c50c2f-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"5621e369-5e8a-491d-aa26-098025c50c2f\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:06:43 crc kubenswrapper[4691]: I0930 07:06:43.142185 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/5621e369-5e8a-491d-aa26-098025c50c2f-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"5621e369-5e8a-491d-aa26-098025c50c2f\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:06:43 crc kubenswrapper[4691]: I0930 07:06:43.142334 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/5621e369-5e8a-491d-aa26-098025c50c2f-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"5621e369-5e8a-491d-aa26-098025c50c2f\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:06:43 crc kubenswrapper[4691]: I0930 07:06:43.142596 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/5621e369-5e8a-491d-aa26-098025c50c2f-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"5621e369-5e8a-491d-aa26-098025c50c2f\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:06:43 crc kubenswrapper[4691]: I0930 07:06:43.143116 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/5621e369-5e8a-491d-aa26-098025c50c2f-config\") pod \"prometheus-metric-storage-0\" (UID: \"5621e369-5e8a-491d-aa26-098025c50c2f\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:06:43 crc kubenswrapper[4691]: I0930 07:06:43.150973 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jsst\" (UniqueName: \"kubernetes.io/projected/5621e369-5e8a-491d-aa26-098025c50c2f-kube-api-access-8jsst\") pod \"prometheus-metric-storage-0\" (UID: \"5621e369-5e8a-491d-aa26-098025c50c2f\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:06:43 crc kubenswrapper[4691]: I0930 07:06:43.183610 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-299d6a78-f368-48b6-8212-bc99e11a0dd5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-299d6a78-f368-48b6-8212-bc99e11a0dd5\") pod \"prometheus-metric-storage-0\" (UID: \"5621e369-5e8a-491d-aa26-098025c50c2f\") " pod="openstack/prometheus-metric-storage-0" Sep 30 07:06:43 crc kubenswrapper[4691]: I0930 07:06:43.234466 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Sep 30 07:06:43 crc kubenswrapper[4691]: I0930 07:06:43.245930 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6d36519-9195-4e0b-9760-844d420e2661" path="/var/lib/kubelet/pods/e6d36519-9195-4e0b-9760-844d420e2661/volumes" Sep 30 07:06:43 crc kubenswrapper[4691]: I0930 07:06:43.735375 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Sep 30 07:06:43 crc kubenswrapper[4691]: I0930 07:06:43.783230 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"5621e369-5e8a-491d-aa26-098025c50c2f","Type":"ContainerStarted","Data":"42716db5a8f6f310766a11c5c0532dda7595abb4eecbeabd67d1cbafd67d2e44"} Sep 30 07:06:48 crc kubenswrapper[4691]: I0930 07:06:48.844720 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"5621e369-5e8a-491d-aa26-098025c50c2f","Type":"ContainerStarted","Data":"13af9ca24ef9072f679a68439c99f8e71f218c2e876324c4bccf8b201b1f1cad"} Sep 30 07:06:52 crc kubenswrapper[4691]: I0930 07:06:52.850216 4691 patch_prober.go:28] interesting pod/machine-config-daemon-4w4k6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 07:06:52 crc kubenswrapper[4691]: I0930 07:06:52.850868 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 07:06:58 crc kubenswrapper[4691]: I0930 07:06:58.971785 4691 generic.go:334] "Generic (PLEG): container finished" podID="5621e369-5e8a-491d-aa26-098025c50c2f" containerID="13af9ca24ef9072f679a68439c99f8e71f218c2e876324c4bccf8b201b1f1cad" exitCode=0 Sep 30 07:06:58 crc kubenswrapper[4691]: I0930 07:06:58.971905 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"5621e369-5e8a-491d-aa26-098025c50c2f","Type":"ContainerDied","Data":"13af9ca24ef9072f679a68439c99f8e71f218c2e876324c4bccf8b201b1f1cad"} Sep 30 07:06:59 crc kubenswrapper[4691]: I0930 07:06:59.985439 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"5621e369-5e8a-491d-aa26-098025c50c2f","Type":"ContainerStarted","Data":"592a927f600ed4061c633760a43547b11536a5bed13c3bc0a249499c4f46a4b9"} Sep 30 07:07:04 crc kubenswrapper[4691]: I0930 07:07:04.049650 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"5621e369-5e8a-491d-aa26-098025c50c2f","Type":"ContainerStarted","Data":"b2a2356d8f120a636ac12091848c665f97c0acabdda9210a476daeaf26d6175d"} Sep 30 07:07:04 crc kubenswrapper[4691]: I0930 07:07:04.050261 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"5621e369-5e8a-491d-aa26-098025c50c2f","Type":"ContainerStarted","Data":"237429d3d4c5b29bd911b85dfe89d1c4e55f3f420c12d15d1a151d9024d8dd0d"} Sep 30 07:07:04 crc kubenswrapper[4691]: I0930 07:07:04.097562 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=22.097533922 podStartE2EDuration="22.097533922s" podCreationTimestamp="2025-09-30 07:06:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:07:04.083761272 +0000 UTC m=+2867.558782322" watchObservedRunningTime="2025-09-30 07:07:04.097533922 +0000 UTC m=+2867.572555002" Sep 30 07:07:08 crc kubenswrapper[4691]: I0930 07:07:08.235067 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Sep 30 07:07:13 crc kubenswrapper[4691]: I0930 07:07:13.246522 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Sep 30 07:07:13 crc kubenswrapper[4691]: I0930 07:07:13.247433 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Sep 30 07:07:13 crc kubenswrapper[4691]: I0930 07:07:13.253363 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Sep 30 07:07:22 crc kubenswrapper[4691]: I0930 07:07:22.850978 4691 patch_prober.go:28] interesting pod/machine-config-daemon-4w4k6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 07:07:22 crc kubenswrapper[4691]: I0930 07:07:22.851862 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 07:07:37 crc kubenswrapper[4691]: I0930 07:07:37.750794 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Sep 30 07:07:37 crc kubenswrapper[4691]: I0930 07:07:37.754399 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Sep 30 07:07:37 crc kubenswrapper[4691]: I0930 07:07:37.762136 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Sep 30 07:07:37 crc kubenswrapper[4691]: I0930 07:07:37.782828 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Sep 30 07:07:37 crc kubenswrapper[4691]: I0930 07:07:37.782923 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Sep 30 07:07:37 crc kubenswrapper[4691]: I0930 07:07:37.783765 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-st99m" Sep 30 07:07:37 crc kubenswrapper[4691]: I0930 07:07:37.783990 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Sep 30 07:07:37 crc kubenswrapper[4691]: I0930 07:07:37.817490 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03\") " pod="openstack/tempest-tests-tempest" Sep 30 07:07:37 crc kubenswrapper[4691]: I0930 07:07:37.817556 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03\") " pod="openstack/tempest-tests-tempest" Sep 30 07:07:37 crc kubenswrapper[4691]: I0930 07:07:37.817667 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03\") " pod="openstack/tempest-tests-tempest" Sep 30 07:07:37 crc kubenswrapper[4691]: I0930 07:07:37.817747 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03\") " pod="openstack/tempest-tests-tempest" Sep 30 07:07:37 crc kubenswrapper[4691]: I0930 07:07:37.817795 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03\") " pod="openstack/tempest-tests-tempest" Sep 30 07:07:37 crc kubenswrapper[4691]: I0930 07:07:37.817851 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03\") " pod="openstack/tempest-tests-tempest" Sep 30 07:07:37 crc kubenswrapper[4691]: I0930 07:07:37.817933 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vxc8\" (UniqueName: \"kubernetes.io/projected/d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03-kube-api-access-2vxc8\") pod \"tempest-tests-tempest\" (UID: \"d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03\") " pod="openstack/tempest-tests-tempest" Sep 30 07:07:37 crc kubenswrapper[4691]: I0930 07:07:37.818061 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03-config-data\") pod \"tempest-tests-tempest\" (UID: \"d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03\") " pod="openstack/tempest-tests-tempest" Sep 30 07:07:37 crc kubenswrapper[4691]: I0930 07:07:37.818176 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03\") " pod="openstack/tempest-tests-tempest" Sep 30 07:07:37 crc kubenswrapper[4691]: I0930 07:07:37.920539 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vxc8\" (UniqueName: \"kubernetes.io/projected/d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03-kube-api-access-2vxc8\") pod \"tempest-tests-tempest\" (UID: \"d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03\") " pod="openstack/tempest-tests-tempest" Sep 30 07:07:37 crc kubenswrapper[4691]: I0930 07:07:37.920684 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03-config-data\") pod \"tempest-tests-tempest\" (UID: \"d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03\") " pod="openstack/tempest-tests-tempest" Sep 30 07:07:37 crc kubenswrapper[4691]: I0930 07:07:37.920775 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03\") " pod="openstack/tempest-tests-tempest" Sep 30 07:07:37 crc kubenswrapper[4691]: I0930 07:07:37.920821 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03\") " pod="openstack/tempest-tests-tempest" Sep 30 07:07:37 crc kubenswrapper[4691]: I0930 07:07:37.920848 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03\") " pod="openstack/tempest-tests-tempest" Sep 30 07:07:37 crc kubenswrapper[4691]: I0930 07:07:37.920941 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03\") " pod="openstack/tempest-tests-tempest" Sep 30 07:07:37 crc kubenswrapper[4691]: I0930 07:07:37.920999 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03\") " pod="openstack/tempest-tests-tempest" Sep 30 07:07:37 crc kubenswrapper[4691]: I0930 07:07:37.921227 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03\") " pod="openstack/tempest-tests-tempest" Sep 30 07:07:37 crc kubenswrapper[4691]: I0930 07:07:37.921261 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03\") " pod="openstack/tempest-tests-tempest" Sep 30 07:07:37 crc kubenswrapper[4691]: I0930 07:07:37.921622 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03\") " pod="openstack/tempest-tests-tempest" Sep 30 07:07:37 crc kubenswrapper[4691]: I0930 07:07:37.922536 4691 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/tempest-tests-tempest" Sep 30 07:07:37 crc kubenswrapper[4691]: I0930 07:07:37.922676 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03\") " pod="openstack/tempest-tests-tempest" Sep 30 07:07:37 crc kubenswrapper[4691]: I0930 07:07:37.923667 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03\") " pod="openstack/tempest-tests-tempest" Sep 30 07:07:37 crc kubenswrapper[4691]: I0930 07:07:37.924009 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03-config-data\") pod \"tempest-tests-tempest\" (UID: \"d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03\") " pod="openstack/tempest-tests-tempest" Sep 30 07:07:37 crc kubenswrapper[4691]: I0930 07:07:37.930004 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03\") " pod="openstack/tempest-tests-tempest" Sep 30 07:07:37 crc kubenswrapper[4691]: I0930 07:07:37.936481 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03\") " pod="openstack/tempest-tests-tempest" Sep 30 07:07:37 crc kubenswrapper[4691]: I0930 07:07:37.942457 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03\") " pod="openstack/tempest-tests-tempest" Sep 30 07:07:37 crc kubenswrapper[4691]: I0930 07:07:37.942843 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vxc8\" (UniqueName: \"kubernetes.io/projected/d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03-kube-api-access-2vxc8\") pod \"tempest-tests-tempest\" (UID: \"d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03\") " pod="openstack/tempest-tests-tempest" Sep 30 07:07:37 crc kubenswrapper[4691]: I0930 07:07:37.962034 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"tempest-tests-tempest\" (UID: \"d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03\") " pod="openstack/tempest-tests-tempest" Sep 30 07:07:38 crc kubenswrapper[4691]: I0930 07:07:38.120097 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Sep 30 07:07:38 crc kubenswrapper[4691]: I0930 07:07:38.679270 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Sep 30 07:07:38 crc kubenswrapper[4691]: W0930 07:07:38.682952 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd7f0691f_aa04_4bb3_b9aa_8e29fd3eeb03.slice/crio-54e2ee440c2cee54f29fc553d6e14510d13c1955ca9fffbd05c990aeda29307d WatchSource:0}: Error finding container 54e2ee440c2cee54f29fc553d6e14510d13c1955ca9fffbd05c990aeda29307d: Status 404 returned error can't find the container with id 54e2ee440c2cee54f29fc553d6e14510d13c1955ca9fffbd05c990aeda29307d Sep 30 07:07:39 crc kubenswrapper[4691]: I0930 07:07:39.496942 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03","Type":"ContainerStarted","Data":"54e2ee440c2cee54f29fc553d6e14510d13c1955ca9fffbd05c990aeda29307d"} Sep 30 07:07:50 crc kubenswrapper[4691]: I0930 07:07:50.622627 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03","Type":"ContainerStarted","Data":"cb3fe673e142ffeb08a833510f05b62401a774565973e50edb442d4e439e69f9"} Sep 30 07:07:50 crc kubenswrapper[4691]: I0930 07:07:50.668000 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.048216349 podStartE2EDuration="14.667970308s" podCreationTimestamp="2025-09-30 07:07:36 +0000 UTC" firstStartedPulling="2025-09-30 07:07:38.686203333 +0000 UTC m=+2902.161224403" lastFinishedPulling="2025-09-30 07:07:49.305957282 +0000 UTC m=+2912.780978362" observedRunningTime="2025-09-30 07:07:50.649146146 +0000 UTC m=+2914.124167266" watchObservedRunningTime="2025-09-30 07:07:50.667970308 +0000 UTC m=+2914.142991398" Sep 30 07:07:52 crc kubenswrapper[4691]: I0930 07:07:52.850266 4691 patch_prober.go:28] interesting pod/machine-config-daemon-4w4k6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 07:07:52 crc kubenswrapper[4691]: I0930 07:07:52.850604 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 07:07:52 crc kubenswrapper[4691]: I0930 07:07:52.850663 4691 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" Sep 30 07:07:52 crc kubenswrapper[4691]: I0930 07:07:52.851617 4691 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"be8ef5b3095c5d5b2e4262f242c72c9857c3992dd41b5ac407f3540740dd3d31"} pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 07:07:52 crc kubenswrapper[4691]: I0930 07:07:52.851698 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" containerName="machine-config-daemon" containerID="cri-o://be8ef5b3095c5d5b2e4262f242c72c9857c3992dd41b5ac407f3540740dd3d31" gracePeriod=600 Sep 30 07:07:53 crc kubenswrapper[4691]: I0930 07:07:53.678838 4691 generic.go:334] "Generic (PLEG): container finished" podID="69b46ade-8260-448f-84b7-506632d23ff9" containerID="be8ef5b3095c5d5b2e4262f242c72c9857c3992dd41b5ac407f3540740dd3d31" exitCode=0 Sep 30 07:07:53 crc kubenswrapper[4691]: I0930 07:07:53.678927 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" event={"ID":"69b46ade-8260-448f-84b7-506632d23ff9","Type":"ContainerDied","Data":"be8ef5b3095c5d5b2e4262f242c72c9857c3992dd41b5ac407f3540740dd3d31"} Sep 30 07:07:53 crc kubenswrapper[4691]: I0930 07:07:53.679420 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" event={"ID":"69b46ade-8260-448f-84b7-506632d23ff9","Type":"ContainerStarted","Data":"9518cc1f054b42c079b347293caf22adcca4092e127081d19ade2b5433469a6f"} Sep 30 07:07:53 crc kubenswrapper[4691]: I0930 07:07:53.679470 4691 scope.go:117] "RemoveContainer" containerID="02eef150c90beb3661cb2f8cd317bd9191cd9f934ce1f2317599df254f95be1f" Sep 30 07:10:22 crc kubenswrapper[4691]: I0930 07:10:22.850602 4691 patch_prober.go:28] interesting pod/machine-config-daemon-4w4k6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 07:10:22 crc kubenswrapper[4691]: I0930 07:10:22.851254 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 07:10:52 crc kubenswrapper[4691]: I0930 07:10:52.849898 4691 patch_prober.go:28] interesting pod/machine-config-daemon-4w4k6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 07:10:52 crc kubenswrapper[4691]: I0930 07:10:52.850683 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 07:11:22 crc kubenswrapper[4691]: I0930 07:11:22.849652 4691 patch_prober.go:28] interesting pod/machine-config-daemon-4w4k6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 07:11:22 crc kubenswrapper[4691]: I0930 07:11:22.851350 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 07:11:22 crc kubenswrapper[4691]: I0930 07:11:22.851506 4691 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" Sep 30 07:11:22 crc kubenswrapper[4691]: I0930 07:11:22.852384 4691 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9518cc1f054b42c079b347293caf22adcca4092e127081d19ade2b5433469a6f"} pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 07:11:22 crc kubenswrapper[4691]: I0930 07:11:22.852580 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" containerName="machine-config-daemon" containerID="cri-o://9518cc1f054b42c079b347293caf22adcca4092e127081d19ade2b5433469a6f" gracePeriod=600 Sep 30 07:11:22 crc kubenswrapper[4691]: E0930 07:11:22.996126 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:11:23 crc kubenswrapper[4691]: I0930 07:11:23.082694 4691 generic.go:334] "Generic (PLEG): container finished" podID="69b46ade-8260-448f-84b7-506632d23ff9" containerID="9518cc1f054b42c079b347293caf22adcca4092e127081d19ade2b5433469a6f" exitCode=0 Sep 30 07:11:23 crc kubenswrapper[4691]: I0930 07:11:23.082753 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" event={"ID":"69b46ade-8260-448f-84b7-506632d23ff9","Type":"ContainerDied","Data":"9518cc1f054b42c079b347293caf22adcca4092e127081d19ade2b5433469a6f"} Sep 30 07:11:23 crc kubenswrapper[4691]: I0930 07:11:23.083084 4691 scope.go:117] "RemoveContainer" containerID="be8ef5b3095c5d5b2e4262f242c72c9857c3992dd41b5ac407f3540740dd3d31" Sep 30 07:11:23 crc kubenswrapper[4691]: I0930 07:11:23.083820 4691 scope.go:117] "RemoveContainer" containerID="9518cc1f054b42c079b347293caf22adcca4092e127081d19ade2b5433469a6f" Sep 30 07:11:23 crc kubenswrapper[4691]: E0930 07:11:23.084151 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:11:37 crc kubenswrapper[4691]: I0930 07:11:37.231258 4691 scope.go:117] "RemoveContainer" containerID="9518cc1f054b42c079b347293caf22adcca4092e127081d19ade2b5433469a6f" Sep 30 07:11:37 crc kubenswrapper[4691]: E0930 07:11:37.232308 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:11:49 crc kubenswrapper[4691]: I0930 07:11:49.225016 4691 scope.go:117] "RemoveContainer" containerID="9518cc1f054b42c079b347293caf22adcca4092e127081d19ade2b5433469a6f" Sep 30 07:11:49 crc kubenswrapper[4691]: E0930 07:11:49.225719 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:12:03 crc kubenswrapper[4691]: I0930 07:12:03.225041 4691 scope.go:117] "RemoveContainer" containerID="9518cc1f054b42c079b347293caf22adcca4092e127081d19ade2b5433469a6f" Sep 30 07:12:03 crc kubenswrapper[4691]: E0930 07:12:03.225887 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:12:14 crc kubenswrapper[4691]: I0930 07:12:14.225559 4691 scope.go:117] "RemoveContainer" containerID="9518cc1f054b42c079b347293caf22adcca4092e127081d19ade2b5433469a6f" Sep 30 07:12:14 crc kubenswrapper[4691]: E0930 07:12:14.226650 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:12:28 crc kubenswrapper[4691]: I0930 07:12:28.225202 4691 scope.go:117] "RemoveContainer" containerID="9518cc1f054b42c079b347293caf22adcca4092e127081d19ade2b5433469a6f" Sep 30 07:12:28 crc kubenswrapper[4691]: E0930 07:12:28.225991 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:12:39 crc kubenswrapper[4691]: I0930 07:12:39.225768 4691 scope.go:117] "RemoveContainer" containerID="9518cc1f054b42c079b347293caf22adcca4092e127081d19ade2b5433469a6f" Sep 30 07:12:39 crc kubenswrapper[4691]: E0930 07:12:39.226487 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:12:52 crc kubenswrapper[4691]: I0930 07:12:52.225295 4691 scope.go:117] "RemoveContainer" containerID="9518cc1f054b42c079b347293caf22adcca4092e127081d19ade2b5433469a6f" Sep 30 07:12:52 crc kubenswrapper[4691]: E0930 07:12:52.226354 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:13:04 crc kubenswrapper[4691]: I0930 07:13:04.225321 4691 scope.go:117] "RemoveContainer" containerID="9518cc1f054b42c079b347293caf22adcca4092e127081d19ade2b5433469a6f" Sep 30 07:13:04 crc kubenswrapper[4691]: E0930 07:13:04.226459 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:13:17 crc kubenswrapper[4691]: I0930 07:13:17.232000 4691 scope.go:117] "RemoveContainer" containerID="9518cc1f054b42c079b347293caf22adcca4092e127081d19ade2b5433469a6f" Sep 30 07:13:17 crc kubenswrapper[4691]: E0930 07:13:17.232811 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:13:28 crc kubenswrapper[4691]: I0930 07:13:28.224737 4691 scope.go:117] "RemoveContainer" containerID="9518cc1f054b42c079b347293caf22adcca4092e127081d19ade2b5433469a6f" Sep 30 07:13:28 crc kubenswrapper[4691]: E0930 07:13:28.225754 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:13:39 crc kubenswrapper[4691]: I0930 07:13:39.225138 4691 scope.go:117] "RemoveContainer" containerID="9518cc1f054b42c079b347293caf22adcca4092e127081d19ade2b5433469a6f" Sep 30 07:13:39 crc kubenswrapper[4691]: E0930 07:13:39.226049 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:13:54 crc kubenswrapper[4691]: I0930 07:13:54.225364 4691 scope.go:117] "RemoveContainer" containerID="9518cc1f054b42c079b347293caf22adcca4092e127081d19ade2b5433469a6f" Sep 30 07:13:54 crc kubenswrapper[4691]: E0930 07:13:54.226211 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:14:09 crc kubenswrapper[4691]: I0930 07:14:09.225046 4691 scope.go:117] "RemoveContainer" containerID="9518cc1f054b42c079b347293caf22adcca4092e127081d19ade2b5433469a6f" Sep 30 07:14:09 crc kubenswrapper[4691]: E0930 07:14:09.226180 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:14:20 crc kubenswrapper[4691]: I0930 07:14:20.224854 4691 scope.go:117] "RemoveContainer" containerID="9518cc1f054b42c079b347293caf22adcca4092e127081d19ade2b5433469a6f" Sep 30 07:14:20 crc kubenswrapper[4691]: E0930 07:14:20.225770 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:14:34 crc kubenswrapper[4691]: I0930 07:14:34.225234 4691 scope.go:117] "RemoveContainer" containerID="9518cc1f054b42c079b347293caf22adcca4092e127081d19ade2b5433469a6f" Sep 30 07:14:34 crc kubenswrapper[4691]: E0930 07:14:34.225979 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:14:45 crc kubenswrapper[4691]: I0930 07:14:45.225294 4691 scope.go:117] "RemoveContainer" containerID="9518cc1f054b42c079b347293caf22adcca4092e127081d19ade2b5433469a6f" Sep 30 07:14:45 crc kubenswrapper[4691]: E0930 07:14:45.226141 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:14:51 crc kubenswrapper[4691]: I0930 07:14:51.329109 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ssqbj"] Sep 30 07:14:51 crc kubenswrapper[4691]: I0930 07:14:51.332616 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ssqbj" Sep 30 07:14:51 crc kubenswrapper[4691]: I0930 07:14:51.348932 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ssqbj"] Sep 30 07:14:51 crc kubenswrapper[4691]: I0930 07:14:51.455528 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/751286e3-50f2-4ee5-b3c9-c74a75d19bb5-utilities\") pod \"community-operators-ssqbj\" (UID: \"751286e3-50f2-4ee5-b3c9-c74a75d19bb5\") " pod="openshift-marketplace/community-operators-ssqbj" Sep 30 07:14:51 crc kubenswrapper[4691]: I0930 07:14:51.455589 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/751286e3-50f2-4ee5-b3c9-c74a75d19bb5-catalog-content\") pod \"community-operators-ssqbj\" (UID: \"751286e3-50f2-4ee5-b3c9-c74a75d19bb5\") " pod="openshift-marketplace/community-operators-ssqbj" Sep 30 07:14:51 crc kubenswrapper[4691]: I0930 07:14:51.455650 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqtdl\" (UniqueName: \"kubernetes.io/projected/751286e3-50f2-4ee5-b3c9-c74a75d19bb5-kube-api-access-zqtdl\") pod \"community-operators-ssqbj\" (UID: \"751286e3-50f2-4ee5-b3c9-c74a75d19bb5\") " pod="openshift-marketplace/community-operators-ssqbj" Sep 30 07:14:51 crc kubenswrapper[4691]: I0930 07:14:51.558020 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/751286e3-50f2-4ee5-b3c9-c74a75d19bb5-utilities\") pod \"community-operators-ssqbj\" (UID: \"751286e3-50f2-4ee5-b3c9-c74a75d19bb5\") " pod="openshift-marketplace/community-operators-ssqbj" Sep 30 07:14:51 crc kubenswrapper[4691]: I0930 07:14:51.558086 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/751286e3-50f2-4ee5-b3c9-c74a75d19bb5-catalog-content\") pod \"community-operators-ssqbj\" (UID: \"751286e3-50f2-4ee5-b3c9-c74a75d19bb5\") " pod="openshift-marketplace/community-operators-ssqbj" Sep 30 07:14:51 crc kubenswrapper[4691]: I0930 07:14:51.558145 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqtdl\" (UniqueName: \"kubernetes.io/projected/751286e3-50f2-4ee5-b3c9-c74a75d19bb5-kube-api-access-zqtdl\") pod \"community-operators-ssqbj\" (UID: \"751286e3-50f2-4ee5-b3c9-c74a75d19bb5\") " pod="openshift-marketplace/community-operators-ssqbj" Sep 30 07:14:51 crc kubenswrapper[4691]: I0930 07:14:51.558675 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/751286e3-50f2-4ee5-b3c9-c74a75d19bb5-utilities\") pod \"community-operators-ssqbj\" (UID: \"751286e3-50f2-4ee5-b3c9-c74a75d19bb5\") " pod="openshift-marketplace/community-operators-ssqbj" Sep 30 07:14:51 crc kubenswrapper[4691]: I0930 07:14:51.559068 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/751286e3-50f2-4ee5-b3c9-c74a75d19bb5-catalog-content\") pod \"community-operators-ssqbj\" (UID: \"751286e3-50f2-4ee5-b3c9-c74a75d19bb5\") " pod="openshift-marketplace/community-operators-ssqbj" Sep 30 07:14:51 crc kubenswrapper[4691]: I0930 07:14:51.578982 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqtdl\" (UniqueName: \"kubernetes.io/projected/751286e3-50f2-4ee5-b3c9-c74a75d19bb5-kube-api-access-zqtdl\") pod \"community-operators-ssqbj\" (UID: \"751286e3-50f2-4ee5-b3c9-c74a75d19bb5\") " pod="openshift-marketplace/community-operators-ssqbj" Sep 30 07:14:51 crc kubenswrapper[4691]: I0930 07:14:51.657664 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ssqbj" Sep 30 07:14:53 crc kubenswrapper[4691]: I0930 07:14:53.165038 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ssqbj"] Sep 30 07:14:53 crc kubenswrapper[4691]: I0930 07:14:53.286203 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ssqbj" event={"ID":"751286e3-50f2-4ee5-b3c9-c74a75d19bb5","Type":"ContainerStarted","Data":"92ec9f7971b8a55a3142fd0b30a97e1fad5f93a61c9833d806cb3f4803fd6105"} Sep 30 07:14:54 crc kubenswrapper[4691]: I0930 07:14:54.298694 4691 generic.go:334] "Generic (PLEG): container finished" podID="751286e3-50f2-4ee5-b3c9-c74a75d19bb5" containerID="a0210ec656f523942d6ae3458b5ae941d2c041b10e4e0ca2dc6fd4390c78a45b" exitCode=0 Sep 30 07:14:54 crc kubenswrapper[4691]: I0930 07:14:54.298790 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ssqbj" event={"ID":"751286e3-50f2-4ee5-b3c9-c74a75d19bb5","Type":"ContainerDied","Data":"a0210ec656f523942d6ae3458b5ae941d2c041b10e4e0ca2dc6fd4390c78a45b"} Sep 30 07:14:54 crc kubenswrapper[4691]: I0930 07:14:54.301254 4691 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 07:14:55 crc kubenswrapper[4691]: I0930 07:14:55.311851 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ssqbj" event={"ID":"751286e3-50f2-4ee5-b3c9-c74a75d19bb5","Type":"ContainerStarted","Data":"b0101c20c8214a4668cc50afcf24649d57da534dc5668b4ad65c9d56cb93c89f"} Sep 30 07:14:57 crc kubenswrapper[4691]: I0930 07:14:57.334679 4691 generic.go:334] "Generic (PLEG): container finished" podID="751286e3-50f2-4ee5-b3c9-c74a75d19bb5" containerID="b0101c20c8214a4668cc50afcf24649d57da534dc5668b4ad65c9d56cb93c89f" exitCode=0 Sep 30 07:14:57 crc kubenswrapper[4691]: I0930 07:14:57.334767 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ssqbj" event={"ID":"751286e3-50f2-4ee5-b3c9-c74a75d19bb5","Type":"ContainerDied","Data":"b0101c20c8214a4668cc50afcf24649d57da534dc5668b4ad65c9d56cb93c89f"} Sep 30 07:14:58 crc kubenswrapper[4691]: I0930 07:14:58.356068 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ssqbj" event={"ID":"751286e3-50f2-4ee5-b3c9-c74a75d19bb5","Type":"ContainerStarted","Data":"5b65fc2efac43c4a33c34c54e48508e6ee023fc6ff480c14d892341ebd83161a"} Sep 30 07:14:58 crc kubenswrapper[4691]: I0930 07:14:58.381517 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ssqbj" podStartSLOduration=3.723567943 podStartE2EDuration="7.381492179s" podCreationTimestamp="2025-09-30 07:14:51 +0000 UTC" firstStartedPulling="2025-09-30 07:14:54.300793671 +0000 UTC m=+3337.775814721" lastFinishedPulling="2025-09-30 07:14:57.958717907 +0000 UTC m=+3341.433738957" observedRunningTime="2025-09-30 07:14:58.37466407 +0000 UTC m=+3341.849685120" watchObservedRunningTime="2025-09-30 07:14:58.381492179 +0000 UTC m=+3341.856513219" Sep 30 07:15:00 crc kubenswrapper[4691]: I0930 07:15:00.187690 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320275-bpbdp"] Sep 30 07:15:00 crc kubenswrapper[4691]: I0930 07:15:00.189325 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320275-bpbdp" Sep 30 07:15:00 crc kubenswrapper[4691]: I0930 07:15:00.195473 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 07:15:00 crc kubenswrapper[4691]: I0930 07:15:00.195989 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 07:15:00 crc kubenswrapper[4691]: I0930 07:15:00.201579 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320275-bpbdp"] Sep 30 07:15:00 crc kubenswrapper[4691]: I0930 07:15:00.226250 4691 scope.go:117] "RemoveContainer" containerID="9518cc1f054b42c079b347293caf22adcca4092e127081d19ade2b5433469a6f" Sep 30 07:15:00 crc kubenswrapper[4691]: E0930 07:15:00.226656 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:15:00 crc kubenswrapper[4691]: I0930 07:15:00.342674 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlqmh\" (UniqueName: \"kubernetes.io/projected/c4fb18c4-f0fd-438b-a522-1a7807fb7b30-kube-api-access-mlqmh\") pod \"collect-profiles-29320275-bpbdp\" (UID: \"c4fb18c4-f0fd-438b-a522-1a7807fb7b30\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320275-bpbdp" Sep 30 07:15:00 crc kubenswrapper[4691]: I0930 07:15:00.343370 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c4fb18c4-f0fd-438b-a522-1a7807fb7b30-config-volume\") pod \"collect-profiles-29320275-bpbdp\" (UID: \"c4fb18c4-f0fd-438b-a522-1a7807fb7b30\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320275-bpbdp" Sep 30 07:15:00 crc kubenswrapper[4691]: I0930 07:15:00.343937 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c4fb18c4-f0fd-438b-a522-1a7807fb7b30-secret-volume\") pod \"collect-profiles-29320275-bpbdp\" (UID: \"c4fb18c4-f0fd-438b-a522-1a7807fb7b30\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320275-bpbdp" Sep 30 07:15:00 crc kubenswrapper[4691]: I0930 07:15:00.446594 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlqmh\" (UniqueName: \"kubernetes.io/projected/c4fb18c4-f0fd-438b-a522-1a7807fb7b30-kube-api-access-mlqmh\") pod \"collect-profiles-29320275-bpbdp\" (UID: \"c4fb18c4-f0fd-438b-a522-1a7807fb7b30\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320275-bpbdp" Sep 30 07:15:00 crc kubenswrapper[4691]: I0930 07:15:00.446701 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c4fb18c4-f0fd-438b-a522-1a7807fb7b30-config-volume\") pod \"collect-profiles-29320275-bpbdp\" (UID: \"c4fb18c4-f0fd-438b-a522-1a7807fb7b30\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320275-bpbdp" Sep 30 07:15:00 crc kubenswrapper[4691]: I0930 07:15:00.446779 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c4fb18c4-f0fd-438b-a522-1a7807fb7b30-secret-volume\") pod \"collect-profiles-29320275-bpbdp\" (UID: \"c4fb18c4-f0fd-438b-a522-1a7807fb7b30\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320275-bpbdp" Sep 30 07:15:00 crc kubenswrapper[4691]: I0930 07:15:00.447806 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c4fb18c4-f0fd-438b-a522-1a7807fb7b30-config-volume\") pod \"collect-profiles-29320275-bpbdp\" (UID: \"c4fb18c4-f0fd-438b-a522-1a7807fb7b30\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320275-bpbdp" Sep 30 07:15:00 crc kubenswrapper[4691]: I0930 07:15:00.454623 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c4fb18c4-f0fd-438b-a522-1a7807fb7b30-secret-volume\") pod \"collect-profiles-29320275-bpbdp\" (UID: \"c4fb18c4-f0fd-438b-a522-1a7807fb7b30\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320275-bpbdp" Sep 30 07:15:00 crc kubenswrapper[4691]: I0930 07:15:00.468072 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlqmh\" (UniqueName: \"kubernetes.io/projected/c4fb18c4-f0fd-438b-a522-1a7807fb7b30-kube-api-access-mlqmh\") pod \"collect-profiles-29320275-bpbdp\" (UID: \"c4fb18c4-f0fd-438b-a522-1a7807fb7b30\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320275-bpbdp" Sep 30 07:15:00 crc kubenswrapper[4691]: I0930 07:15:00.518865 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320275-bpbdp" Sep 30 07:15:01 crc kubenswrapper[4691]: I0930 07:15:01.036860 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320275-bpbdp"] Sep 30 07:15:01 crc kubenswrapper[4691]: I0930 07:15:01.385823 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320275-bpbdp" event={"ID":"c4fb18c4-f0fd-438b-a522-1a7807fb7b30","Type":"ContainerStarted","Data":"656a3487a2e2480d56f0b95bb8f030caa95accbee7d82b2d6e4df1185f2d34d2"} Sep 30 07:15:01 crc kubenswrapper[4691]: I0930 07:15:01.386022 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320275-bpbdp" event={"ID":"c4fb18c4-f0fd-438b-a522-1a7807fb7b30","Type":"ContainerStarted","Data":"b6e805526ee5108e455719c2512af6c8a510ce72344bee1dd9a5a0e836641309"} Sep 30 07:15:01 crc kubenswrapper[4691]: I0930 07:15:01.411129 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29320275-bpbdp" podStartSLOduration=1.411107989 podStartE2EDuration="1.411107989s" podCreationTimestamp="2025-09-30 07:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:15:01.403174896 +0000 UTC m=+3344.878195936" watchObservedRunningTime="2025-09-30 07:15:01.411107989 +0000 UTC m=+3344.886129029" Sep 30 07:15:01 crc kubenswrapper[4691]: I0930 07:15:01.659584 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ssqbj" Sep 30 07:15:01 crc kubenswrapper[4691]: I0930 07:15:01.659866 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ssqbj" Sep 30 07:15:01 crc kubenswrapper[4691]: I0930 07:15:01.719315 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ssqbj" Sep 30 07:15:02 crc kubenswrapper[4691]: I0930 07:15:02.398095 4691 generic.go:334] "Generic (PLEG): container finished" podID="c4fb18c4-f0fd-438b-a522-1a7807fb7b30" containerID="656a3487a2e2480d56f0b95bb8f030caa95accbee7d82b2d6e4df1185f2d34d2" exitCode=0 Sep 30 07:15:02 crc kubenswrapper[4691]: I0930 07:15:02.399491 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320275-bpbdp" event={"ID":"c4fb18c4-f0fd-438b-a522-1a7807fb7b30","Type":"ContainerDied","Data":"656a3487a2e2480d56f0b95bb8f030caa95accbee7d82b2d6e4df1185f2d34d2"} Sep 30 07:15:03 crc kubenswrapper[4691]: I0930 07:15:03.771861 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320275-bpbdp" Sep 30 07:15:03 crc kubenswrapper[4691]: I0930 07:15:03.919027 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c4fb18c4-f0fd-438b-a522-1a7807fb7b30-config-volume\") pod \"c4fb18c4-f0fd-438b-a522-1a7807fb7b30\" (UID: \"c4fb18c4-f0fd-438b-a522-1a7807fb7b30\") " Sep 30 07:15:03 crc kubenswrapper[4691]: I0930 07:15:03.919177 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlqmh\" (UniqueName: \"kubernetes.io/projected/c4fb18c4-f0fd-438b-a522-1a7807fb7b30-kube-api-access-mlqmh\") pod \"c4fb18c4-f0fd-438b-a522-1a7807fb7b30\" (UID: \"c4fb18c4-f0fd-438b-a522-1a7807fb7b30\") " Sep 30 07:15:03 crc kubenswrapper[4691]: I0930 07:15:03.919224 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c4fb18c4-f0fd-438b-a522-1a7807fb7b30-secret-volume\") pod \"c4fb18c4-f0fd-438b-a522-1a7807fb7b30\" (UID: \"c4fb18c4-f0fd-438b-a522-1a7807fb7b30\") " Sep 30 07:15:03 crc kubenswrapper[4691]: I0930 07:15:03.919952 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4fb18c4-f0fd-438b-a522-1a7807fb7b30-config-volume" (OuterVolumeSpecName: "config-volume") pod "c4fb18c4-f0fd-438b-a522-1a7807fb7b30" (UID: "c4fb18c4-f0fd-438b-a522-1a7807fb7b30"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:15:03 crc kubenswrapper[4691]: I0930 07:15:03.920385 4691 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c4fb18c4-f0fd-438b-a522-1a7807fb7b30-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 07:15:03 crc kubenswrapper[4691]: I0930 07:15:03.928518 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c4fb18c4-f0fd-438b-a522-1a7807fb7b30-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c4fb18c4-f0fd-438b-a522-1a7807fb7b30" (UID: "c4fb18c4-f0fd-438b-a522-1a7807fb7b30"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:15:03 crc kubenswrapper[4691]: I0930 07:15:03.931431 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4fb18c4-f0fd-438b-a522-1a7807fb7b30-kube-api-access-mlqmh" (OuterVolumeSpecName: "kube-api-access-mlqmh") pod "c4fb18c4-f0fd-438b-a522-1a7807fb7b30" (UID: "c4fb18c4-f0fd-438b-a522-1a7807fb7b30"). InnerVolumeSpecName "kube-api-access-mlqmh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:15:04 crc kubenswrapper[4691]: I0930 07:15:04.023104 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mlqmh\" (UniqueName: \"kubernetes.io/projected/c4fb18c4-f0fd-438b-a522-1a7807fb7b30-kube-api-access-mlqmh\") on node \"crc\" DevicePath \"\"" Sep 30 07:15:04 crc kubenswrapper[4691]: I0930 07:15:04.023175 4691 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c4fb18c4-f0fd-438b-a522-1a7807fb7b30-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 07:15:04 crc kubenswrapper[4691]: I0930 07:15:04.415634 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320275-bpbdp" event={"ID":"c4fb18c4-f0fd-438b-a522-1a7807fb7b30","Type":"ContainerDied","Data":"b6e805526ee5108e455719c2512af6c8a510ce72344bee1dd9a5a0e836641309"} Sep 30 07:15:04 crc kubenswrapper[4691]: I0930 07:15:04.415928 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b6e805526ee5108e455719c2512af6c8a510ce72344bee1dd9a5a0e836641309" Sep 30 07:15:04 crc kubenswrapper[4691]: I0930 07:15:04.415712 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320275-bpbdp" Sep 30 07:15:04 crc kubenswrapper[4691]: I0930 07:15:04.486343 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320230-dkbgc"] Sep 30 07:15:04 crc kubenswrapper[4691]: I0930 07:15:04.494452 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320230-dkbgc"] Sep 30 07:15:04 crc kubenswrapper[4691]: I0930 07:15:04.881708 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vcrwg"] Sep 30 07:15:04 crc kubenswrapper[4691]: E0930 07:15:04.882240 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4fb18c4-f0fd-438b-a522-1a7807fb7b30" containerName="collect-profiles" Sep 30 07:15:04 crc kubenswrapper[4691]: I0930 07:15:04.882254 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4fb18c4-f0fd-438b-a522-1a7807fb7b30" containerName="collect-profiles" Sep 30 07:15:04 crc kubenswrapper[4691]: I0930 07:15:04.882514 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4fb18c4-f0fd-438b-a522-1a7807fb7b30" containerName="collect-profiles" Sep 30 07:15:04 crc kubenswrapper[4691]: I0930 07:15:04.884225 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vcrwg" Sep 30 07:15:04 crc kubenswrapper[4691]: I0930 07:15:04.896981 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vcrwg"] Sep 30 07:15:05 crc kubenswrapper[4691]: I0930 07:15:05.042277 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54z5w\" (UniqueName: \"kubernetes.io/projected/337d422d-dd4c-4cd9-869d-18a9e6f935d9-kube-api-access-54z5w\") pod \"redhat-operators-vcrwg\" (UID: \"337d422d-dd4c-4cd9-869d-18a9e6f935d9\") " pod="openshift-marketplace/redhat-operators-vcrwg" Sep 30 07:15:05 crc kubenswrapper[4691]: I0930 07:15:05.042374 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/337d422d-dd4c-4cd9-869d-18a9e6f935d9-catalog-content\") pod \"redhat-operators-vcrwg\" (UID: \"337d422d-dd4c-4cd9-869d-18a9e6f935d9\") " pod="openshift-marketplace/redhat-operators-vcrwg" Sep 30 07:15:05 crc kubenswrapper[4691]: I0930 07:15:05.043049 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/337d422d-dd4c-4cd9-869d-18a9e6f935d9-utilities\") pod \"redhat-operators-vcrwg\" (UID: \"337d422d-dd4c-4cd9-869d-18a9e6f935d9\") " pod="openshift-marketplace/redhat-operators-vcrwg" Sep 30 07:15:05 crc kubenswrapper[4691]: I0930 07:15:05.145427 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-54z5w\" (UniqueName: \"kubernetes.io/projected/337d422d-dd4c-4cd9-869d-18a9e6f935d9-kube-api-access-54z5w\") pod \"redhat-operators-vcrwg\" (UID: \"337d422d-dd4c-4cd9-869d-18a9e6f935d9\") " pod="openshift-marketplace/redhat-operators-vcrwg" Sep 30 07:15:05 crc kubenswrapper[4691]: I0930 07:15:05.145513 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/337d422d-dd4c-4cd9-869d-18a9e6f935d9-catalog-content\") pod \"redhat-operators-vcrwg\" (UID: \"337d422d-dd4c-4cd9-869d-18a9e6f935d9\") " pod="openshift-marketplace/redhat-operators-vcrwg" Sep 30 07:15:05 crc kubenswrapper[4691]: I0930 07:15:05.145626 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/337d422d-dd4c-4cd9-869d-18a9e6f935d9-utilities\") pod \"redhat-operators-vcrwg\" (UID: \"337d422d-dd4c-4cd9-869d-18a9e6f935d9\") " pod="openshift-marketplace/redhat-operators-vcrwg" Sep 30 07:15:05 crc kubenswrapper[4691]: I0930 07:15:05.146034 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/337d422d-dd4c-4cd9-869d-18a9e6f935d9-catalog-content\") pod \"redhat-operators-vcrwg\" (UID: \"337d422d-dd4c-4cd9-869d-18a9e6f935d9\") " pod="openshift-marketplace/redhat-operators-vcrwg" Sep 30 07:15:05 crc kubenswrapper[4691]: I0930 07:15:05.146097 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/337d422d-dd4c-4cd9-869d-18a9e6f935d9-utilities\") pod \"redhat-operators-vcrwg\" (UID: \"337d422d-dd4c-4cd9-869d-18a9e6f935d9\") " pod="openshift-marketplace/redhat-operators-vcrwg" Sep 30 07:15:05 crc kubenswrapper[4691]: I0930 07:15:05.172862 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-54z5w\" (UniqueName: \"kubernetes.io/projected/337d422d-dd4c-4cd9-869d-18a9e6f935d9-kube-api-access-54z5w\") pod \"redhat-operators-vcrwg\" (UID: \"337d422d-dd4c-4cd9-869d-18a9e6f935d9\") " pod="openshift-marketplace/redhat-operators-vcrwg" Sep 30 07:15:05 crc kubenswrapper[4691]: I0930 07:15:05.257353 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e35ed7eb-1300-40cb-b087-8d4aa2cb1daa" path="/var/lib/kubelet/pods/e35ed7eb-1300-40cb-b087-8d4aa2cb1daa/volumes" Sep 30 07:15:05 crc kubenswrapper[4691]: I0930 07:15:05.257832 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vcrwg" Sep 30 07:15:05 crc kubenswrapper[4691]: I0930 07:15:05.741009 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vcrwg"] Sep 30 07:15:06 crc kubenswrapper[4691]: I0930 07:15:06.435234 4691 generic.go:334] "Generic (PLEG): container finished" podID="337d422d-dd4c-4cd9-869d-18a9e6f935d9" containerID="fcaa8cad303c8d88725cb9540ea88ba8e8d2c94ea7e868becfbd93288f882383" exitCode=0 Sep 30 07:15:06 crc kubenswrapper[4691]: I0930 07:15:06.435298 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vcrwg" event={"ID":"337d422d-dd4c-4cd9-869d-18a9e6f935d9","Type":"ContainerDied","Data":"fcaa8cad303c8d88725cb9540ea88ba8e8d2c94ea7e868becfbd93288f882383"} Sep 30 07:15:06 crc kubenswrapper[4691]: I0930 07:15:06.435535 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vcrwg" event={"ID":"337d422d-dd4c-4cd9-869d-18a9e6f935d9","Type":"ContainerStarted","Data":"b16659ba0b05be9d5edb7d12a9e17f0717adc5175d7a24e9367184e5f67529fe"} Sep 30 07:15:08 crc kubenswrapper[4691]: I0930 07:15:08.465566 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vcrwg" event={"ID":"337d422d-dd4c-4cd9-869d-18a9e6f935d9","Type":"ContainerStarted","Data":"b95248722918c38c136f2e252f707bd446e9d1c846e37eb0fd9d5432621d3910"} Sep 30 07:15:11 crc kubenswrapper[4691]: I0930 07:15:11.711243 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ssqbj" Sep 30 07:15:11 crc kubenswrapper[4691]: I0930 07:15:11.772312 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ssqbj"] Sep 30 07:15:12 crc kubenswrapper[4691]: I0930 07:15:12.498287 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ssqbj" podUID="751286e3-50f2-4ee5-b3c9-c74a75d19bb5" containerName="registry-server" containerID="cri-o://5b65fc2efac43c4a33c34c54e48508e6ee023fc6ff480c14d892341ebd83161a" gracePeriod=2 Sep 30 07:15:13 crc kubenswrapper[4691]: I0930 07:15:13.522084 4691 generic.go:334] "Generic (PLEG): container finished" podID="751286e3-50f2-4ee5-b3c9-c74a75d19bb5" containerID="5b65fc2efac43c4a33c34c54e48508e6ee023fc6ff480c14d892341ebd83161a" exitCode=0 Sep 30 07:15:13 crc kubenswrapper[4691]: I0930 07:15:13.522198 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ssqbj" event={"ID":"751286e3-50f2-4ee5-b3c9-c74a75d19bb5","Type":"ContainerDied","Data":"5b65fc2efac43c4a33c34c54e48508e6ee023fc6ff480c14d892341ebd83161a"} Sep 30 07:15:13 crc kubenswrapper[4691]: I0930 07:15:13.699389 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ssqbj" Sep 30 07:15:13 crc kubenswrapper[4691]: I0930 07:15:13.728571 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/751286e3-50f2-4ee5-b3c9-c74a75d19bb5-catalog-content\") pod \"751286e3-50f2-4ee5-b3c9-c74a75d19bb5\" (UID: \"751286e3-50f2-4ee5-b3c9-c74a75d19bb5\") " Sep 30 07:15:13 crc kubenswrapper[4691]: I0930 07:15:13.728715 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/751286e3-50f2-4ee5-b3c9-c74a75d19bb5-utilities\") pod \"751286e3-50f2-4ee5-b3c9-c74a75d19bb5\" (UID: \"751286e3-50f2-4ee5-b3c9-c74a75d19bb5\") " Sep 30 07:15:13 crc kubenswrapper[4691]: I0930 07:15:13.728975 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqtdl\" (UniqueName: \"kubernetes.io/projected/751286e3-50f2-4ee5-b3c9-c74a75d19bb5-kube-api-access-zqtdl\") pod \"751286e3-50f2-4ee5-b3c9-c74a75d19bb5\" (UID: \"751286e3-50f2-4ee5-b3c9-c74a75d19bb5\") " Sep 30 07:15:13 crc kubenswrapper[4691]: I0930 07:15:13.729764 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/751286e3-50f2-4ee5-b3c9-c74a75d19bb5-utilities" (OuterVolumeSpecName: "utilities") pod "751286e3-50f2-4ee5-b3c9-c74a75d19bb5" (UID: "751286e3-50f2-4ee5-b3c9-c74a75d19bb5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:15:13 crc kubenswrapper[4691]: I0930 07:15:13.772893 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/751286e3-50f2-4ee5-b3c9-c74a75d19bb5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "751286e3-50f2-4ee5-b3c9-c74a75d19bb5" (UID: "751286e3-50f2-4ee5-b3c9-c74a75d19bb5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:15:13 crc kubenswrapper[4691]: I0930 07:15:13.777063 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/751286e3-50f2-4ee5-b3c9-c74a75d19bb5-kube-api-access-zqtdl" (OuterVolumeSpecName: "kube-api-access-zqtdl") pod "751286e3-50f2-4ee5-b3c9-c74a75d19bb5" (UID: "751286e3-50f2-4ee5-b3c9-c74a75d19bb5"). InnerVolumeSpecName "kube-api-access-zqtdl". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:15:13 crc kubenswrapper[4691]: I0930 07:15:13.831868 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqtdl\" (UniqueName: \"kubernetes.io/projected/751286e3-50f2-4ee5-b3c9-c74a75d19bb5-kube-api-access-zqtdl\") on node \"crc\" DevicePath \"\"" Sep 30 07:15:13 crc kubenswrapper[4691]: I0930 07:15:13.831911 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/751286e3-50f2-4ee5-b3c9-c74a75d19bb5-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 07:15:13 crc kubenswrapper[4691]: I0930 07:15:13.831922 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/751286e3-50f2-4ee5-b3c9-c74a75d19bb5-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 07:15:14 crc kubenswrapper[4691]: I0930 07:15:14.225898 4691 scope.go:117] "RemoveContainer" containerID="9518cc1f054b42c079b347293caf22adcca4092e127081d19ade2b5433469a6f" Sep 30 07:15:14 crc kubenswrapper[4691]: E0930 07:15:14.226285 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:15:14 crc kubenswrapper[4691]: I0930 07:15:14.533859 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ssqbj" event={"ID":"751286e3-50f2-4ee5-b3c9-c74a75d19bb5","Type":"ContainerDied","Data":"92ec9f7971b8a55a3142fd0b30a97e1fad5f93a61c9833d806cb3f4803fd6105"} Sep 30 07:15:14 crc kubenswrapper[4691]: I0930 07:15:14.533946 4691 scope.go:117] "RemoveContainer" containerID="5b65fc2efac43c4a33c34c54e48508e6ee023fc6ff480c14d892341ebd83161a" Sep 30 07:15:14 crc kubenswrapper[4691]: I0930 07:15:14.534098 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ssqbj" Sep 30 07:15:14 crc kubenswrapper[4691]: I0930 07:15:14.571646 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ssqbj"] Sep 30 07:15:14 crc kubenswrapper[4691]: I0930 07:15:14.572579 4691 scope.go:117] "RemoveContainer" containerID="b0101c20c8214a4668cc50afcf24649d57da534dc5668b4ad65c9d56cb93c89f" Sep 30 07:15:14 crc kubenswrapper[4691]: I0930 07:15:14.583300 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ssqbj"] Sep 30 07:15:14 crc kubenswrapper[4691]: I0930 07:15:14.594015 4691 scope.go:117] "RemoveContainer" containerID="a0210ec656f523942d6ae3458b5ae941d2c041b10e4e0ca2dc6fd4390c78a45b" Sep 30 07:15:15 crc kubenswrapper[4691]: I0930 07:15:15.248831 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="751286e3-50f2-4ee5-b3c9-c74a75d19bb5" path="/var/lib/kubelet/pods/751286e3-50f2-4ee5-b3c9-c74a75d19bb5/volumes" Sep 30 07:15:18 crc kubenswrapper[4691]: I0930 07:15:18.603340 4691 generic.go:334] "Generic (PLEG): container finished" podID="337d422d-dd4c-4cd9-869d-18a9e6f935d9" containerID="b95248722918c38c136f2e252f707bd446e9d1c846e37eb0fd9d5432621d3910" exitCode=0 Sep 30 07:15:18 crc kubenswrapper[4691]: I0930 07:15:18.603443 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vcrwg" event={"ID":"337d422d-dd4c-4cd9-869d-18a9e6f935d9","Type":"ContainerDied","Data":"b95248722918c38c136f2e252f707bd446e9d1c846e37eb0fd9d5432621d3910"} Sep 30 07:15:20 crc kubenswrapper[4691]: I0930 07:15:20.627378 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vcrwg" event={"ID":"337d422d-dd4c-4cd9-869d-18a9e6f935d9","Type":"ContainerStarted","Data":"f675f18d4eda47a3c97e4159c1aa27232bba912fb81f16927d35d564de80edf6"} Sep 30 07:15:25 crc kubenswrapper[4691]: I0930 07:15:25.258353 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vcrwg" Sep 30 07:15:25 crc kubenswrapper[4691]: I0930 07:15:25.258810 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vcrwg" Sep 30 07:15:26 crc kubenswrapper[4691]: I0930 07:15:26.322814 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vcrwg" podUID="337d422d-dd4c-4cd9-869d-18a9e6f935d9" containerName="registry-server" probeResult="failure" output=< Sep 30 07:15:26 crc kubenswrapper[4691]: timeout: failed to connect service ":50051" within 1s Sep 30 07:15:26 crc kubenswrapper[4691]: > Sep 30 07:15:26 crc kubenswrapper[4691]: I0930 07:15:26.445937 4691 scope.go:117] "RemoveContainer" containerID="ed2d986debd486b47f26a584828f12308d58cc18e95a6f9578be61260d53ae36" Sep 30 07:15:29 crc kubenswrapper[4691]: I0930 07:15:29.224435 4691 scope.go:117] "RemoveContainer" containerID="9518cc1f054b42c079b347293caf22adcca4092e127081d19ade2b5433469a6f" Sep 30 07:15:29 crc kubenswrapper[4691]: E0930 07:15:29.225257 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:15:36 crc kubenswrapper[4691]: I0930 07:15:36.332235 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vcrwg" podUID="337d422d-dd4c-4cd9-869d-18a9e6f935d9" containerName="registry-server" probeResult="failure" output=< Sep 30 07:15:36 crc kubenswrapper[4691]: timeout: failed to connect service ":50051" within 1s Sep 30 07:15:36 crc kubenswrapper[4691]: > Sep 30 07:15:41 crc kubenswrapper[4691]: I0930 07:15:41.225265 4691 scope.go:117] "RemoveContainer" containerID="9518cc1f054b42c079b347293caf22adcca4092e127081d19ade2b5433469a6f" Sep 30 07:15:41 crc kubenswrapper[4691]: E0930 07:15:41.225976 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:15:45 crc kubenswrapper[4691]: I0930 07:15:45.313491 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vcrwg" Sep 30 07:15:45 crc kubenswrapper[4691]: I0930 07:15:45.336122 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vcrwg" podStartSLOduration=28.316840751 podStartE2EDuration="41.336103853s" podCreationTimestamp="2025-09-30 07:15:04 +0000 UTC" firstStartedPulling="2025-09-30 07:15:06.437402981 +0000 UTC m=+3349.912424041" lastFinishedPulling="2025-09-30 07:15:19.456666103 +0000 UTC m=+3362.931687143" observedRunningTime="2025-09-30 07:15:20.651392315 +0000 UTC m=+3364.126413365" watchObservedRunningTime="2025-09-30 07:15:45.336103853 +0000 UTC m=+3388.811124903" Sep 30 07:15:45 crc kubenswrapper[4691]: I0930 07:15:45.370083 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vcrwg" Sep 30 07:15:45 crc kubenswrapper[4691]: I0930 07:15:45.564122 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vcrwg"] Sep 30 07:15:46 crc kubenswrapper[4691]: I0930 07:15:46.896486 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vcrwg" podUID="337d422d-dd4c-4cd9-869d-18a9e6f935d9" containerName="registry-server" containerID="cri-o://f675f18d4eda47a3c97e4159c1aa27232bba912fb81f16927d35d564de80edf6" gracePeriod=2 Sep 30 07:15:47 crc kubenswrapper[4691]: I0930 07:15:47.913280 4691 generic.go:334] "Generic (PLEG): container finished" podID="337d422d-dd4c-4cd9-869d-18a9e6f935d9" containerID="f675f18d4eda47a3c97e4159c1aa27232bba912fb81f16927d35d564de80edf6" exitCode=0 Sep 30 07:15:47 crc kubenswrapper[4691]: I0930 07:15:47.913777 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vcrwg" event={"ID":"337d422d-dd4c-4cd9-869d-18a9e6f935d9","Type":"ContainerDied","Data":"f675f18d4eda47a3c97e4159c1aa27232bba912fb81f16927d35d564de80edf6"} Sep 30 07:15:48 crc kubenswrapper[4691]: I0930 07:15:48.023298 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vcrwg" Sep 30 07:15:48 crc kubenswrapper[4691]: I0930 07:15:48.027209 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/337d422d-dd4c-4cd9-869d-18a9e6f935d9-utilities\") pod \"337d422d-dd4c-4cd9-869d-18a9e6f935d9\" (UID: \"337d422d-dd4c-4cd9-869d-18a9e6f935d9\") " Sep 30 07:15:48 crc kubenswrapper[4691]: I0930 07:15:48.027575 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-54z5w\" (UniqueName: \"kubernetes.io/projected/337d422d-dd4c-4cd9-869d-18a9e6f935d9-kube-api-access-54z5w\") pod \"337d422d-dd4c-4cd9-869d-18a9e6f935d9\" (UID: \"337d422d-dd4c-4cd9-869d-18a9e6f935d9\") " Sep 30 07:15:48 crc kubenswrapper[4691]: I0930 07:15:48.027656 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/337d422d-dd4c-4cd9-869d-18a9e6f935d9-catalog-content\") pod \"337d422d-dd4c-4cd9-869d-18a9e6f935d9\" (UID: \"337d422d-dd4c-4cd9-869d-18a9e6f935d9\") " Sep 30 07:15:48 crc kubenswrapper[4691]: I0930 07:15:48.028313 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/337d422d-dd4c-4cd9-869d-18a9e6f935d9-utilities" (OuterVolumeSpecName: "utilities") pod "337d422d-dd4c-4cd9-869d-18a9e6f935d9" (UID: "337d422d-dd4c-4cd9-869d-18a9e6f935d9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:15:48 crc kubenswrapper[4691]: I0930 07:15:48.037588 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/337d422d-dd4c-4cd9-869d-18a9e6f935d9-kube-api-access-54z5w" (OuterVolumeSpecName: "kube-api-access-54z5w") pod "337d422d-dd4c-4cd9-869d-18a9e6f935d9" (UID: "337d422d-dd4c-4cd9-869d-18a9e6f935d9"). InnerVolumeSpecName "kube-api-access-54z5w". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:15:48 crc kubenswrapper[4691]: I0930 07:15:48.129581 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/337d422d-dd4c-4cd9-869d-18a9e6f935d9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "337d422d-dd4c-4cd9-869d-18a9e6f935d9" (UID: "337d422d-dd4c-4cd9-869d-18a9e6f935d9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:15:48 crc kubenswrapper[4691]: I0930 07:15:48.130188 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/337d422d-dd4c-4cd9-869d-18a9e6f935d9-catalog-content\") pod \"337d422d-dd4c-4cd9-869d-18a9e6f935d9\" (UID: \"337d422d-dd4c-4cd9-869d-18a9e6f935d9\") " Sep 30 07:15:48 crc kubenswrapper[4691]: W0930 07:15:48.130351 4691 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/337d422d-dd4c-4cd9-869d-18a9e6f935d9/volumes/kubernetes.io~empty-dir/catalog-content Sep 30 07:15:48 crc kubenswrapper[4691]: I0930 07:15:48.130370 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/337d422d-dd4c-4cd9-869d-18a9e6f935d9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "337d422d-dd4c-4cd9-869d-18a9e6f935d9" (UID: "337d422d-dd4c-4cd9-869d-18a9e6f935d9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:15:48 crc kubenswrapper[4691]: I0930 07:15:48.131098 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-54z5w\" (UniqueName: \"kubernetes.io/projected/337d422d-dd4c-4cd9-869d-18a9e6f935d9-kube-api-access-54z5w\") on node \"crc\" DevicePath \"\"" Sep 30 07:15:48 crc kubenswrapper[4691]: I0930 07:15:48.131204 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/337d422d-dd4c-4cd9-869d-18a9e6f935d9-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 07:15:48 crc kubenswrapper[4691]: I0930 07:15:48.131298 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/337d422d-dd4c-4cd9-869d-18a9e6f935d9-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 07:15:48 crc kubenswrapper[4691]: I0930 07:15:48.924861 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vcrwg" event={"ID":"337d422d-dd4c-4cd9-869d-18a9e6f935d9","Type":"ContainerDied","Data":"b16659ba0b05be9d5edb7d12a9e17f0717adc5175d7a24e9367184e5f67529fe"} Sep 30 07:15:48 crc kubenswrapper[4691]: I0930 07:15:48.925231 4691 scope.go:117] "RemoveContainer" containerID="f675f18d4eda47a3c97e4159c1aa27232bba912fb81f16927d35d564de80edf6" Sep 30 07:15:48 crc kubenswrapper[4691]: I0930 07:15:48.924985 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vcrwg" Sep 30 07:15:48 crc kubenswrapper[4691]: I0930 07:15:48.960300 4691 scope.go:117] "RemoveContainer" containerID="b95248722918c38c136f2e252f707bd446e9d1c846e37eb0fd9d5432621d3910" Sep 30 07:15:48 crc kubenswrapper[4691]: I0930 07:15:48.980018 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vcrwg"] Sep 30 07:15:48 crc kubenswrapper[4691]: I0930 07:15:48.994267 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vcrwg"] Sep 30 07:15:49 crc kubenswrapper[4691]: I0930 07:15:49.002752 4691 scope.go:117] "RemoveContainer" containerID="fcaa8cad303c8d88725cb9540ea88ba8e8d2c94ea7e868becfbd93288f882383" Sep 30 07:15:49 crc kubenswrapper[4691]: I0930 07:15:49.247820 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="337d422d-dd4c-4cd9-869d-18a9e6f935d9" path="/var/lib/kubelet/pods/337d422d-dd4c-4cd9-869d-18a9e6f935d9/volumes" Sep 30 07:15:54 crc kubenswrapper[4691]: I0930 07:15:54.224433 4691 scope.go:117] "RemoveContainer" containerID="9518cc1f054b42c079b347293caf22adcca4092e127081d19ade2b5433469a6f" Sep 30 07:15:54 crc kubenswrapper[4691]: E0930 07:15:54.225244 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:16:05 crc kubenswrapper[4691]: I0930 07:16:05.225153 4691 scope.go:117] "RemoveContainer" containerID="9518cc1f054b42c079b347293caf22adcca4092e127081d19ade2b5433469a6f" Sep 30 07:16:05 crc kubenswrapper[4691]: E0930 07:16:05.227202 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:16:20 crc kubenswrapper[4691]: I0930 07:16:20.226436 4691 scope.go:117] "RemoveContainer" containerID="9518cc1f054b42c079b347293caf22adcca4092e127081d19ade2b5433469a6f" Sep 30 07:16:20 crc kubenswrapper[4691]: E0930 07:16:20.227241 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:16:30 crc kubenswrapper[4691]: I0930 07:16:30.533239 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9xmqg"] Sep 30 07:16:30 crc kubenswrapper[4691]: E0930 07:16:30.535006 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="751286e3-50f2-4ee5-b3c9-c74a75d19bb5" containerName="extract-utilities" Sep 30 07:16:30 crc kubenswrapper[4691]: I0930 07:16:30.535026 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="751286e3-50f2-4ee5-b3c9-c74a75d19bb5" containerName="extract-utilities" Sep 30 07:16:30 crc kubenswrapper[4691]: E0930 07:16:30.535043 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="337d422d-dd4c-4cd9-869d-18a9e6f935d9" containerName="extract-content" Sep 30 07:16:30 crc kubenswrapper[4691]: I0930 07:16:30.535049 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="337d422d-dd4c-4cd9-869d-18a9e6f935d9" containerName="extract-content" Sep 30 07:16:30 crc kubenswrapper[4691]: E0930 07:16:30.535104 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="337d422d-dd4c-4cd9-869d-18a9e6f935d9" containerName="extract-utilities" Sep 30 07:16:30 crc kubenswrapper[4691]: I0930 07:16:30.535111 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="337d422d-dd4c-4cd9-869d-18a9e6f935d9" containerName="extract-utilities" Sep 30 07:16:30 crc kubenswrapper[4691]: E0930 07:16:30.535135 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="337d422d-dd4c-4cd9-869d-18a9e6f935d9" containerName="registry-server" Sep 30 07:16:30 crc kubenswrapper[4691]: I0930 07:16:30.535143 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="337d422d-dd4c-4cd9-869d-18a9e6f935d9" containerName="registry-server" Sep 30 07:16:30 crc kubenswrapper[4691]: E0930 07:16:30.535174 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="751286e3-50f2-4ee5-b3c9-c74a75d19bb5" containerName="extract-content" Sep 30 07:16:30 crc kubenswrapper[4691]: I0930 07:16:30.535182 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="751286e3-50f2-4ee5-b3c9-c74a75d19bb5" containerName="extract-content" Sep 30 07:16:30 crc kubenswrapper[4691]: E0930 07:16:30.535200 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="751286e3-50f2-4ee5-b3c9-c74a75d19bb5" containerName="registry-server" Sep 30 07:16:30 crc kubenswrapper[4691]: I0930 07:16:30.535205 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="751286e3-50f2-4ee5-b3c9-c74a75d19bb5" containerName="registry-server" Sep 30 07:16:30 crc kubenswrapper[4691]: I0930 07:16:30.537284 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="337d422d-dd4c-4cd9-869d-18a9e6f935d9" containerName="registry-server" Sep 30 07:16:30 crc kubenswrapper[4691]: I0930 07:16:30.537316 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="751286e3-50f2-4ee5-b3c9-c74a75d19bb5" containerName="registry-server" Sep 30 07:16:30 crc kubenswrapper[4691]: I0930 07:16:30.540152 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9xmqg" Sep 30 07:16:30 crc kubenswrapper[4691]: I0930 07:16:30.554152 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9xmqg"] Sep 30 07:16:30 crc kubenswrapper[4691]: I0930 07:16:30.681923 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93c048ec-3ef2-4dc9-af06-5f3aa9bcae6f-utilities\") pod \"certified-operators-9xmqg\" (UID: \"93c048ec-3ef2-4dc9-af06-5f3aa9bcae6f\") " pod="openshift-marketplace/certified-operators-9xmqg" Sep 30 07:16:30 crc kubenswrapper[4691]: I0930 07:16:30.682305 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93c048ec-3ef2-4dc9-af06-5f3aa9bcae6f-catalog-content\") pod \"certified-operators-9xmqg\" (UID: \"93c048ec-3ef2-4dc9-af06-5f3aa9bcae6f\") " pod="openshift-marketplace/certified-operators-9xmqg" Sep 30 07:16:30 crc kubenswrapper[4691]: I0930 07:16:30.682406 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4pj6\" (UniqueName: \"kubernetes.io/projected/93c048ec-3ef2-4dc9-af06-5f3aa9bcae6f-kube-api-access-z4pj6\") pod \"certified-operators-9xmqg\" (UID: \"93c048ec-3ef2-4dc9-af06-5f3aa9bcae6f\") " pod="openshift-marketplace/certified-operators-9xmqg" Sep 30 07:16:30 crc kubenswrapper[4691]: I0930 07:16:30.784593 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93c048ec-3ef2-4dc9-af06-5f3aa9bcae6f-catalog-content\") pod \"certified-operators-9xmqg\" (UID: \"93c048ec-3ef2-4dc9-af06-5f3aa9bcae6f\") " pod="openshift-marketplace/certified-operators-9xmqg" Sep 30 07:16:30 crc kubenswrapper[4691]: I0930 07:16:30.784673 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4pj6\" (UniqueName: \"kubernetes.io/projected/93c048ec-3ef2-4dc9-af06-5f3aa9bcae6f-kube-api-access-z4pj6\") pod \"certified-operators-9xmqg\" (UID: \"93c048ec-3ef2-4dc9-af06-5f3aa9bcae6f\") " pod="openshift-marketplace/certified-operators-9xmqg" Sep 30 07:16:30 crc kubenswrapper[4691]: I0930 07:16:30.784760 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93c048ec-3ef2-4dc9-af06-5f3aa9bcae6f-utilities\") pod \"certified-operators-9xmqg\" (UID: \"93c048ec-3ef2-4dc9-af06-5f3aa9bcae6f\") " pod="openshift-marketplace/certified-operators-9xmqg" Sep 30 07:16:30 crc kubenswrapper[4691]: I0930 07:16:30.785270 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/93c048ec-3ef2-4dc9-af06-5f3aa9bcae6f-utilities\") pod \"certified-operators-9xmqg\" (UID: \"93c048ec-3ef2-4dc9-af06-5f3aa9bcae6f\") " pod="openshift-marketplace/certified-operators-9xmqg" Sep 30 07:16:30 crc kubenswrapper[4691]: I0930 07:16:30.785507 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/93c048ec-3ef2-4dc9-af06-5f3aa9bcae6f-catalog-content\") pod \"certified-operators-9xmqg\" (UID: \"93c048ec-3ef2-4dc9-af06-5f3aa9bcae6f\") " pod="openshift-marketplace/certified-operators-9xmqg" Sep 30 07:16:30 crc kubenswrapper[4691]: I0930 07:16:30.806706 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4pj6\" (UniqueName: \"kubernetes.io/projected/93c048ec-3ef2-4dc9-af06-5f3aa9bcae6f-kube-api-access-z4pj6\") pod \"certified-operators-9xmqg\" (UID: \"93c048ec-3ef2-4dc9-af06-5f3aa9bcae6f\") " pod="openshift-marketplace/certified-operators-9xmqg" Sep 30 07:16:30 crc kubenswrapper[4691]: I0930 07:16:30.876358 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9xmqg" Sep 30 07:16:31 crc kubenswrapper[4691]: I0930 07:16:31.221974 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9xmqg"] Sep 30 07:16:31 crc kubenswrapper[4691]: W0930 07:16:31.224147 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93c048ec_3ef2_4dc9_af06_5f3aa9bcae6f.slice/crio-75b1e3d704f7916e7d176b9e21e38808289c65ad7d59eb73eb9fa88e58162577 WatchSource:0}: Error finding container 75b1e3d704f7916e7d176b9e21e38808289c65ad7d59eb73eb9fa88e58162577: Status 404 returned error can't find the container with id 75b1e3d704f7916e7d176b9e21e38808289c65ad7d59eb73eb9fa88e58162577 Sep 30 07:16:31 crc kubenswrapper[4691]: I0930 07:16:31.412336 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9xmqg" event={"ID":"93c048ec-3ef2-4dc9-af06-5f3aa9bcae6f","Type":"ContainerStarted","Data":"75b1e3d704f7916e7d176b9e21e38808289c65ad7d59eb73eb9fa88e58162577"} Sep 30 07:16:32 crc kubenswrapper[4691]: I0930 07:16:32.423580 4691 generic.go:334] "Generic (PLEG): container finished" podID="93c048ec-3ef2-4dc9-af06-5f3aa9bcae6f" containerID="ff6691b6436c510bdfc023604f615f1dbd7ea5ce5ea264343e7a08aaeaf15873" exitCode=0 Sep 30 07:16:32 crc kubenswrapper[4691]: I0930 07:16:32.423622 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9xmqg" event={"ID":"93c048ec-3ef2-4dc9-af06-5f3aa9bcae6f","Type":"ContainerDied","Data":"ff6691b6436c510bdfc023604f615f1dbd7ea5ce5ea264343e7a08aaeaf15873"} Sep 30 07:16:33 crc kubenswrapper[4691]: I0930 07:16:33.224838 4691 scope.go:117] "RemoveContainer" containerID="9518cc1f054b42c079b347293caf22adcca4092e127081d19ade2b5433469a6f" Sep 30 07:16:34 crc kubenswrapper[4691]: I0930 07:16:34.453052 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" event={"ID":"69b46ade-8260-448f-84b7-506632d23ff9","Type":"ContainerStarted","Data":"0434386cd76a49bfe15e3153e96b9f35396c2bb1bd37e200ecf8ba3d62d6b4a5"} Sep 30 07:16:34 crc kubenswrapper[4691]: I0930 07:16:34.911068 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bx2j5"] Sep 30 07:16:34 crc kubenswrapper[4691]: I0930 07:16:34.913539 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bx2j5" Sep 30 07:16:34 crc kubenswrapper[4691]: I0930 07:16:34.925687 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bx2j5"] Sep 30 07:16:35 crc kubenswrapper[4691]: I0930 07:16:35.086298 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1802caf3-f3b8-4d0b-8442-1b587308a100-utilities\") pod \"redhat-marketplace-bx2j5\" (UID: \"1802caf3-f3b8-4d0b-8442-1b587308a100\") " pod="openshift-marketplace/redhat-marketplace-bx2j5" Sep 30 07:16:35 crc kubenswrapper[4691]: I0930 07:16:35.086411 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjc9w\" (UniqueName: \"kubernetes.io/projected/1802caf3-f3b8-4d0b-8442-1b587308a100-kube-api-access-sjc9w\") pod \"redhat-marketplace-bx2j5\" (UID: \"1802caf3-f3b8-4d0b-8442-1b587308a100\") " pod="openshift-marketplace/redhat-marketplace-bx2j5" Sep 30 07:16:35 crc kubenswrapper[4691]: I0930 07:16:35.086795 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1802caf3-f3b8-4d0b-8442-1b587308a100-catalog-content\") pod \"redhat-marketplace-bx2j5\" (UID: \"1802caf3-f3b8-4d0b-8442-1b587308a100\") " pod="openshift-marketplace/redhat-marketplace-bx2j5" Sep 30 07:16:35 crc kubenswrapper[4691]: I0930 07:16:35.188363 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1802caf3-f3b8-4d0b-8442-1b587308a100-catalog-content\") pod \"redhat-marketplace-bx2j5\" (UID: \"1802caf3-f3b8-4d0b-8442-1b587308a100\") " pod="openshift-marketplace/redhat-marketplace-bx2j5" Sep 30 07:16:35 crc kubenswrapper[4691]: I0930 07:16:35.188447 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1802caf3-f3b8-4d0b-8442-1b587308a100-utilities\") pod \"redhat-marketplace-bx2j5\" (UID: \"1802caf3-f3b8-4d0b-8442-1b587308a100\") " pod="openshift-marketplace/redhat-marketplace-bx2j5" Sep 30 07:16:35 crc kubenswrapper[4691]: I0930 07:16:35.188525 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjc9w\" (UniqueName: \"kubernetes.io/projected/1802caf3-f3b8-4d0b-8442-1b587308a100-kube-api-access-sjc9w\") pod \"redhat-marketplace-bx2j5\" (UID: \"1802caf3-f3b8-4d0b-8442-1b587308a100\") " pod="openshift-marketplace/redhat-marketplace-bx2j5" Sep 30 07:16:35 crc kubenswrapper[4691]: I0930 07:16:35.188818 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1802caf3-f3b8-4d0b-8442-1b587308a100-catalog-content\") pod \"redhat-marketplace-bx2j5\" (UID: \"1802caf3-f3b8-4d0b-8442-1b587308a100\") " pod="openshift-marketplace/redhat-marketplace-bx2j5" Sep 30 07:16:35 crc kubenswrapper[4691]: I0930 07:16:35.189025 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1802caf3-f3b8-4d0b-8442-1b587308a100-utilities\") pod \"redhat-marketplace-bx2j5\" (UID: \"1802caf3-f3b8-4d0b-8442-1b587308a100\") " pod="openshift-marketplace/redhat-marketplace-bx2j5" Sep 30 07:16:35 crc kubenswrapper[4691]: I0930 07:16:35.218068 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjc9w\" (UniqueName: \"kubernetes.io/projected/1802caf3-f3b8-4d0b-8442-1b587308a100-kube-api-access-sjc9w\") pod \"redhat-marketplace-bx2j5\" (UID: \"1802caf3-f3b8-4d0b-8442-1b587308a100\") " pod="openshift-marketplace/redhat-marketplace-bx2j5" Sep 30 07:16:35 crc kubenswrapper[4691]: I0930 07:16:35.235664 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bx2j5" Sep 30 07:16:38 crc kubenswrapper[4691]: I0930 07:16:38.884257 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bx2j5"] Sep 30 07:16:39 crc kubenswrapper[4691]: I0930 07:16:39.505602 4691 generic.go:334] "Generic (PLEG): container finished" podID="1802caf3-f3b8-4d0b-8442-1b587308a100" containerID="4cbaedf504c0289f3a03ee256e9845aaf57c51e6b310209a067a66f1094db3d4" exitCode=0 Sep 30 07:16:39 crc kubenswrapper[4691]: I0930 07:16:39.506181 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bx2j5" event={"ID":"1802caf3-f3b8-4d0b-8442-1b587308a100","Type":"ContainerDied","Data":"4cbaedf504c0289f3a03ee256e9845aaf57c51e6b310209a067a66f1094db3d4"} Sep 30 07:16:39 crc kubenswrapper[4691]: I0930 07:16:39.507128 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bx2j5" event={"ID":"1802caf3-f3b8-4d0b-8442-1b587308a100","Type":"ContainerStarted","Data":"7e02f0d7c1e9de192dd75b85ee5b6e10277989ddd27c29c37e690023827a1f09"} Sep 30 07:16:39 crc kubenswrapper[4691]: I0930 07:16:39.511360 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9xmqg" event={"ID":"93c048ec-3ef2-4dc9-af06-5f3aa9bcae6f","Type":"ContainerStarted","Data":"ea6b3657d1d483ef76a21af020ac412c92350b53bc327a79005fe89f4799c496"} Sep 30 07:16:40 crc kubenswrapper[4691]: I0930 07:16:40.524740 4691 generic.go:334] "Generic (PLEG): container finished" podID="93c048ec-3ef2-4dc9-af06-5f3aa9bcae6f" containerID="ea6b3657d1d483ef76a21af020ac412c92350b53bc327a79005fe89f4799c496" exitCode=0 Sep 30 07:16:40 crc kubenswrapper[4691]: I0930 07:16:40.524856 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9xmqg" event={"ID":"93c048ec-3ef2-4dc9-af06-5f3aa9bcae6f","Type":"ContainerDied","Data":"ea6b3657d1d483ef76a21af020ac412c92350b53bc327a79005fe89f4799c496"} Sep 30 07:16:41 crc kubenswrapper[4691]: I0930 07:16:41.537495 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9xmqg" event={"ID":"93c048ec-3ef2-4dc9-af06-5f3aa9bcae6f","Type":"ContainerStarted","Data":"d2f214930c2ca4690fd65cb6ad356b5d5e2e8c442a97e63545e1f3931d7c815f"} Sep 30 07:16:41 crc kubenswrapper[4691]: I0930 07:16:41.541656 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bx2j5" event={"ID":"1802caf3-f3b8-4d0b-8442-1b587308a100","Type":"ContainerStarted","Data":"c6daca10b142ef15269f4f196b846bb5fe39a6dc71d88318a35a98c9d50a1b5f"} Sep 30 07:16:41 crc kubenswrapper[4691]: I0930 07:16:41.559991 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9xmqg" podStartSLOduration=2.880675113 podStartE2EDuration="11.559976277s" podCreationTimestamp="2025-09-30 07:16:30 +0000 UTC" firstStartedPulling="2025-09-30 07:16:32.425596884 +0000 UTC m=+3435.900617924" lastFinishedPulling="2025-09-30 07:16:41.104898048 +0000 UTC m=+3444.579919088" observedRunningTime="2025-09-30 07:16:41.557999364 +0000 UTC m=+3445.033020404" watchObservedRunningTime="2025-09-30 07:16:41.559976277 +0000 UTC m=+3445.034997317" Sep 30 07:16:42 crc kubenswrapper[4691]: I0930 07:16:42.553217 4691 generic.go:334] "Generic (PLEG): container finished" podID="1802caf3-f3b8-4d0b-8442-1b587308a100" containerID="c6daca10b142ef15269f4f196b846bb5fe39a6dc71d88318a35a98c9d50a1b5f" exitCode=0 Sep 30 07:16:42 crc kubenswrapper[4691]: I0930 07:16:42.553402 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bx2j5" event={"ID":"1802caf3-f3b8-4d0b-8442-1b587308a100","Type":"ContainerDied","Data":"c6daca10b142ef15269f4f196b846bb5fe39a6dc71d88318a35a98c9d50a1b5f"} Sep 30 07:16:44 crc kubenswrapper[4691]: I0930 07:16:44.594437 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bx2j5" event={"ID":"1802caf3-f3b8-4d0b-8442-1b587308a100","Type":"ContainerStarted","Data":"6c3b144a485df29563a11f183411d96259dcdc59b639ce2de87b00a46a9f5782"} Sep 30 07:16:44 crc kubenswrapper[4691]: I0930 07:16:44.613825 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bx2j5" podStartSLOduration=6.694085054 podStartE2EDuration="10.61380692s" podCreationTimestamp="2025-09-30 07:16:34 +0000 UTC" firstStartedPulling="2025-09-30 07:16:39.50786613 +0000 UTC m=+3442.982887160" lastFinishedPulling="2025-09-30 07:16:43.427587986 +0000 UTC m=+3446.902609026" observedRunningTime="2025-09-30 07:16:44.61287817 +0000 UTC m=+3448.087899250" watchObservedRunningTime="2025-09-30 07:16:44.61380692 +0000 UTC m=+3448.088827960" Sep 30 07:16:45 crc kubenswrapper[4691]: I0930 07:16:45.243020 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bx2j5" Sep 30 07:16:45 crc kubenswrapper[4691]: I0930 07:16:45.243076 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bx2j5" Sep 30 07:16:46 crc kubenswrapper[4691]: I0930 07:16:46.302485 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-bx2j5" podUID="1802caf3-f3b8-4d0b-8442-1b587308a100" containerName="registry-server" probeResult="failure" output=< Sep 30 07:16:46 crc kubenswrapper[4691]: timeout: failed to connect service ":50051" within 1s Sep 30 07:16:46 crc kubenswrapper[4691]: > Sep 30 07:16:50 crc kubenswrapper[4691]: I0930 07:16:50.878199 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9xmqg" Sep 30 07:16:50 crc kubenswrapper[4691]: I0930 07:16:50.878858 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9xmqg" Sep 30 07:16:51 crc kubenswrapper[4691]: I0930 07:16:51.930572 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-9xmqg" podUID="93c048ec-3ef2-4dc9-af06-5f3aa9bcae6f" containerName="registry-server" probeResult="failure" output=< Sep 30 07:16:51 crc kubenswrapper[4691]: timeout: failed to connect service ":50051" within 1s Sep 30 07:16:51 crc kubenswrapper[4691]: > Sep 30 07:16:55 crc kubenswrapper[4691]: I0930 07:16:55.304984 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bx2j5" Sep 30 07:16:55 crc kubenswrapper[4691]: I0930 07:16:55.357531 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bx2j5" Sep 30 07:16:55 crc kubenswrapper[4691]: I0930 07:16:55.543436 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bx2j5"] Sep 30 07:16:56 crc kubenswrapper[4691]: I0930 07:16:56.716962 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bx2j5" podUID="1802caf3-f3b8-4d0b-8442-1b587308a100" containerName="registry-server" containerID="cri-o://6c3b144a485df29563a11f183411d96259dcdc59b639ce2de87b00a46a9f5782" gracePeriod=2 Sep 30 07:16:57 crc kubenswrapper[4691]: I0930 07:16:57.236818 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bx2j5" Sep 30 07:16:57 crc kubenswrapper[4691]: I0930 07:16:57.307196 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1802caf3-f3b8-4d0b-8442-1b587308a100-utilities\") pod \"1802caf3-f3b8-4d0b-8442-1b587308a100\" (UID: \"1802caf3-f3b8-4d0b-8442-1b587308a100\") " Sep 30 07:16:57 crc kubenswrapper[4691]: I0930 07:16:57.307286 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sjc9w\" (UniqueName: \"kubernetes.io/projected/1802caf3-f3b8-4d0b-8442-1b587308a100-kube-api-access-sjc9w\") pod \"1802caf3-f3b8-4d0b-8442-1b587308a100\" (UID: \"1802caf3-f3b8-4d0b-8442-1b587308a100\") " Sep 30 07:16:57 crc kubenswrapper[4691]: I0930 07:16:57.307412 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1802caf3-f3b8-4d0b-8442-1b587308a100-catalog-content\") pod \"1802caf3-f3b8-4d0b-8442-1b587308a100\" (UID: \"1802caf3-f3b8-4d0b-8442-1b587308a100\") " Sep 30 07:16:57 crc kubenswrapper[4691]: I0930 07:16:57.308865 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1802caf3-f3b8-4d0b-8442-1b587308a100-utilities" (OuterVolumeSpecName: "utilities") pod "1802caf3-f3b8-4d0b-8442-1b587308a100" (UID: "1802caf3-f3b8-4d0b-8442-1b587308a100"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:16:57 crc kubenswrapper[4691]: I0930 07:16:57.315811 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1802caf3-f3b8-4d0b-8442-1b587308a100-kube-api-access-sjc9w" (OuterVolumeSpecName: "kube-api-access-sjc9w") pod "1802caf3-f3b8-4d0b-8442-1b587308a100" (UID: "1802caf3-f3b8-4d0b-8442-1b587308a100"). InnerVolumeSpecName "kube-api-access-sjc9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:16:57 crc kubenswrapper[4691]: I0930 07:16:57.327138 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1802caf3-f3b8-4d0b-8442-1b587308a100-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1802caf3-f3b8-4d0b-8442-1b587308a100" (UID: "1802caf3-f3b8-4d0b-8442-1b587308a100"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:16:57 crc kubenswrapper[4691]: I0930 07:16:57.411103 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1802caf3-f3b8-4d0b-8442-1b587308a100-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 07:16:57 crc kubenswrapper[4691]: I0930 07:16:57.411170 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sjc9w\" (UniqueName: \"kubernetes.io/projected/1802caf3-f3b8-4d0b-8442-1b587308a100-kube-api-access-sjc9w\") on node \"crc\" DevicePath \"\"" Sep 30 07:16:57 crc kubenswrapper[4691]: I0930 07:16:57.411193 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1802caf3-f3b8-4d0b-8442-1b587308a100-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 07:16:57 crc kubenswrapper[4691]: I0930 07:16:57.731601 4691 generic.go:334] "Generic (PLEG): container finished" podID="1802caf3-f3b8-4d0b-8442-1b587308a100" containerID="6c3b144a485df29563a11f183411d96259dcdc59b639ce2de87b00a46a9f5782" exitCode=0 Sep 30 07:16:57 crc kubenswrapper[4691]: I0930 07:16:57.731671 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bx2j5" event={"ID":"1802caf3-f3b8-4d0b-8442-1b587308a100","Type":"ContainerDied","Data":"6c3b144a485df29563a11f183411d96259dcdc59b639ce2de87b00a46a9f5782"} Sep 30 07:16:57 crc kubenswrapper[4691]: I0930 07:16:57.731703 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bx2j5" event={"ID":"1802caf3-f3b8-4d0b-8442-1b587308a100","Type":"ContainerDied","Data":"7e02f0d7c1e9de192dd75b85ee5b6e10277989ddd27c29c37e690023827a1f09"} Sep 30 07:16:57 crc kubenswrapper[4691]: I0930 07:16:57.731743 4691 scope.go:117] "RemoveContainer" containerID="6c3b144a485df29563a11f183411d96259dcdc59b639ce2de87b00a46a9f5782" Sep 30 07:16:57 crc kubenswrapper[4691]: I0930 07:16:57.731754 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bx2j5" Sep 30 07:16:57 crc kubenswrapper[4691]: I0930 07:16:57.785967 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bx2j5"] Sep 30 07:16:57 crc kubenswrapper[4691]: I0930 07:16:57.794005 4691 scope.go:117] "RemoveContainer" containerID="c6daca10b142ef15269f4f196b846bb5fe39a6dc71d88318a35a98c9d50a1b5f" Sep 30 07:16:57 crc kubenswrapper[4691]: I0930 07:16:57.796503 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bx2j5"] Sep 30 07:16:57 crc kubenswrapper[4691]: I0930 07:16:57.847914 4691 scope.go:117] "RemoveContainer" containerID="4cbaedf504c0289f3a03ee256e9845aaf57c51e6b310209a067a66f1094db3d4" Sep 30 07:16:57 crc kubenswrapper[4691]: I0930 07:16:57.881600 4691 scope.go:117] "RemoveContainer" containerID="6c3b144a485df29563a11f183411d96259dcdc59b639ce2de87b00a46a9f5782" Sep 30 07:16:57 crc kubenswrapper[4691]: E0930 07:16:57.882570 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c3b144a485df29563a11f183411d96259dcdc59b639ce2de87b00a46a9f5782\": container with ID starting with 6c3b144a485df29563a11f183411d96259dcdc59b639ce2de87b00a46a9f5782 not found: ID does not exist" containerID="6c3b144a485df29563a11f183411d96259dcdc59b639ce2de87b00a46a9f5782" Sep 30 07:16:57 crc kubenswrapper[4691]: I0930 07:16:57.882613 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c3b144a485df29563a11f183411d96259dcdc59b639ce2de87b00a46a9f5782"} err="failed to get container status \"6c3b144a485df29563a11f183411d96259dcdc59b639ce2de87b00a46a9f5782\": rpc error: code = NotFound desc = could not find container \"6c3b144a485df29563a11f183411d96259dcdc59b639ce2de87b00a46a9f5782\": container with ID starting with 6c3b144a485df29563a11f183411d96259dcdc59b639ce2de87b00a46a9f5782 not found: ID does not exist" Sep 30 07:16:57 crc kubenswrapper[4691]: I0930 07:16:57.882638 4691 scope.go:117] "RemoveContainer" containerID="c6daca10b142ef15269f4f196b846bb5fe39a6dc71d88318a35a98c9d50a1b5f" Sep 30 07:16:57 crc kubenswrapper[4691]: E0930 07:16:57.883033 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6daca10b142ef15269f4f196b846bb5fe39a6dc71d88318a35a98c9d50a1b5f\": container with ID starting with c6daca10b142ef15269f4f196b846bb5fe39a6dc71d88318a35a98c9d50a1b5f not found: ID does not exist" containerID="c6daca10b142ef15269f4f196b846bb5fe39a6dc71d88318a35a98c9d50a1b5f" Sep 30 07:16:57 crc kubenswrapper[4691]: I0930 07:16:57.883054 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6daca10b142ef15269f4f196b846bb5fe39a6dc71d88318a35a98c9d50a1b5f"} err="failed to get container status \"c6daca10b142ef15269f4f196b846bb5fe39a6dc71d88318a35a98c9d50a1b5f\": rpc error: code = NotFound desc = could not find container \"c6daca10b142ef15269f4f196b846bb5fe39a6dc71d88318a35a98c9d50a1b5f\": container with ID starting with c6daca10b142ef15269f4f196b846bb5fe39a6dc71d88318a35a98c9d50a1b5f not found: ID does not exist" Sep 30 07:16:57 crc kubenswrapper[4691]: I0930 07:16:57.883066 4691 scope.go:117] "RemoveContainer" containerID="4cbaedf504c0289f3a03ee256e9845aaf57c51e6b310209a067a66f1094db3d4" Sep 30 07:16:57 crc kubenswrapper[4691]: E0930 07:16:57.883239 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cbaedf504c0289f3a03ee256e9845aaf57c51e6b310209a067a66f1094db3d4\": container with ID starting with 4cbaedf504c0289f3a03ee256e9845aaf57c51e6b310209a067a66f1094db3d4 not found: ID does not exist" containerID="4cbaedf504c0289f3a03ee256e9845aaf57c51e6b310209a067a66f1094db3d4" Sep 30 07:16:57 crc kubenswrapper[4691]: I0930 07:16:57.883265 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cbaedf504c0289f3a03ee256e9845aaf57c51e6b310209a067a66f1094db3d4"} err="failed to get container status \"4cbaedf504c0289f3a03ee256e9845aaf57c51e6b310209a067a66f1094db3d4\": rpc error: code = NotFound desc = could not find container \"4cbaedf504c0289f3a03ee256e9845aaf57c51e6b310209a067a66f1094db3d4\": container with ID starting with 4cbaedf504c0289f3a03ee256e9845aaf57c51e6b310209a067a66f1094db3d4 not found: ID does not exist" Sep 30 07:16:59 crc kubenswrapper[4691]: I0930 07:16:59.236056 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1802caf3-f3b8-4d0b-8442-1b587308a100" path="/var/lib/kubelet/pods/1802caf3-f3b8-4d0b-8442-1b587308a100/volumes" Sep 30 07:17:01 crc kubenswrapper[4691]: I0930 07:17:01.931419 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-9xmqg" podUID="93c048ec-3ef2-4dc9-af06-5f3aa9bcae6f" containerName="registry-server" probeResult="failure" output=< Sep 30 07:17:01 crc kubenswrapper[4691]: timeout: failed to connect service ":50051" within 1s Sep 30 07:17:01 crc kubenswrapper[4691]: > Sep 30 07:17:10 crc kubenswrapper[4691]: I0930 07:17:10.925970 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9xmqg" Sep 30 07:17:11 crc kubenswrapper[4691]: I0930 07:17:11.023632 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9xmqg" Sep 30 07:17:11 crc kubenswrapper[4691]: I0930 07:17:11.109541 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9xmqg"] Sep 30 07:17:11 crc kubenswrapper[4691]: I0930 07:17:11.209303 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hwbg9"] Sep 30 07:17:11 crc kubenswrapper[4691]: I0930 07:17:11.209579 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hwbg9" podUID="38edfb7b-d43d-4492-bc1b-8281e28991c0" containerName="registry-server" containerID="cri-o://b3b4083d4931a476e33aa7c89387b7aa63b24a88433bdcbc005007bdd364cabc" gracePeriod=2 Sep 30 07:17:11 crc kubenswrapper[4691]: I0930 07:17:11.716767 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hwbg9" Sep 30 07:17:11 crc kubenswrapper[4691]: I0930 07:17:11.830436 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38edfb7b-d43d-4492-bc1b-8281e28991c0-utilities\") pod \"38edfb7b-d43d-4492-bc1b-8281e28991c0\" (UID: \"38edfb7b-d43d-4492-bc1b-8281e28991c0\") " Sep 30 07:17:11 crc kubenswrapper[4691]: I0930 07:17:11.830571 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38edfb7b-d43d-4492-bc1b-8281e28991c0-catalog-content\") pod \"38edfb7b-d43d-4492-bc1b-8281e28991c0\" (UID: \"38edfb7b-d43d-4492-bc1b-8281e28991c0\") " Sep 30 07:17:11 crc kubenswrapper[4691]: I0930 07:17:11.830815 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pqd7z\" (UniqueName: \"kubernetes.io/projected/38edfb7b-d43d-4492-bc1b-8281e28991c0-kube-api-access-pqd7z\") pod \"38edfb7b-d43d-4492-bc1b-8281e28991c0\" (UID: \"38edfb7b-d43d-4492-bc1b-8281e28991c0\") " Sep 30 07:17:11 crc kubenswrapper[4691]: I0930 07:17:11.834490 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38edfb7b-d43d-4492-bc1b-8281e28991c0-utilities" (OuterVolumeSpecName: "utilities") pod "38edfb7b-d43d-4492-bc1b-8281e28991c0" (UID: "38edfb7b-d43d-4492-bc1b-8281e28991c0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:17:11 crc kubenswrapper[4691]: I0930 07:17:11.840052 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38edfb7b-d43d-4492-bc1b-8281e28991c0-kube-api-access-pqd7z" (OuterVolumeSpecName: "kube-api-access-pqd7z") pod "38edfb7b-d43d-4492-bc1b-8281e28991c0" (UID: "38edfb7b-d43d-4492-bc1b-8281e28991c0"). InnerVolumeSpecName "kube-api-access-pqd7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:17:11 crc kubenswrapper[4691]: I0930 07:17:11.886186 4691 generic.go:334] "Generic (PLEG): container finished" podID="38edfb7b-d43d-4492-bc1b-8281e28991c0" containerID="b3b4083d4931a476e33aa7c89387b7aa63b24a88433bdcbc005007bdd364cabc" exitCode=0 Sep 30 07:17:11 crc kubenswrapper[4691]: I0930 07:17:11.887264 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hwbg9" Sep 30 07:17:11 crc kubenswrapper[4691]: I0930 07:17:11.887368 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hwbg9" event={"ID":"38edfb7b-d43d-4492-bc1b-8281e28991c0","Type":"ContainerDied","Data":"b3b4083d4931a476e33aa7c89387b7aa63b24a88433bdcbc005007bdd364cabc"} Sep 30 07:17:11 crc kubenswrapper[4691]: I0930 07:17:11.887404 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hwbg9" event={"ID":"38edfb7b-d43d-4492-bc1b-8281e28991c0","Type":"ContainerDied","Data":"1f4ddc3eedb023b90e8889ae3da320a9004b185e834a0504c4f0a37b9bd7f498"} Sep 30 07:17:11 crc kubenswrapper[4691]: I0930 07:17:11.887427 4691 scope.go:117] "RemoveContainer" containerID="b3b4083d4931a476e33aa7c89387b7aa63b24a88433bdcbc005007bdd364cabc" Sep 30 07:17:11 crc kubenswrapper[4691]: I0930 07:17:11.923169 4691 scope.go:117] "RemoveContainer" containerID="1c34e7fce6c4fb93e0cc366f9138b08d04cc9ddcf5b424e69303b88f7e9a392b" Sep 30 07:17:11 crc kubenswrapper[4691]: I0930 07:17:11.933269 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38edfb7b-d43d-4492-bc1b-8281e28991c0-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 07:17:11 crc kubenswrapper[4691]: I0930 07:17:11.933306 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pqd7z\" (UniqueName: \"kubernetes.io/projected/38edfb7b-d43d-4492-bc1b-8281e28991c0-kube-api-access-pqd7z\") on node \"crc\" DevicePath \"\"" Sep 30 07:17:11 crc kubenswrapper[4691]: I0930 07:17:11.955694 4691 scope.go:117] "RemoveContainer" containerID="c6607ed3ded0e334eb7e6e51c9e6e839fd186b492c11e28d54f078c0d1a05d98" Sep 30 07:17:11 crc kubenswrapper[4691]: I0930 07:17:11.974069 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38edfb7b-d43d-4492-bc1b-8281e28991c0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "38edfb7b-d43d-4492-bc1b-8281e28991c0" (UID: "38edfb7b-d43d-4492-bc1b-8281e28991c0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:17:11 crc kubenswrapper[4691]: I0930 07:17:11.996119 4691 scope.go:117] "RemoveContainer" containerID="b3b4083d4931a476e33aa7c89387b7aa63b24a88433bdcbc005007bdd364cabc" Sep 30 07:17:11 crc kubenswrapper[4691]: E0930 07:17:11.996737 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3b4083d4931a476e33aa7c89387b7aa63b24a88433bdcbc005007bdd364cabc\": container with ID starting with b3b4083d4931a476e33aa7c89387b7aa63b24a88433bdcbc005007bdd364cabc not found: ID does not exist" containerID="b3b4083d4931a476e33aa7c89387b7aa63b24a88433bdcbc005007bdd364cabc" Sep 30 07:17:11 crc kubenswrapper[4691]: I0930 07:17:11.996771 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3b4083d4931a476e33aa7c89387b7aa63b24a88433bdcbc005007bdd364cabc"} err="failed to get container status \"b3b4083d4931a476e33aa7c89387b7aa63b24a88433bdcbc005007bdd364cabc\": rpc error: code = NotFound desc = could not find container \"b3b4083d4931a476e33aa7c89387b7aa63b24a88433bdcbc005007bdd364cabc\": container with ID starting with b3b4083d4931a476e33aa7c89387b7aa63b24a88433bdcbc005007bdd364cabc not found: ID does not exist" Sep 30 07:17:11 crc kubenswrapper[4691]: I0930 07:17:11.996792 4691 scope.go:117] "RemoveContainer" containerID="1c34e7fce6c4fb93e0cc366f9138b08d04cc9ddcf5b424e69303b88f7e9a392b" Sep 30 07:17:11 crc kubenswrapper[4691]: E0930 07:17:11.997165 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c34e7fce6c4fb93e0cc366f9138b08d04cc9ddcf5b424e69303b88f7e9a392b\": container with ID starting with 1c34e7fce6c4fb93e0cc366f9138b08d04cc9ddcf5b424e69303b88f7e9a392b not found: ID does not exist" containerID="1c34e7fce6c4fb93e0cc366f9138b08d04cc9ddcf5b424e69303b88f7e9a392b" Sep 30 07:17:11 crc kubenswrapper[4691]: I0930 07:17:11.997193 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c34e7fce6c4fb93e0cc366f9138b08d04cc9ddcf5b424e69303b88f7e9a392b"} err="failed to get container status \"1c34e7fce6c4fb93e0cc366f9138b08d04cc9ddcf5b424e69303b88f7e9a392b\": rpc error: code = NotFound desc = could not find container \"1c34e7fce6c4fb93e0cc366f9138b08d04cc9ddcf5b424e69303b88f7e9a392b\": container with ID starting with 1c34e7fce6c4fb93e0cc366f9138b08d04cc9ddcf5b424e69303b88f7e9a392b not found: ID does not exist" Sep 30 07:17:11 crc kubenswrapper[4691]: I0930 07:17:11.997206 4691 scope.go:117] "RemoveContainer" containerID="c6607ed3ded0e334eb7e6e51c9e6e839fd186b492c11e28d54f078c0d1a05d98" Sep 30 07:17:11 crc kubenswrapper[4691]: E0930 07:17:11.997434 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6607ed3ded0e334eb7e6e51c9e6e839fd186b492c11e28d54f078c0d1a05d98\": container with ID starting with c6607ed3ded0e334eb7e6e51c9e6e839fd186b492c11e28d54f078c0d1a05d98 not found: ID does not exist" containerID="c6607ed3ded0e334eb7e6e51c9e6e839fd186b492c11e28d54f078c0d1a05d98" Sep 30 07:17:11 crc kubenswrapper[4691]: I0930 07:17:11.997457 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6607ed3ded0e334eb7e6e51c9e6e839fd186b492c11e28d54f078c0d1a05d98"} err="failed to get container status \"c6607ed3ded0e334eb7e6e51c9e6e839fd186b492c11e28d54f078c0d1a05d98\": rpc error: code = NotFound desc = could not find container \"c6607ed3ded0e334eb7e6e51c9e6e839fd186b492c11e28d54f078c0d1a05d98\": container with ID starting with c6607ed3ded0e334eb7e6e51c9e6e839fd186b492c11e28d54f078c0d1a05d98 not found: ID does not exist" Sep 30 07:17:12 crc kubenswrapper[4691]: I0930 07:17:12.035553 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38edfb7b-d43d-4492-bc1b-8281e28991c0-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 07:17:12 crc kubenswrapper[4691]: I0930 07:17:12.231040 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hwbg9"] Sep 30 07:17:12 crc kubenswrapper[4691]: I0930 07:17:12.242626 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hwbg9"] Sep 30 07:17:13 crc kubenswrapper[4691]: I0930 07:17:13.238290 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38edfb7b-d43d-4492-bc1b-8281e28991c0" path="/var/lib/kubelet/pods/38edfb7b-d43d-4492-bc1b-8281e28991c0/volumes" Sep 30 07:18:52 crc kubenswrapper[4691]: I0930 07:18:52.850572 4691 patch_prober.go:28] interesting pod/machine-config-daemon-4w4k6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 07:18:52 crc kubenswrapper[4691]: I0930 07:18:52.852131 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 07:19:22 crc kubenswrapper[4691]: I0930 07:19:22.850447 4691 patch_prober.go:28] interesting pod/machine-config-daemon-4w4k6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 07:19:22 crc kubenswrapper[4691]: I0930 07:19:22.851082 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 07:19:52 crc kubenswrapper[4691]: I0930 07:19:52.850160 4691 patch_prober.go:28] interesting pod/machine-config-daemon-4w4k6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 07:19:52 crc kubenswrapper[4691]: I0930 07:19:52.850753 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 07:19:52 crc kubenswrapper[4691]: I0930 07:19:52.850809 4691 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" Sep 30 07:19:52 crc kubenswrapper[4691]: I0930 07:19:52.851741 4691 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0434386cd76a49bfe15e3153e96b9f35396c2bb1bd37e200ecf8ba3d62d6b4a5"} pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 07:19:52 crc kubenswrapper[4691]: I0930 07:19:52.851801 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" containerName="machine-config-daemon" containerID="cri-o://0434386cd76a49bfe15e3153e96b9f35396c2bb1bd37e200ecf8ba3d62d6b4a5" gracePeriod=600 Sep 30 07:19:53 crc kubenswrapper[4691]: I0930 07:19:53.653784 4691 generic.go:334] "Generic (PLEG): container finished" podID="69b46ade-8260-448f-84b7-506632d23ff9" containerID="0434386cd76a49bfe15e3153e96b9f35396c2bb1bd37e200ecf8ba3d62d6b4a5" exitCode=0 Sep 30 07:19:53 crc kubenswrapper[4691]: I0930 07:19:53.653912 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" event={"ID":"69b46ade-8260-448f-84b7-506632d23ff9","Type":"ContainerDied","Data":"0434386cd76a49bfe15e3153e96b9f35396c2bb1bd37e200ecf8ba3d62d6b4a5"} Sep 30 07:19:53 crc kubenswrapper[4691]: I0930 07:19:53.654204 4691 scope.go:117] "RemoveContainer" containerID="9518cc1f054b42c079b347293caf22adcca4092e127081d19ade2b5433469a6f" Sep 30 07:19:54 crc kubenswrapper[4691]: I0930 07:19:54.667306 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" event={"ID":"69b46ade-8260-448f-84b7-506632d23ff9","Type":"ContainerStarted","Data":"232d9e6ff7d50e3133db1d864568369b00d728911ca2e9a51d2661303a77e720"} Sep 30 07:22:22 crc kubenswrapper[4691]: I0930 07:22:22.849601 4691 patch_prober.go:28] interesting pod/machine-config-daemon-4w4k6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 07:22:22 crc kubenswrapper[4691]: I0930 07:22:22.850313 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 07:22:52 crc kubenswrapper[4691]: I0930 07:22:52.850516 4691 patch_prober.go:28] interesting pod/machine-config-daemon-4w4k6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 07:22:52 crc kubenswrapper[4691]: I0930 07:22:52.851317 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 07:23:22 crc kubenswrapper[4691]: I0930 07:23:22.850501 4691 patch_prober.go:28] interesting pod/machine-config-daemon-4w4k6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 07:23:22 crc kubenswrapper[4691]: I0930 07:23:22.851128 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 07:23:22 crc kubenswrapper[4691]: I0930 07:23:22.851194 4691 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" Sep 30 07:23:22 crc kubenswrapper[4691]: I0930 07:23:22.852266 4691 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"232d9e6ff7d50e3133db1d864568369b00d728911ca2e9a51d2661303a77e720"} pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 07:23:22 crc kubenswrapper[4691]: I0930 07:23:22.852360 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" containerName="machine-config-daemon" containerID="cri-o://232d9e6ff7d50e3133db1d864568369b00d728911ca2e9a51d2661303a77e720" gracePeriod=600 Sep 30 07:23:22 crc kubenswrapper[4691]: E0930 07:23:22.980636 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:23:23 crc kubenswrapper[4691]: I0930 07:23:23.006695 4691 generic.go:334] "Generic (PLEG): container finished" podID="69b46ade-8260-448f-84b7-506632d23ff9" containerID="232d9e6ff7d50e3133db1d864568369b00d728911ca2e9a51d2661303a77e720" exitCode=0 Sep 30 07:23:23 crc kubenswrapper[4691]: I0930 07:23:23.006734 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" event={"ID":"69b46ade-8260-448f-84b7-506632d23ff9","Type":"ContainerDied","Data":"232d9e6ff7d50e3133db1d864568369b00d728911ca2e9a51d2661303a77e720"} Sep 30 07:23:23 crc kubenswrapper[4691]: I0930 07:23:23.006767 4691 scope.go:117] "RemoveContainer" containerID="0434386cd76a49bfe15e3153e96b9f35396c2bb1bd37e200ecf8ba3d62d6b4a5" Sep 30 07:23:23 crc kubenswrapper[4691]: I0930 07:23:23.007533 4691 scope.go:117] "RemoveContainer" containerID="232d9e6ff7d50e3133db1d864568369b00d728911ca2e9a51d2661303a77e720" Sep 30 07:23:23 crc kubenswrapper[4691]: E0930 07:23:23.007992 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:23:37 crc kubenswrapper[4691]: I0930 07:23:37.243516 4691 scope.go:117] "RemoveContainer" containerID="232d9e6ff7d50e3133db1d864568369b00d728911ca2e9a51d2661303a77e720" Sep 30 07:23:37 crc kubenswrapper[4691]: E0930 07:23:37.244651 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:23:48 crc kubenswrapper[4691]: I0930 07:23:48.224643 4691 scope.go:117] "RemoveContainer" containerID="232d9e6ff7d50e3133db1d864568369b00d728911ca2e9a51d2661303a77e720" Sep 30 07:23:48 crc kubenswrapper[4691]: E0930 07:23:48.225366 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:24:03 crc kubenswrapper[4691]: I0930 07:24:03.225052 4691 scope.go:117] "RemoveContainer" containerID="232d9e6ff7d50e3133db1d864568369b00d728911ca2e9a51d2661303a77e720" Sep 30 07:24:03 crc kubenswrapper[4691]: E0930 07:24:03.226139 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:24:16 crc kubenswrapper[4691]: I0930 07:24:16.225046 4691 scope.go:117] "RemoveContainer" containerID="232d9e6ff7d50e3133db1d864568369b00d728911ca2e9a51d2661303a77e720" Sep 30 07:24:16 crc kubenswrapper[4691]: E0930 07:24:16.226030 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:24:29 crc kubenswrapper[4691]: I0930 07:24:29.225372 4691 scope.go:117] "RemoveContainer" containerID="232d9e6ff7d50e3133db1d864568369b00d728911ca2e9a51d2661303a77e720" Sep 30 07:24:29 crc kubenswrapper[4691]: E0930 07:24:29.226515 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:24:40 crc kubenswrapper[4691]: I0930 07:24:40.226247 4691 scope.go:117] "RemoveContainer" containerID="232d9e6ff7d50e3133db1d864568369b00d728911ca2e9a51d2661303a77e720" Sep 30 07:24:40 crc kubenswrapper[4691]: E0930 07:24:40.227700 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:24:52 crc kubenswrapper[4691]: I0930 07:24:52.225339 4691 scope.go:117] "RemoveContainer" containerID="232d9e6ff7d50e3133db1d864568369b00d728911ca2e9a51d2661303a77e720" Sep 30 07:24:52 crc kubenswrapper[4691]: E0930 07:24:52.226007 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:25:01 crc kubenswrapper[4691]: I0930 07:25:01.258050 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-phvmg"] Sep 30 07:25:01 crc kubenswrapper[4691]: E0930 07:25:01.259458 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38edfb7b-d43d-4492-bc1b-8281e28991c0" containerName="registry-server" Sep 30 07:25:01 crc kubenswrapper[4691]: I0930 07:25:01.259485 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="38edfb7b-d43d-4492-bc1b-8281e28991c0" containerName="registry-server" Sep 30 07:25:01 crc kubenswrapper[4691]: E0930 07:25:01.259531 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1802caf3-f3b8-4d0b-8442-1b587308a100" containerName="extract-utilities" Sep 30 07:25:01 crc kubenswrapper[4691]: I0930 07:25:01.259548 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="1802caf3-f3b8-4d0b-8442-1b587308a100" containerName="extract-utilities" Sep 30 07:25:01 crc kubenswrapper[4691]: E0930 07:25:01.259565 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1802caf3-f3b8-4d0b-8442-1b587308a100" containerName="registry-server" Sep 30 07:25:01 crc kubenswrapper[4691]: I0930 07:25:01.259578 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="1802caf3-f3b8-4d0b-8442-1b587308a100" containerName="registry-server" Sep 30 07:25:01 crc kubenswrapper[4691]: E0930 07:25:01.259594 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38edfb7b-d43d-4492-bc1b-8281e28991c0" containerName="extract-utilities" Sep 30 07:25:01 crc kubenswrapper[4691]: I0930 07:25:01.259608 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="38edfb7b-d43d-4492-bc1b-8281e28991c0" containerName="extract-utilities" Sep 30 07:25:01 crc kubenswrapper[4691]: E0930 07:25:01.259668 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38edfb7b-d43d-4492-bc1b-8281e28991c0" containerName="extract-content" Sep 30 07:25:01 crc kubenswrapper[4691]: I0930 07:25:01.259681 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="38edfb7b-d43d-4492-bc1b-8281e28991c0" containerName="extract-content" Sep 30 07:25:01 crc kubenswrapper[4691]: E0930 07:25:01.259699 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1802caf3-f3b8-4d0b-8442-1b587308a100" containerName="extract-content" Sep 30 07:25:01 crc kubenswrapper[4691]: I0930 07:25:01.259712 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="1802caf3-f3b8-4d0b-8442-1b587308a100" containerName="extract-content" Sep 30 07:25:01 crc kubenswrapper[4691]: I0930 07:25:01.260146 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="1802caf3-f3b8-4d0b-8442-1b587308a100" containerName="registry-server" Sep 30 07:25:01 crc kubenswrapper[4691]: I0930 07:25:01.260183 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="38edfb7b-d43d-4492-bc1b-8281e28991c0" containerName="registry-server" Sep 30 07:25:01 crc kubenswrapper[4691]: I0930 07:25:01.263299 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-phvmg"] Sep 30 07:25:01 crc kubenswrapper[4691]: I0930 07:25:01.263467 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-phvmg" Sep 30 07:25:01 crc kubenswrapper[4691]: I0930 07:25:01.373528 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90da459a-aa19-4d9f-aa05-8f405d40a9bf-catalog-content\") pod \"community-operators-phvmg\" (UID: \"90da459a-aa19-4d9f-aa05-8f405d40a9bf\") " pod="openshift-marketplace/community-operators-phvmg" Sep 30 07:25:01 crc kubenswrapper[4691]: I0930 07:25:01.373777 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90da459a-aa19-4d9f-aa05-8f405d40a9bf-utilities\") pod \"community-operators-phvmg\" (UID: \"90da459a-aa19-4d9f-aa05-8f405d40a9bf\") " pod="openshift-marketplace/community-operators-phvmg" Sep 30 07:25:01 crc kubenswrapper[4691]: I0930 07:25:01.373947 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmtxh\" (UniqueName: \"kubernetes.io/projected/90da459a-aa19-4d9f-aa05-8f405d40a9bf-kube-api-access-lmtxh\") pod \"community-operators-phvmg\" (UID: \"90da459a-aa19-4d9f-aa05-8f405d40a9bf\") " pod="openshift-marketplace/community-operators-phvmg" Sep 30 07:25:01 crc kubenswrapper[4691]: I0930 07:25:01.477130 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90da459a-aa19-4d9f-aa05-8f405d40a9bf-catalog-content\") pod \"community-operators-phvmg\" (UID: \"90da459a-aa19-4d9f-aa05-8f405d40a9bf\") " pod="openshift-marketplace/community-operators-phvmg" Sep 30 07:25:01 crc kubenswrapper[4691]: I0930 07:25:01.477293 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90da459a-aa19-4d9f-aa05-8f405d40a9bf-utilities\") pod \"community-operators-phvmg\" (UID: \"90da459a-aa19-4d9f-aa05-8f405d40a9bf\") " pod="openshift-marketplace/community-operators-phvmg" Sep 30 07:25:01 crc kubenswrapper[4691]: I0930 07:25:01.477351 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmtxh\" (UniqueName: \"kubernetes.io/projected/90da459a-aa19-4d9f-aa05-8f405d40a9bf-kube-api-access-lmtxh\") pod \"community-operators-phvmg\" (UID: \"90da459a-aa19-4d9f-aa05-8f405d40a9bf\") " pod="openshift-marketplace/community-operators-phvmg" Sep 30 07:25:01 crc kubenswrapper[4691]: I0930 07:25:01.477782 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90da459a-aa19-4d9f-aa05-8f405d40a9bf-catalog-content\") pod \"community-operators-phvmg\" (UID: \"90da459a-aa19-4d9f-aa05-8f405d40a9bf\") " pod="openshift-marketplace/community-operators-phvmg" Sep 30 07:25:01 crc kubenswrapper[4691]: I0930 07:25:01.477811 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90da459a-aa19-4d9f-aa05-8f405d40a9bf-utilities\") pod \"community-operators-phvmg\" (UID: \"90da459a-aa19-4d9f-aa05-8f405d40a9bf\") " pod="openshift-marketplace/community-operators-phvmg" Sep 30 07:25:01 crc kubenswrapper[4691]: I0930 07:25:01.501063 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmtxh\" (UniqueName: \"kubernetes.io/projected/90da459a-aa19-4d9f-aa05-8f405d40a9bf-kube-api-access-lmtxh\") pod \"community-operators-phvmg\" (UID: \"90da459a-aa19-4d9f-aa05-8f405d40a9bf\") " pod="openshift-marketplace/community-operators-phvmg" Sep 30 07:25:01 crc kubenswrapper[4691]: I0930 07:25:01.595677 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-phvmg" Sep 30 07:25:02 crc kubenswrapper[4691]: I0930 07:25:02.143372 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-phvmg"] Sep 30 07:25:03 crc kubenswrapper[4691]: I0930 07:25:03.154698 4691 generic.go:334] "Generic (PLEG): container finished" podID="90da459a-aa19-4d9f-aa05-8f405d40a9bf" containerID="03b6712b53deb4d942066b2f7790605e76ede178e6d3df61b3adc28977815c24" exitCode=0 Sep 30 07:25:03 crc kubenswrapper[4691]: I0930 07:25:03.154758 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-phvmg" event={"ID":"90da459a-aa19-4d9f-aa05-8f405d40a9bf","Type":"ContainerDied","Data":"03b6712b53deb4d942066b2f7790605e76ede178e6d3df61b3adc28977815c24"} Sep 30 07:25:03 crc kubenswrapper[4691]: I0930 07:25:03.155379 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-phvmg" event={"ID":"90da459a-aa19-4d9f-aa05-8f405d40a9bf","Type":"ContainerStarted","Data":"673e0c157bd92f702a9e6b072523b6fdf5216a15f338ed2ca9af6a459fc711c3"} Sep 30 07:25:03 crc kubenswrapper[4691]: I0930 07:25:03.158229 4691 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 07:25:03 crc kubenswrapper[4691]: I0930 07:25:03.225029 4691 scope.go:117] "RemoveContainer" containerID="232d9e6ff7d50e3133db1d864568369b00d728911ca2e9a51d2661303a77e720" Sep 30 07:25:03 crc kubenswrapper[4691]: E0930 07:25:03.225584 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:25:05 crc kubenswrapper[4691]: I0930 07:25:05.197494 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-phvmg" event={"ID":"90da459a-aa19-4d9f-aa05-8f405d40a9bf","Type":"ContainerStarted","Data":"af06a7b15e4fa95b4fa41faca5fd6fb7dc1eb11e88d63499664a6e582727ce39"} Sep 30 07:25:06 crc kubenswrapper[4691]: I0930 07:25:06.241220 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-n8hzw"] Sep 30 07:25:06 crc kubenswrapper[4691]: I0930 07:25:06.247924 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n8hzw" Sep 30 07:25:06 crc kubenswrapper[4691]: I0930 07:25:06.257097 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n8hzw"] Sep 30 07:25:06 crc kubenswrapper[4691]: I0930 07:25:06.378630 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f88dh\" (UniqueName: \"kubernetes.io/projected/c9727131-b423-44ef-a20b-b32fbc74de93-kube-api-access-f88dh\") pod \"redhat-operators-n8hzw\" (UID: \"c9727131-b423-44ef-a20b-b32fbc74de93\") " pod="openshift-marketplace/redhat-operators-n8hzw" Sep 30 07:25:06 crc kubenswrapper[4691]: I0930 07:25:06.378740 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9727131-b423-44ef-a20b-b32fbc74de93-utilities\") pod \"redhat-operators-n8hzw\" (UID: \"c9727131-b423-44ef-a20b-b32fbc74de93\") " pod="openshift-marketplace/redhat-operators-n8hzw" Sep 30 07:25:06 crc kubenswrapper[4691]: I0930 07:25:06.378788 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9727131-b423-44ef-a20b-b32fbc74de93-catalog-content\") pod \"redhat-operators-n8hzw\" (UID: \"c9727131-b423-44ef-a20b-b32fbc74de93\") " pod="openshift-marketplace/redhat-operators-n8hzw" Sep 30 07:25:06 crc kubenswrapper[4691]: I0930 07:25:06.481069 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f88dh\" (UniqueName: \"kubernetes.io/projected/c9727131-b423-44ef-a20b-b32fbc74de93-kube-api-access-f88dh\") pod \"redhat-operators-n8hzw\" (UID: \"c9727131-b423-44ef-a20b-b32fbc74de93\") " pod="openshift-marketplace/redhat-operators-n8hzw" Sep 30 07:25:06 crc kubenswrapper[4691]: I0930 07:25:06.481251 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9727131-b423-44ef-a20b-b32fbc74de93-utilities\") pod \"redhat-operators-n8hzw\" (UID: \"c9727131-b423-44ef-a20b-b32fbc74de93\") " pod="openshift-marketplace/redhat-operators-n8hzw" Sep 30 07:25:06 crc kubenswrapper[4691]: I0930 07:25:06.481346 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9727131-b423-44ef-a20b-b32fbc74de93-catalog-content\") pod \"redhat-operators-n8hzw\" (UID: \"c9727131-b423-44ef-a20b-b32fbc74de93\") " pod="openshift-marketplace/redhat-operators-n8hzw" Sep 30 07:25:06 crc kubenswrapper[4691]: I0930 07:25:06.481766 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9727131-b423-44ef-a20b-b32fbc74de93-utilities\") pod \"redhat-operators-n8hzw\" (UID: \"c9727131-b423-44ef-a20b-b32fbc74de93\") " pod="openshift-marketplace/redhat-operators-n8hzw" Sep 30 07:25:06 crc kubenswrapper[4691]: I0930 07:25:06.481808 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9727131-b423-44ef-a20b-b32fbc74de93-catalog-content\") pod \"redhat-operators-n8hzw\" (UID: \"c9727131-b423-44ef-a20b-b32fbc74de93\") " pod="openshift-marketplace/redhat-operators-n8hzw" Sep 30 07:25:06 crc kubenswrapper[4691]: I0930 07:25:06.512813 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f88dh\" (UniqueName: \"kubernetes.io/projected/c9727131-b423-44ef-a20b-b32fbc74de93-kube-api-access-f88dh\") pod \"redhat-operators-n8hzw\" (UID: \"c9727131-b423-44ef-a20b-b32fbc74de93\") " pod="openshift-marketplace/redhat-operators-n8hzw" Sep 30 07:25:06 crc kubenswrapper[4691]: I0930 07:25:06.584142 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n8hzw" Sep 30 07:25:07 crc kubenswrapper[4691]: I0930 07:25:07.076472 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n8hzw"] Sep 30 07:25:07 crc kubenswrapper[4691]: I0930 07:25:07.217569 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n8hzw" event={"ID":"c9727131-b423-44ef-a20b-b32fbc74de93","Type":"ContainerStarted","Data":"e1e3601d7be33ea1aae36b29c888347078f14c2ebae7be7ff52238685912def7"} Sep 30 07:25:08 crc kubenswrapper[4691]: I0930 07:25:08.237155 4691 generic.go:334] "Generic (PLEG): container finished" podID="c9727131-b423-44ef-a20b-b32fbc74de93" containerID="5371bd070d0862033c2fa6969c54cc5dd10f2727b6fda384103670ac8a5e1d3c" exitCode=0 Sep 30 07:25:08 crc kubenswrapper[4691]: I0930 07:25:08.237496 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n8hzw" event={"ID":"c9727131-b423-44ef-a20b-b32fbc74de93","Type":"ContainerDied","Data":"5371bd070d0862033c2fa6969c54cc5dd10f2727b6fda384103670ac8a5e1d3c"} Sep 30 07:25:10 crc kubenswrapper[4691]: I0930 07:25:10.262202 4691 generic.go:334] "Generic (PLEG): container finished" podID="90da459a-aa19-4d9f-aa05-8f405d40a9bf" containerID="af06a7b15e4fa95b4fa41faca5fd6fb7dc1eb11e88d63499664a6e582727ce39" exitCode=0 Sep 30 07:25:10 crc kubenswrapper[4691]: I0930 07:25:10.262287 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-phvmg" event={"ID":"90da459a-aa19-4d9f-aa05-8f405d40a9bf","Type":"ContainerDied","Data":"af06a7b15e4fa95b4fa41faca5fd6fb7dc1eb11e88d63499664a6e582727ce39"} Sep 30 07:25:11 crc kubenswrapper[4691]: I0930 07:25:11.282483 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n8hzw" event={"ID":"c9727131-b423-44ef-a20b-b32fbc74de93","Type":"ContainerStarted","Data":"b60bfcc215c4f8cac66865448662fa2fa52e8d3673d74f0b81bca0e96ec8e139"} Sep 30 07:25:12 crc kubenswrapper[4691]: I0930 07:25:12.316882 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-phvmg" event={"ID":"90da459a-aa19-4d9f-aa05-8f405d40a9bf","Type":"ContainerStarted","Data":"d1586e16faa0a7e63e01b1b35a6259e100505cf894c3750bb51be8fa222fa53a"} Sep 30 07:25:12 crc kubenswrapper[4691]: I0930 07:25:12.378076 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-phvmg" podStartSLOduration=3.582812161 podStartE2EDuration="11.378048209s" podCreationTimestamp="2025-09-30 07:25:01 +0000 UTC" firstStartedPulling="2025-09-30 07:25:03.157972163 +0000 UTC m=+3946.632993203" lastFinishedPulling="2025-09-30 07:25:10.953208221 +0000 UTC m=+3954.428229251" observedRunningTime="2025-09-30 07:25:12.338553335 +0000 UTC m=+3955.813574385" watchObservedRunningTime="2025-09-30 07:25:12.378048209 +0000 UTC m=+3955.853069269" Sep 30 07:25:16 crc kubenswrapper[4691]: I0930 07:25:16.368050 4691 generic.go:334] "Generic (PLEG): container finished" podID="c9727131-b423-44ef-a20b-b32fbc74de93" containerID="b60bfcc215c4f8cac66865448662fa2fa52e8d3673d74f0b81bca0e96ec8e139" exitCode=0 Sep 30 07:25:16 crc kubenswrapper[4691]: I0930 07:25:16.368161 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n8hzw" event={"ID":"c9727131-b423-44ef-a20b-b32fbc74de93","Type":"ContainerDied","Data":"b60bfcc215c4f8cac66865448662fa2fa52e8d3673d74f0b81bca0e96ec8e139"} Sep 30 07:25:18 crc kubenswrapper[4691]: I0930 07:25:18.225033 4691 scope.go:117] "RemoveContainer" containerID="232d9e6ff7d50e3133db1d864568369b00d728911ca2e9a51d2661303a77e720" Sep 30 07:25:18 crc kubenswrapper[4691]: E0930 07:25:18.225587 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:25:18 crc kubenswrapper[4691]: I0930 07:25:18.395592 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n8hzw" event={"ID":"c9727131-b423-44ef-a20b-b32fbc74de93","Type":"ContainerStarted","Data":"012ca3458ccb50b233bdb8425b4b9e9a5988546c841bc0fdc839021135900519"} Sep 30 07:25:18 crc kubenswrapper[4691]: I0930 07:25:18.435474 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-n8hzw" podStartSLOduration=3.636951856 podStartE2EDuration="12.435453872s" podCreationTimestamp="2025-09-30 07:25:06 +0000 UTC" firstStartedPulling="2025-09-30 07:25:08.240842378 +0000 UTC m=+3951.715863428" lastFinishedPulling="2025-09-30 07:25:17.039344374 +0000 UTC m=+3960.514365444" observedRunningTime="2025-09-30 07:25:18.420548216 +0000 UTC m=+3961.895569256" watchObservedRunningTime="2025-09-30 07:25:18.435453872 +0000 UTC m=+3961.910474912" Sep 30 07:25:21 crc kubenswrapper[4691]: I0930 07:25:21.597117 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-phvmg" Sep 30 07:25:21 crc kubenswrapper[4691]: I0930 07:25:21.597484 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-phvmg" Sep 30 07:25:21 crc kubenswrapper[4691]: I0930 07:25:21.689463 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-phvmg" Sep 30 07:25:22 crc kubenswrapper[4691]: I0930 07:25:22.520906 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-phvmg" Sep 30 07:25:22 crc kubenswrapper[4691]: I0930 07:25:22.623298 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-phvmg"] Sep 30 07:25:24 crc kubenswrapper[4691]: I0930 07:25:24.462396 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-phvmg" podUID="90da459a-aa19-4d9f-aa05-8f405d40a9bf" containerName="registry-server" containerID="cri-o://d1586e16faa0a7e63e01b1b35a6259e100505cf894c3750bb51be8fa222fa53a" gracePeriod=2 Sep 30 07:25:25 crc kubenswrapper[4691]: I0930 07:25:25.129495 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-phvmg" Sep 30 07:25:25 crc kubenswrapper[4691]: I0930 07:25:25.247685 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90da459a-aa19-4d9f-aa05-8f405d40a9bf-utilities\") pod \"90da459a-aa19-4d9f-aa05-8f405d40a9bf\" (UID: \"90da459a-aa19-4d9f-aa05-8f405d40a9bf\") " Sep 30 07:25:25 crc kubenswrapper[4691]: I0930 07:25:25.247790 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lmtxh\" (UniqueName: \"kubernetes.io/projected/90da459a-aa19-4d9f-aa05-8f405d40a9bf-kube-api-access-lmtxh\") pod \"90da459a-aa19-4d9f-aa05-8f405d40a9bf\" (UID: \"90da459a-aa19-4d9f-aa05-8f405d40a9bf\") " Sep 30 07:25:25 crc kubenswrapper[4691]: I0930 07:25:25.247877 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90da459a-aa19-4d9f-aa05-8f405d40a9bf-catalog-content\") pod \"90da459a-aa19-4d9f-aa05-8f405d40a9bf\" (UID: \"90da459a-aa19-4d9f-aa05-8f405d40a9bf\") " Sep 30 07:25:25 crc kubenswrapper[4691]: I0930 07:25:25.248324 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90da459a-aa19-4d9f-aa05-8f405d40a9bf-utilities" (OuterVolumeSpecName: "utilities") pod "90da459a-aa19-4d9f-aa05-8f405d40a9bf" (UID: "90da459a-aa19-4d9f-aa05-8f405d40a9bf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:25:25 crc kubenswrapper[4691]: I0930 07:25:25.261491 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90da459a-aa19-4d9f-aa05-8f405d40a9bf-kube-api-access-lmtxh" (OuterVolumeSpecName: "kube-api-access-lmtxh") pod "90da459a-aa19-4d9f-aa05-8f405d40a9bf" (UID: "90da459a-aa19-4d9f-aa05-8f405d40a9bf"). InnerVolumeSpecName "kube-api-access-lmtxh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:25:25 crc kubenswrapper[4691]: I0930 07:25:25.294958 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90da459a-aa19-4d9f-aa05-8f405d40a9bf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "90da459a-aa19-4d9f-aa05-8f405d40a9bf" (UID: "90da459a-aa19-4d9f-aa05-8f405d40a9bf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:25:25 crc kubenswrapper[4691]: I0930 07:25:25.350401 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90da459a-aa19-4d9f-aa05-8f405d40a9bf-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 07:25:25 crc kubenswrapper[4691]: I0930 07:25:25.350534 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90da459a-aa19-4d9f-aa05-8f405d40a9bf-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 07:25:25 crc kubenswrapper[4691]: I0930 07:25:25.350634 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lmtxh\" (UniqueName: \"kubernetes.io/projected/90da459a-aa19-4d9f-aa05-8f405d40a9bf-kube-api-access-lmtxh\") on node \"crc\" DevicePath \"\"" Sep 30 07:25:25 crc kubenswrapper[4691]: I0930 07:25:25.481434 4691 generic.go:334] "Generic (PLEG): container finished" podID="90da459a-aa19-4d9f-aa05-8f405d40a9bf" containerID="d1586e16faa0a7e63e01b1b35a6259e100505cf894c3750bb51be8fa222fa53a" exitCode=0 Sep 30 07:25:25 crc kubenswrapper[4691]: I0930 07:25:25.481479 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-phvmg" Sep 30 07:25:25 crc kubenswrapper[4691]: I0930 07:25:25.481494 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-phvmg" event={"ID":"90da459a-aa19-4d9f-aa05-8f405d40a9bf","Type":"ContainerDied","Data":"d1586e16faa0a7e63e01b1b35a6259e100505cf894c3750bb51be8fa222fa53a"} Sep 30 07:25:25 crc kubenswrapper[4691]: I0930 07:25:25.481543 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-phvmg" event={"ID":"90da459a-aa19-4d9f-aa05-8f405d40a9bf","Type":"ContainerDied","Data":"673e0c157bd92f702a9e6b072523b6fdf5216a15f338ed2ca9af6a459fc711c3"} Sep 30 07:25:25 crc kubenswrapper[4691]: I0930 07:25:25.481576 4691 scope.go:117] "RemoveContainer" containerID="d1586e16faa0a7e63e01b1b35a6259e100505cf894c3750bb51be8fa222fa53a" Sep 30 07:25:25 crc kubenswrapper[4691]: I0930 07:25:25.520479 4691 scope.go:117] "RemoveContainer" containerID="af06a7b15e4fa95b4fa41faca5fd6fb7dc1eb11e88d63499664a6e582727ce39" Sep 30 07:25:25 crc kubenswrapper[4691]: I0930 07:25:25.526238 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-phvmg"] Sep 30 07:25:25 crc kubenswrapper[4691]: I0930 07:25:25.543061 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-phvmg"] Sep 30 07:25:25 crc kubenswrapper[4691]: I0930 07:25:25.549150 4691 scope.go:117] "RemoveContainer" containerID="03b6712b53deb4d942066b2f7790605e76ede178e6d3df61b3adc28977815c24" Sep 30 07:25:25 crc kubenswrapper[4691]: I0930 07:25:25.610929 4691 scope.go:117] "RemoveContainer" containerID="d1586e16faa0a7e63e01b1b35a6259e100505cf894c3750bb51be8fa222fa53a" Sep 30 07:25:25 crc kubenswrapper[4691]: E0930 07:25:25.611399 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1586e16faa0a7e63e01b1b35a6259e100505cf894c3750bb51be8fa222fa53a\": container with ID starting with d1586e16faa0a7e63e01b1b35a6259e100505cf894c3750bb51be8fa222fa53a not found: ID does not exist" containerID="d1586e16faa0a7e63e01b1b35a6259e100505cf894c3750bb51be8fa222fa53a" Sep 30 07:25:25 crc kubenswrapper[4691]: I0930 07:25:25.611432 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1586e16faa0a7e63e01b1b35a6259e100505cf894c3750bb51be8fa222fa53a"} err="failed to get container status \"d1586e16faa0a7e63e01b1b35a6259e100505cf894c3750bb51be8fa222fa53a\": rpc error: code = NotFound desc = could not find container \"d1586e16faa0a7e63e01b1b35a6259e100505cf894c3750bb51be8fa222fa53a\": container with ID starting with d1586e16faa0a7e63e01b1b35a6259e100505cf894c3750bb51be8fa222fa53a not found: ID does not exist" Sep 30 07:25:25 crc kubenswrapper[4691]: I0930 07:25:25.611453 4691 scope.go:117] "RemoveContainer" containerID="af06a7b15e4fa95b4fa41faca5fd6fb7dc1eb11e88d63499664a6e582727ce39" Sep 30 07:25:25 crc kubenswrapper[4691]: E0930 07:25:25.611819 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af06a7b15e4fa95b4fa41faca5fd6fb7dc1eb11e88d63499664a6e582727ce39\": container with ID starting with af06a7b15e4fa95b4fa41faca5fd6fb7dc1eb11e88d63499664a6e582727ce39 not found: ID does not exist" containerID="af06a7b15e4fa95b4fa41faca5fd6fb7dc1eb11e88d63499664a6e582727ce39" Sep 30 07:25:25 crc kubenswrapper[4691]: I0930 07:25:25.611851 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af06a7b15e4fa95b4fa41faca5fd6fb7dc1eb11e88d63499664a6e582727ce39"} err="failed to get container status \"af06a7b15e4fa95b4fa41faca5fd6fb7dc1eb11e88d63499664a6e582727ce39\": rpc error: code = NotFound desc = could not find container \"af06a7b15e4fa95b4fa41faca5fd6fb7dc1eb11e88d63499664a6e582727ce39\": container with ID starting with af06a7b15e4fa95b4fa41faca5fd6fb7dc1eb11e88d63499664a6e582727ce39 not found: ID does not exist" Sep 30 07:25:25 crc kubenswrapper[4691]: I0930 07:25:25.611865 4691 scope.go:117] "RemoveContainer" containerID="03b6712b53deb4d942066b2f7790605e76ede178e6d3df61b3adc28977815c24" Sep 30 07:25:25 crc kubenswrapper[4691]: E0930 07:25:25.612247 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03b6712b53deb4d942066b2f7790605e76ede178e6d3df61b3adc28977815c24\": container with ID starting with 03b6712b53deb4d942066b2f7790605e76ede178e6d3df61b3adc28977815c24 not found: ID does not exist" containerID="03b6712b53deb4d942066b2f7790605e76ede178e6d3df61b3adc28977815c24" Sep 30 07:25:25 crc kubenswrapper[4691]: I0930 07:25:25.612274 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03b6712b53deb4d942066b2f7790605e76ede178e6d3df61b3adc28977815c24"} err="failed to get container status \"03b6712b53deb4d942066b2f7790605e76ede178e6d3df61b3adc28977815c24\": rpc error: code = NotFound desc = could not find container \"03b6712b53deb4d942066b2f7790605e76ede178e6d3df61b3adc28977815c24\": container with ID starting with 03b6712b53deb4d942066b2f7790605e76ede178e6d3df61b3adc28977815c24 not found: ID does not exist" Sep 30 07:25:26 crc kubenswrapper[4691]: I0930 07:25:26.584744 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-n8hzw" Sep 30 07:25:26 crc kubenswrapper[4691]: I0930 07:25:26.585191 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-n8hzw" Sep 30 07:25:26 crc kubenswrapper[4691]: I0930 07:25:26.670915 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-n8hzw" Sep 30 07:25:27 crc kubenswrapper[4691]: I0930 07:25:27.241238 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90da459a-aa19-4d9f-aa05-8f405d40a9bf" path="/var/lib/kubelet/pods/90da459a-aa19-4d9f-aa05-8f405d40a9bf/volumes" Sep 30 07:25:27 crc kubenswrapper[4691]: I0930 07:25:27.557229 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-n8hzw" Sep 30 07:25:28 crc kubenswrapper[4691]: I0930 07:25:28.026822 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n8hzw"] Sep 30 07:25:29 crc kubenswrapper[4691]: I0930 07:25:29.519592 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-n8hzw" podUID="c9727131-b423-44ef-a20b-b32fbc74de93" containerName="registry-server" containerID="cri-o://012ca3458ccb50b233bdb8425b4b9e9a5988546c841bc0fdc839021135900519" gracePeriod=2 Sep 30 07:25:30 crc kubenswrapper[4691]: I0930 07:25:30.053025 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n8hzw" Sep 30 07:25:30 crc kubenswrapper[4691]: I0930 07:25:30.145026 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9727131-b423-44ef-a20b-b32fbc74de93-utilities\") pod \"c9727131-b423-44ef-a20b-b32fbc74de93\" (UID: \"c9727131-b423-44ef-a20b-b32fbc74de93\") " Sep 30 07:25:30 crc kubenswrapper[4691]: I0930 07:25:30.145435 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9727131-b423-44ef-a20b-b32fbc74de93-catalog-content\") pod \"c9727131-b423-44ef-a20b-b32fbc74de93\" (UID: \"c9727131-b423-44ef-a20b-b32fbc74de93\") " Sep 30 07:25:30 crc kubenswrapper[4691]: I0930 07:25:30.145463 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f88dh\" (UniqueName: \"kubernetes.io/projected/c9727131-b423-44ef-a20b-b32fbc74de93-kube-api-access-f88dh\") pod \"c9727131-b423-44ef-a20b-b32fbc74de93\" (UID: \"c9727131-b423-44ef-a20b-b32fbc74de93\") " Sep 30 07:25:30 crc kubenswrapper[4691]: I0930 07:25:30.146246 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9727131-b423-44ef-a20b-b32fbc74de93-utilities" (OuterVolumeSpecName: "utilities") pod "c9727131-b423-44ef-a20b-b32fbc74de93" (UID: "c9727131-b423-44ef-a20b-b32fbc74de93"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:25:30 crc kubenswrapper[4691]: I0930 07:25:30.150533 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9727131-b423-44ef-a20b-b32fbc74de93-kube-api-access-f88dh" (OuterVolumeSpecName: "kube-api-access-f88dh") pod "c9727131-b423-44ef-a20b-b32fbc74de93" (UID: "c9727131-b423-44ef-a20b-b32fbc74de93"). InnerVolumeSpecName "kube-api-access-f88dh". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:25:30 crc kubenswrapper[4691]: I0930 07:25:30.247348 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f88dh\" (UniqueName: \"kubernetes.io/projected/c9727131-b423-44ef-a20b-b32fbc74de93-kube-api-access-f88dh\") on node \"crc\" DevicePath \"\"" Sep 30 07:25:30 crc kubenswrapper[4691]: I0930 07:25:30.247420 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9727131-b423-44ef-a20b-b32fbc74de93-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 07:25:30 crc kubenswrapper[4691]: I0930 07:25:30.252094 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9727131-b423-44ef-a20b-b32fbc74de93-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c9727131-b423-44ef-a20b-b32fbc74de93" (UID: "c9727131-b423-44ef-a20b-b32fbc74de93"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:25:30 crc kubenswrapper[4691]: I0930 07:25:30.349531 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9727131-b423-44ef-a20b-b32fbc74de93-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 07:25:30 crc kubenswrapper[4691]: I0930 07:25:30.532263 4691 generic.go:334] "Generic (PLEG): container finished" podID="c9727131-b423-44ef-a20b-b32fbc74de93" containerID="012ca3458ccb50b233bdb8425b4b9e9a5988546c841bc0fdc839021135900519" exitCode=0 Sep 30 07:25:30 crc kubenswrapper[4691]: I0930 07:25:30.532324 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n8hzw" event={"ID":"c9727131-b423-44ef-a20b-b32fbc74de93","Type":"ContainerDied","Data":"012ca3458ccb50b233bdb8425b4b9e9a5988546c841bc0fdc839021135900519"} Sep 30 07:25:30 crc kubenswrapper[4691]: I0930 07:25:30.532367 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n8hzw" Sep 30 07:25:30 crc kubenswrapper[4691]: I0930 07:25:30.532394 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n8hzw" event={"ID":"c9727131-b423-44ef-a20b-b32fbc74de93","Type":"ContainerDied","Data":"e1e3601d7be33ea1aae36b29c888347078f14c2ebae7be7ff52238685912def7"} Sep 30 07:25:30 crc kubenswrapper[4691]: I0930 07:25:30.532419 4691 scope.go:117] "RemoveContainer" containerID="012ca3458ccb50b233bdb8425b4b9e9a5988546c841bc0fdc839021135900519" Sep 30 07:25:30 crc kubenswrapper[4691]: I0930 07:25:30.574612 4691 scope.go:117] "RemoveContainer" containerID="b60bfcc215c4f8cac66865448662fa2fa52e8d3673d74f0b81bca0e96ec8e139" Sep 30 07:25:30 crc kubenswrapper[4691]: I0930 07:25:30.580012 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n8hzw"] Sep 30 07:25:30 crc kubenswrapper[4691]: I0930 07:25:30.586425 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-n8hzw"] Sep 30 07:25:30 crc kubenswrapper[4691]: I0930 07:25:30.617455 4691 scope.go:117] "RemoveContainer" containerID="5371bd070d0862033c2fa6969c54cc5dd10f2727b6fda384103670ac8a5e1d3c" Sep 30 07:25:30 crc kubenswrapper[4691]: I0930 07:25:30.646487 4691 scope.go:117] "RemoveContainer" containerID="012ca3458ccb50b233bdb8425b4b9e9a5988546c841bc0fdc839021135900519" Sep 30 07:25:30 crc kubenswrapper[4691]: E0930 07:25:30.647645 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"012ca3458ccb50b233bdb8425b4b9e9a5988546c841bc0fdc839021135900519\": container with ID starting with 012ca3458ccb50b233bdb8425b4b9e9a5988546c841bc0fdc839021135900519 not found: ID does not exist" containerID="012ca3458ccb50b233bdb8425b4b9e9a5988546c841bc0fdc839021135900519" Sep 30 07:25:30 crc kubenswrapper[4691]: I0930 07:25:30.647690 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"012ca3458ccb50b233bdb8425b4b9e9a5988546c841bc0fdc839021135900519"} err="failed to get container status \"012ca3458ccb50b233bdb8425b4b9e9a5988546c841bc0fdc839021135900519\": rpc error: code = NotFound desc = could not find container \"012ca3458ccb50b233bdb8425b4b9e9a5988546c841bc0fdc839021135900519\": container with ID starting with 012ca3458ccb50b233bdb8425b4b9e9a5988546c841bc0fdc839021135900519 not found: ID does not exist" Sep 30 07:25:30 crc kubenswrapper[4691]: I0930 07:25:30.647717 4691 scope.go:117] "RemoveContainer" containerID="b60bfcc215c4f8cac66865448662fa2fa52e8d3673d74f0b81bca0e96ec8e139" Sep 30 07:25:30 crc kubenswrapper[4691]: E0930 07:25:30.648063 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b60bfcc215c4f8cac66865448662fa2fa52e8d3673d74f0b81bca0e96ec8e139\": container with ID starting with b60bfcc215c4f8cac66865448662fa2fa52e8d3673d74f0b81bca0e96ec8e139 not found: ID does not exist" containerID="b60bfcc215c4f8cac66865448662fa2fa52e8d3673d74f0b81bca0e96ec8e139" Sep 30 07:25:30 crc kubenswrapper[4691]: I0930 07:25:30.648085 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b60bfcc215c4f8cac66865448662fa2fa52e8d3673d74f0b81bca0e96ec8e139"} err="failed to get container status \"b60bfcc215c4f8cac66865448662fa2fa52e8d3673d74f0b81bca0e96ec8e139\": rpc error: code = NotFound desc = could not find container \"b60bfcc215c4f8cac66865448662fa2fa52e8d3673d74f0b81bca0e96ec8e139\": container with ID starting with b60bfcc215c4f8cac66865448662fa2fa52e8d3673d74f0b81bca0e96ec8e139 not found: ID does not exist" Sep 30 07:25:30 crc kubenswrapper[4691]: I0930 07:25:30.648099 4691 scope.go:117] "RemoveContainer" containerID="5371bd070d0862033c2fa6969c54cc5dd10f2727b6fda384103670ac8a5e1d3c" Sep 30 07:25:30 crc kubenswrapper[4691]: E0930 07:25:30.648427 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5371bd070d0862033c2fa6969c54cc5dd10f2727b6fda384103670ac8a5e1d3c\": container with ID starting with 5371bd070d0862033c2fa6969c54cc5dd10f2727b6fda384103670ac8a5e1d3c not found: ID does not exist" containerID="5371bd070d0862033c2fa6969c54cc5dd10f2727b6fda384103670ac8a5e1d3c" Sep 30 07:25:30 crc kubenswrapper[4691]: I0930 07:25:30.648457 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5371bd070d0862033c2fa6969c54cc5dd10f2727b6fda384103670ac8a5e1d3c"} err="failed to get container status \"5371bd070d0862033c2fa6969c54cc5dd10f2727b6fda384103670ac8a5e1d3c\": rpc error: code = NotFound desc = could not find container \"5371bd070d0862033c2fa6969c54cc5dd10f2727b6fda384103670ac8a5e1d3c\": container with ID starting with 5371bd070d0862033c2fa6969c54cc5dd10f2727b6fda384103670ac8a5e1d3c not found: ID does not exist" Sep 30 07:25:31 crc kubenswrapper[4691]: I0930 07:25:31.226368 4691 scope.go:117] "RemoveContainer" containerID="232d9e6ff7d50e3133db1d864568369b00d728911ca2e9a51d2661303a77e720" Sep 30 07:25:31 crc kubenswrapper[4691]: E0930 07:25:31.227187 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:25:31 crc kubenswrapper[4691]: I0930 07:25:31.237699 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9727131-b423-44ef-a20b-b32fbc74de93" path="/var/lib/kubelet/pods/c9727131-b423-44ef-a20b-b32fbc74de93/volumes" Sep 30 07:25:46 crc kubenswrapper[4691]: I0930 07:25:46.225070 4691 scope.go:117] "RemoveContainer" containerID="232d9e6ff7d50e3133db1d864568369b00d728911ca2e9a51d2661303a77e720" Sep 30 07:25:46 crc kubenswrapper[4691]: E0930 07:25:46.226307 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:25:57 crc kubenswrapper[4691]: I0930 07:25:57.234546 4691 scope.go:117] "RemoveContainer" containerID="232d9e6ff7d50e3133db1d864568369b00d728911ca2e9a51d2661303a77e720" Sep 30 07:25:57 crc kubenswrapper[4691]: E0930 07:25:57.236197 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:26:11 crc kubenswrapper[4691]: I0930 07:26:11.224703 4691 scope.go:117] "RemoveContainer" containerID="232d9e6ff7d50e3133db1d864568369b00d728911ca2e9a51d2661303a77e720" Sep 30 07:26:11 crc kubenswrapper[4691]: E0930 07:26:11.226041 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:26:25 crc kubenswrapper[4691]: I0930 07:26:25.225627 4691 scope.go:117] "RemoveContainer" containerID="232d9e6ff7d50e3133db1d864568369b00d728911ca2e9a51d2661303a77e720" Sep 30 07:26:25 crc kubenswrapper[4691]: E0930 07:26:25.226610 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:26:38 crc kubenswrapper[4691]: I0930 07:26:38.224845 4691 scope.go:117] "RemoveContainer" containerID="232d9e6ff7d50e3133db1d864568369b00d728911ca2e9a51d2661303a77e720" Sep 30 07:26:38 crc kubenswrapper[4691]: E0930 07:26:38.225744 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:26:53 crc kubenswrapper[4691]: I0930 07:26:53.225843 4691 scope.go:117] "RemoveContainer" containerID="232d9e6ff7d50e3133db1d864568369b00d728911ca2e9a51d2661303a77e720" Sep 30 07:26:53 crc kubenswrapper[4691]: E0930 07:26:53.226824 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:27:06 crc kubenswrapper[4691]: I0930 07:27:06.225150 4691 scope.go:117] "RemoveContainer" containerID="232d9e6ff7d50e3133db1d864568369b00d728911ca2e9a51d2661303a77e720" Sep 30 07:27:06 crc kubenswrapper[4691]: E0930 07:27:06.226310 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:27:12 crc kubenswrapper[4691]: I0930 07:27:12.420517 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5sjvx"] Sep 30 07:27:12 crc kubenswrapper[4691]: E0930 07:27:12.421581 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90da459a-aa19-4d9f-aa05-8f405d40a9bf" containerName="registry-server" Sep 30 07:27:12 crc kubenswrapper[4691]: I0930 07:27:12.421595 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="90da459a-aa19-4d9f-aa05-8f405d40a9bf" containerName="registry-server" Sep 30 07:27:12 crc kubenswrapper[4691]: E0930 07:27:12.421620 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9727131-b423-44ef-a20b-b32fbc74de93" containerName="registry-server" Sep 30 07:27:12 crc kubenswrapper[4691]: I0930 07:27:12.421626 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9727131-b423-44ef-a20b-b32fbc74de93" containerName="registry-server" Sep 30 07:27:12 crc kubenswrapper[4691]: E0930 07:27:12.421640 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90da459a-aa19-4d9f-aa05-8f405d40a9bf" containerName="extract-utilities" Sep 30 07:27:12 crc kubenswrapper[4691]: I0930 07:27:12.421646 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="90da459a-aa19-4d9f-aa05-8f405d40a9bf" containerName="extract-utilities" Sep 30 07:27:12 crc kubenswrapper[4691]: E0930 07:27:12.421655 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90da459a-aa19-4d9f-aa05-8f405d40a9bf" containerName="extract-content" Sep 30 07:27:12 crc kubenswrapper[4691]: I0930 07:27:12.421660 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="90da459a-aa19-4d9f-aa05-8f405d40a9bf" containerName="extract-content" Sep 30 07:27:12 crc kubenswrapper[4691]: E0930 07:27:12.421669 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9727131-b423-44ef-a20b-b32fbc74de93" containerName="extract-content" Sep 30 07:27:12 crc kubenswrapper[4691]: I0930 07:27:12.421677 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9727131-b423-44ef-a20b-b32fbc74de93" containerName="extract-content" Sep 30 07:27:12 crc kubenswrapper[4691]: E0930 07:27:12.421710 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9727131-b423-44ef-a20b-b32fbc74de93" containerName="extract-utilities" Sep 30 07:27:12 crc kubenswrapper[4691]: I0930 07:27:12.421716 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9727131-b423-44ef-a20b-b32fbc74de93" containerName="extract-utilities" Sep 30 07:27:12 crc kubenswrapper[4691]: I0930 07:27:12.421904 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="90da459a-aa19-4d9f-aa05-8f405d40a9bf" containerName="registry-server" Sep 30 07:27:12 crc kubenswrapper[4691]: I0930 07:27:12.421930 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9727131-b423-44ef-a20b-b32fbc74de93" containerName="registry-server" Sep 30 07:27:12 crc kubenswrapper[4691]: I0930 07:27:12.423546 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5sjvx" Sep 30 07:27:12 crc kubenswrapper[4691]: I0930 07:27:12.441818 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5sjvx"] Sep 30 07:27:12 crc kubenswrapper[4691]: I0930 07:27:12.531488 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f93a3aa-908b-4922-a8aa-6d0c9fb73084-utilities\") pod \"redhat-marketplace-5sjvx\" (UID: \"1f93a3aa-908b-4922-a8aa-6d0c9fb73084\") " pod="openshift-marketplace/redhat-marketplace-5sjvx" Sep 30 07:27:12 crc kubenswrapper[4691]: I0930 07:27:12.531558 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhzrm\" (UniqueName: \"kubernetes.io/projected/1f93a3aa-908b-4922-a8aa-6d0c9fb73084-kube-api-access-vhzrm\") pod \"redhat-marketplace-5sjvx\" (UID: \"1f93a3aa-908b-4922-a8aa-6d0c9fb73084\") " pod="openshift-marketplace/redhat-marketplace-5sjvx" Sep 30 07:27:12 crc kubenswrapper[4691]: I0930 07:27:12.531593 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f93a3aa-908b-4922-a8aa-6d0c9fb73084-catalog-content\") pod \"redhat-marketplace-5sjvx\" (UID: \"1f93a3aa-908b-4922-a8aa-6d0c9fb73084\") " pod="openshift-marketplace/redhat-marketplace-5sjvx" Sep 30 07:27:12 crc kubenswrapper[4691]: I0930 07:27:12.632866 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f93a3aa-908b-4922-a8aa-6d0c9fb73084-utilities\") pod \"redhat-marketplace-5sjvx\" (UID: \"1f93a3aa-908b-4922-a8aa-6d0c9fb73084\") " pod="openshift-marketplace/redhat-marketplace-5sjvx" Sep 30 07:27:12 crc kubenswrapper[4691]: I0930 07:27:12.632933 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhzrm\" (UniqueName: \"kubernetes.io/projected/1f93a3aa-908b-4922-a8aa-6d0c9fb73084-kube-api-access-vhzrm\") pod \"redhat-marketplace-5sjvx\" (UID: \"1f93a3aa-908b-4922-a8aa-6d0c9fb73084\") " pod="openshift-marketplace/redhat-marketplace-5sjvx" Sep 30 07:27:12 crc kubenswrapper[4691]: I0930 07:27:12.632969 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f93a3aa-908b-4922-a8aa-6d0c9fb73084-catalog-content\") pod \"redhat-marketplace-5sjvx\" (UID: \"1f93a3aa-908b-4922-a8aa-6d0c9fb73084\") " pod="openshift-marketplace/redhat-marketplace-5sjvx" Sep 30 07:27:12 crc kubenswrapper[4691]: I0930 07:27:12.633551 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f93a3aa-908b-4922-a8aa-6d0c9fb73084-utilities\") pod \"redhat-marketplace-5sjvx\" (UID: \"1f93a3aa-908b-4922-a8aa-6d0c9fb73084\") " pod="openshift-marketplace/redhat-marketplace-5sjvx" Sep 30 07:27:12 crc kubenswrapper[4691]: I0930 07:27:12.633873 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f93a3aa-908b-4922-a8aa-6d0c9fb73084-catalog-content\") pod \"redhat-marketplace-5sjvx\" (UID: \"1f93a3aa-908b-4922-a8aa-6d0c9fb73084\") " pod="openshift-marketplace/redhat-marketplace-5sjvx" Sep 30 07:27:12 crc kubenswrapper[4691]: I0930 07:27:12.668671 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhzrm\" (UniqueName: \"kubernetes.io/projected/1f93a3aa-908b-4922-a8aa-6d0c9fb73084-kube-api-access-vhzrm\") pod \"redhat-marketplace-5sjvx\" (UID: \"1f93a3aa-908b-4922-a8aa-6d0c9fb73084\") " pod="openshift-marketplace/redhat-marketplace-5sjvx" Sep 30 07:27:12 crc kubenswrapper[4691]: I0930 07:27:12.744408 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5sjvx" Sep 30 07:27:13 crc kubenswrapper[4691]: I0930 07:27:13.081993 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5sjvx"] Sep 30 07:27:13 crc kubenswrapper[4691]: I0930 07:27:13.683509 4691 generic.go:334] "Generic (PLEG): container finished" podID="1f93a3aa-908b-4922-a8aa-6d0c9fb73084" containerID="bf69fa6bc62f09f0ab228f07521b86265779a264676e6df1f2401c55d5bdd91f" exitCode=0 Sep 30 07:27:13 crc kubenswrapper[4691]: I0930 07:27:13.683603 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5sjvx" event={"ID":"1f93a3aa-908b-4922-a8aa-6d0c9fb73084","Type":"ContainerDied","Data":"bf69fa6bc62f09f0ab228f07521b86265779a264676e6df1f2401c55d5bdd91f"} Sep 30 07:27:13 crc kubenswrapper[4691]: I0930 07:27:13.683935 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5sjvx" event={"ID":"1f93a3aa-908b-4922-a8aa-6d0c9fb73084","Type":"ContainerStarted","Data":"771e904441dc2d5d48b0d4cce0adf3bb96ca047fc2abb83f218dc3f9a78ff57d"} Sep 30 07:27:15 crc kubenswrapper[4691]: I0930 07:27:15.003039 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-tdf2r"] Sep 30 07:27:15 crc kubenswrapper[4691]: I0930 07:27:15.007037 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tdf2r" Sep 30 07:27:15 crc kubenswrapper[4691]: I0930 07:27:15.039579 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tdf2r"] Sep 30 07:27:15 crc kubenswrapper[4691]: I0930 07:27:15.094635 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e5a24ef-cd18-4162-9739-452580bed5a0-catalog-content\") pod \"certified-operators-tdf2r\" (UID: \"4e5a24ef-cd18-4162-9739-452580bed5a0\") " pod="openshift-marketplace/certified-operators-tdf2r" Sep 30 07:27:15 crc kubenswrapper[4691]: I0930 07:27:15.095106 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e5a24ef-cd18-4162-9739-452580bed5a0-utilities\") pod \"certified-operators-tdf2r\" (UID: \"4e5a24ef-cd18-4162-9739-452580bed5a0\") " pod="openshift-marketplace/certified-operators-tdf2r" Sep 30 07:27:15 crc kubenswrapper[4691]: I0930 07:27:15.095182 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcnxd\" (UniqueName: \"kubernetes.io/projected/4e5a24ef-cd18-4162-9739-452580bed5a0-kube-api-access-jcnxd\") pod \"certified-operators-tdf2r\" (UID: \"4e5a24ef-cd18-4162-9739-452580bed5a0\") " pod="openshift-marketplace/certified-operators-tdf2r" Sep 30 07:27:15 crc kubenswrapper[4691]: I0930 07:27:15.201267 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e5a24ef-cd18-4162-9739-452580bed5a0-catalog-content\") pod \"certified-operators-tdf2r\" (UID: \"4e5a24ef-cd18-4162-9739-452580bed5a0\") " pod="openshift-marketplace/certified-operators-tdf2r" Sep 30 07:27:15 crc kubenswrapper[4691]: I0930 07:27:15.201340 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e5a24ef-cd18-4162-9739-452580bed5a0-utilities\") pod \"certified-operators-tdf2r\" (UID: \"4e5a24ef-cd18-4162-9739-452580bed5a0\") " pod="openshift-marketplace/certified-operators-tdf2r" Sep 30 07:27:15 crc kubenswrapper[4691]: I0930 07:27:15.201412 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcnxd\" (UniqueName: \"kubernetes.io/projected/4e5a24ef-cd18-4162-9739-452580bed5a0-kube-api-access-jcnxd\") pod \"certified-operators-tdf2r\" (UID: \"4e5a24ef-cd18-4162-9739-452580bed5a0\") " pod="openshift-marketplace/certified-operators-tdf2r" Sep 30 07:27:15 crc kubenswrapper[4691]: I0930 07:27:15.202500 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e5a24ef-cd18-4162-9739-452580bed5a0-utilities\") pod \"certified-operators-tdf2r\" (UID: \"4e5a24ef-cd18-4162-9739-452580bed5a0\") " pod="openshift-marketplace/certified-operators-tdf2r" Sep 30 07:27:15 crc kubenswrapper[4691]: I0930 07:27:15.202671 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e5a24ef-cd18-4162-9739-452580bed5a0-catalog-content\") pod \"certified-operators-tdf2r\" (UID: \"4e5a24ef-cd18-4162-9739-452580bed5a0\") " pod="openshift-marketplace/certified-operators-tdf2r" Sep 30 07:27:15 crc kubenswrapper[4691]: I0930 07:27:15.229610 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcnxd\" (UniqueName: \"kubernetes.io/projected/4e5a24ef-cd18-4162-9739-452580bed5a0-kube-api-access-jcnxd\") pod \"certified-operators-tdf2r\" (UID: \"4e5a24ef-cd18-4162-9739-452580bed5a0\") " pod="openshift-marketplace/certified-operators-tdf2r" Sep 30 07:27:15 crc kubenswrapper[4691]: I0930 07:27:15.386936 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tdf2r" Sep 30 07:27:15 crc kubenswrapper[4691]: I0930 07:27:15.708665 4691 generic.go:334] "Generic (PLEG): container finished" podID="1f93a3aa-908b-4922-a8aa-6d0c9fb73084" containerID="3e263adab74470bf89f1fcfe1b7bb711329f4bd750e225ef5a93592cbe529d80" exitCode=0 Sep 30 07:27:15 crc kubenswrapper[4691]: I0930 07:27:15.708928 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5sjvx" event={"ID":"1f93a3aa-908b-4922-a8aa-6d0c9fb73084","Type":"ContainerDied","Data":"3e263adab74470bf89f1fcfe1b7bb711329f4bd750e225ef5a93592cbe529d80"} Sep 30 07:27:15 crc kubenswrapper[4691]: I0930 07:27:15.975072 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-tdf2r"] Sep 30 07:27:16 crc kubenswrapper[4691]: I0930 07:27:16.718824 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5sjvx" event={"ID":"1f93a3aa-908b-4922-a8aa-6d0c9fb73084","Type":"ContainerStarted","Data":"cb91c581452510d38e6ed5ada31a194f269dd4f46e9c84f5d85b692290defe56"} Sep 30 07:27:16 crc kubenswrapper[4691]: I0930 07:27:16.720230 4691 generic.go:334] "Generic (PLEG): container finished" podID="4e5a24ef-cd18-4162-9739-452580bed5a0" containerID="2c98201460c55ce325e072d0abb9b7937e524e2d75c2cd31704d936dc3d64cd7" exitCode=0 Sep 30 07:27:16 crc kubenswrapper[4691]: I0930 07:27:16.720266 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tdf2r" event={"ID":"4e5a24ef-cd18-4162-9739-452580bed5a0","Type":"ContainerDied","Data":"2c98201460c55ce325e072d0abb9b7937e524e2d75c2cd31704d936dc3d64cd7"} Sep 30 07:27:16 crc kubenswrapper[4691]: I0930 07:27:16.720280 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tdf2r" event={"ID":"4e5a24ef-cd18-4162-9739-452580bed5a0","Type":"ContainerStarted","Data":"5179f50a9fc01d80ae31351c7d226ee431be35e5e9e85d6230293ae351f87439"} Sep 30 07:27:16 crc kubenswrapper[4691]: I0930 07:27:16.745106 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5sjvx" podStartSLOduration=2.338517844 podStartE2EDuration="4.745084105s" podCreationTimestamp="2025-09-30 07:27:12 +0000 UTC" firstStartedPulling="2025-09-30 07:27:13.685456833 +0000 UTC m=+4077.160477873" lastFinishedPulling="2025-09-30 07:27:16.092023084 +0000 UTC m=+4079.567044134" observedRunningTime="2025-09-30 07:27:16.737104229 +0000 UTC m=+4080.212125269" watchObservedRunningTime="2025-09-30 07:27:16.745084105 +0000 UTC m=+4080.220105145" Sep 30 07:27:17 crc kubenswrapper[4691]: I0930 07:27:17.235198 4691 scope.go:117] "RemoveContainer" containerID="232d9e6ff7d50e3133db1d864568369b00d728911ca2e9a51d2661303a77e720" Sep 30 07:27:17 crc kubenswrapper[4691]: E0930 07:27:17.236207 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:27:18 crc kubenswrapper[4691]: I0930 07:27:18.742690 4691 generic.go:334] "Generic (PLEG): container finished" podID="4e5a24ef-cd18-4162-9739-452580bed5a0" containerID="3baea762dcab0b2c60b43df49a512d682f64f53e9edb3f211232c4d4ec7fd042" exitCode=0 Sep 30 07:27:18 crc kubenswrapper[4691]: I0930 07:27:18.742761 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tdf2r" event={"ID":"4e5a24ef-cd18-4162-9739-452580bed5a0","Type":"ContainerDied","Data":"3baea762dcab0b2c60b43df49a512d682f64f53e9edb3f211232c4d4ec7fd042"} Sep 30 07:27:19 crc kubenswrapper[4691]: I0930 07:27:19.752766 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tdf2r" event={"ID":"4e5a24ef-cd18-4162-9739-452580bed5a0","Type":"ContainerStarted","Data":"d11129e515549560c075e82a579848128df3c2cfca13ab8e82670427926fd263"} Sep 30 07:27:19 crc kubenswrapper[4691]: I0930 07:27:19.778045 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-tdf2r" podStartSLOduration=3.114742859 podStartE2EDuration="5.778019311s" podCreationTimestamp="2025-09-30 07:27:14 +0000 UTC" firstStartedPulling="2025-09-30 07:27:16.722152581 +0000 UTC m=+4080.197173621" lastFinishedPulling="2025-09-30 07:27:19.385429003 +0000 UTC m=+4082.860450073" observedRunningTime="2025-09-30 07:27:19.774353804 +0000 UTC m=+4083.249374844" watchObservedRunningTime="2025-09-30 07:27:19.778019311 +0000 UTC m=+4083.253040351" Sep 30 07:27:22 crc kubenswrapper[4691]: I0930 07:27:22.745617 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5sjvx" Sep 30 07:27:22 crc kubenswrapper[4691]: I0930 07:27:22.746226 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5sjvx" Sep 30 07:27:22 crc kubenswrapper[4691]: I0930 07:27:22.803954 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5sjvx" Sep 30 07:27:22 crc kubenswrapper[4691]: I0930 07:27:22.873532 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5sjvx" Sep 30 07:27:23 crc kubenswrapper[4691]: I0930 07:27:23.989714 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5sjvx"] Sep 30 07:27:24 crc kubenswrapper[4691]: I0930 07:27:24.807599 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5sjvx" podUID="1f93a3aa-908b-4922-a8aa-6d0c9fb73084" containerName="registry-server" containerID="cri-o://cb91c581452510d38e6ed5ada31a194f269dd4f46e9c84f5d85b692290defe56" gracePeriod=2 Sep 30 07:27:25 crc kubenswrapper[4691]: I0930 07:27:25.329357 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5sjvx" Sep 30 07:27:25 crc kubenswrapper[4691]: I0930 07:27:25.387572 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-tdf2r" Sep 30 07:27:25 crc kubenswrapper[4691]: I0930 07:27:25.387623 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-tdf2r" Sep 30 07:27:25 crc kubenswrapper[4691]: I0930 07:27:25.435848 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-tdf2r" Sep 30 07:27:25 crc kubenswrapper[4691]: I0930 07:27:25.463157 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f93a3aa-908b-4922-a8aa-6d0c9fb73084-utilities\") pod \"1f93a3aa-908b-4922-a8aa-6d0c9fb73084\" (UID: \"1f93a3aa-908b-4922-a8aa-6d0c9fb73084\") " Sep 30 07:27:25 crc kubenswrapper[4691]: I0930 07:27:25.463461 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhzrm\" (UniqueName: \"kubernetes.io/projected/1f93a3aa-908b-4922-a8aa-6d0c9fb73084-kube-api-access-vhzrm\") pod \"1f93a3aa-908b-4922-a8aa-6d0c9fb73084\" (UID: \"1f93a3aa-908b-4922-a8aa-6d0c9fb73084\") " Sep 30 07:27:25 crc kubenswrapper[4691]: I0930 07:27:25.463494 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f93a3aa-908b-4922-a8aa-6d0c9fb73084-catalog-content\") pod \"1f93a3aa-908b-4922-a8aa-6d0c9fb73084\" (UID: \"1f93a3aa-908b-4922-a8aa-6d0c9fb73084\") " Sep 30 07:27:25 crc kubenswrapper[4691]: I0930 07:27:25.464431 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f93a3aa-908b-4922-a8aa-6d0c9fb73084-utilities" (OuterVolumeSpecName: "utilities") pod "1f93a3aa-908b-4922-a8aa-6d0c9fb73084" (UID: "1f93a3aa-908b-4922-a8aa-6d0c9fb73084"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:27:25 crc kubenswrapper[4691]: I0930 07:27:25.475406 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f93a3aa-908b-4922-a8aa-6d0c9fb73084-kube-api-access-vhzrm" (OuterVolumeSpecName: "kube-api-access-vhzrm") pod "1f93a3aa-908b-4922-a8aa-6d0c9fb73084" (UID: "1f93a3aa-908b-4922-a8aa-6d0c9fb73084"). InnerVolumeSpecName "kube-api-access-vhzrm". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:27:25 crc kubenswrapper[4691]: I0930 07:27:25.502844 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f93a3aa-908b-4922-a8aa-6d0c9fb73084-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1f93a3aa-908b-4922-a8aa-6d0c9fb73084" (UID: "1f93a3aa-908b-4922-a8aa-6d0c9fb73084"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:27:25 crc kubenswrapper[4691]: I0930 07:27:25.566067 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1f93a3aa-908b-4922-a8aa-6d0c9fb73084-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 07:27:25 crc kubenswrapper[4691]: I0930 07:27:25.566104 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhzrm\" (UniqueName: \"kubernetes.io/projected/1f93a3aa-908b-4922-a8aa-6d0c9fb73084-kube-api-access-vhzrm\") on node \"crc\" DevicePath \"\"" Sep 30 07:27:25 crc kubenswrapper[4691]: I0930 07:27:25.566114 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1f93a3aa-908b-4922-a8aa-6d0c9fb73084-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 07:27:25 crc kubenswrapper[4691]: I0930 07:27:25.818086 4691 generic.go:334] "Generic (PLEG): container finished" podID="1f93a3aa-908b-4922-a8aa-6d0c9fb73084" containerID="cb91c581452510d38e6ed5ada31a194f269dd4f46e9c84f5d85b692290defe56" exitCode=0 Sep 30 07:27:25 crc kubenswrapper[4691]: I0930 07:27:25.818418 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5sjvx" event={"ID":"1f93a3aa-908b-4922-a8aa-6d0c9fb73084","Type":"ContainerDied","Data":"cb91c581452510d38e6ed5ada31a194f269dd4f46e9c84f5d85b692290defe56"} Sep 30 07:27:25 crc kubenswrapper[4691]: I0930 07:27:25.818465 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5sjvx" event={"ID":"1f93a3aa-908b-4922-a8aa-6d0c9fb73084","Type":"ContainerDied","Data":"771e904441dc2d5d48b0d4cce0adf3bb96ca047fc2abb83f218dc3f9a78ff57d"} Sep 30 07:27:25 crc kubenswrapper[4691]: I0930 07:27:25.818490 4691 scope.go:117] "RemoveContainer" containerID="cb91c581452510d38e6ed5ada31a194f269dd4f46e9c84f5d85b692290defe56" Sep 30 07:27:25 crc kubenswrapper[4691]: I0930 07:27:25.818567 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5sjvx" Sep 30 07:27:25 crc kubenswrapper[4691]: I0930 07:27:25.864785 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5sjvx"] Sep 30 07:27:25 crc kubenswrapper[4691]: I0930 07:27:25.877465 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5sjvx"] Sep 30 07:27:25 crc kubenswrapper[4691]: I0930 07:27:25.898412 4691 scope.go:117] "RemoveContainer" containerID="3e263adab74470bf89f1fcfe1b7bb711329f4bd750e225ef5a93592cbe529d80" Sep 30 07:27:25 crc kubenswrapper[4691]: I0930 07:27:25.968564 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-tdf2r" Sep 30 07:27:25 crc kubenswrapper[4691]: I0930 07:27:25.983842 4691 scope.go:117] "RemoveContainer" containerID="bf69fa6bc62f09f0ab228f07521b86265779a264676e6df1f2401c55d5bdd91f" Sep 30 07:27:26 crc kubenswrapper[4691]: I0930 07:27:26.024384 4691 scope.go:117] "RemoveContainer" containerID="cb91c581452510d38e6ed5ada31a194f269dd4f46e9c84f5d85b692290defe56" Sep 30 07:27:26 crc kubenswrapper[4691]: E0930 07:27:26.026094 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb91c581452510d38e6ed5ada31a194f269dd4f46e9c84f5d85b692290defe56\": container with ID starting with cb91c581452510d38e6ed5ada31a194f269dd4f46e9c84f5d85b692290defe56 not found: ID does not exist" containerID="cb91c581452510d38e6ed5ada31a194f269dd4f46e9c84f5d85b692290defe56" Sep 30 07:27:26 crc kubenswrapper[4691]: I0930 07:27:26.026129 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb91c581452510d38e6ed5ada31a194f269dd4f46e9c84f5d85b692290defe56"} err="failed to get container status \"cb91c581452510d38e6ed5ada31a194f269dd4f46e9c84f5d85b692290defe56\": rpc error: code = NotFound desc = could not find container \"cb91c581452510d38e6ed5ada31a194f269dd4f46e9c84f5d85b692290defe56\": container with ID starting with cb91c581452510d38e6ed5ada31a194f269dd4f46e9c84f5d85b692290defe56 not found: ID does not exist" Sep 30 07:27:26 crc kubenswrapper[4691]: I0930 07:27:26.026148 4691 scope.go:117] "RemoveContainer" containerID="3e263adab74470bf89f1fcfe1b7bb711329f4bd750e225ef5a93592cbe529d80" Sep 30 07:27:26 crc kubenswrapper[4691]: E0930 07:27:26.026601 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e263adab74470bf89f1fcfe1b7bb711329f4bd750e225ef5a93592cbe529d80\": container with ID starting with 3e263adab74470bf89f1fcfe1b7bb711329f4bd750e225ef5a93592cbe529d80 not found: ID does not exist" containerID="3e263adab74470bf89f1fcfe1b7bb711329f4bd750e225ef5a93592cbe529d80" Sep 30 07:27:26 crc kubenswrapper[4691]: I0930 07:27:26.026619 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e263adab74470bf89f1fcfe1b7bb711329f4bd750e225ef5a93592cbe529d80"} err="failed to get container status \"3e263adab74470bf89f1fcfe1b7bb711329f4bd750e225ef5a93592cbe529d80\": rpc error: code = NotFound desc = could not find container \"3e263adab74470bf89f1fcfe1b7bb711329f4bd750e225ef5a93592cbe529d80\": container with ID starting with 3e263adab74470bf89f1fcfe1b7bb711329f4bd750e225ef5a93592cbe529d80 not found: ID does not exist" Sep 30 07:27:26 crc kubenswrapper[4691]: I0930 07:27:26.026631 4691 scope.go:117] "RemoveContainer" containerID="bf69fa6bc62f09f0ab228f07521b86265779a264676e6df1f2401c55d5bdd91f" Sep 30 07:27:26 crc kubenswrapper[4691]: E0930 07:27:26.026930 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf69fa6bc62f09f0ab228f07521b86265779a264676e6df1f2401c55d5bdd91f\": container with ID starting with bf69fa6bc62f09f0ab228f07521b86265779a264676e6df1f2401c55d5bdd91f not found: ID does not exist" containerID="bf69fa6bc62f09f0ab228f07521b86265779a264676e6df1f2401c55d5bdd91f" Sep 30 07:27:26 crc kubenswrapper[4691]: I0930 07:27:26.026948 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf69fa6bc62f09f0ab228f07521b86265779a264676e6df1f2401c55d5bdd91f"} err="failed to get container status \"bf69fa6bc62f09f0ab228f07521b86265779a264676e6df1f2401c55d5bdd91f\": rpc error: code = NotFound desc = could not find container \"bf69fa6bc62f09f0ab228f07521b86265779a264676e6df1f2401c55d5bdd91f\": container with ID starting with bf69fa6bc62f09f0ab228f07521b86265779a264676e6df1f2401c55d5bdd91f not found: ID does not exist" Sep 30 07:27:27 crc kubenswrapper[4691]: I0930 07:27:27.247115 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f93a3aa-908b-4922-a8aa-6d0c9fb73084" path="/var/lib/kubelet/pods/1f93a3aa-908b-4922-a8aa-6d0c9fb73084/volumes" Sep 30 07:27:27 crc kubenswrapper[4691]: I0930 07:27:27.795657 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tdf2r"] Sep 30 07:27:27 crc kubenswrapper[4691]: I0930 07:27:27.844590 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-tdf2r" podUID="4e5a24ef-cd18-4162-9739-452580bed5a0" containerName="registry-server" containerID="cri-o://d11129e515549560c075e82a579848128df3c2cfca13ab8e82670427926fd263" gracePeriod=2 Sep 30 07:27:28 crc kubenswrapper[4691]: I0930 07:27:28.349608 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tdf2r" Sep 30 07:27:28 crc kubenswrapper[4691]: I0930 07:27:28.434659 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e5a24ef-cd18-4162-9739-452580bed5a0-catalog-content\") pod \"4e5a24ef-cd18-4162-9739-452580bed5a0\" (UID: \"4e5a24ef-cd18-4162-9739-452580bed5a0\") " Sep 30 07:27:28 crc kubenswrapper[4691]: I0930 07:27:28.435044 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jcnxd\" (UniqueName: \"kubernetes.io/projected/4e5a24ef-cd18-4162-9739-452580bed5a0-kube-api-access-jcnxd\") pod \"4e5a24ef-cd18-4162-9739-452580bed5a0\" (UID: \"4e5a24ef-cd18-4162-9739-452580bed5a0\") " Sep 30 07:27:28 crc kubenswrapper[4691]: I0930 07:27:28.435188 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e5a24ef-cd18-4162-9739-452580bed5a0-utilities\") pod \"4e5a24ef-cd18-4162-9739-452580bed5a0\" (UID: \"4e5a24ef-cd18-4162-9739-452580bed5a0\") " Sep 30 07:27:28 crc kubenswrapper[4691]: I0930 07:27:28.436253 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e5a24ef-cd18-4162-9739-452580bed5a0-utilities" (OuterVolumeSpecName: "utilities") pod "4e5a24ef-cd18-4162-9739-452580bed5a0" (UID: "4e5a24ef-cd18-4162-9739-452580bed5a0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:27:28 crc kubenswrapper[4691]: I0930 07:27:28.453108 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e5a24ef-cd18-4162-9739-452580bed5a0-kube-api-access-jcnxd" (OuterVolumeSpecName: "kube-api-access-jcnxd") pod "4e5a24ef-cd18-4162-9739-452580bed5a0" (UID: "4e5a24ef-cd18-4162-9739-452580bed5a0"). InnerVolumeSpecName "kube-api-access-jcnxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:27:28 crc kubenswrapper[4691]: I0930 07:27:28.484446 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e5a24ef-cd18-4162-9739-452580bed5a0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4e5a24ef-cd18-4162-9739-452580bed5a0" (UID: "4e5a24ef-cd18-4162-9739-452580bed5a0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:27:28 crc kubenswrapper[4691]: I0930 07:27:28.538018 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jcnxd\" (UniqueName: \"kubernetes.io/projected/4e5a24ef-cd18-4162-9739-452580bed5a0-kube-api-access-jcnxd\") on node \"crc\" DevicePath \"\"" Sep 30 07:27:28 crc kubenswrapper[4691]: I0930 07:27:28.538056 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4e5a24ef-cd18-4162-9739-452580bed5a0-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 07:27:28 crc kubenswrapper[4691]: I0930 07:27:28.538070 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4e5a24ef-cd18-4162-9739-452580bed5a0-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 07:27:28 crc kubenswrapper[4691]: I0930 07:27:28.856393 4691 generic.go:334] "Generic (PLEG): container finished" podID="4e5a24ef-cd18-4162-9739-452580bed5a0" containerID="d11129e515549560c075e82a579848128df3c2cfca13ab8e82670427926fd263" exitCode=0 Sep 30 07:27:28 crc kubenswrapper[4691]: I0930 07:27:28.856471 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-tdf2r" Sep 30 07:27:28 crc kubenswrapper[4691]: I0930 07:27:28.856462 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tdf2r" event={"ID":"4e5a24ef-cd18-4162-9739-452580bed5a0","Type":"ContainerDied","Data":"d11129e515549560c075e82a579848128df3c2cfca13ab8e82670427926fd263"} Sep 30 07:27:28 crc kubenswrapper[4691]: I0930 07:27:28.856619 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-tdf2r" event={"ID":"4e5a24ef-cd18-4162-9739-452580bed5a0","Type":"ContainerDied","Data":"5179f50a9fc01d80ae31351c7d226ee431be35e5e9e85d6230293ae351f87439"} Sep 30 07:27:28 crc kubenswrapper[4691]: I0930 07:27:28.856647 4691 scope.go:117] "RemoveContainer" containerID="d11129e515549560c075e82a579848128df3c2cfca13ab8e82670427926fd263" Sep 30 07:27:28 crc kubenswrapper[4691]: I0930 07:27:28.886969 4691 scope.go:117] "RemoveContainer" containerID="3baea762dcab0b2c60b43df49a512d682f64f53e9edb3f211232c4d4ec7fd042" Sep 30 07:27:28 crc kubenswrapper[4691]: I0930 07:27:28.908738 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-tdf2r"] Sep 30 07:27:28 crc kubenswrapper[4691]: I0930 07:27:28.919066 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-tdf2r"] Sep 30 07:27:29 crc kubenswrapper[4691]: I0930 07:27:29.237678 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4e5a24ef-cd18-4162-9739-452580bed5a0" path="/var/lib/kubelet/pods/4e5a24ef-cd18-4162-9739-452580bed5a0/volumes" Sep 30 07:27:29 crc kubenswrapper[4691]: I0930 07:27:29.523462 4691 scope.go:117] "RemoveContainer" containerID="2c98201460c55ce325e072d0abb9b7937e524e2d75c2cd31704d936dc3d64cd7" Sep 30 07:27:29 crc kubenswrapper[4691]: I0930 07:27:29.598550 4691 scope.go:117] "RemoveContainer" containerID="d11129e515549560c075e82a579848128df3c2cfca13ab8e82670427926fd263" Sep 30 07:27:29 crc kubenswrapper[4691]: E0930 07:27:29.599081 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d11129e515549560c075e82a579848128df3c2cfca13ab8e82670427926fd263\": container with ID starting with d11129e515549560c075e82a579848128df3c2cfca13ab8e82670427926fd263 not found: ID does not exist" containerID="d11129e515549560c075e82a579848128df3c2cfca13ab8e82670427926fd263" Sep 30 07:27:29 crc kubenswrapper[4691]: I0930 07:27:29.599120 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d11129e515549560c075e82a579848128df3c2cfca13ab8e82670427926fd263"} err="failed to get container status \"d11129e515549560c075e82a579848128df3c2cfca13ab8e82670427926fd263\": rpc error: code = NotFound desc = could not find container \"d11129e515549560c075e82a579848128df3c2cfca13ab8e82670427926fd263\": container with ID starting with d11129e515549560c075e82a579848128df3c2cfca13ab8e82670427926fd263 not found: ID does not exist" Sep 30 07:27:29 crc kubenswrapper[4691]: I0930 07:27:29.599156 4691 scope.go:117] "RemoveContainer" containerID="3baea762dcab0b2c60b43df49a512d682f64f53e9edb3f211232c4d4ec7fd042" Sep 30 07:27:29 crc kubenswrapper[4691]: E0930 07:27:29.599665 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3baea762dcab0b2c60b43df49a512d682f64f53e9edb3f211232c4d4ec7fd042\": container with ID starting with 3baea762dcab0b2c60b43df49a512d682f64f53e9edb3f211232c4d4ec7fd042 not found: ID does not exist" containerID="3baea762dcab0b2c60b43df49a512d682f64f53e9edb3f211232c4d4ec7fd042" Sep 30 07:27:29 crc kubenswrapper[4691]: I0930 07:27:29.599731 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3baea762dcab0b2c60b43df49a512d682f64f53e9edb3f211232c4d4ec7fd042"} err="failed to get container status \"3baea762dcab0b2c60b43df49a512d682f64f53e9edb3f211232c4d4ec7fd042\": rpc error: code = NotFound desc = could not find container \"3baea762dcab0b2c60b43df49a512d682f64f53e9edb3f211232c4d4ec7fd042\": container with ID starting with 3baea762dcab0b2c60b43df49a512d682f64f53e9edb3f211232c4d4ec7fd042 not found: ID does not exist" Sep 30 07:27:29 crc kubenswrapper[4691]: I0930 07:27:29.599758 4691 scope.go:117] "RemoveContainer" containerID="2c98201460c55ce325e072d0abb9b7937e524e2d75c2cd31704d936dc3d64cd7" Sep 30 07:27:29 crc kubenswrapper[4691]: E0930 07:27:29.600269 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c98201460c55ce325e072d0abb9b7937e524e2d75c2cd31704d936dc3d64cd7\": container with ID starting with 2c98201460c55ce325e072d0abb9b7937e524e2d75c2cd31704d936dc3d64cd7 not found: ID does not exist" containerID="2c98201460c55ce325e072d0abb9b7937e524e2d75c2cd31704d936dc3d64cd7" Sep 30 07:27:29 crc kubenswrapper[4691]: I0930 07:27:29.600311 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c98201460c55ce325e072d0abb9b7937e524e2d75c2cd31704d936dc3d64cd7"} err="failed to get container status \"2c98201460c55ce325e072d0abb9b7937e524e2d75c2cd31704d936dc3d64cd7\": rpc error: code = NotFound desc = could not find container \"2c98201460c55ce325e072d0abb9b7937e524e2d75c2cd31704d936dc3d64cd7\": container with ID starting with 2c98201460c55ce325e072d0abb9b7937e524e2d75c2cd31704d936dc3d64cd7 not found: ID does not exist" Sep 30 07:27:31 crc kubenswrapper[4691]: I0930 07:27:31.226972 4691 scope.go:117] "RemoveContainer" containerID="232d9e6ff7d50e3133db1d864568369b00d728911ca2e9a51d2661303a77e720" Sep 30 07:27:31 crc kubenswrapper[4691]: E0930 07:27:31.228117 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:27:46 crc kubenswrapper[4691]: I0930 07:27:46.224683 4691 scope.go:117] "RemoveContainer" containerID="232d9e6ff7d50e3133db1d864568369b00d728911ca2e9a51d2661303a77e720" Sep 30 07:27:46 crc kubenswrapper[4691]: E0930 07:27:46.225507 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:28:01 crc kubenswrapper[4691]: I0930 07:28:01.225481 4691 scope.go:117] "RemoveContainer" containerID="232d9e6ff7d50e3133db1d864568369b00d728911ca2e9a51d2661303a77e720" Sep 30 07:28:01 crc kubenswrapper[4691]: E0930 07:28:01.226484 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:28:15 crc kubenswrapper[4691]: I0930 07:28:15.225360 4691 scope.go:117] "RemoveContainer" containerID="232d9e6ff7d50e3133db1d864568369b00d728911ca2e9a51d2661303a77e720" Sep 30 07:28:15 crc kubenswrapper[4691]: E0930 07:28:15.226429 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:28:29 crc kubenswrapper[4691]: I0930 07:28:29.225154 4691 scope.go:117] "RemoveContainer" containerID="232d9e6ff7d50e3133db1d864568369b00d728911ca2e9a51d2661303a77e720" Sep 30 07:28:29 crc kubenswrapper[4691]: I0930 07:28:29.604033 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" event={"ID":"69b46ade-8260-448f-84b7-506632d23ff9","Type":"ContainerStarted","Data":"3c8dbde14b0f238a12d2a17d184b4856046cef36a2f8cbe725380984abbf070e"} Sep 30 07:29:00 crc kubenswrapper[4691]: E0930 07:29:00.026549 4691 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.46:58622->38.102.83.46:45179: write tcp 38.102.83.46:58622->38.102.83.46:45179: write: broken pipe Sep 30 07:30:00 crc kubenswrapper[4691]: I0930 07:30:00.170446 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320290-9v9zv"] Sep 30 07:30:00 crc kubenswrapper[4691]: E0930 07:30:00.171959 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e5a24ef-cd18-4162-9739-452580bed5a0" containerName="registry-server" Sep 30 07:30:00 crc kubenswrapper[4691]: I0930 07:30:00.171991 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e5a24ef-cd18-4162-9739-452580bed5a0" containerName="registry-server" Sep 30 07:30:00 crc kubenswrapper[4691]: E0930 07:30:00.172042 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e5a24ef-cd18-4162-9739-452580bed5a0" containerName="extract-content" Sep 30 07:30:00 crc kubenswrapper[4691]: I0930 07:30:00.172061 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e5a24ef-cd18-4162-9739-452580bed5a0" containerName="extract-content" Sep 30 07:30:00 crc kubenswrapper[4691]: E0930 07:30:00.172090 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f93a3aa-908b-4922-a8aa-6d0c9fb73084" containerName="extract-utilities" Sep 30 07:30:00 crc kubenswrapper[4691]: I0930 07:30:00.172107 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f93a3aa-908b-4922-a8aa-6d0c9fb73084" containerName="extract-utilities" Sep 30 07:30:00 crc kubenswrapper[4691]: E0930 07:30:00.172172 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f93a3aa-908b-4922-a8aa-6d0c9fb73084" containerName="registry-server" Sep 30 07:30:00 crc kubenswrapper[4691]: I0930 07:30:00.172191 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f93a3aa-908b-4922-a8aa-6d0c9fb73084" containerName="registry-server" Sep 30 07:30:00 crc kubenswrapper[4691]: E0930 07:30:00.172225 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e5a24ef-cd18-4162-9739-452580bed5a0" containerName="extract-utilities" Sep 30 07:30:00 crc kubenswrapper[4691]: I0930 07:30:00.172242 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e5a24ef-cd18-4162-9739-452580bed5a0" containerName="extract-utilities" Sep 30 07:30:00 crc kubenswrapper[4691]: E0930 07:30:00.172269 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f93a3aa-908b-4922-a8aa-6d0c9fb73084" containerName="extract-content" Sep 30 07:30:00 crc kubenswrapper[4691]: I0930 07:30:00.172285 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f93a3aa-908b-4922-a8aa-6d0c9fb73084" containerName="extract-content" Sep 30 07:30:00 crc kubenswrapper[4691]: I0930 07:30:00.172748 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e5a24ef-cd18-4162-9739-452580bed5a0" containerName="registry-server" Sep 30 07:30:00 crc kubenswrapper[4691]: I0930 07:30:00.172806 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f93a3aa-908b-4922-a8aa-6d0c9fb73084" containerName="registry-server" Sep 30 07:30:00 crc kubenswrapper[4691]: I0930 07:30:00.174392 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320290-9v9zv" Sep 30 07:30:00 crc kubenswrapper[4691]: I0930 07:30:00.176723 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 07:30:00 crc kubenswrapper[4691]: I0930 07:30:00.177599 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 07:30:00 crc kubenswrapper[4691]: I0930 07:30:00.186604 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320290-9v9zv"] Sep 30 07:30:00 crc kubenswrapper[4691]: I0930 07:30:00.224201 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3345756c-7e50-4d96-9fd0-da2efbdac54f-config-volume\") pod \"collect-profiles-29320290-9v9zv\" (UID: \"3345756c-7e50-4d96-9fd0-da2efbdac54f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320290-9v9zv" Sep 30 07:30:00 crc kubenswrapper[4691]: I0930 07:30:00.224494 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfvtz\" (UniqueName: \"kubernetes.io/projected/3345756c-7e50-4d96-9fd0-da2efbdac54f-kube-api-access-nfvtz\") pod \"collect-profiles-29320290-9v9zv\" (UID: \"3345756c-7e50-4d96-9fd0-da2efbdac54f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320290-9v9zv" Sep 30 07:30:00 crc kubenswrapper[4691]: I0930 07:30:00.224656 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3345756c-7e50-4d96-9fd0-da2efbdac54f-secret-volume\") pod \"collect-profiles-29320290-9v9zv\" (UID: \"3345756c-7e50-4d96-9fd0-da2efbdac54f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320290-9v9zv" Sep 30 07:30:00 crc kubenswrapper[4691]: I0930 07:30:00.326864 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3345756c-7e50-4d96-9fd0-da2efbdac54f-config-volume\") pod \"collect-profiles-29320290-9v9zv\" (UID: \"3345756c-7e50-4d96-9fd0-da2efbdac54f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320290-9v9zv" Sep 30 07:30:00 crc kubenswrapper[4691]: I0930 07:30:00.327006 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfvtz\" (UniqueName: \"kubernetes.io/projected/3345756c-7e50-4d96-9fd0-da2efbdac54f-kube-api-access-nfvtz\") pod \"collect-profiles-29320290-9v9zv\" (UID: \"3345756c-7e50-4d96-9fd0-da2efbdac54f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320290-9v9zv" Sep 30 07:30:00 crc kubenswrapper[4691]: I0930 07:30:00.327042 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3345756c-7e50-4d96-9fd0-da2efbdac54f-secret-volume\") pod \"collect-profiles-29320290-9v9zv\" (UID: \"3345756c-7e50-4d96-9fd0-da2efbdac54f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320290-9v9zv" Sep 30 07:30:00 crc kubenswrapper[4691]: I0930 07:30:00.328985 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3345756c-7e50-4d96-9fd0-da2efbdac54f-config-volume\") pod \"collect-profiles-29320290-9v9zv\" (UID: \"3345756c-7e50-4d96-9fd0-da2efbdac54f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320290-9v9zv" Sep 30 07:30:00 crc kubenswrapper[4691]: I0930 07:30:00.336833 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3345756c-7e50-4d96-9fd0-da2efbdac54f-secret-volume\") pod \"collect-profiles-29320290-9v9zv\" (UID: \"3345756c-7e50-4d96-9fd0-da2efbdac54f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320290-9v9zv" Sep 30 07:30:00 crc kubenswrapper[4691]: I0930 07:30:00.342562 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfvtz\" (UniqueName: \"kubernetes.io/projected/3345756c-7e50-4d96-9fd0-da2efbdac54f-kube-api-access-nfvtz\") pod \"collect-profiles-29320290-9v9zv\" (UID: \"3345756c-7e50-4d96-9fd0-da2efbdac54f\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320290-9v9zv" Sep 30 07:30:00 crc kubenswrapper[4691]: I0930 07:30:00.511606 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320290-9v9zv" Sep 30 07:30:00 crc kubenswrapper[4691]: I0930 07:30:00.981535 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320290-9v9zv"] Sep 30 07:30:01 crc kubenswrapper[4691]: I0930 07:30:01.700710 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320290-9v9zv" event={"ID":"3345756c-7e50-4d96-9fd0-da2efbdac54f","Type":"ContainerStarted","Data":"30738f2cd56b08c050aaa6e6f8c4f3bfdefe347e631440a0ae1594ffabaaf5a7"} Sep 30 07:30:01 crc kubenswrapper[4691]: I0930 07:30:01.701163 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320290-9v9zv" event={"ID":"3345756c-7e50-4d96-9fd0-da2efbdac54f","Type":"ContainerStarted","Data":"67aadd516c0d6df3a2e3ec3f04384b990da959e3b0a6dfe99b19ce8a01a16337"} Sep 30 07:30:02 crc kubenswrapper[4691]: I0930 07:30:02.714237 4691 generic.go:334] "Generic (PLEG): container finished" podID="3345756c-7e50-4d96-9fd0-da2efbdac54f" containerID="30738f2cd56b08c050aaa6e6f8c4f3bfdefe347e631440a0ae1594ffabaaf5a7" exitCode=0 Sep 30 07:30:02 crc kubenswrapper[4691]: I0930 07:30:02.714336 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320290-9v9zv" event={"ID":"3345756c-7e50-4d96-9fd0-da2efbdac54f","Type":"ContainerDied","Data":"30738f2cd56b08c050aaa6e6f8c4f3bfdefe347e631440a0ae1594ffabaaf5a7"} Sep 30 07:30:04 crc kubenswrapper[4691]: I0930 07:30:04.109470 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320290-9v9zv" Sep 30 07:30:04 crc kubenswrapper[4691]: I0930 07:30:04.206466 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3345756c-7e50-4d96-9fd0-da2efbdac54f-config-volume\") pod \"3345756c-7e50-4d96-9fd0-da2efbdac54f\" (UID: \"3345756c-7e50-4d96-9fd0-da2efbdac54f\") " Sep 30 07:30:04 crc kubenswrapper[4691]: I0930 07:30:04.206942 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3345756c-7e50-4d96-9fd0-da2efbdac54f-secret-volume\") pod \"3345756c-7e50-4d96-9fd0-da2efbdac54f\" (UID: \"3345756c-7e50-4d96-9fd0-da2efbdac54f\") " Sep 30 07:30:04 crc kubenswrapper[4691]: I0930 07:30:04.207126 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nfvtz\" (UniqueName: \"kubernetes.io/projected/3345756c-7e50-4d96-9fd0-da2efbdac54f-kube-api-access-nfvtz\") pod \"3345756c-7e50-4d96-9fd0-da2efbdac54f\" (UID: \"3345756c-7e50-4d96-9fd0-da2efbdac54f\") " Sep 30 07:30:04 crc kubenswrapper[4691]: I0930 07:30:04.207334 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3345756c-7e50-4d96-9fd0-da2efbdac54f-config-volume" (OuterVolumeSpecName: "config-volume") pod "3345756c-7e50-4d96-9fd0-da2efbdac54f" (UID: "3345756c-7e50-4d96-9fd0-da2efbdac54f"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:30:04 crc kubenswrapper[4691]: I0930 07:30:04.207785 4691 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3345756c-7e50-4d96-9fd0-da2efbdac54f-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 07:30:04 crc kubenswrapper[4691]: I0930 07:30:04.217062 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3345756c-7e50-4d96-9fd0-da2efbdac54f-kube-api-access-nfvtz" (OuterVolumeSpecName: "kube-api-access-nfvtz") pod "3345756c-7e50-4d96-9fd0-da2efbdac54f" (UID: "3345756c-7e50-4d96-9fd0-da2efbdac54f"). InnerVolumeSpecName "kube-api-access-nfvtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:30:04 crc kubenswrapper[4691]: I0930 07:30:04.217247 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3345756c-7e50-4d96-9fd0-da2efbdac54f-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3345756c-7e50-4d96-9fd0-da2efbdac54f" (UID: "3345756c-7e50-4d96-9fd0-da2efbdac54f"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:30:04 crc kubenswrapper[4691]: I0930 07:30:04.310558 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nfvtz\" (UniqueName: \"kubernetes.io/projected/3345756c-7e50-4d96-9fd0-da2efbdac54f-kube-api-access-nfvtz\") on node \"crc\" DevicePath \"\"" Sep 30 07:30:04 crc kubenswrapper[4691]: I0930 07:30:04.310598 4691 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3345756c-7e50-4d96-9fd0-da2efbdac54f-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 07:30:04 crc kubenswrapper[4691]: I0930 07:30:04.748331 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320290-9v9zv" event={"ID":"3345756c-7e50-4d96-9fd0-da2efbdac54f","Type":"ContainerDied","Data":"67aadd516c0d6df3a2e3ec3f04384b990da959e3b0a6dfe99b19ce8a01a16337"} Sep 30 07:30:04 crc kubenswrapper[4691]: I0930 07:30:04.748396 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="67aadd516c0d6df3a2e3ec3f04384b990da959e3b0a6dfe99b19ce8a01a16337" Sep 30 07:30:04 crc kubenswrapper[4691]: I0930 07:30:04.748401 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320290-9v9zv" Sep 30 07:30:04 crc kubenswrapper[4691]: I0930 07:30:04.819996 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320245-lbznn"] Sep 30 07:30:04 crc kubenswrapper[4691]: I0930 07:30:04.830930 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320245-lbznn"] Sep 30 07:30:05 crc kubenswrapper[4691]: I0930 07:30:05.238881 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f23214a-9e66-4001-833c-671c08f7a95d" path="/var/lib/kubelet/pods/2f23214a-9e66-4001-833c-671c08f7a95d/volumes" Sep 30 07:30:27 crc kubenswrapper[4691]: I0930 07:30:27.022925 4691 scope.go:117] "RemoveContainer" containerID="4d7b49074587bba4e2b185e585678f74b2e7eb7fc35ba041b8f19e870bc21446" Sep 30 07:30:52 crc kubenswrapper[4691]: I0930 07:30:52.850540 4691 patch_prober.go:28] interesting pod/machine-config-daemon-4w4k6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 07:30:52 crc kubenswrapper[4691]: I0930 07:30:52.851091 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 07:31:22 crc kubenswrapper[4691]: I0930 07:31:22.850449 4691 patch_prober.go:28] interesting pod/machine-config-daemon-4w4k6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 07:31:22 crc kubenswrapper[4691]: I0930 07:31:22.851019 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 07:31:52 crc kubenswrapper[4691]: I0930 07:31:52.849874 4691 patch_prober.go:28] interesting pod/machine-config-daemon-4w4k6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 07:31:52 crc kubenswrapper[4691]: I0930 07:31:52.850531 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 07:31:52 crc kubenswrapper[4691]: I0930 07:31:52.850582 4691 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" Sep 30 07:31:52 crc kubenswrapper[4691]: I0930 07:31:52.851460 4691 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3c8dbde14b0f238a12d2a17d184b4856046cef36a2f8cbe725380984abbf070e"} pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 07:31:52 crc kubenswrapper[4691]: I0930 07:31:52.851522 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" containerName="machine-config-daemon" containerID="cri-o://3c8dbde14b0f238a12d2a17d184b4856046cef36a2f8cbe725380984abbf070e" gracePeriod=600 Sep 30 07:31:53 crc kubenswrapper[4691]: I0930 07:31:53.963312 4691 generic.go:334] "Generic (PLEG): container finished" podID="69b46ade-8260-448f-84b7-506632d23ff9" containerID="3c8dbde14b0f238a12d2a17d184b4856046cef36a2f8cbe725380984abbf070e" exitCode=0 Sep 30 07:31:53 crc kubenswrapper[4691]: I0930 07:31:53.963380 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" event={"ID":"69b46ade-8260-448f-84b7-506632d23ff9","Type":"ContainerDied","Data":"3c8dbde14b0f238a12d2a17d184b4856046cef36a2f8cbe725380984abbf070e"} Sep 30 07:31:53 crc kubenswrapper[4691]: I0930 07:31:53.963662 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" event={"ID":"69b46ade-8260-448f-84b7-506632d23ff9","Type":"ContainerStarted","Data":"f188c8214aa7e772a4491a6d7c828ca911338d6668e5ea8374385e72206b46f9"} Sep 30 07:31:53 crc kubenswrapper[4691]: I0930 07:31:53.963682 4691 scope.go:117] "RemoveContainer" containerID="232d9e6ff7d50e3133db1d864568369b00d728911ca2e9a51d2661303a77e720" Sep 30 07:32:37 crc kubenswrapper[4691]: I0930 07:32:37.800385 4691 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="08782d24-2bd9-48d6-b9b2-12a2ad66e6d0" containerName="galera" probeResult="failure" output="command timed out" Sep 30 07:32:37 crc kubenswrapper[4691]: I0930 07:32:37.800540 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-galera-0" podUID="08782d24-2bd9-48d6-b9b2-12a2ad66e6d0" containerName="galera" probeResult="failure" output="command timed out" Sep 30 07:34:22 crc kubenswrapper[4691]: I0930 07:34:22.850414 4691 patch_prober.go:28] interesting pod/machine-config-daemon-4w4k6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 07:34:22 crc kubenswrapper[4691]: I0930 07:34:22.851010 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 07:34:52 crc kubenswrapper[4691]: I0930 07:34:52.849594 4691 patch_prober.go:28] interesting pod/machine-config-daemon-4w4k6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 07:34:52 crc kubenswrapper[4691]: I0930 07:34:52.850401 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 07:35:22 crc kubenswrapper[4691]: I0930 07:35:22.850554 4691 patch_prober.go:28] interesting pod/machine-config-daemon-4w4k6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 07:35:22 crc kubenswrapper[4691]: I0930 07:35:22.851227 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 07:35:22 crc kubenswrapper[4691]: I0930 07:35:22.851307 4691 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" Sep 30 07:35:22 crc kubenswrapper[4691]: I0930 07:35:22.852569 4691 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f188c8214aa7e772a4491a6d7c828ca911338d6668e5ea8374385e72206b46f9"} pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 07:35:22 crc kubenswrapper[4691]: I0930 07:35:22.852830 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" containerName="machine-config-daemon" containerID="cri-o://f188c8214aa7e772a4491a6d7c828ca911338d6668e5ea8374385e72206b46f9" gracePeriod=600 Sep 30 07:35:23 crc kubenswrapper[4691]: I0930 07:35:23.212177 4691 generic.go:334] "Generic (PLEG): container finished" podID="69b46ade-8260-448f-84b7-506632d23ff9" containerID="f188c8214aa7e772a4491a6d7c828ca911338d6668e5ea8374385e72206b46f9" exitCode=0 Sep 30 07:35:23 crc kubenswrapper[4691]: I0930 07:35:23.212253 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" event={"ID":"69b46ade-8260-448f-84b7-506632d23ff9","Type":"ContainerDied","Data":"f188c8214aa7e772a4491a6d7c828ca911338d6668e5ea8374385e72206b46f9"} Sep 30 07:35:23 crc kubenswrapper[4691]: I0930 07:35:23.212434 4691 scope.go:117] "RemoveContainer" containerID="3c8dbde14b0f238a12d2a17d184b4856046cef36a2f8cbe725380984abbf070e" Sep 30 07:35:23 crc kubenswrapper[4691]: E0930 07:35:23.708341 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:35:24 crc kubenswrapper[4691]: I0930 07:35:24.224167 4691 scope.go:117] "RemoveContainer" containerID="f188c8214aa7e772a4491a6d7c828ca911338d6668e5ea8374385e72206b46f9" Sep 30 07:35:24 crc kubenswrapper[4691]: E0930 07:35:24.224762 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:35:36 crc kubenswrapper[4691]: I0930 07:35:36.225167 4691 scope.go:117] "RemoveContainer" containerID="f188c8214aa7e772a4491a6d7c828ca911338d6668e5ea8374385e72206b46f9" Sep 30 07:35:36 crc kubenswrapper[4691]: E0930 07:35:36.226145 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:35:51 crc kubenswrapper[4691]: I0930 07:35:51.224924 4691 scope.go:117] "RemoveContainer" containerID="f188c8214aa7e772a4491a6d7c828ca911338d6668e5ea8374385e72206b46f9" Sep 30 07:35:51 crc kubenswrapper[4691]: E0930 07:35:51.225536 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:36:05 crc kubenswrapper[4691]: I0930 07:36:05.438682 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7g8ws"] Sep 30 07:36:05 crc kubenswrapper[4691]: E0930 07:36:05.440088 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3345756c-7e50-4d96-9fd0-da2efbdac54f" containerName="collect-profiles" Sep 30 07:36:05 crc kubenswrapper[4691]: I0930 07:36:05.440107 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="3345756c-7e50-4d96-9fd0-da2efbdac54f" containerName="collect-profiles" Sep 30 07:36:05 crc kubenswrapper[4691]: I0930 07:36:05.440389 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="3345756c-7e50-4d96-9fd0-da2efbdac54f" containerName="collect-profiles" Sep 30 07:36:05 crc kubenswrapper[4691]: I0930 07:36:05.442165 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7g8ws" Sep 30 07:36:05 crc kubenswrapper[4691]: I0930 07:36:05.463458 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7g8ws"] Sep 30 07:36:05 crc kubenswrapper[4691]: I0930 07:36:05.498836 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b54588b9-b74c-44c0-aa51-07ad90269398-catalog-content\") pod \"redhat-operators-7g8ws\" (UID: \"b54588b9-b74c-44c0-aa51-07ad90269398\") " pod="openshift-marketplace/redhat-operators-7g8ws" Sep 30 07:36:05 crc kubenswrapper[4691]: I0930 07:36:05.498934 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b54588b9-b74c-44c0-aa51-07ad90269398-utilities\") pod \"redhat-operators-7g8ws\" (UID: \"b54588b9-b74c-44c0-aa51-07ad90269398\") " pod="openshift-marketplace/redhat-operators-7g8ws" Sep 30 07:36:05 crc kubenswrapper[4691]: I0930 07:36:05.499050 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwpzb\" (UniqueName: \"kubernetes.io/projected/b54588b9-b74c-44c0-aa51-07ad90269398-kube-api-access-qwpzb\") pod \"redhat-operators-7g8ws\" (UID: \"b54588b9-b74c-44c0-aa51-07ad90269398\") " pod="openshift-marketplace/redhat-operators-7g8ws" Sep 30 07:36:05 crc kubenswrapper[4691]: I0930 07:36:05.600427 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwpzb\" (UniqueName: \"kubernetes.io/projected/b54588b9-b74c-44c0-aa51-07ad90269398-kube-api-access-qwpzb\") pod \"redhat-operators-7g8ws\" (UID: \"b54588b9-b74c-44c0-aa51-07ad90269398\") " pod="openshift-marketplace/redhat-operators-7g8ws" Sep 30 07:36:05 crc kubenswrapper[4691]: I0930 07:36:05.600518 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b54588b9-b74c-44c0-aa51-07ad90269398-catalog-content\") pod \"redhat-operators-7g8ws\" (UID: \"b54588b9-b74c-44c0-aa51-07ad90269398\") " pod="openshift-marketplace/redhat-operators-7g8ws" Sep 30 07:36:05 crc kubenswrapper[4691]: I0930 07:36:05.600564 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b54588b9-b74c-44c0-aa51-07ad90269398-utilities\") pod \"redhat-operators-7g8ws\" (UID: \"b54588b9-b74c-44c0-aa51-07ad90269398\") " pod="openshift-marketplace/redhat-operators-7g8ws" Sep 30 07:36:05 crc kubenswrapper[4691]: I0930 07:36:05.601186 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b54588b9-b74c-44c0-aa51-07ad90269398-catalog-content\") pod \"redhat-operators-7g8ws\" (UID: \"b54588b9-b74c-44c0-aa51-07ad90269398\") " pod="openshift-marketplace/redhat-operators-7g8ws" Sep 30 07:36:05 crc kubenswrapper[4691]: I0930 07:36:05.601211 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b54588b9-b74c-44c0-aa51-07ad90269398-utilities\") pod \"redhat-operators-7g8ws\" (UID: \"b54588b9-b74c-44c0-aa51-07ad90269398\") " pod="openshift-marketplace/redhat-operators-7g8ws" Sep 30 07:36:05 crc kubenswrapper[4691]: I0930 07:36:05.625942 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwpzb\" (UniqueName: \"kubernetes.io/projected/b54588b9-b74c-44c0-aa51-07ad90269398-kube-api-access-qwpzb\") pod \"redhat-operators-7g8ws\" (UID: \"b54588b9-b74c-44c0-aa51-07ad90269398\") " pod="openshift-marketplace/redhat-operators-7g8ws" Sep 30 07:36:05 crc kubenswrapper[4691]: I0930 07:36:05.631679 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-658m9"] Sep 30 07:36:05 crc kubenswrapper[4691]: I0930 07:36:05.637637 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-658m9" Sep 30 07:36:05 crc kubenswrapper[4691]: I0930 07:36:05.666316 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-658m9"] Sep 30 07:36:05 crc kubenswrapper[4691]: I0930 07:36:05.702631 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7725e183-fb93-4384-9abf-cbb9b539d4b4-utilities\") pod \"community-operators-658m9\" (UID: \"7725e183-fb93-4384-9abf-cbb9b539d4b4\") " pod="openshift-marketplace/community-operators-658m9" Sep 30 07:36:05 crc kubenswrapper[4691]: I0930 07:36:05.702775 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qgtq\" (UniqueName: \"kubernetes.io/projected/7725e183-fb93-4384-9abf-cbb9b539d4b4-kube-api-access-4qgtq\") pod \"community-operators-658m9\" (UID: \"7725e183-fb93-4384-9abf-cbb9b539d4b4\") " pod="openshift-marketplace/community-operators-658m9" Sep 30 07:36:05 crc kubenswrapper[4691]: I0930 07:36:05.702831 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7725e183-fb93-4384-9abf-cbb9b539d4b4-catalog-content\") pod \"community-operators-658m9\" (UID: \"7725e183-fb93-4384-9abf-cbb9b539d4b4\") " pod="openshift-marketplace/community-operators-658m9" Sep 30 07:36:05 crc kubenswrapper[4691]: I0930 07:36:05.770204 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7g8ws" Sep 30 07:36:05 crc kubenswrapper[4691]: I0930 07:36:05.806261 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7725e183-fb93-4384-9abf-cbb9b539d4b4-utilities\") pod \"community-operators-658m9\" (UID: \"7725e183-fb93-4384-9abf-cbb9b539d4b4\") " pod="openshift-marketplace/community-operators-658m9" Sep 30 07:36:05 crc kubenswrapper[4691]: I0930 07:36:05.806383 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qgtq\" (UniqueName: \"kubernetes.io/projected/7725e183-fb93-4384-9abf-cbb9b539d4b4-kube-api-access-4qgtq\") pod \"community-operators-658m9\" (UID: \"7725e183-fb93-4384-9abf-cbb9b539d4b4\") " pod="openshift-marketplace/community-operators-658m9" Sep 30 07:36:05 crc kubenswrapper[4691]: I0930 07:36:05.806420 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7725e183-fb93-4384-9abf-cbb9b539d4b4-catalog-content\") pod \"community-operators-658m9\" (UID: \"7725e183-fb93-4384-9abf-cbb9b539d4b4\") " pod="openshift-marketplace/community-operators-658m9" Sep 30 07:36:05 crc kubenswrapper[4691]: I0930 07:36:05.806789 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7725e183-fb93-4384-9abf-cbb9b539d4b4-utilities\") pod \"community-operators-658m9\" (UID: \"7725e183-fb93-4384-9abf-cbb9b539d4b4\") " pod="openshift-marketplace/community-operators-658m9" Sep 30 07:36:05 crc kubenswrapper[4691]: I0930 07:36:05.806816 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7725e183-fb93-4384-9abf-cbb9b539d4b4-catalog-content\") pod \"community-operators-658m9\" (UID: \"7725e183-fb93-4384-9abf-cbb9b539d4b4\") " pod="openshift-marketplace/community-operators-658m9" Sep 30 07:36:05 crc kubenswrapper[4691]: I0930 07:36:05.831755 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qgtq\" (UniqueName: \"kubernetes.io/projected/7725e183-fb93-4384-9abf-cbb9b539d4b4-kube-api-access-4qgtq\") pod \"community-operators-658m9\" (UID: \"7725e183-fb93-4384-9abf-cbb9b539d4b4\") " pod="openshift-marketplace/community-operators-658m9" Sep 30 07:36:06 crc kubenswrapper[4691]: I0930 07:36:06.008677 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-658m9" Sep 30 07:36:06 crc kubenswrapper[4691]: I0930 07:36:06.228009 4691 scope.go:117] "RemoveContainer" containerID="f188c8214aa7e772a4491a6d7c828ca911338d6668e5ea8374385e72206b46f9" Sep 30 07:36:06 crc kubenswrapper[4691]: E0930 07:36:06.228436 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:36:06 crc kubenswrapper[4691]: I0930 07:36:06.388345 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7g8ws"] Sep 30 07:36:06 crc kubenswrapper[4691]: W0930 07:36:06.391636 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb54588b9_b74c_44c0_aa51_07ad90269398.slice/crio-63935dd77e6bd95a47d6bec47160b52ed4e2f2248a7685ee1c913b12fe978451 WatchSource:0}: Error finding container 63935dd77e6bd95a47d6bec47160b52ed4e2f2248a7685ee1c913b12fe978451: Status 404 returned error can't find the container with id 63935dd77e6bd95a47d6bec47160b52ed4e2f2248a7685ee1c913b12fe978451 Sep 30 07:36:06 crc kubenswrapper[4691]: I0930 07:36:06.675848 4691 generic.go:334] "Generic (PLEG): container finished" podID="b54588b9-b74c-44c0-aa51-07ad90269398" containerID="fb4d9179a0c3ad1e19515e850699ea3164d39f4198a1048d604316474239ad83" exitCode=0 Sep 30 07:36:06 crc kubenswrapper[4691]: I0930 07:36:06.675925 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7g8ws" event={"ID":"b54588b9-b74c-44c0-aa51-07ad90269398","Type":"ContainerDied","Data":"fb4d9179a0c3ad1e19515e850699ea3164d39f4198a1048d604316474239ad83"} Sep 30 07:36:06 crc kubenswrapper[4691]: I0930 07:36:06.675970 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7g8ws" event={"ID":"b54588b9-b74c-44c0-aa51-07ad90269398","Type":"ContainerStarted","Data":"63935dd77e6bd95a47d6bec47160b52ed4e2f2248a7685ee1c913b12fe978451"} Sep 30 07:36:06 crc kubenswrapper[4691]: I0930 07:36:06.678657 4691 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 07:36:06 crc kubenswrapper[4691]: I0930 07:36:06.690869 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-658m9"] Sep 30 07:36:07 crc kubenswrapper[4691]: I0930 07:36:07.686669 4691 generic.go:334] "Generic (PLEG): container finished" podID="7725e183-fb93-4384-9abf-cbb9b539d4b4" containerID="1ada86fe6dbca6203dd5095fbe2f8563bca8570004fcd78ee8db032ca64efe6e" exitCode=0 Sep 30 07:36:07 crc kubenswrapper[4691]: I0930 07:36:07.686756 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-658m9" event={"ID":"7725e183-fb93-4384-9abf-cbb9b539d4b4","Type":"ContainerDied","Data":"1ada86fe6dbca6203dd5095fbe2f8563bca8570004fcd78ee8db032ca64efe6e"} Sep 30 07:36:07 crc kubenswrapper[4691]: I0930 07:36:07.687067 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-658m9" event={"ID":"7725e183-fb93-4384-9abf-cbb9b539d4b4","Type":"ContainerStarted","Data":"899dc25d0bb54666b106bee2f35b6ba1d124a200ced4e1689027eccc2fa47d21"} Sep 30 07:36:08 crc kubenswrapper[4691]: I0930 07:36:08.698337 4691 generic.go:334] "Generic (PLEG): container finished" podID="b54588b9-b74c-44c0-aa51-07ad90269398" containerID="aa9165ad525d598b80a9959f372a81be0d15fb810b608e8fa56694a989b4be42" exitCode=0 Sep 30 07:36:08 crc kubenswrapper[4691]: I0930 07:36:08.698443 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7g8ws" event={"ID":"b54588b9-b74c-44c0-aa51-07ad90269398","Type":"ContainerDied","Data":"aa9165ad525d598b80a9959f372a81be0d15fb810b608e8fa56694a989b4be42"} Sep 30 07:36:08 crc kubenswrapper[4691]: I0930 07:36:08.701107 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-658m9" event={"ID":"7725e183-fb93-4384-9abf-cbb9b539d4b4","Type":"ContainerStarted","Data":"a19d9d2509084f43db26f8cf9a2e07f6027ca158c77f494171ba24da6f987cc1"} Sep 30 07:36:09 crc kubenswrapper[4691]: I0930 07:36:09.711656 4691 generic.go:334] "Generic (PLEG): container finished" podID="7725e183-fb93-4384-9abf-cbb9b539d4b4" containerID="a19d9d2509084f43db26f8cf9a2e07f6027ca158c77f494171ba24da6f987cc1" exitCode=0 Sep 30 07:36:09 crc kubenswrapper[4691]: I0930 07:36:09.712035 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-658m9" event={"ID":"7725e183-fb93-4384-9abf-cbb9b539d4b4","Type":"ContainerDied","Data":"a19d9d2509084f43db26f8cf9a2e07f6027ca158c77f494171ba24da6f987cc1"} Sep 30 07:36:09 crc kubenswrapper[4691]: I0930 07:36:09.714790 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7g8ws" event={"ID":"b54588b9-b74c-44c0-aa51-07ad90269398","Type":"ContainerStarted","Data":"ae63805830ef4d896106aa2cca97b9063dba7125ac210537c9de1249628f5e26"} Sep 30 07:36:09 crc kubenswrapper[4691]: I0930 07:36:09.779701 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7g8ws" podStartSLOduration=2.318080963 podStartE2EDuration="4.779682214s" podCreationTimestamp="2025-09-30 07:36:05 +0000 UTC" firstStartedPulling="2025-09-30 07:36:06.678353294 +0000 UTC m=+4610.153374344" lastFinishedPulling="2025-09-30 07:36:09.139954555 +0000 UTC m=+4612.614975595" observedRunningTime="2025-09-30 07:36:09.776301005 +0000 UTC m=+4613.251322045" watchObservedRunningTime="2025-09-30 07:36:09.779682214 +0000 UTC m=+4613.254703254" Sep 30 07:36:11 crc kubenswrapper[4691]: I0930 07:36:11.736062 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-658m9" event={"ID":"7725e183-fb93-4384-9abf-cbb9b539d4b4","Type":"ContainerStarted","Data":"737d41fea792a8043b8db72ebc31649055374da0d0cddb8a753ef176a82e56ae"} Sep 30 07:36:11 crc kubenswrapper[4691]: I0930 07:36:11.759776 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-658m9" podStartSLOduration=3.718503737 podStartE2EDuration="6.759758386s" podCreationTimestamp="2025-09-30 07:36:05 +0000 UTC" firstStartedPulling="2025-09-30 07:36:07.690113939 +0000 UTC m=+4611.165134989" lastFinishedPulling="2025-09-30 07:36:10.731368598 +0000 UTC m=+4614.206389638" observedRunningTime="2025-09-30 07:36:11.758297359 +0000 UTC m=+4615.233318419" watchObservedRunningTime="2025-09-30 07:36:11.759758386 +0000 UTC m=+4615.234779426" Sep 30 07:36:15 crc kubenswrapper[4691]: I0930 07:36:15.771338 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7g8ws" Sep 30 07:36:15 crc kubenswrapper[4691]: I0930 07:36:15.772862 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7g8ws" Sep 30 07:36:15 crc kubenswrapper[4691]: I0930 07:36:15.850416 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7g8ws" Sep 30 07:36:16 crc kubenswrapper[4691]: I0930 07:36:16.010245 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-658m9" Sep 30 07:36:16 crc kubenswrapper[4691]: I0930 07:36:16.010334 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-658m9" Sep 30 07:36:16 crc kubenswrapper[4691]: I0930 07:36:16.075532 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-658m9" Sep 30 07:36:16 crc kubenswrapper[4691]: I0930 07:36:16.865258 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-658m9" Sep 30 07:36:16 crc kubenswrapper[4691]: I0930 07:36:16.872932 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7g8ws" Sep 30 07:36:17 crc kubenswrapper[4691]: I0930 07:36:17.424284 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-658m9"] Sep 30 07:36:18 crc kubenswrapper[4691]: I0930 07:36:18.829220 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-658m9" podUID="7725e183-fb93-4384-9abf-cbb9b539d4b4" containerName="registry-server" containerID="cri-o://737d41fea792a8043b8db72ebc31649055374da0d0cddb8a753ef176a82e56ae" gracePeriod=2 Sep 30 07:36:19 crc kubenswrapper[4691]: I0930 07:36:19.249799 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7g8ws"] Sep 30 07:36:19 crc kubenswrapper[4691]: I0930 07:36:19.251052 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7g8ws" podUID="b54588b9-b74c-44c0-aa51-07ad90269398" containerName="registry-server" containerID="cri-o://ae63805830ef4d896106aa2cca97b9063dba7125ac210537c9de1249628f5e26" gracePeriod=2 Sep 30 07:36:19 crc kubenswrapper[4691]: I0930 07:36:19.534263 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-658m9" Sep 30 07:36:19 crc kubenswrapper[4691]: I0930 07:36:19.615074 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7725e183-fb93-4384-9abf-cbb9b539d4b4-utilities\") pod \"7725e183-fb93-4384-9abf-cbb9b539d4b4\" (UID: \"7725e183-fb93-4384-9abf-cbb9b539d4b4\") " Sep 30 07:36:19 crc kubenswrapper[4691]: I0930 07:36:19.615161 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qgtq\" (UniqueName: \"kubernetes.io/projected/7725e183-fb93-4384-9abf-cbb9b539d4b4-kube-api-access-4qgtq\") pod \"7725e183-fb93-4384-9abf-cbb9b539d4b4\" (UID: \"7725e183-fb93-4384-9abf-cbb9b539d4b4\") " Sep 30 07:36:19 crc kubenswrapper[4691]: I0930 07:36:19.615202 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7725e183-fb93-4384-9abf-cbb9b539d4b4-catalog-content\") pod \"7725e183-fb93-4384-9abf-cbb9b539d4b4\" (UID: \"7725e183-fb93-4384-9abf-cbb9b539d4b4\") " Sep 30 07:36:19 crc kubenswrapper[4691]: I0930 07:36:19.615807 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7725e183-fb93-4384-9abf-cbb9b539d4b4-utilities" (OuterVolumeSpecName: "utilities") pod "7725e183-fb93-4384-9abf-cbb9b539d4b4" (UID: "7725e183-fb93-4384-9abf-cbb9b539d4b4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:36:19 crc kubenswrapper[4691]: I0930 07:36:19.638289 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7725e183-fb93-4384-9abf-cbb9b539d4b4-kube-api-access-4qgtq" (OuterVolumeSpecName: "kube-api-access-4qgtq") pod "7725e183-fb93-4384-9abf-cbb9b539d4b4" (UID: "7725e183-fb93-4384-9abf-cbb9b539d4b4"). InnerVolumeSpecName "kube-api-access-4qgtq". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:36:19 crc kubenswrapper[4691]: I0930 07:36:19.669781 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7725e183-fb93-4384-9abf-cbb9b539d4b4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7725e183-fb93-4384-9abf-cbb9b539d4b4" (UID: "7725e183-fb93-4384-9abf-cbb9b539d4b4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:36:19 crc kubenswrapper[4691]: I0930 07:36:19.717667 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7725e183-fb93-4384-9abf-cbb9b539d4b4-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 07:36:19 crc kubenswrapper[4691]: I0930 07:36:19.717717 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4qgtq\" (UniqueName: \"kubernetes.io/projected/7725e183-fb93-4384-9abf-cbb9b539d4b4-kube-api-access-4qgtq\") on node \"crc\" DevicePath \"\"" Sep 30 07:36:19 crc kubenswrapper[4691]: I0930 07:36:19.717735 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7725e183-fb93-4384-9abf-cbb9b539d4b4-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 07:36:19 crc kubenswrapper[4691]: I0930 07:36:19.843945 4691 generic.go:334] "Generic (PLEG): container finished" podID="7725e183-fb93-4384-9abf-cbb9b539d4b4" containerID="737d41fea792a8043b8db72ebc31649055374da0d0cddb8a753ef176a82e56ae" exitCode=0 Sep 30 07:36:19 crc kubenswrapper[4691]: I0930 07:36:19.844041 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-658m9" event={"ID":"7725e183-fb93-4384-9abf-cbb9b539d4b4","Type":"ContainerDied","Data":"737d41fea792a8043b8db72ebc31649055374da0d0cddb8a753ef176a82e56ae"} Sep 30 07:36:19 crc kubenswrapper[4691]: I0930 07:36:19.844079 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-658m9" event={"ID":"7725e183-fb93-4384-9abf-cbb9b539d4b4","Type":"ContainerDied","Data":"899dc25d0bb54666b106bee2f35b6ba1d124a200ced4e1689027eccc2fa47d21"} Sep 30 07:36:19 crc kubenswrapper[4691]: I0930 07:36:19.844105 4691 scope.go:117] "RemoveContainer" containerID="737d41fea792a8043b8db72ebc31649055374da0d0cddb8a753ef176a82e56ae" Sep 30 07:36:19 crc kubenswrapper[4691]: I0930 07:36:19.844301 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-658m9" Sep 30 07:36:19 crc kubenswrapper[4691]: I0930 07:36:19.851378 4691 generic.go:334] "Generic (PLEG): container finished" podID="b54588b9-b74c-44c0-aa51-07ad90269398" containerID="ae63805830ef4d896106aa2cca97b9063dba7125ac210537c9de1249628f5e26" exitCode=0 Sep 30 07:36:19 crc kubenswrapper[4691]: I0930 07:36:19.851452 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7g8ws" event={"ID":"b54588b9-b74c-44c0-aa51-07ad90269398","Type":"ContainerDied","Data":"ae63805830ef4d896106aa2cca97b9063dba7125ac210537c9de1249628f5e26"} Sep 30 07:36:19 crc kubenswrapper[4691]: I0930 07:36:19.882276 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-658m9"] Sep 30 07:36:19 crc kubenswrapper[4691]: I0930 07:36:19.882683 4691 scope.go:117] "RemoveContainer" containerID="a19d9d2509084f43db26f8cf9a2e07f6027ca158c77f494171ba24da6f987cc1" Sep 30 07:36:19 crc kubenswrapper[4691]: I0930 07:36:19.893307 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-658m9"] Sep 30 07:36:19 crc kubenswrapper[4691]: I0930 07:36:19.914450 4691 scope.go:117] "RemoveContainer" containerID="1ada86fe6dbca6203dd5095fbe2f8563bca8570004fcd78ee8db032ca64efe6e" Sep 30 07:36:19 crc kubenswrapper[4691]: I0930 07:36:19.974994 4691 scope.go:117] "RemoveContainer" containerID="737d41fea792a8043b8db72ebc31649055374da0d0cddb8a753ef176a82e56ae" Sep 30 07:36:19 crc kubenswrapper[4691]: E0930 07:36:19.977705 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"737d41fea792a8043b8db72ebc31649055374da0d0cddb8a753ef176a82e56ae\": container with ID starting with 737d41fea792a8043b8db72ebc31649055374da0d0cddb8a753ef176a82e56ae not found: ID does not exist" containerID="737d41fea792a8043b8db72ebc31649055374da0d0cddb8a753ef176a82e56ae" Sep 30 07:36:19 crc kubenswrapper[4691]: I0930 07:36:19.977764 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"737d41fea792a8043b8db72ebc31649055374da0d0cddb8a753ef176a82e56ae"} err="failed to get container status \"737d41fea792a8043b8db72ebc31649055374da0d0cddb8a753ef176a82e56ae\": rpc error: code = NotFound desc = could not find container \"737d41fea792a8043b8db72ebc31649055374da0d0cddb8a753ef176a82e56ae\": container with ID starting with 737d41fea792a8043b8db72ebc31649055374da0d0cddb8a753ef176a82e56ae not found: ID does not exist" Sep 30 07:36:19 crc kubenswrapper[4691]: I0930 07:36:19.977803 4691 scope.go:117] "RemoveContainer" containerID="a19d9d2509084f43db26f8cf9a2e07f6027ca158c77f494171ba24da6f987cc1" Sep 30 07:36:19 crc kubenswrapper[4691]: E0930 07:36:19.981222 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a19d9d2509084f43db26f8cf9a2e07f6027ca158c77f494171ba24da6f987cc1\": container with ID starting with a19d9d2509084f43db26f8cf9a2e07f6027ca158c77f494171ba24da6f987cc1 not found: ID does not exist" containerID="a19d9d2509084f43db26f8cf9a2e07f6027ca158c77f494171ba24da6f987cc1" Sep 30 07:36:19 crc kubenswrapper[4691]: I0930 07:36:19.981274 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a19d9d2509084f43db26f8cf9a2e07f6027ca158c77f494171ba24da6f987cc1"} err="failed to get container status \"a19d9d2509084f43db26f8cf9a2e07f6027ca158c77f494171ba24da6f987cc1\": rpc error: code = NotFound desc = could not find container \"a19d9d2509084f43db26f8cf9a2e07f6027ca158c77f494171ba24da6f987cc1\": container with ID starting with a19d9d2509084f43db26f8cf9a2e07f6027ca158c77f494171ba24da6f987cc1 not found: ID does not exist" Sep 30 07:36:19 crc kubenswrapper[4691]: I0930 07:36:19.981311 4691 scope.go:117] "RemoveContainer" containerID="1ada86fe6dbca6203dd5095fbe2f8563bca8570004fcd78ee8db032ca64efe6e" Sep 30 07:36:19 crc kubenswrapper[4691]: E0930 07:36:19.981655 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ada86fe6dbca6203dd5095fbe2f8563bca8570004fcd78ee8db032ca64efe6e\": container with ID starting with 1ada86fe6dbca6203dd5095fbe2f8563bca8570004fcd78ee8db032ca64efe6e not found: ID does not exist" containerID="1ada86fe6dbca6203dd5095fbe2f8563bca8570004fcd78ee8db032ca64efe6e" Sep 30 07:36:19 crc kubenswrapper[4691]: I0930 07:36:19.981701 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ada86fe6dbca6203dd5095fbe2f8563bca8570004fcd78ee8db032ca64efe6e"} err="failed to get container status \"1ada86fe6dbca6203dd5095fbe2f8563bca8570004fcd78ee8db032ca64efe6e\": rpc error: code = NotFound desc = could not find container \"1ada86fe6dbca6203dd5095fbe2f8563bca8570004fcd78ee8db032ca64efe6e\": container with ID starting with 1ada86fe6dbca6203dd5095fbe2f8563bca8570004fcd78ee8db032ca64efe6e not found: ID does not exist" Sep 30 07:36:20 crc kubenswrapper[4691]: I0930 07:36:20.264469 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7g8ws" Sep 30 07:36:20 crc kubenswrapper[4691]: I0930 07:36:20.331860 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b54588b9-b74c-44c0-aa51-07ad90269398-catalog-content\") pod \"b54588b9-b74c-44c0-aa51-07ad90269398\" (UID: \"b54588b9-b74c-44c0-aa51-07ad90269398\") " Sep 30 07:36:20 crc kubenswrapper[4691]: I0930 07:36:20.332030 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b54588b9-b74c-44c0-aa51-07ad90269398-utilities\") pod \"b54588b9-b74c-44c0-aa51-07ad90269398\" (UID: \"b54588b9-b74c-44c0-aa51-07ad90269398\") " Sep 30 07:36:20 crc kubenswrapper[4691]: I0930 07:36:20.332151 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwpzb\" (UniqueName: \"kubernetes.io/projected/b54588b9-b74c-44c0-aa51-07ad90269398-kube-api-access-qwpzb\") pod \"b54588b9-b74c-44c0-aa51-07ad90269398\" (UID: \"b54588b9-b74c-44c0-aa51-07ad90269398\") " Sep 30 07:36:20 crc kubenswrapper[4691]: I0930 07:36:20.333355 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b54588b9-b74c-44c0-aa51-07ad90269398-utilities" (OuterVolumeSpecName: "utilities") pod "b54588b9-b74c-44c0-aa51-07ad90269398" (UID: "b54588b9-b74c-44c0-aa51-07ad90269398"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:36:20 crc kubenswrapper[4691]: I0930 07:36:20.340402 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b54588b9-b74c-44c0-aa51-07ad90269398-kube-api-access-qwpzb" (OuterVolumeSpecName: "kube-api-access-qwpzb") pod "b54588b9-b74c-44c0-aa51-07ad90269398" (UID: "b54588b9-b74c-44c0-aa51-07ad90269398"). InnerVolumeSpecName "kube-api-access-qwpzb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:36:20 crc kubenswrapper[4691]: I0930 07:36:20.411068 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b54588b9-b74c-44c0-aa51-07ad90269398-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b54588b9-b74c-44c0-aa51-07ad90269398" (UID: "b54588b9-b74c-44c0-aa51-07ad90269398"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:36:20 crc kubenswrapper[4691]: I0930 07:36:20.436184 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b54588b9-b74c-44c0-aa51-07ad90269398-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 07:36:20 crc kubenswrapper[4691]: I0930 07:36:20.436221 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b54588b9-b74c-44c0-aa51-07ad90269398-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 07:36:20 crc kubenswrapper[4691]: I0930 07:36:20.436233 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwpzb\" (UniqueName: \"kubernetes.io/projected/b54588b9-b74c-44c0-aa51-07ad90269398-kube-api-access-qwpzb\") on node \"crc\" DevicePath \"\"" Sep 30 07:36:20 crc kubenswrapper[4691]: I0930 07:36:20.869604 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7g8ws" event={"ID":"b54588b9-b74c-44c0-aa51-07ad90269398","Type":"ContainerDied","Data":"63935dd77e6bd95a47d6bec47160b52ed4e2f2248a7685ee1c913b12fe978451"} Sep 30 07:36:20 crc kubenswrapper[4691]: I0930 07:36:20.869682 4691 scope.go:117] "RemoveContainer" containerID="ae63805830ef4d896106aa2cca97b9063dba7125ac210537c9de1249628f5e26" Sep 30 07:36:20 crc kubenswrapper[4691]: I0930 07:36:20.869707 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7g8ws" Sep 30 07:36:20 crc kubenswrapper[4691]: I0930 07:36:20.915711 4691 scope.go:117] "RemoveContainer" containerID="aa9165ad525d598b80a9959f372a81be0d15fb810b608e8fa56694a989b4be42" Sep 30 07:36:20 crc kubenswrapper[4691]: I0930 07:36:20.924796 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7g8ws"] Sep 30 07:36:20 crc kubenswrapper[4691]: I0930 07:36:20.934078 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7g8ws"] Sep 30 07:36:20 crc kubenswrapper[4691]: I0930 07:36:20.958698 4691 scope.go:117] "RemoveContainer" containerID="fb4d9179a0c3ad1e19515e850699ea3164d39f4198a1048d604316474239ad83" Sep 30 07:36:21 crc kubenswrapper[4691]: I0930 07:36:21.226880 4691 scope.go:117] "RemoveContainer" containerID="f188c8214aa7e772a4491a6d7c828ca911338d6668e5ea8374385e72206b46f9" Sep 30 07:36:21 crc kubenswrapper[4691]: E0930 07:36:21.227446 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:36:21 crc kubenswrapper[4691]: I0930 07:36:21.238443 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7725e183-fb93-4384-9abf-cbb9b539d4b4" path="/var/lib/kubelet/pods/7725e183-fb93-4384-9abf-cbb9b539d4b4/volumes" Sep 30 07:36:21 crc kubenswrapper[4691]: I0930 07:36:21.239141 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b54588b9-b74c-44c0-aa51-07ad90269398" path="/var/lib/kubelet/pods/b54588b9-b74c-44c0-aa51-07ad90269398/volumes" Sep 30 07:36:36 crc kubenswrapper[4691]: I0930 07:36:36.225080 4691 scope.go:117] "RemoveContainer" containerID="f188c8214aa7e772a4491a6d7c828ca911338d6668e5ea8374385e72206b46f9" Sep 30 07:36:36 crc kubenswrapper[4691]: E0930 07:36:36.226045 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:36:50 crc kubenswrapper[4691]: I0930 07:36:50.226056 4691 scope.go:117] "RemoveContainer" containerID="f188c8214aa7e772a4491a6d7c828ca911338d6668e5ea8374385e72206b46f9" Sep 30 07:36:50 crc kubenswrapper[4691]: E0930 07:36:50.226869 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:37:02 crc kubenswrapper[4691]: I0930 07:37:02.224830 4691 scope.go:117] "RemoveContainer" containerID="f188c8214aa7e772a4491a6d7c828ca911338d6668e5ea8374385e72206b46f9" Sep 30 07:37:02 crc kubenswrapper[4691]: E0930 07:37:02.225856 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:37:15 crc kubenswrapper[4691]: I0930 07:37:15.225766 4691 scope.go:117] "RemoveContainer" containerID="f188c8214aa7e772a4491a6d7c828ca911338d6668e5ea8374385e72206b46f9" Sep 30 07:37:15 crc kubenswrapper[4691]: E0930 07:37:15.226945 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:37:26 crc kubenswrapper[4691]: I0930 07:37:26.225777 4691 scope.go:117] "RemoveContainer" containerID="f188c8214aa7e772a4491a6d7c828ca911338d6668e5ea8374385e72206b46f9" Sep 30 07:37:26 crc kubenswrapper[4691]: E0930 07:37:26.227137 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:37:38 crc kubenswrapper[4691]: I0930 07:37:38.226292 4691 scope.go:117] "RemoveContainer" containerID="f188c8214aa7e772a4491a6d7c828ca911338d6668e5ea8374385e72206b46f9" Sep 30 07:37:38 crc kubenswrapper[4691]: E0930 07:37:38.227120 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:37:40 crc kubenswrapper[4691]: I0930 07:37:40.359979 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-j87v4"] Sep 30 07:37:40 crc kubenswrapper[4691]: E0930 07:37:40.361182 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b54588b9-b74c-44c0-aa51-07ad90269398" containerName="extract-content" Sep 30 07:37:40 crc kubenswrapper[4691]: I0930 07:37:40.361219 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="b54588b9-b74c-44c0-aa51-07ad90269398" containerName="extract-content" Sep 30 07:37:40 crc kubenswrapper[4691]: E0930 07:37:40.361263 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7725e183-fb93-4384-9abf-cbb9b539d4b4" containerName="extract-utilities" Sep 30 07:37:40 crc kubenswrapper[4691]: I0930 07:37:40.361278 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="7725e183-fb93-4384-9abf-cbb9b539d4b4" containerName="extract-utilities" Sep 30 07:37:40 crc kubenswrapper[4691]: E0930 07:37:40.361314 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b54588b9-b74c-44c0-aa51-07ad90269398" containerName="extract-utilities" Sep 30 07:37:40 crc kubenswrapper[4691]: I0930 07:37:40.361344 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="b54588b9-b74c-44c0-aa51-07ad90269398" containerName="extract-utilities" Sep 30 07:37:40 crc kubenswrapper[4691]: E0930 07:37:40.361393 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b54588b9-b74c-44c0-aa51-07ad90269398" containerName="registry-server" Sep 30 07:37:40 crc kubenswrapper[4691]: I0930 07:37:40.361412 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="b54588b9-b74c-44c0-aa51-07ad90269398" containerName="registry-server" Sep 30 07:37:40 crc kubenswrapper[4691]: E0930 07:37:40.361460 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7725e183-fb93-4384-9abf-cbb9b539d4b4" containerName="extract-content" Sep 30 07:37:40 crc kubenswrapper[4691]: I0930 07:37:40.361477 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="7725e183-fb93-4384-9abf-cbb9b539d4b4" containerName="extract-content" Sep 30 07:37:40 crc kubenswrapper[4691]: E0930 07:37:40.361509 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7725e183-fb93-4384-9abf-cbb9b539d4b4" containerName="registry-server" Sep 30 07:37:40 crc kubenswrapper[4691]: I0930 07:37:40.361526 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="7725e183-fb93-4384-9abf-cbb9b539d4b4" containerName="registry-server" Sep 30 07:37:40 crc kubenswrapper[4691]: I0930 07:37:40.362001 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="7725e183-fb93-4384-9abf-cbb9b539d4b4" containerName="registry-server" Sep 30 07:37:40 crc kubenswrapper[4691]: I0930 07:37:40.362059 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="b54588b9-b74c-44c0-aa51-07ad90269398" containerName="registry-server" Sep 30 07:37:40 crc kubenswrapper[4691]: I0930 07:37:40.365453 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j87v4" Sep 30 07:37:40 crc kubenswrapper[4691]: I0930 07:37:40.375361 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-j87v4"] Sep 30 07:37:40 crc kubenswrapper[4691]: I0930 07:37:40.446092 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b5e2b99-1ae0-4f26-9bb4-67c831e0109f-catalog-content\") pod \"certified-operators-j87v4\" (UID: \"8b5e2b99-1ae0-4f26-9bb4-67c831e0109f\") " pod="openshift-marketplace/certified-operators-j87v4" Sep 30 07:37:40 crc kubenswrapper[4691]: I0930 07:37:40.446403 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b5e2b99-1ae0-4f26-9bb4-67c831e0109f-utilities\") pod \"certified-operators-j87v4\" (UID: \"8b5e2b99-1ae0-4f26-9bb4-67c831e0109f\") " pod="openshift-marketplace/certified-operators-j87v4" Sep 30 07:37:40 crc kubenswrapper[4691]: I0930 07:37:40.446534 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4l5q\" (UniqueName: \"kubernetes.io/projected/8b5e2b99-1ae0-4f26-9bb4-67c831e0109f-kube-api-access-p4l5q\") pod \"certified-operators-j87v4\" (UID: \"8b5e2b99-1ae0-4f26-9bb4-67c831e0109f\") " pod="openshift-marketplace/certified-operators-j87v4" Sep 30 07:37:40 crc kubenswrapper[4691]: I0930 07:37:40.548651 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b5e2b99-1ae0-4f26-9bb4-67c831e0109f-catalog-content\") pod \"certified-operators-j87v4\" (UID: \"8b5e2b99-1ae0-4f26-9bb4-67c831e0109f\") " pod="openshift-marketplace/certified-operators-j87v4" Sep 30 07:37:40 crc kubenswrapper[4691]: I0930 07:37:40.548758 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b5e2b99-1ae0-4f26-9bb4-67c831e0109f-utilities\") pod \"certified-operators-j87v4\" (UID: \"8b5e2b99-1ae0-4f26-9bb4-67c831e0109f\") " pod="openshift-marketplace/certified-operators-j87v4" Sep 30 07:37:40 crc kubenswrapper[4691]: I0930 07:37:40.548797 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4l5q\" (UniqueName: \"kubernetes.io/projected/8b5e2b99-1ae0-4f26-9bb4-67c831e0109f-kube-api-access-p4l5q\") pod \"certified-operators-j87v4\" (UID: \"8b5e2b99-1ae0-4f26-9bb4-67c831e0109f\") " pod="openshift-marketplace/certified-operators-j87v4" Sep 30 07:37:40 crc kubenswrapper[4691]: I0930 07:37:40.549202 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b5e2b99-1ae0-4f26-9bb4-67c831e0109f-catalog-content\") pod \"certified-operators-j87v4\" (UID: \"8b5e2b99-1ae0-4f26-9bb4-67c831e0109f\") " pod="openshift-marketplace/certified-operators-j87v4" Sep 30 07:37:40 crc kubenswrapper[4691]: I0930 07:37:40.549362 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b5e2b99-1ae0-4f26-9bb4-67c831e0109f-utilities\") pod \"certified-operators-j87v4\" (UID: \"8b5e2b99-1ae0-4f26-9bb4-67c831e0109f\") " pod="openshift-marketplace/certified-operators-j87v4" Sep 30 07:37:40 crc kubenswrapper[4691]: I0930 07:37:40.602032 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4l5q\" (UniqueName: \"kubernetes.io/projected/8b5e2b99-1ae0-4f26-9bb4-67c831e0109f-kube-api-access-p4l5q\") pod \"certified-operators-j87v4\" (UID: \"8b5e2b99-1ae0-4f26-9bb4-67c831e0109f\") " pod="openshift-marketplace/certified-operators-j87v4" Sep 30 07:37:40 crc kubenswrapper[4691]: I0930 07:37:40.702535 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j87v4" Sep 30 07:37:41 crc kubenswrapper[4691]: I0930 07:37:41.284685 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-j87v4"] Sep 30 07:37:41 crc kubenswrapper[4691]: I0930 07:37:41.821082 4691 generic.go:334] "Generic (PLEG): container finished" podID="8b5e2b99-1ae0-4f26-9bb4-67c831e0109f" containerID="05b40333e229a1aff6d05080d760a7b4050035cc12d8750e5690675422a9ce84" exitCode=0 Sep 30 07:37:41 crc kubenswrapper[4691]: I0930 07:37:41.821270 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j87v4" event={"ID":"8b5e2b99-1ae0-4f26-9bb4-67c831e0109f","Type":"ContainerDied","Data":"05b40333e229a1aff6d05080d760a7b4050035cc12d8750e5690675422a9ce84"} Sep 30 07:37:41 crc kubenswrapper[4691]: I0930 07:37:41.821505 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j87v4" event={"ID":"8b5e2b99-1ae0-4f26-9bb4-67c831e0109f","Type":"ContainerStarted","Data":"0b84c72201a2b106ecd17ce8cf3f1826deaf3ec07db5acc7d336f516539da930"} Sep 30 07:37:44 crc kubenswrapper[4691]: I0930 07:37:44.352295 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lspbc"] Sep 30 07:37:44 crc kubenswrapper[4691]: I0930 07:37:44.356055 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lspbc" Sep 30 07:37:44 crc kubenswrapper[4691]: I0930 07:37:44.380311 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lspbc"] Sep 30 07:37:44 crc kubenswrapper[4691]: I0930 07:37:44.450322 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6cd22d18-ee1a-4b48-af58-82ce2873c42c-catalog-content\") pod \"redhat-marketplace-lspbc\" (UID: \"6cd22d18-ee1a-4b48-af58-82ce2873c42c\") " pod="openshift-marketplace/redhat-marketplace-lspbc" Sep 30 07:37:44 crc kubenswrapper[4691]: I0930 07:37:44.450511 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6cd22d18-ee1a-4b48-af58-82ce2873c42c-utilities\") pod \"redhat-marketplace-lspbc\" (UID: \"6cd22d18-ee1a-4b48-af58-82ce2873c42c\") " pod="openshift-marketplace/redhat-marketplace-lspbc" Sep 30 07:37:44 crc kubenswrapper[4691]: I0930 07:37:44.450660 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2g9q2\" (UniqueName: \"kubernetes.io/projected/6cd22d18-ee1a-4b48-af58-82ce2873c42c-kube-api-access-2g9q2\") pod \"redhat-marketplace-lspbc\" (UID: \"6cd22d18-ee1a-4b48-af58-82ce2873c42c\") " pod="openshift-marketplace/redhat-marketplace-lspbc" Sep 30 07:37:44 crc kubenswrapper[4691]: I0930 07:37:44.552916 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2g9q2\" (UniqueName: \"kubernetes.io/projected/6cd22d18-ee1a-4b48-af58-82ce2873c42c-kube-api-access-2g9q2\") pod \"redhat-marketplace-lspbc\" (UID: \"6cd22d18-ee1a-4b48-af58-82ce2873c42c\") " pod="openshift-marketplace/redhat-marketplace-lspbc" Sep 30 07:37:44 crc kubenswrapper[4691]: I0930 07:37:44.553347 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6cd22d18-ee1a-4b48-af58-82ce2873c42c-catalog-content\") pod \"redhat-marketplace-lspbc\" (UID: \"6cd22d18-ee1a-4b48-af58-82ce2873c42c\") " pod="openshift-marketplace/redhat-marketplace-lspbc" Sep 30 07:37:44 crc kubenswrapper[4691]: I0930 07:37:44.553387 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6cd22d18-ee1a-4b48-af58-82ce2873c42c-utilities\") pod \"redhat-marketplace-lspbc\" (UID: \"6cd22d18-ee1a-4b48-af58-82ce2873c42c\") " pod="openshift-marketplace/redhat-marketplace-lspbc" Sep 30 07:37:44 crc kubenswrapper[4691]: I0930 07:37:44.554117 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6cd22d18-ee1a-4b48-af58-82ce2873c42c-utilities\") pod \"redhat-marketplace-lspbc\" (UID: \"6cd22d18-ee1a-4b48-af58-82ce2873c42c\") " pod="openshift-marketplace/redhat-marketplace-lspbc" Sep 30 07:37:44 crc kubenswrapper[4691]: I0930 07:37:44.554228 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6cd22d18-ee1a-4b48-af58-82ce2873c42c-catalog-content\") pod \"redhat-marketplace-lspbc\" (UID: \"6cd22d18-ee1a-4b48-af58-82ce2873c42c\") " pod="openshift-marketplace/redhat-marketplace-lspbc" Sep 30 07:37:44 crc kubenswrapper[4691]: I0930 07:37:44.578756 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2g9q2\" (UniqueName: \"kubernetes.io/projected/6cd22d18-ee1a-4b48-af58-82ce2873c42c-kube-api-access-2g9q2\") pod \"redhat-marketplace-lspbc\" (UID: \"6cd22d18-ee1a-4b48-af58-82ce2873c42c\") " pod="openshift-marketplace/redhat-marketplace-lspbc" Sep 30 07:37:44 crc kubenswrapper[4691]: I0930 07:37:44.698263 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lspbc" Sep 30 07:37:44 crc kubenswrapper[4691]: I0930 07:37:44.881134 4691 generic.go:334] "Generic (PLEG): container finished" podID="8b5e2b99-1ae0-4f26-9bb4-67c831e0109f" containerID="324a91dd4194a351dd110561695c946503bf8a3ef90240fed9ca81077fe0af17" exitCode=0 Sep 30 07:37:44 crc kubenswrapper[4691]: I0930 07:37:44.881307 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j87v4" event={"ID":"8b5e2b99-1ae0-4f26-9bb4-67c831e0109f","Type":"ContainerDied","Data":"324a91dd4194a351dd110561695c946503bf8a3ef90240fed9ca81077fe0af17"} Sep 30 07:37:45 crc kubenswrapper[4691]: I0930 07:37:45.183262 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lspbc"] Sep 30 07:37:45 crc kubenswrapper[4691]: I0930 07:37:45.894842 4691 generic.go:334] "Generic (PLEG): container finished" podID="6cd22d18-ee1a-4b48-af58-82ce2873c42c" containerID="a4bb1731a49887346c9dbfd7ccfd980dae500bf8aed177a71d5c0640dd830100" exitCode=0 Sep 30 07:37:45 crc kubenswrapper[4691]: I0930 07:37:45.895112 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lspbc" event={"ID":"6cd22d18-ee1a-4b48-af58-82ce2873c42c","Type":"ContainerDied","Data":"a4bb1731a49887346c9dbfd7ccfd980dae500bf8aed177a71d5c0640dd830100"} Sep 30 07:37:45 crc kubenswrapper[4691]: I0930 07:37:45.895170 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lspbc" event={"ID":"6cd22d18-ee1a-4b48-af58-82ce2873c42c","Type":"ContainerStarted","Data":"5c564d69cfaace1726f3c27876a89a0738d1ce6fea2af54b6291766600795b2b"} Sep 30 07:37:46 crc kubenswrapper[4691]: I0930 07:37:46.906509 4691 generic.go:334] "Generic (PLEG): container finished" podID="6cd22d18-ee1a-4b48-af58-82ce2873c42c" containerID="5ef58bcd73abf2d8d643b4be241c3d3aa8f186ad77f6f319694107f9d321b8a7" exitCode=0 Sep 30 07:37:46 crc kubenswrapper[4691]: I0930 07:37:46.906562 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lspbc" event={"ID":"6cd22d18-ee1a-4b48-af58-82ce2873c42c","Type":"ContainerDied","Data":"5ef58bcd73abf2d8d643b4be241c3d3aa8f186ad77f6f319694107f9d321b8a7"} Sep 30 07:37:46 crc kubenswrapper[4691]: I0930 07:37:46.910004 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j87v4" event={"ID":"8b5e2b99-1ae0-4f26-9bb4-67c831e0109f","Type":"ContainerStarted","Data":"c09712b500ebb7cd4868925d242a6eb29bb5e1a6a1fb965483bc23c6058221be"} Sep 30 07:37:46 crc kubenswrapper[4691]: I0930 07:37:46.952495 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-j87v4" podStartSLOduration=3.115841328 podStartE2EDuration="6.952478585s" podCreationTimestamp="2025-09-30 07:37:40 +0000 UTC" firstStartedPulling="2025-09-30 07:37:41.823935953 +0000 UTC m=+4705.298956993" lastFinishedPulling="2025-09-30 07:37:45.66057321 +0000 UTC m=+4709.135594250" observedRunningTime="2025-09-30 07:37:46.951589866 +0000 UTC m=+4710.426610926" watchObservedRunningTime="2025-09-30 07:37:46.952478585 +0000 UTC m=+4710.427499615" Sep 30 07:37:47 crc kubenswrapper[4691]: I0930 07:37:47.924309 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lspbc" event={"ID":"6cd22d18-ee1a-4b48-af58-82ce2873c42c","Type":"ContainerStarted","Data":"e85873fb564531fb30f97357112809c8c70431498230521cbf242b50060eaf0e"} Sep 30 07:37:47 crc kubenswrapper[4691]: I0930 07:37:47.944498 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lspbc" podStartSLOduration=2.5144739879999998 podStartE2EDuration="3.94447943s" podCreationTimestamp="2025-09-30 07:37:44 +0000 UTC" firstStartedPulling="2025-09-30 07:37:45.896370921 +0000 UTC m=+4709.371391961" lastFinishedPulling="2025-09-30 07:37:47.326376373 +0000 UTC m=+4710.801397403" observedRunningTime="2025-09-30 07:37:47.941635319 +0000 UTC m=+4711.416656399" watchObservedRunningTime="2025-09-30 07:37:47.94447943 +0000 UTC m=+4711.419500470" Sep 30 07:37:49 crc kubenswrapper[4691]: I0930 07:37:49.225241 4691 scope.go:117] "RemoveContainer" containerID="f188c8214aa7e772a4491a6d7c828ca911338d6668e5ea8374385e72206b46f9" Sep 30 07:37:49 crc kubenswrapper[4691]: E0930 07:37:49.225986 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:37:50 crc kubenswrapper[4691]: I0930 07:37:50.704314 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-j87v4" Sep 30 07:37:50 crc kubenswrapper[4691]: I0930 07:37:50.704803 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-j87v4" Sep 30 07:37:50 crc kubenswrapper[4691]: I0930 07:37:50.774187 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-j87v4" Sep 30 07:37:50 crc kubenswrapper[4691]: I0930 07:37:50.998454 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-j87v4" Sep 30 07:37:53 crc kubenswrapper[4691]: I0930 07:37:53.534036 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-j87v4"] Sep 30 07:37:53 crc kubenswrapper[4691]: I0930 07:37:53.534785 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-j87v4" podUID="8b5e2b99-1ae0-4f26-9bb4-67c831e0109f" containerName="registry-server" containerID="cri-o://c09712b500ebb7cd4868925d242a6eb29bb5e1a6a1fb965483bc23c6058221be" gracePeriod=2 Sep 30 07:37:53 crc kubenswrapper[4691]: I0930 07:37:53.993101 4691 generic.go:334] "Generic (PLEG): container finished" podID="8b5e2b99-1ae0-4f26-9bb4-67c831e0109f" containerID="c09712b500ebb7cd4868925d242a6eb29bb5e1a6a1fb965483bc23c6058221be" exitCode=0 Sep 30 07:37:53 crc kubenswrapper[4691]: I0930 07:37:53.993174 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j87v4" event={"ID":"8b5e2b99-1ae0-4f26-9bb4-67c831e0109f","Type":"ContainerDied","Data":"c09712b500ebb7cd4868925d242a6eb29bb5e1a6a1fb965483bc23c6058221be"} Sep 30 07:37:54 crc kubenswrapper[4691]: I0930 07:37:54.112875 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j87v4" Sep 30 07:37:54 crc kubenswrapper[4691]: I0930 07:37:54.146807 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b5e2b99-1ae0-4f26-9bb4-67c831e0109f-utilities\") pod \"8b5e2b99-1ae0-4f26-9bb4-67c831e0109f\" (UID: \"8b5e2b99-1ae0-4f26-9bb4-67c831e0109f\") " Sep 30 07:37:54 crc kubenswrapper[4691]: I0930 07:37:54.147071 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b5e2b99-1ae0-4f26-9bb4-67c831e0109f-catalog-content\") pod \"8b5e2b99-1ae0-4f26-9bb4-67c831e0109f\" (UID: \"8b5e2b99-1ae0-4f26-9bb4-67c831e0109f\") " Sep 30 07:37:54 crc kubenswrapper[4691]: I0930 07:37:54.147183 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4l5q\" (UniqueName: \"kubernetes.io/projected/8b5e2b99-1ae0-4f26-9bb4-67c831e0109f-kube-api-access-p4l5q\") pod \"8b5e2b99-1ae0-4f26-9bb4-67c831e0109f\" (UID: \"8b5e2b99-1ae0-4f26-9bb4-67c831e0109f\") " Sep 30 07:37:54 crc kubenswrapper[4691]: I0930 07:37:54.148013 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b5e2b99-1ae0-4f26-9bb4-67c831e0109f-utilities" (OuterVolumeSpecName: "utilities") pod "8b5e2b99-1ae0-4f26-9bb4-67c831e0109f" (UID: "8b5e2b99-1ae0-4f26-9bb4-67c831e0109f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:37:54 crc kubenswrapper[4691]: I0930 07:37:54.215170 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b5e2b99-1ae0-4f26-9bb4-67c831e0109f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8b5e2b99-1ae0-4f26-9bb4-67c831e0109f" (UID: "8b5e2b99-1ae0-4f26-9bb4-67c831e0109f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:37:54 crc kubenswrapper[4691]: I0930 07:37:54.250363 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b5e2b99-1ae0-4f26-9bb4-67c831e0109f-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 07:37:54 crc kubenswrapper[4691]: I0930 07:37:54.250410 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b5e2b99-1ae0-4f26-9bb4-67c831e0109f-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 07:37:54 crc kubenswrapper[4691]: I0930 07:37:54.699079 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lspbc" Sep 30 07:37:54 crc kubenswrapper[4691]: I0930 07:37:54.699158 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lspbc" Sep 30 07:37:55 crc kubenswrapper[4691]: I0930 07:37:55.013216 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-j87v4" event={"ID":"8b5e2b99-1ae0-4f26-9bb4-67c831e0109f","Type":"ContainerDied","Data":"0b84c72201a2b106ecd17ce8cf3f1826deaf3ec07db5acc7d336f516539da930"} Sep 30 07:37:55 crc kubenswrapper[4691]: I0930 07:37:55.013273 4691 scope.go:117] "RemoveContainer" containerID="c09712b500ebb7cd4868925d242a6eb29bb5e1a6a1fb965483bc23c6058221be" Sep 30 07:37:55 crc kubenswrapper[4691]: I0930 07:37:55.013301 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-j87v4" Sep 30 07:37:55 crc kubenswrapper[4691]: I0930 07:37:55.039133 4691 scope.go:117] "RemoveContainer" containerID="324a91dd4194a351dd110561695c946503bf8a3ef90240fed9ca81077fe0af17" Sep 30 07:37:55 crc kubenswrapper[4691]: I0930 07:37:55.102334 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b5e2b99-1ae0-4f26-9bb4-67c831e0109f-kube-api-access-p4l5q" (OuterVolumeSpecName: "kube-api-access-p4l5q") pod "8b5e2b99-1ae0-4f26-9bb4-67c831e0109f" (UID: "8b5e2b99-1ae0-4f26-9bb4-67c831e0109f"). InnerVolumeSpecName "kube-api-access-p4l5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:37:55 crc kubenswrapper[4691]: I0930 07:37:55.116620 4691 scope.go:117] "RemoveContainer" containerID="05b40333e229a1aff6d05080d760a7b4050035cc12d8750e5690675422a9ce84" Sep 30 07:37:55 crc kubenswrapper[4691]: I0930 07:37:55.172706 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4l5q\" (UniqueName: \"kubernetes.io/projected/8b5e2b99-1ae0-4f26-9bb4-67c831e0109f-kube-api-access-p4l5q\") on node \"crc\" DevicePath \"\"" Sep 30 07:37:55 crc kubenswrapper[4691]: I0930 07:37:55.352352 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lspbc" Sep 30 07:37:55 crc kubenswrapper[4691]: I0930 07:37:55.398153 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-j87v4"] Sep 30 07:37:55 crc kubenswrapper[4691]: I0930 07:37:55.406002 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-j87v4"] Sep 30 07:37:55 crc kubenswrapper[4691]: I0930 07:37:55.425516 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lspbc" Sep 30 07:37:57 crc kubenswrapper[4691]: I0930 07:37:57.242100 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b5e2b99-1ae0-4f26-9bb4-67c831e0109f" path="/var/lib/kubelet/pods/8b5e2b99-1ae0-4f26-9bb4-67c831e0109f/volumes" Sep 30 07:37:58 crc kubenswrapper[4691]: I0930 07:37:58.541965 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lspbc"] Sep 30 07:37:58 crc kubenswrapper[4691]: I0930 07:37:58.542749 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-lspbc" podUID="6cd22d18-ee1a-4b48-af58-82ce2873c42c" containerName="registry-server" containerID="cri-o://e85873fb564531fb30f97357112809c8c70431498230521cbf242b50060eaf0e" gracePeriod=2 Sep 30 07:37:59 crc kubenswrapper[4691]: I0930 07:37:59.055723 4691 generic.go:334] "Generic (PLEG): container finished" podID="6cd22d18-ee1a-4b48-af58-82ce2873c42c" containerID="e85873fb564531fb30f97357112809c8c70431498230521cbf242b50060eaf0e" exitCode=0 Sep 30 07:37:59 crc kubenswrapper[4691]: I0930 07:37:59.055805 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lspbc" event={"ID":"6cd22d18-ee1a-4b48-af58-82ce2873c42c","Type":"ContainerDied","Data":"e85873fb564531fb30f97357112809c8c70431498230521cbf242b50060eaf0e"} Sep 30 07:37:59 crc kubenswrapper[4691]: I0930 07:37:59.056293 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lspbc" event={"ID":"6cd22d18-ee1a-4b48-af58-82ce2873c42c","Type":"ContainerDied","Data":"5c564d69cfaace1726f3c27876a89a0738d1ce6fea2af54b6291766600795b2b"} Sep 30 07:37:59 crc kubenswrapper[4691]: I0930 07:37:59.056315 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c564d69cfaace1726f3c27876a89a0738d1ce6fea2af54b6291766600795b2b" Sep 30 07:37:59 crc kubenswrapper[4691]: I0930 07:37:59.105555 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lspbc" Sep 30 07:37:59 crc kubenswrapper[4691]: I0930 07:37:59.171611 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6cd22d18-ee1a-4b48-af58-82ce2873c42c-catalog-content\") pod \"6cd22d18-ee1a-4b48-af58-82ce2873c42c\" (UID: \"6cd22d18-ee1a-4b48-af58-82ce2873c42c\") " Sep 30 07:37:59 crc kubenswrapper[4691]: I0930 07:37:59.171846 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6cd22d18-ee1a-4b48-af58-82ce2873c42c-utilities\") pod \"6cd22d18-ee1a-4b48-af58-82ce2873c42c\" (UID: \"6cd22d18-ee1a-4b48-af58-82ce2873c42c\") " Sep 30 07:37:59 crc kubenswrapper[4691]: I0930 07:37:59.171972 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2g9q2\" (UniqueName: \"kubernetes.io/projected/6cd22d18-ee1a-4b48-af58-82ce2873c42c-kube-api-access-2g9q2\") pod \"6cd22d18-ee1a-4b48-af58-82ce2873c42c\" (UID: \"6cd22d18-ee1a-4b48-af58-82ce2873c42c\") " Sep 30 07:37:59 crc kubenswrapper[4691]: I0930 07:37:59.172957 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6cd22d18-ee1a-4b48-af58-82ce2873c42c-utilities" (OuterVolumeSpecName: "utilities") pod "6cd22d18-ee1a-4b48-af58-82ce2873c42c" (UID: "6cd22d18-ee1a-4b48-af58-82ce2873c42c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:37:59 crc kubenswrapper[4691]: I0930 07:37:59.180979 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cd22d18-ee1a-4b48-af58-82ce2873c42c-kube-api-access-2g9q2" (OuterVolumeSpecName: "kube-api-access-2g9q2") pod "6cd22d18-ee1a-4b48-af58-82ce2873c42c" (UID: "6cd22d18-ee1a-4b48-af58-82ce2873c42c"). InnerVolumeSpecName "kube-api-access-2g9q2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:37:59 crc kubenswrapper[4691]: I0930 07:37:59.186050 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6cd22d18-ee1a-4b48-af58-82ce2873c42c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6cd22d18-ee1a-4b48-af58-82ce2873c42c" (UID: "6cd22d18-ee1a-4b48-af58-82ce2873c42c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:37:59 crc kubenswrapper[4691]: I0930 07:37:59.274804 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6cd22d18-ee1a-4b48-af58-82ce2873c42c-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 07:37:59 crc kubenswrapper[4691]: I0930 07:37:59.274836 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6cd22d18-ee1a-4b48-af58-82ce2873c42c-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 07:37:59 crc kubenswrapper[4691]: I0930 07:37:59.274845 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2g9q2\" (UniqueName: \"kubernetes.io/projected/6cd22d18-ee1a-4b48-af58-82ce2873c42c-kube-api-access-2g9q2\") on node \"crc\" DevicePath \"\"" Sep 30 07:38:00 crc kubenswrapper[4691]: I0930 07:38:00.068531 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lspbc" Sep 30 07:38:00 crc kubenswrapper[4691]: I0930 07:38:00.102844 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lspbc"] Sep 30 07:38:00 crc kubenswrapper[4691]: I0930 07:38:00.116527 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-lspbc"] Sep 30 07:38:01 crc kubenswrapper[4691]: I0930 07:38:01.244001 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6cd22d18-ee1a-4b48-af58-82ce2873c42c" path="/var/lib/kubelet/pods/6cd22d18-ee1a-4b48-af58-82ce2873c42c/volumes" Sep 30 07:38:04 crc kubenswrapper[4691]: I0930 07:38:04.224507 4691 scope.go:117] "RemoveContainer" containerID="f188c8214aa7e772a4491a6d7c828ca911338d6668e5ea8374385e72206b46f9" Sep 30 07:38:04 crc kubenswrapper[4691]: E0930 07:38:04.225874 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:38:16 crc kubenswrapper[4691]: I0930 07:38:16.225608 4691 scope.go:117] "RemoveContainer" containerID="f188c8214aa7e772a4491a6d7c828ca911338d6668e5ea8374385e72206b46f9" Sep 30 07:38:16 crc kubenswrapper[4691]: E0930 07:38:16.226402 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:38:30 crc kubenswrapper[4691]: I0930 07:38:30.225472 4691 scope.go:117] "RemoveContainer" containerID="f188c8214aa7e772a4491a6d7c828ca911338d6668e5ea8374385e72206b46f9" Sep 30 07:38:30 crc kubenswrapper[4691]: E0930 07:38:30.226266 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:38:42 crc kubenswrapper[4691]: I0930 07:38:42.225547 4691 scope.go:117] "RemoveContainer" containerID="f188c8214aa7e772a4491a6d7c828ca911338d6668e5ea8374385e72206b46f9" Sep 30 07:38:42 crc kubenswrapper[4691]: E0930 07:38:42.226550 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:38:57 crc kubenswrapper[4691]: I0930 07:38:57.232466 4691 scope.go:117] "RemoveContainer" containerID="f188c8214aa7e772a4491a6d7c828ca911338d6668e5ea8374385e72206b46f9" Sep 30 07:38:57 crc kubenswrapper[4691]: E0930 07:38:57.233142 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:39:11 crc kubenswrapper[4691]: I0930 07:39:11.225605 4691 scope.go:117] "RemoveContainer" containerID="f188c8214aa7e772a4491a6d7c828ca911338d6668e5ea8374385e72206b46f9" Sep 30 07:39:11 crc kubenswrapper[4691]: E0930 07:39:11.226761 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:39:25 crc kubenswrapper[4691]: I0930 07:39:25.225668 4691 scope.go:117] "RemoveContainer" containerID="f188c8214aa7e772a4491a6d7c828ca911338d6668e5ea8374385e72206b46f9" Sep 30 07:39:25 crc kubenswrapper[4691]: E0930 07:39:25.226479 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:39:39 crc kubenswrapper[4691]: I0930 07:39:39.224611 4691 scope.go:117] "RemoveContainer" containerID="f188c8214aa7e772a4491a6d7c828ca911338d6668e5ea8374385e72206b46f9" Sep 30 07:39:39 crc kubenswrapper[4691]: E0930 07:39:39.225357 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:39:51 crc kubenswrapper[4691]: I0930 07:39:51.224870 4691 scope.go:117] "RemoveContainer" containerID="f188c8214aa7e772a4491a6d7c828ca911338d6668e5ea8374385e72206b46f9" Sep 30 07:39:51 crc kubenswrapper[4691]: E0930 07:39:51.225743 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:40:03 crc kubenswrapper[4691]: I0930 07:40:03.228525 4691 scope.go:117] "RemoveContainer" containerID="f188c8214aa7e772a4491a6d7c828ca911338d6668e5ea8374385e72206b46f9" Sep 30 07:40:03 crc kubenswrapper[4691]: E0930 07:40:03.229295 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:40:17 crc kubenswrapper[4691]: I0930 07:40:17.232411 4691 scope.go:117] "RemoveContainer" containerID="f188c8214aa7e772a4491a6d7c828ca911338d6668e5ea8374385e72206b46f9" Sep 30 07:40:17 crc kubenswrapper[4691]: E0930 07:40:17.233293 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:40:31 crc kubenswrapper[4691]: I0930 07:40:31.225530 4691 scope.go:117] "RemoveContainer" containerID="f188c8214aa7e772a4491a6d7c828ca911338d6668e5ea8374385e72206b46f9" Sep 30 07:40:31 crc kubenswrapper[4691]: I0930 07:40:31.782343 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" event={"ID":"69b46ade-8260-448f-84b7-506632d23ff9","Type":"ContainerStarted","Data":"46e59622e78b8b72120d17869ac466e93f8c7d3d5e94b7b3aebd944a72fbfe16"} Sep 30 07:42:52 crc kubenswrapper[4691]: I0930 07:42:52.850548 4691 patch_prober.go:28] interesting pod/machine-config-daemon-4w4k6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 07:42:52 crc kubenswrapper[4691]: I0930 07:42:52.851183 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 07:43:22 crc kubenswrapper[4691]: I0930 07:43:22.850426 4691 patch_prober.go:28] interesting pod/machine-config-daemon-4w4k6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 07:43:22 crc kubenswrapper[4691]: I0930 07:43:22.850985 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 07:43:52 crc kubenswrapper[4691]: I0930 07:43:52.861040 4691 patch_prober.go:28] interesting pod/machine-config-daemon-4w4k6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 07:43:52 crc kubenswrapper[4691]: I0930 07:43:52.862001 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 07:43:52 crc kubenswrapper[4691]: I0930 07:43:52.862337 4691 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" Sep 30 07:43:52 crc kubenswrapper[4691]: I0930 07:43:52.864910 4691 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"46e59622e78b8b72120d17869ac466e93f8c7d3d5e94b7b3aebd944a72fbfe16"} pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 07:43:52 crc kubenswrapper[4691]: I0930 07:43:52.865029 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" containerName="machine-config-daemon" containerID="cri-o://46e59622e78b8b72120d17869ac466e93f8c7d3d5e94b7b3aebd944a72fbfe16" gracePeriod=600 Sep 30 07:43:53 crc kubenswrapper[4691]: I0930 07:43:53.076324 4691 generic.go:334] "Generic (PLEG): container finished" podID="69b46ade-8260-448f-84b7-506632d23ff9" containerID="46e59622e78b8b72120d17869ac466e93f8c7d3d5e94b7b3aebd944a72fbfe16" exitCode=0 Sep 30 07:43:53 crc kubenswrapper[4691]: I0930 07:43:53.076664 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" event={"ID":"69b46ade-8260-448f-84b7-506632d23ff9","Type":"ContainerDied","Data":"46e59622e78b8b72120d17869ac466e93f8c7d3d5e94b7b3aebd944a72fbfe16"} Sep 30 07:43:53 crc kubenswrapper[4691]: I0930 07:43:53.076705 4691 scope.go:117] "RemoveContainer" containerID="f188c8214aa7e772a4491a6d7c828ca911338d6668e5ea8374385e72206b46f9" Sep 30 07:43:54 crc kubenswrapper[4691]: I0930 07:43:54.087397 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" event={"ID":"69b46ade-8260-448f-84b7-506632d23ff9","Type":"ContainerStarted","Data":"d2c1910332c7b39c28fb9c23f54694ce1e0dbe190d7af26b09b1c40104f1b3da"} Sep 30 07:44:27 crc kubenswrapper[4691]: I0930 07:44:27.403749 4691 scope.go:117] "RemoveContainer" containerID="5ef58bcd73abf2d8d643b4be241c3d3aa8f186ad77f6f319694107f9d321b8a7" Sep 30 07:44:27 crc kubenswrapper[4691]: I0930 07:44:27.432655 4691 scope.go:117] "RemoveContainer" containerID="e85873fb564531fb30f97357112809c8c70431498230521cbf242b50060eaf0e" Sep 30 07:44:27 crc kubenswrapper[4691]: I0930 07:44:27.492721 4691 scope.go:117] "RemoveContainer" containerID="a4bb1731a49887346c9dbfd7ccfd980dae500bf8aed177a71d5c0640dd830100" Sep 30 07:45:00 crc kubenswrapper[4691]: I0930 07:45:00.179258 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320305-nmfjh"] Sep 30 07:45:00 crc kubenswrapper[4691]: E0930 07:45:00.180998 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cd22d18-ee1a-4b48-af58-82ce2873c42c" containerName="registry-server" Sep 30 07:45:00 crc kubenswrapper[4691]: I0930 07:45:00.181032 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cd22d18-ee1a-4b48-af58-82ce2873c42c" containerName="registry-server" Sep 30 07:45:00 crc kubenswrapper[4691]: E0930 07:45:00.181062 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cd22d18-ee1a-4b48-af58-82ce2873c42c" containerName="extract-utilities" Sep 30 07:45:00 crc kubenswrapper[4691]: I0930 07:45:00.181080 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cd22d18-ee1a-4b48-af58-82ce2873c42c" containerName="extract-utilities" Sep 30 07:45:00 crc kubenswrapper[4691]: E0930 07:45:00.181121 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cd22d18-ee1a-4b48-af58-82ce2873c42c" containerName="extract-content" Sep 30 07:45:00 crc kubenswrapper[4691]: I0930 07:45:00.181140 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cd22d18-ee1a-4b48-af58-82ce2873c42c" containerName="extract-content" Sep 30 07:45:00 crc kubenswrapper[4691]: E0930 07:45:00.181192 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b5e2b99-1ae0-4f26-9bb4-67c831e0109f" containerName="extract-utilities" Sep 30 07:45:00 crc kubenswrapper[4691]: I0930 07:45:00.181209 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b5e2b99-1ae0-4f26-9bb4-67c831e0109f" containerName="extract-utilities" Sep 30 07:45:00 crc kubenswrapper[4691]: E0930 07:45:00.181244 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b5e2b99-1ae0-4f26-9bb4-67c831e0109f" containerName="extract-content" Sep 30 07:45:00 crc kubenswrapper[4691]: I0930 07:45:00.181261 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b5e2b99-1ae0-4f26-9bb4-67c831e0109f" containerName="extract-content" Sep 30 07:45:00 crc kubenswrapper[4691]: E0930 07:45:00.181289 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b5e2b99-1ae0-4f26-9bb4-67c831e0109f" containerName="registry-server" Sep 30 07:45:00 crc kubenswrapper[4691]: I0930 07:45:00.181305 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b5e2b99-1ae0-4f26-9bb4-67c831e0109f" containerName="registry-server" Sep 30 07:45:00 crc kubenswrapper[4691]: I0930 07:45:00.181852 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cd22d18-ee1a-4b48-af58-82ce2873c42c" containerName="registry-server" Sep 30 07:45:00 crc kubenswrapper[4691]: I0930 07:45:00.181956 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b5e2b99-1ae0-4f26-9bb4-67c831e0109f" containerName="registry-server" Sep 30 07:45:00 crc kubenswrapper[4691]: I0930 07:45:00.183714 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320305-nmfjh" Sep 30 07:45:00 crc kubenswrapper[4691]: I0930 07:45:00.186342 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 07:45:00 crc kubenswrapper[4691]: I0930 07:45:00.187851 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 07:45:00 crc kubenswrapper[4691]: I0930 07:45:00.197988 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320305-nmfjh"] Sep 30 07:45:00 crc kubenswrapper[4691]: I0930 07:45:00.268410 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c784119a-5e3f-4069-ab5e-36b6d8381f6b-secret-volume\") pod \"collect-profiles-29320305-nmfjh\" (UID: \"c784119a-5e3f-4069-ab5e-36b6d8381f6b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320305-nmfjh" Sep 30 07:45:00 crc kubenswrapper[4691]: I0930 07:45:00.268907 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rktxr\" (UniqueName: \"kubernetes.io/projected/c784119a-5e3f-4069-ab5e-36b6d8381f6b-kube-api-access-rktxr\") pod \"collect-profiles-29320305-nmfjh\" (UID: \"c784119a-5e3f-4069-ab5e-36b6d8381f6b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320305-nmfjh" Sep 30 07:45:00 crc kubenswrapper[4691]: I0930 07:45:00.269022 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c784119a-5e3f-4069-ab5e-36b6d8381f6b-config-volume\") pod \"collect-profiles-29320305-nmfjh\" (UID: \"c784119a-5e3f-4069-ab5e-36b6d8381f6b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320305-nmfjh" Sep 30 07:45:00 crc kubenswrapper[4691]: I0930 07:45:00.371262 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c784119a-5e3f-4069-ab5e-36b6d8381f6b-secret-volume\") pod \"collect-profiles-29320305-nmfjh\" (UID: \"c784119a-5e3f-4069-ab5e-36b6d8381f6b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320305-nmfjh" Sep 30 07:45:00 crc kubenswrapper[4691]: I0930 07:45:00.371418 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rktxr\" (UniqueName: \"kubernetes.io/projected/c784119a-5e3f-4069-ab5e-36b6d8381f6b-kube-api-access-rktxr\") pod \"collect-profiles-29320305-nmfjh\" (UID: \"c784119a-5e3f-4069-ab5e-36b6d8381f6b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320305-nmfjh" Sep 30 07:45:00 crc kubenswrapper[4691]: I0930 07:45:00.371455 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c784119a-5e3f-4069-ab5e-36b6d8381f6b-config-volume\") pod \"collect-profiles-29320305-nmfjh\" (UID: \"c784119a-5e3f-4069-ab5e-36b6d8381f6b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320305-nmfjh" Sep 30 07:45:00 crc kubenswrapper[4691]: I0930 07:45:00.372779 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c784119a-5e3f-4069-ab5e-36b6d8381f6b-config-volume\") pod \"collect-profiles-29320305-nmfjh\" (UID: \"c784119a-5e3f-4069-ab5e-36b6d8381f6b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320305-nmfjh" Sep 30 07:45:00 crc kubenswrapper[4691]: I0930 07:45:00.378831 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c784119a-5e3f-4069-ab5e-36b6d8381f6b-secret-volume\") pod \"collect-profiles-29320305-nmfjh\" (UID: \"c784119a-5e3f-4069-ab5e-36b6d8381f6b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320305-nmfjh" Sep 30 07:45:00 crc kubenswrapper[4691]: I0930 07:45:00.404712 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rktxr\" (UniqueName: \"kubernetes.io/projected/c784119a-5e3f-4069-ab5e-36b6d8381f6b-kube-api-access-rktxr\") pod \"collect-profiles-29320305-nmfjh\" (UID: \"c784119a-5e3f-4069-ab5e-36b6d8381f6b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320305-nmfjh" Sep 30 07:45:00 crc kubenswrapper[4691]: I0930 07:45:00.508544 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320305-nmfjh" Sep 30 07:45:00 crc kubenswrapper[4691]: I0930 07:45:00.980193 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320305-nmfjh"] Sep 30 07:45:00 crc kubenswrapper[4691]: W0930 07:45:00.987382 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc784119a_5e3f_4069_ab5e_36b6d8381f6b.slice/crio-40d0e2bf56c1b010f0f36f7813ffa3883eea7b3e3d9da73e9f7c609569248078 WatchSource:0}: Error finding container 40d0e2bf56c1b010f0f36f7813ffa3883eea7b3e3d9da73e9f7c609569248078: Status 404 returned error can't find the container with id 40d0e2bf56c1b010f0f36f7813ffa3883eea7b3e3d9da73e9f7c609569248078 Sep 30 07:45:01 crc kubenswrapper[4691]: I0930 07:45:01.864798 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320305-nmfjh" event={"ID":"c784119a-5e3f-4069-ab5e-36b6d8381f6b","Type":"ContainerStarted","Data":"e5d02d66c8a409199bb73d9d1c0c9659c457159e505c91e79d14eb2386121013"} Sep 30 07:45:01 crc kubenswrapper[4691]: I0930 07:45:01.865085 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320305-nmfjh" event={"ID":"c784119a-5e3f-4069-ab5e-36b6d8381f6b","Type":"ContainerStarted","Data":"40d0e2bf56c1b010f0f36f7813ffa3883eea7b3e3d9da73e9f7c609569248078"} Sep 30 07:45:01 crc kubenswrapper[4691]: I0930 07:45:01.884851 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29320305-nmfjh" podStartSLOduration=1.884831731 podStartE2EDuration="1.884831731s" podCreationTimestamp="2025-09-30 07:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:45:01.879740378 +0000 UTC m=+5145.354761428" watchObservedRunningTime="2025-09-30 07:45:01.884831731 +0000 UTC m=+5145.359852771" Sep 30 07:45:02 crc kubenswrapper[4691]: I0930 07:45:02.879216 4691 generic.go:334] "Generic (PLEG): container finished" podID="c784119a-5e3f-4069-ab5e-36b6d8381f6b" containerID="e5d02d66c8a409199bb73d9d1c0c9659c457159e505c91e79d14eb2386121013" exitCode=0 Sep 30 07:45:02 crc kubenswrapper[4691]: I0930 07:45:02.879313 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320305-nmfjh" event={"ID":"c784119a-5e3f-4069-ab5e-36b6d8381f6b","Type":"ContainerDied","Data":"e5d02d66c8a409199bb73d9d1c0c9659c457159e505c91e79d14eb2386121013"} Sep 30 07:45:04 crc kubenswrapper[4691]: I0930 07:45:04.290073 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320305-nmfjh" Sep 30 07:45:04 crc kubenswrapper[4691]: I0930 07:45:04.453230 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rktxr\" (UniqueName: \"kubernetes.io/projected/c784119a-5e3f-4069-ab5e-36b6d8381f6b-kube-api-access-rktxr\") pod \"c784119a-5e3f-4069-ab5e-36b6d8381f6b\" (UID: \"c784119a-5e3f-4069-ab5e-36b6d8381f6b\") " Sep 30 07:45:04 crc kubenswrapper[4691]: I0930 07:45:04.453450 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c784119a-5e3f-4069-ab5e-36b6d8381f6b-config-volume\") pod \"c784119a-5e3f-4069-ab5e-36b6d8381f6b\" (UID: \"c784119a-5e3f-4069-ab5e-36b6d8381f6b\") " Sep 30 07:45:04 crc kubenswrapper[4691]: I0930 07:45:04.453489 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c784119a-5e3f-4069-ab5e-36b6d8381f6b-secret-volume\") pod \"c784119a-5e3f-4069-ab5e-36b6d8381f6b\" (UID: \"c784119a-5e3f-4069-ab5e-36b6d8381f6b\") " Sep 30 07:45:04 crc kubenswrapper[4691]: I0930 07:45:04.454442 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c784119a-5e3f-4069-ab5e-36b6d8381f6b-config-volume" (OuterVolumeSpecName: "config-volume") pod "c784119a-5e3f-4069-ab5e-36b6d8381f6b" (UID: "c784119a-5e3f-4069-ab5e-36b6d8381f6b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:45:04 crc kubenswrapper[4691]: I0930 07:45:04.465621 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c784119a-5e3f-4069-ab5e-36b6d8381f6b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c784119a-5e3f-4069-ab5e-36b6d8381f6b" (UID: "c784119a-5e3f-4069-ab5e-36b6d8381f6b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:45:04 crc kubenswrapper[4691]: I0930 07:45:04.469336 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c784119a-5e3f-4069-ab5e-36b6d8381f6b-kube-api-access-rktxr" (OuterVolumeSpecName: "kube-api-access-rktxr") pod "c784119a-5e3f-4069-ab5e-36b6d8381f6b" (UID: "c784119a-5e3f-4069-ab5e-36b6d8381f6b"). InnerVolumeSpecName "kube-api-access-rktxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:45:04 crc kubenswrapper[4691]: I0930 07:45:04.556459 4691 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c784119a-5e3f-4069-ab5e-36b6d8381f6b-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 07:45:04 crc kubenswrapper[4691]: I0930 07:45:04.556721 4691 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c784119a-5e3f-4069-ab5e-36b6d8381f6b-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 07:45:04 crc kubenswrapper[4691]: I0930 07:45:04.556803 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rktxr\" (UniqueName: \"kubernetes.io/projected/c784119a-5e3f-4069-ab5e-36b6d8381f6b-kube-api-access-rktxr\") on node \"crc\" DevicePath \"\"" Sep 30 07:45:04 crc kubenswrapper[4691]: I0930 07:45:04.905226 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320305-nmfjh" event={"ID":"c784119a-5e3f-4069-ab5e-36b6d8381f6b","Type":"ContainerDied","Data":"40d0e2bf56c1b010f0f36f7813ffa3883eea7b3e3d9da73e9f7c609569248078"} Sep 30 07:45:04 crc kubenswrapper[4691]: I0930 07:45:04.905282 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320305-nmfjh" Sep 30 07:45:04 crc kubenswrapper[4691]: I0930 07:45:04.905285 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="40d0e2bf56c1b010f0f36f7813ffa3883eea7b3e3d9da73e9f7c609569248078" Sep 30 07:45:04 crc kubenswrapper[4691]: I0930 07:45:04.971255 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320260-8mrl4"] Sep 30 07:45:04 crc kubenswrapper[4691]: I0930 07:45:04.983097 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320260-8mrl4"] Sep 30 07:45:05 crc kubenswrapper[4691]: I0930 07:45:05.240790 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b641bf2-9852-4550-bbb0-9add7388a1f6" path="/var/lib/kubelet/pods/9b641bf2-9852-4550-bbb0-9add7388a1f6/volumes" Sep 30 07:45:27 crc kubenswrapper[4691]: I0930 07:45:27.547296 4691 scope.go:117] "RemoveContainer" containerID="c25f8b2d85989887fdce89b2427ec3a0e8dac608cf51ab17b1ea51a207231a25" Sep 30 07:46:22 crc kubenswrapper[4691]: I0930 07:46:22.850043 4691 patch_prober.go:28] interesting pod/machine-config-daemon-4w4k6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 07:46:22 crc kubenswrapper[4691]: I0930 07:46:22.850691 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 07:46:43 crc kubenswrapper[4691]: I0930 07:46:43.618861 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hs5kk"] Sep 30 07:46:43 crc kubenswrapper[4691]: E0930 07:46:43.621349 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c784119a-5e3f-4069-ab5e-36b6d8381f6b" containerName="collect-profiles" Sep 30 07:46:43 crc kubenswrapper[4691]: I0930 07:46:43.622215 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="c784119a-5e3f-4069-ab5e-36b6d8381f6b" containerName="collect-profiles" Sep 30 07:46:43 crc kubenswrapper[4691]: I0930 07:46:43.622535 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="c784119a-5e3f-4069-ab5e-36b6d8381f6b" containerName="collect-profiles" Sep 30 07:46:43 crc kubenswrapper[4691]: I0930 07:46:43.624484 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hs5kk" Sep 30 07:46:43 crc kubenswrapper[4691]: I0930 07:46:43.655616 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hs5kk"] Sep 30 07:46:43 crc kubenswrapper[4691]: I0930 07:46:43.755929 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3e138e1-10ac-4f13-aaab-33299a8a590a-catalog-content\") pod \"community-operators-hs5kk\" (UID: \"e3e138e1-10ac-4f13-aaab-33299a8a590a\") " pod="openshift-marketplace/community-operators-hs5kk" Sep 30 07:46:43 crc kubenswrapper[4691]: I0930 07:46:43.756013 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zw87l\" (UniqueName: \"kubernetes.io/projected/e3e138e1-10ac-4f13-aaab-33299a8a590a-kube-api-access-zw87l\") pod \"community-operators-hs5kk\" (UID: \"e3e138e1-10ac-4f13-aaab-33299a8a590a\") " pod="openshift-marketplace/community-operators-hs5kk" Sep 30 07:46:43 crc kubenswrapper[4691]: I0930 07:46:43.756162 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3e138e1-10ac-4f13-aaab-33299a8a590a-utilities\") pod \"community-operators-hs5kk\" (UID: \"e3e138e1-10ac-4f13-aaab-33299a8a590a\") " pod="openshift-marketplace/community-operators-hs5kk" Sep 30 07:46:43 crc kubenswrapper[4691]: I0930 07:46:43.858597 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3e138e1-10ac-4f13-aaab-33299a8a590a-catalog-content\") pod \"community-operators-hs5kk\" (UID: \"e3e138e1-10ac-4f13-aaab-33299a8a590a\") " pod="openshift-marketplace/community-operators-hs5kk" Sep 30 07:46:43 crc kubenswrapper[4691]: I0930 07:46:43.858688 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zw87l\" (UniqueName: \"kubernetes.io/projected/e3e138e1-10ac-4f13-aaab-33299a8a590a-kube-api-access-zw87l\") pod \"community-operators-hs5kk\" (UID: \"e3e138e1-10ac-4f13-aaab-33299a8a590a\") " pod="openshift-marketplace/community-operators-hs5kk" Sep 30 07:46:43 crc kubenswrapper[4691]: I0930 07:46:43.858729 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3e138e1-10ac-4f13-aaab-33299a8a590a-utilities\") pod \"community-operators-hs5kk\" (UID: \"e3e138e1-10ac-4f13-aaab-33299a8a590a\") " pod="openshift-marketplace/community-operators-hs5kk" Sep 30 07:46:43 crc kubenswrapper[4691]: I0930 07:46:43.859347 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3e138e1-10ac-4f13-aaab-33299a8a590a-utilities\") pod \"community-operators-hs5kk\" (UID: \"e3e138e1-10ac-4f13-aaab-33299a8a590a\") " pod="openshift-marketplace/community-operators-hs5kk" Sep 30 07:46:43 crc kubenswrapper[4691]: I0930 07:46:43.859608 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3e138e1-10ac-4f13-aaab-33299a8a590a-catalog-content\") pod \"community-operators-hs5kk\" (UID: \"e3e138e1-10ac-4f13-aaab-33299a8a590a\") " pod="openshift-marketplace/community-operators-hs5kk" Sep 30 07:46:43 crc kubenswrapper[4691]: I0930 07:46:43.881942 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zw87l\" (UniqueName: \"kubernetes.io/projected/e3e138e1-10ac-4f13-aaab-33299a8a590a-kube-api-access-zw87l\") pod \"community-operators-hs5kk\" (UID: \"e3e138e1-10ac-4f13-aaab-33299a8a590a\") " pod="openshift-marketplace/community-operators-hs5kk" Sep 30 07:46:43 crc kubenswrapper[4691]: I0930 07:46:43.957051 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hs5kk" Sep 30 07:46:44 crc kubenswrapper[4691]: I0930 07:46:44.544940 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hs5kk"] Sep 30 07:46:45 crc kubenswrapper[4691]: I0930 07:46:45.028824 4691 generic.go:334] "Generic (PLEG): container finished" podID="e3e138e1-10ac-4f13-aaab-33299a8a590a" containerID="e00ea800358c1726e6386e20379456af54c59a38eb447d3891570c26ca5c4273" exitCode=0 Sep 30 07:46:45 crc kubenswrapper[4691]: I0930 07:46:45.029043 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hs5kk" event={"ID":"e3e138e1-10ac-4f13-aaab-33299a8a590a","Type":"ContainerDied","Data":"e00ea800358c1726e6386e20379456af54c59a38eb447d3891570c26ca5c4273"} Sep 30 07:46:45 crc kubenswrapper[4691]: I0930 07:46:45.029099 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hs5kk" event={"ID":"e3e138e1-10ac-4f13-aaab-33299a8a590a","Type":"ContainerStarted","Data":"76767add588e52f506652363152ce9235b48583f238c7650efc5003ca05d3a61"} Sep 30 07:46:45 crc kubenswrapper[4691]: I0930 07:46:45.031043 4691 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 07:46:46 crc kubenswrapper[4691]: I0930 07:46:46.041778 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hs5kk" event={"ID":"e3e138e1-10ac-4f13-aaab-33299a8a590a","Type":"ContainerStarted","Data":"38b3fd3b78d687aa43b6e33301be19f55f3476605a8ab194279ce85a504da365"} Sep 30 07:46:47 crc kubenswrapper[4691]: I0930 07:46:47.059656 4691 generic.go:334] "Generic (PLEG): container finished" podID="e3e138e1-10ac-4f13-aaab-33299a8a590a" containerID="38b3fd3b78d687aa43b6e33301be19f55f3476605a8ab194279ce85a504da365" exitCode=0 Sep 30 07:46:47 crc kubenswrapper[4691]: I0930 07:46:47.059695 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hs5kk" event={"ID":"e3e138e1-10ac-4f13-aaab-33299a8a590a","Type":"ContainerDied","Data":"38b3fd3b78d687aa43b6e33301be19f55f3476605a8ab194279ce85a504da365"} Sep 30 07:46:49 crc kubenswrapper[4691]: I0930 07:46:49.099460 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hs5kk" event={"ID":"e3e138e1-10ac-4f13-aaab-33299a8a590a","Type":"ContainerStarted","Data":"99e9efbfec64d02721127021e81b5de0408efebbd0c8f3b70f12c2fc6f315d37"} Sep 30 07:46:49 crc kubenswrapper[4691]: I0930 07:46:49.128130 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hs5kk" podStartSLOduration=3.659145715 podStartE2EDuration="6.128078761s" podCreationTimestamp="2025-09-30 07:46:43 +0000 UTC" firstStartedPulling="2025-09-30 07:46:45.030805044 +0000 UTC m=+5248.505826084" lastFinishedPulling="2025-09-30 07:46:47.49973805 +0000 UTC m=+5250.974759130" observedRunningTime="2025-09-30 07:46:49.121183251 +0000 UTC m=+5252.596204331" watchObservedRunningTime="2025-09-30 07:46:49.128078761 +0000 UTC m=+5252.603099821" Sep 30 07:46:52 crc kubenswrapper[4691]: I0930 07:46:52.849722 4691 patch_prober.go:28] interesting pod/machine-config-daemon-4w4k6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 07:46:52 crc kubenswrapper[4691]: I0930 07:46:52.850058 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 07:46:53 crc kubenswrapper[4691]: I0930 07:46:53.957323 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hs5kk" Sep 30 07:46:53 crc kubenswrapper[4691]: I0930 07:46:53.957767 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hs5kk" Sep 30 07:46:54 crc kubenswrapper[4691]: I0930 07:46:54.029254 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hs5kk" Sep 30 07:46:54 crc kubenswrapper[4691]: I0930 07:46:54.224145 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hs5kk" Sep 30 07:46:54 crc kubenswrapper[4691]: I0930 07:46:54.274394 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hs5kk"] Sep 30 07:46:56 crc kubenswrapper[4691]: I0930 07:46:56.181501 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hs5kk" podUID="e3e138e1-10ac-4f13-aaab-33299a8a590a" containerName="registry-server" containerID="cri-o://99e9efbfec64d02721127021e81b5de0408efebbd0c8f3b70f12c2fc6f315d37" gracePeriod=2 Sep 30 07:46:56 crc kubenswrapper[4691]: I0930 07:46:56.650154 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hs5kk" Sep 30 07:46:56 crc kubenswrapper[4691]: I0930 07:46:56.748378 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3e138e1-10ac-4f13-aaab-33299a8a590a-utilities\") pod \"e3e138e1-10ac-4f13-aaab-33299a8a590a\" (UID: \"e3e138e1-10ac-4f13-aaab-33299a8a590a\") " Sep 30 07:46:56 crc kubenswrapper[4691]: I0930 07:46:56.748714 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zw87l\" (UniqueName: \"kubernetes.io/projected/e3e138e1-10ac-4f13-aaab-33299a8a590a-kube-api-access-zw87l\") pod \"e3e138e1-10ac-4f13-aaab-33299a8a590a\" (UID: \"e3e138e1-10ac-4f13-aaab-33299a8a590a\") " Sep 30 07:46:56 crc kubenswrapper[4691]: I0930 07:46:56.748875 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3e138e1-10ac-4f13-aaab-33299a8a590a-catalog-content\") pod \"e3e138e1-10ac-4f13-aaab-33299a8a590a\" (UID: \"e3e138e1-10ac-4f13-aaab-33299a8a590a\") " Sep 30 07:46:56 crc kubenswrapper[4691]: I0930 07:46:56.749165 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3e138e1-10ac-4f13-aaab-33299a8a590a-utilities" (OuterVolumeSpecName: "utilities") pod "e3e138e1-10ac-4f13-aaab-33299a8a590a" (UID: "e3e138e1-10ac-4f13-aaab-33299a8a590a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:46:56 crc kubenswrapper[4691]: I0930 07:46:56.749623 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e3e138e1-10ac-4f13-aaab-33299a8a590a-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 07:46:56 crc kubenswrapper[4691]: I0930 07:46:56.755927 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3e138e1-10ac-4f13-aaab-33299a8a590a-kube-api-access-zw87l" (OuterVolumeSpecName: "kube-api-access-zw87l") pod "e3e138e1-10ac-4f13-aaab-33299a8a590a" (UID: "e3e138e1-10ac-4f13-aaab-33299a8a590a"). InnerVolumeSpecName "kube-api-access-zw87l". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:46:56 crc kubenswrapper[4691]: I0930 07:46:56.803173 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e3e138e1-10ac-4f13-aaab-33299a8a590a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e3e138e1-10ac-4f13-aaab-33299a8a590a" (UID: "e3e138e1-10ac-4f13-aaab-33299a8a590a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:46:56 crc kubenswrapper[4691]: I0930 07:46:56.852249 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zw87l\" (UniqueName: \"kubernetes.io/projected/e3e138e1-10ac-4f13-aaab-33299a8a590a-kube-api-access-zw87l\") on node \"crc\" DevicePath \"\"" Sep 30 07:46:56 crc kubenswrapper[4691]: I0930 07:46:56.852302 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e3e138e1-10ac-4f13-aaab-33299a8a590a-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 07:46:57 crc kubenswrapper[4691]: I0930 07:46:57.195867 4691 generic.go:334] "Generic (PLEG): container finished" podID="e3e138e1-10ac-4f13-aaab-33299a8a590a" containerID="99e9efbfec64d02721127021e81b5de0408efebbd0c8f3b70f12c2fc6f315d37" exitCode=0 Sep 30 07:46:57 crc kubenswrapper[4691]: I0930 07:46:57.195991 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hs5kk" Sep 30 07:46:57 crc kubenswrapper[4691]: I0930 07:46:57.196735 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hs5kk" event={"ID":"e3e138e1-10ac-4f13-aaab-33299a8a590a","Type":"ContainerDied","Data":"99e9efbfec64d02721127021e81b5de0408efebbd0c8f3b70f12c2fc6f315d37"} Sep 30 07:46:57 crc kubenswrapper[4691]: I0930 07:46:57.196851 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hs5kk" event={"ID":"e3e138e1-10ac-4f13-aaab-33299a8a590a","Type":"ContainerDied","Data":"76767add588e52f506652363152ce9235b48583f238c7650efc5003ca05d3a61"} Sep 30 07:46:57 crc kubenswrapper[4691]: I0930 07:46:57.196966 4691 scope.go:117] "RemoveContainer" containerID="99e9efbfec64d02721127021e81b5de0408efebbd0c8f3b70f12c2fc6f315d37" Sep 30 07:46:57 crc kubenswrapper[4691]: I0930 07:46:57.233112 4691 scope.go:117] "RemoveContainer" containerID="38b3fd3b78d687aa43b6e33301be19f55f3476605a8ab194279ce85a504da365" Sep 30 07:46:57 crc kubenswrapper[4691]: I0930 07:46:57.263857 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hs5kk"] Sep 30 07:46:57 crc kubenswrapper[4691]: I0930 07:46:57.273714 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hs5kk"] Sep 30 07:46:57 crc kubenswrapper[4691]: I0930 07:46:57.927242 4691 scope.go:117] "RemoveContainer" containerID="e00ea800358c1726e6386e20379456af54c59a38eb447d3891570c26ca5c4273" Sep 30 07:46:58 crc kubenswrapper[4691]: I0930 07:46:58.004782 4691 scope.go:117] "RemoveContainer" containerID="99e9efbfec64d02721127021e81b5de0408efebbd0c8f3b70f12c2fc6f315d37" Sep 30 07:46:58 crc kubenswrapper[4691]: E0930 07:46:58.005299 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99e9efbfec64d02721127021e81b5de0408efebbd0c8f3b70f12c2fc6f315d37\": container with ID starting with 99e9efbfec64d02721127021e81b5de0408efebbd0c8f3b70f12c2fc6f315d37 not found: ID does not exist" containerID="99e9efbfec64d02721127021e81b5de0408efebbd0c8f3b70f12c2fc6f315d37" Sep 30 07:46:58 crc kubenswrapper[4691]: I0930 07:46:58.005333 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99e9efbfec64d02721127021e81b5de0408efebbd0c8f3b70f12c2fc6f315d37"} err="failed to get container status \"99e9efbfec64d02721127021e81b5de0408efebbd0c8f3b70f12c2fc6f315d37\": rpc error: code = NotFound desc = could not find container \"99e9efbfec64d02721127021e81b5de0408efebbd0c8f3b70f12c2fc6f315d37\": container with ID starting with 99e9efbfec64d02721127021e81b5de0408efebbd0c8f3b70f12c2fc6f315d37 not found: ID does not exist" Sep 30 07:46:58 crc kubenswrapper[4691]: I0930 07:46:58.005357 4691 scope.go:117] "RemoveContainer" containerID="38b3fd3b78d687aa43b6e33301be19f55f3476605a8ab194279ce85a504da365" Sep 30 07:46:58 crc kubenswrapper[4691]: E0930 07:46:58.005766 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38b3fd3b78d687aa43b6e33301be19f55f3476605a8ab194279ce85a504da365\": container with ID starting with 38b3fd3b78d687aa43b6e33301be19f55f3476605a8ab194279ce85a504da365 not found: ID does not exist" containerID="38b3fd3b78d687aa43b6e33301be19f55f3476605a8ab194279ce85a504da365" Sep 30 07:46:58 crc kubenswrapper[4691]: I0930 07:46:58.005903 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38b3fd3b78d687aa43b6e33301be19f55f3476605a8ab194279ce85a504da365"} err="failed to get container status \"38b3fd3b78d687aa43b6e33301be19f55f3476605a8ab194279ce85a504da365\": rpc error: code = NotFound desc = could not find container \"38b3fd3b78d687aa43b6e33301be19f55f3476605a8ab194279ce85a504da365\": container with ID starting with 38b3fd3b78d687aa43b6e33301be19f55f3476605a8ab194279ce85a504da365 not found: ID does not exist" Sep 30 07:46:58 crc kubenswrapper[4691]: I0930 07:46:58.005958 4691 scope.go:117] "RemoveContainer" containerID="e00ea800358c1726e6386e20379456af54c59a38eb447d3891570c26ca5c4273" Sep 30 07:46:58 crc kubenswrapper[4691]: E0930 07:46:58.006329 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e00ea800358c1726e6386e20379456af54c59a38eb447d3891570c26ca5c4273\": container with ID starting with e00ea800358c1726e6386e20379456af54c59a38eb447d3891570c26ca5c4273 not found: ID does not exist" containerID="e00ea800358c1726e6386e20379456af54c59a38eb447d3891570c26ca5c4273" Sep 30 07:46:58 crc kubenswrapper[4691]: I0930 07:46:58.006359 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e00ea800358c1726e6386e20379456af54c59a38eb447d3891570c26ca5c4273"} err="failed to get container status \"e00ea800358c1726e6386e20379456af54c59a38eb447d3891570c26ca5c4273\": rpc error: code = NotFound desc = could not find container \"e00ea800358c1726e6386e20379456af54c59a38eb447d3891570c26ca5c4273\": container with ID starting with e00ea800358c1726e6386e20379456af54c59a38eb447d3891570c26ca5c4273 not found: ID does not exist" Sep 30 07:46:59 crc kubenswrapper[4691]: I0930 07:46:59.243178 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3e138e1-10ac-4f13-aaab-33299a8a590a" path="/var/lib/kubelet/pods/e3e138e1-10ac-4f13-aaab-33299a8a590a/volumes" Sep 30 07:46:59 crc kubenswrapper[4691]: I0930 07:46:59.693766 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-f6694"] Sep 30 07:46:59 crc kubenswrapper[4691]: E0930 07:46:59.694451 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3e138e1-10ac-4f13-aaab-33299a8a590a" containerName="extract-utilities" Sep 30 07:46:59 crc kubenswrapper[4691]: I0930 07:46:59.694485 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3e138e1-10ac-4f13-aaab-33299a8a590a" containerName="extract-utilities" Sep 30 07:46:59 crc kubenswrapper[4691]: E0930 07:46:59.694508 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3e138e1-10ac-4f13-aaab-33299a8a590a" containerName="extract-content" Sep 30 07:46:59 crc kubenswrapper[4691]: I0930 07:46:59.694520 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3e138e1-10ac-4f13-aaab-33299a8a590a" containerName="extract-content" Sep 30 07:46:59 crc kubenswrapper[4691]: E0930 07:46:59.694543 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3e138e1-10ac-4f13-aaab-33299a8a590a" containerName="registry-server" Sep 30 07:46:59 crc kubenswrapper[4691]: I0930 07:46:59.694557 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3e138e1-10ac-4f13-aaab-33299a8a590a" containerName="registry-server" Sep 30 07:46:59 crc kubenswrapper[4691]: I0930 07:46:59.695426 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3e138e1-10ac-4f13-aaab-33299a8a590a" containerName="registry-server" Sep 30 07:46:59 crc kubenswrapper[4691]: I0930 07:46:59.697790 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f6694" Sep 30 07:46:59 crc kubenswrapper[4691]: I0930 07:46:59.716428 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-f6694"] Sep 30 07:46:59 crc kubenswrapper[4691]: I0930 07:46:59.827746 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2hl2\" (UniqueName: \"kubernetes.io/projected/bc6c63a6-344d-4be3-a502-0c3c1d5a8d0c-kube-api-access-l2hl2\") pod \"redhat-operators-f6694\" (UID: \"bc6c63a6-344d-4be3-a502-0c3c1d5a8d0c\") " pod="openshift-marketplace/redhat-operators-f6694" Sep 30 07:46:59 crc kubenswrapper[4691]: I0930 07:46:59.829749 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc6c63a6-344d-4be3-a502-0c3c1d5a8d0c-catalog-content\") pod \"redhat-operators-f6694\" (UID: \"bc6c63a6-344d-4be3-a502-0c3c1d5a8d0c\") " pod="openshift-marketplace/redhat-operators-f6694" Sep 30 07:46:59 crc kubenswrapper[4691]: I0930 07:46:59.829873 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc6c63a6-344d-4be3-a502-0c3c1d5a8d0c-utilities\") pod \"redhat-operators-f6694\" (UID: \"bc6c63a6-344d-4be3-a502-0c3c1d5a8d0c\") " pod="openshift-marketplace/redhat-operators-f6694" Sep 30 07:46:59 crc kubenswrapper[4691]: I0930 07:46:59.931246 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc6c63a6-344d-4be3-a502-0c3c1d5a8d0c-catalog-content\") pod \"redhat-operators-f6694\" (UID: \"bc6c63a6-344d-4be3-a502-0c3c1d5a8d0c\") " pod="openshift-marketplace/redhat-operators-f6694" Sep 30 07:46:59 crc kubenswrapper[4691]: I0930 07:46:59.931329 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc6c63a6-344d-4be3-a502-0c3c1d5a8d0c-utilities\") pod \"redhat-operators-f6694\" (UID: \"bc6c63a6-344d-4be3-a502-0c3c1d5a8d0c\") " pod="openshift-marketplace/redhat-operators-f6694" Sep 30 07:46:59 crc kubenswrapper[4691]: I0930 07:46:59.931390 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2hl2\" (UniqueName: \"kubernetes.io/projected/bc6c63a6-344d-4be3-a502-0c3c1d5a8d0c-kube-api-access-l2hl2\") pod \"redhat-operators-f6694\" (UID: \"bc6c63a6-344d-4be3-a502-0c3c1d5a8d0c\") " pod="openshift-marketplace/redhat-operators-f6694" Sep 30 07:46:59 crc kubenswrapper[4691]: I0930 07:46:59.931827 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc6c63a6-344d-4be3-a502-0c3c1d5a8d0c-catalog-content\") pod \"redhat-operators-f6694\" (UID: \"bc6c63a6-344d-4be3-a502-0c3c1d5a8d0c\") " pod="openshift-marketplace/redhat-operators-f6694" Sep 30 07:46:59 crc kubenswrapper[4691]: I0930 07:46:59.931848 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc6c63a6-344d-4be3-a502-0c3c1d5a8d0c-utilities\") pod \"redhat-operators-f6694\" (UID: \"bc6c63a6-344d-4be3-a502-0c3c1d5a8d0c\") " pod="openshift-marketplace/redhat-operators-f6694" Sep 30 07:47:00 crc kubenswrapper[4691]: I0930 07:47:00.204392 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2hl2\" (UniqueName: \"kubernetes.io/projected/bc6c63a6-344d-4be3-a502-0c3c1d5a8d0c-kube-api-access-l2hl2\") pod \"redhat-operators-f6694\" (UID: \"bc6c63a6-344d-4be3-a502-0c3c1d5a8d0c\") " pod="openshift-marketplace/redhat-operators-f6694" Sep 30 07:47:00 crc kubenswrapper[4691]: I0930 07:47:00.341412 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f6694" Sep 30 07:47:00 crc kubenswrapper[4691]: I0930 07:47:00.841535 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-f6694"] Sep 30 07:47:00 crc kubenswrapper[4691]: W0930 07:47:00.849926 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc6c63a6_344d_4be3_a502_0c3c1d5a8d0c.slice/crio-c652d5208761477e57e30c4a316fe6b89adfbf0473595275a34ecd8aa4ee26df WatchSource:0}: Error finding container c652d5208761477e57e30c4a316fe6b89adfbf0473595275a34ecd8aa4ee26df: Status 404 returned error can't find the container with id c652d5208761477e57e30c4a316fe6b89adfbf0473595275a34ecd8aa4ee26df Sep 30 07:47:01 crc kubenswrapper[4691]: I0930 07:47:01.241507 4691 generic.go:334] "Generic (PLEG): container finished" podID="bc6c63a6-344d-4be3-a502-0c3c1d5a8d0c" containerID="e8bc2c18aa1239dead5188c500986c13d1384a4e2bdb3783b34d34528f599417" exitCode=0 Sep 30 07:47:01 crc kubenswrapper[4691]: I0930 07:47:01.241551 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f6694" event={"ID":"bc6c63a6-344d-4be3-a502-0c3c1d5a8d0c","Type":"ContainerDied","Data":"e8bc2c18aa1239dead5188c500986c13d1384a4e2bdb3783b34d34528f599417"} Sep 30 07:47:01 crc kubenswrapper[4691]: I0930 07:47:01.241577 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f6694" event={"ID":"bc6c63a6-344d-4be3-a502-0c3c1d5a8d0c","Type":"ContainerStarted","Data":"c652d5208761477e57e30c4a316fe6b89adfbf0473595275a34ecd8aa4ee26df"} Sep 30 07:47:02 crc kubenswrapper[4691]: I0930 07:47:02.257213 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f6694" event={"ID":"bc6c63a6-344d-4be3-a502-0c3c1d5a8d0c","Type":"ContainerStarted","Data":"88826f1da5f792e7946d98e7d3455e90359c35f4d40f822832e575249da3371f"} Sep 30 07:47:05 crc kubenswrapper[4691]: I0930 07:47:05.289610 4691 generic.go:334] "Generic (PLEG): container finished" podID="bc6c63a6-344d-4be3-a502-0c3c1d5a8d0c" containerID="88826f1da5f792e7946d98e7d3455e90359c35f4d40f822832e575249da3371f" exitCode=0 Sep 30 07:47:05 crc kubenswrapper[4691]: I0930 07:47:05.289681 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f6694" event={"ID":"bc6c63a6-344d-4be3-a502-0c3c1d5a8d0c","Type":"ContainerDied","Data":"88826f1da5f792e7946d98e7d3455e90359c35f4d40f822832e575249da3371f"} Sep 30 07:47:06 crc kubenswrapper[4691]: I0930 07:47:06.302645 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f6694" event={"ID":"bc6c63a6-344d-4be3-a502-0c3c1d5a8d0c","Type":"ContainerStarted","Data":"39e9beb830231af96640c818b7730cbdee77d777e559cda3b75b32378abed389"} Sep 30 07:47:06 crc kubenswrapper[4691]: I0930 07:47:06.323971 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-f6694" podStartSLOduration=2.8197367719999997 podStartE2EDuration="7.323944475s" podCreationTimestamp="2025-09-30 07:46:59 +0000 UTC" firstStartedPulling="2025-09-30 07:47:01.243657707 +0000 UTC m=+5264.718678747" lastFinishedPulling="2025-09-30 07:47:05.74786537 +0000 UTC m=+5269.222886450" observedRunningTime="2025-09-30 07:47:06.321040712 +0000 UTC m=+5269.796061772" watchObservedRunningTime="2025-09-30 07:47:06.323944475 +0000 UTC m=+5269.798965575" Sep 30 07:47:10 crc kubenswrapper[4691]: I0930 07:47:10.341646 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-f6694" Sep 30 07:47:10 crc kubenswrapper[4691]: I0930 07:47:10.342363 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-f6694" Sep 30 07:47:11 crc kubenswrapper[4691]: I0930 07:47:11.398335 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-f6694" podUID="bc6c63a6-344d-4be3-a502-0c3c1d5a8d0c" containerName="registry-server" probeResult="failure" output=< Sep 30 07:47:11 crc kubenswrapper[4691]: timeout: failed to connect service ":50051" within 1s Sep 30 07:47:11 crc kubenswrapper[4691]: > Sep 30 07:47:20 crc kubenswrapper[4691]: I0930 07:47:20.421019 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-f6694" Sep 30 07:47:20 crc kubenswrapper[4691]: I0930 07:47:20.501990 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-f6694" Sep 30 07:47:20 crc kubenswrapper[4691]: I0930 07:47:20.676943 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-f6694"] Sep 30 07:47:21 crc kubenswrapper[4691]: I0930 07:47:21.490529 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-f6694" podUID="bc6c63a6-344d-4be3-a502-0c3c1d5a8d0c" containerName="registry-server" containerID="cri-o://39e9beb830231af96640c818b7730cbdee77d777e559cda3b75b32378abed389" gracePeriod=2 Sep 30 07:47:21 crc kubenswrapper[4691]: I0930 07:47:21.974384 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f6694" Sep 30 07:47:22 crc kubenswrapper[4691]: I0930 07:47:22.034744 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc6c63a6-344d-4be3-a502-0c3c1d5a8d0c-utilities\") pod \"bc6c63a6-344d-4be3-a502-0c3c1d5a8d0c\" (UID: \"bc6c63a6-344d-4be3-a502-0c3c1d5a8d0c\") " Sep 30 07:47:22 crc kubenswrapper[4691]: I0930 07:47:22.035082 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc6c63a6-344d-4be3-a502-0c3c1d5a8d0c-catalog-content\") pod \"bc6c63a6-344d-4be3-a502-0c3c1d5a8d0c\" (UID: \"bc6c63a6-344d-4be3-a502-0c3c1d5a8d0c\") " Sep 30 07:47:22 crc kubenswrapper[4691]: I0930 07:47:22.035144 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2hl2\" (UniqueName: \"kubernetes.io/projected/bc6c63a6-344d-4be3-a502-0c3c1d5a8d0c-kube-api-access-l2hl2\") pod \"bc6c63a6-344d-4be3-a502-0c3c1d5a8d0c\" (UID: \"bc6c63a6-344d-4be3-a502-0c3c1d5a8d0c\") " Sep 30 07:47:22 crc kubenswrapper[4691]: I0930 07:47:22.035737 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc6c63a6-344d-4be3-a502-0c3c1d5a8d0c-utilities" (OuterVolumeSpecName: "utilities") pod "bc6c63a6-344d-4be3-a502-0c3c1d5a8d0c" (UID: "bc6c63a6-344d-4be3-a502-0c3c1d5a8d0c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:47:22 crc kubenswrapper[4691]: I0930 07:47:22.041361 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc6c63a6-344d-4be3-a502-0c3c1d5a8d0c-kube-api-access-l2hl2" (OuterVolumeSpecName: "kube-api-access-l2hl2") pod "bc6c63a6-344d-4be3-a502-0c3c1d5a8d0c" (UID: "bc6c63a6-344d-4be3-a502-0c3c1d5a8d0c"). InnerVolumeSpecName "kube-api-access-l2hl2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:47:22 crc kubenswrapper[4691]: I0930 07:47:22.132092 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc6c63a6-344d-4be3-a502-0c3c1d5a8d0c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bc6c63a6-344d-4be3-a502-0c3c1d5a8d0c" (UID: "bc6c63a6-344d-4be3-a502-0c3c1d5a8d0c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:47:22 crc kubenswrapper[4691]: I0930 07:47:22.137567 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc6c63a6-344d-4be3-a502-0c3c1d5a8d0c-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 07:47:22 crc kubenswrapper[4691]: I0930 07:47:22.137602 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc6c63a6-344d-4be3-a502-0c3c1d5a8d0c-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 07:47:22 crc kubenswrapper[4691]: I0930 07:47:22.137624 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2hl2\" (UniqueName: \"kubernetes.io/projected/bc6c63a6-344d-4be3-a502-0c3c1d5a8d0c-kube-api-access-l2hl2\") on node \"crc\" DevicePath \"\"" Sep 30 07:47:22 crc kubenswrapper[4691]: I0930 07:47:22.501640 4691 generic.go:334] "Generic (PLEG): container finished" podID="bc6c63a6-344d-4be3-a502-0c3c1d5a8d0c" containerID="39e9beb830231af96640c818b7730cbdee77d777e559cda3b75b32378abed389" exitCode=0 Sep 30 07:47:22 crc kubenswrapper[4691]: I0930 07:47:22.501920 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f6694" event={"ID":"bc6c63a6-344d-4be3-a502-0c3c1d5a8d0c","Type":"ContainerDied","Data":"39e9beb830231af96640c818b7730cbdee77d777e559cda3b75b32378abed389"} Sep 30 07:47:22 crc kubenswrapper[4691]: I0930 07:47:22.501952 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-f6694" event={"ID":"bc6c63a6-344d-4be3-a502-0c3c1d5a8d0c","Type":"ContainerDied","Data":"c652d5208761477e57e30c4a316fe6b89adfbf0473595275a34ecd8aa4ee26df"} Sep 30 07:47:22 crc kubenswrapper[4691]: I0930 07:47:22.501972 4691 scope.go:117] "RemoveContainer" containerID="39e9beb830231af96640c818b7730cbdee77d777e559cda3b75b32378abed389" Sep 30 07:47:22 crc kubenswrapper[4691]: I0930 07:47:22.502161 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-f6694" Sep 30 07:47:22 crc kubenswrapper[4691]: I0930 07:47:22.549096 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-f6694"] Sep 30 07:47:22 crc kubenswrapper[4691]: I0930 07:47:22.550909 4691 scope.go:117] "RemoveContainer" containerID="88826f1da5f792e7946d98e7d3455e90359c35f4d40f822832e575249da3371f" Sep 30 07:47:22 crc kubenswrapper[4691]: I0930 07:47:22.562053 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-f6694"] Sep 30 07:47:22 crc kubenswrapper[4691]: I0930 07:47:22.850304 4691 patch_prober.go:28] interesting pod/machine-config-daemon-4w4k6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 07:47:22 crc kubenswrapper[4691]: I0930 07:47:22.850376 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 07:47:22 crc kubenswrapper[4691]: I0930 07:47:22.850440 4691 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" Sep 30 07:47:22 crc kubenswrapper[4691]: I0930 07:47:22.851201 4691 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d2c1910332c7b39c28fb9c23f54694ce1e0dbe190d7af26b09b1c40104f1b3da"} pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 07:47:22 crc kubenswrapper[4691]: I0930 07:47:22.851298 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" containerName="machine-config-daemon" containerID="cri-o://d2c1910332c7b39c28fb9c23f54694ce1e0dbe190d7af26b09b1c40104f1b3da" gracePeriod=600 Sep 30 07:47:23 crc kubenswrapper[4691]: E0930 07:47:23.129874 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:47:23 crc kubenswrapper[4691]: I0930 07:47:23.135781 4691 scope.go:117] "RemoveContainer" containerID="e8bc2c18aa1239dead5188c500986c13d1384a4e2bdb3783b34d34528f599417" Sep 30 07:47:23 crc kubenswrapper[4691]: I0930 07:47:23.239655 4691 scope.go:117] "RemoveContainer" containerID="39e9beb830231af96640c818b7730cbdee77d777e559cda3b75b32378abed389" Sep 30 07:47:23 crc kubenswrapper[4691]: E0930 07:47:23.240535 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39e9beb830231af96640c818b7730cbdee77d777e559cda3b75b32378abed389\": container with ID starting with 39e9beb830231af96640c818b7730cbdee77d777e559cda3b75b32378abed389 not found: ID does not exist" containerID="39e9beb830231af96640c818b7730cbdee77d777e559cda3b75b32378abed389" Sep 30 07:47:23 crc kubenswrapper[4691]: I0930 07:47:23.240585 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39e9beb830231af96640c818b7730cbdee77d777e559cda3b75b32378abed389"} err="failed to get container status \"39e9beb830231af96640c818b7730cbdee77d777e559cda3b75b32378abed389\": rpc error: code = NotFound desc = could not find container \"39e9beb830231af96640c818b7730cbdee77d777e559cda3b75b32378abed389\": container with ID starting with 39e9beb830231af96640c818b7730cbdee77d777e559cda3b75b32378abed389 not found: ID does not exist" Sep 30 07:47:23 crc kubenswrapper[4691]: I0930 07:47:23.240620 4691 scope.go:117] "RemoveContainer" containerID="88826f1da5f792e7946d98e7d3455e90359c35f4d40f822832e575249da3371f" Sep 30 07:47:23 crc kubenswrapper[4691]: E0930 07:47:23.241369 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"88826f1da5f792e7946d98e7d3455e90359c35f4d40f822832e575249da3371f\": container with ID starting with 88826f1da5f792e7946d98e7d3455e90359c35f4d40f822832e575249da3371f not found: ID does not exist" containerID="88826f1da5f792e7946d98e7d3455e90359c35f4d40f822832e575249da3371f" Sep 30 07:47:23 crc kubenswrapper[4691]: I0930 07:47:23.241489 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"88826f1da5f792e7946d98e7d3455e90359c35f4d40f822832e575249da3371f"} err="failed to get container status \"88826f1da5f792e7946d98e7d3455e90359c35f4d40f822832e575249da3371f\": rpc error: code = NotFound desc = could not find container \"88826f1da5f792e7946d98e7d3455e90359c35f4d40f822832e575249da3371f\": container with ID starting with 88826f1da5f792e7946d98e7d3455e90359c35f4d40f822832e575249da3371f not found: ID does not exist" Sep 30 07:47:23 crc kubenswrapper[4691]: I0930 07:47:23.241568 4691 scope.go:117] "RemoveContainer" containerID="e8bc2c18aa1239dead5188c500986c13d1384a4e2bdb3783b34d34528f599417" Sep 30 07:47:23 crc kubenswrapper[4691]: E0930 07:47:23.242094 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8bc2c18aa1239dead5188c500986c13d1384a4e2bdb3783b34d34528f599417\": container with ID starting with e8bc2c18aa1239dead5188c500986c13d1384a4e2bdb3783b34d34528f599417 not found: ID does not exist" containerID="e8bc2c18aa1239dead5188c500986c13d1384a4e2bdb3783b34d34528f599417" Sep 30 07:47:23 crc kubenswrapper[4691]: I0930 07:47:23.242186 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8bc2c18aa1239dead5188c500986c13d1384a4e2bdb3783b34d34528f599417"} err="failed to get container status \"e8bc2c18aa1239dead5188c500986c13d1384a4e2bdb3783b34d34528f599417\": rpc error: code = NotFound desc = could not find container \"e8bc2c18aa1239dead5188c500986c13d1384a4e2bdb3783b34d34528f599417\": container with ID starting with e8bc2c18aa1239dead5188c500986c13d1384a4e2bdb3783b34d34528f599417 not found: ID does not exist" Sep 30 07:47:23 crc kubenswrapper[4691]: I0930 07:47:23.250282 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc6c63a6-344d-4be3-a502-0c3c1d5a8d0c" path="/var/lib/kubelet/pods/bc6c63a6-344d-4be3-a502-0c3c1d5a8d0c/volumes" Sep 30 07:47:23 crc kubenswrapper[4691]: I0930 07:47:23.512509 4691 generic.go:334] "Generic (PLEG): container finished" podID="69b46ade-8260-448f-84b7-506632d23ff9" containerID="d2c1910332c7b39c28fb9c23f54694ce1e0dbe190d7af26b09b1c40104f1b3da" exitCode=0 Sep 30 07:47:23 crc kubenswrapper[4691]: I0930 07:47:23.512600 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" event={"ID":"69b46ade-8260-448f-84b7-506632d23ff9","Type":"ContainerDied","Data":"d2c1910332c7b39c28fb9c23f54694ce1e0dbe190d7af26b09b1c40104f1b3da"} Sep 30 07:47:23 crc kubenswrapper[4691]: I0930 07:47:23.512639 4691 scope.go:117] "RemoveContainer" containerID="46e59622e78b8b72120d17869ac466e93f8c7d3d5e94b7b3aebd944a72fbfe16" Sep 30 07:47:23 crc kubenswrapper[4691]: I0930 07:47:23.513376 4691 scope.go:117] "RemoveContainer" containerID="d2c1910332c7b39c28fb9c23f54694ce1e0dbe190d7af26b09b1c40104f1b3da" Sep 30 07:47:23 crc kubenswrapper[4691]: E0930 07:47:23.513757 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:47:37 crc kubenswrapper[4691]: I0930 07:47:37.231768 4691 scope.go:117] "RemoveContainer" containerID="d2c1910332c7b39c28fb9c23f54694ce1e0dbe190d7af26b09b1c40104f1b3da" Sep 30 07:47:37 crc kubenswrapper[4691]: E0930 07:47:37.232465 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:47:48 crc kubenswrapper[4691]: I0930 07:47:48.225469 4691 scope.go:117] "RemoveContainer" containerID="d2c1910332c7b39c28fb9c23f54694ce1e0dbe190d7af26b09b1c40104f1b3da" Sep 30 07:47:48 crc kubenswrapper[4691]: E0930 07:47:48.226786 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:47:54 crc kubenswrapper[4691]: I0930 07:47:54.608353 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-62fxl"] Sep 30 07:47:54 crc kubenswrapper[4691]: E0930 07:47:54.609994 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc6c63a6-344d-4be3-a502-0c3c1d5a8d0c" containerName="registry-server" Sep 30 07:47:54 crc kubenswrapper[4691]: I0930 07:47:54.610027 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc6c63a6-344d-4be3-a502-0c3c1d5a8d0c" containerName="registry-server" Sep 30 07:47:54 crc kubenswrapper[4691]: E0930 07:47:54.610073 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc6c63a6-344d-4be3-a502-0c3c1d5a8d0c" containerName="extract-content" Sep 30 07:47:54 crc kubenswrapper[4691]: I0930 07:47:54.610091 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc6c63a6-344d-4be3-a502-0c3c1d5a8d0c" containerName="extract-content" Sep 30 07:47:54 crc kubenswrapper[4691]: E0930 07:47:54.610180 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc6c63a6-344d-4be3-a502-0c3c1d5a8d0c" containerName="extract-utilities" Sep 30 07:47:54 crc kubenswrapper[4691]: I0930 07:47:54.610199 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc6c63a6-344d-4be3-a502-0c3c1d5a8d0c" containerName="extract-utilities" Sep 30 07:47:54 crc kubenswrapper[4691]: I0930 07:47:54.610680 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc6c63a6-344d-4be3-a502-0c3c1d5a8d0c" containerName="registry-server" Sep 30 07:47:54 crc kubenswrapper[4691]: I0930 07:47:54.617089 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-62fxl" Sep 30 07:47:54 crc kubenswrapper[4691]: I0930 07:47:54.622653 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-62fxl"] Sep 30 07:47:54 crc kubenswrapper[4691]: I0930 07:47:54.763404 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xm75\" (UniqueName: \"kubernetes.io/projected/a9496dbb-2305-4955-8584-250778e74b43-kube-api-access-6xm75\") pod \"redhat-marketplace-62fxl\" (UID: \"a9496dbb-2305-4955-8584-250778e74b43\") " pod="openshift-marketplace/redhat-marketplace-62fxl" Sep 30 07:47:54 crc kubenswrapper[4691]: I0930 07:47:54.763616 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9496dbb-2305-4955-8584-250778e74b43-utilities\") pod \"redhat-marketplace-62fxl\" (UID: \"a9496dbb-2305-4955-8584-250778e74b43\") " pod="openshift-marketplace/redhat-marketplace-62fxl" Sep 30 07:47:54 crc kubenswrapper[4691]: I0930 07:47:54.763757 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9496dbb-2305-4955-8584-250778e74b43-catalog-content\") pod \"redhat-marketplace-62fxl\" (UID: \"a9496dbb-2305-4955-8584-250778e74b43\") " pod="openshift-marketplace/redhat-marketplace-62fxl" Sep 30 07:47:54 crc kubenswrapper[4691]: I0930 07:47:54.865383 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9496dbb-2305-4955-8584-250778e74b43-utilities\") pod \"redhat-marketplace-62fxl\" (UID: \"a9496dbb-2305-4955-8584-250778e74b43\") " pod="openshift-marketplace/redhat-marketplace-62fxl" Sep 30 07:47:54 crc kubenswrapper[4691]: I0930 07:47:54.865520 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9496dbb-2305-4955-8584-250778e74b43-catalog-content\") pod \"redhat-marketplace-62fxl\" (UID: \"a9496dbb-2305-4955-8584-250778e74b43\") " pod="openshift-marketplace/redhat-marketplace-62fxl" Sep 30 07:47:54 crc kubenswrapper[4691]: I0930 07:47:54.865549 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xm75\" (UniqueName: \"kubernetes.io/projected/a9496dbb-2305-4955-8584-250778e74b43-kube-api-access-6xm75\") pod \"redhat-marketplace-62fxl\" (UID: \"a9496dbb-2305-4955-8584-250778e74b43\") " pod="openshift-marketplace/redhat-marketplace-62fxl" Sep 30 07:47:54 crc kubenswrapper[4691]: I0930 07:47:54.865990 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9496dbb-2305-4955-8584-250778e74b43-utilities\") pod \"redhat-marketplace-62fxl\" (UID: \"a9496dbb-2305-4955-8584-250778e74b43\") " pod="openshift-marketplace/redhat-marketplace-62fxl" Sep 30 07:47:54 crc kubenswrapper[4691]: I0930 07:47:54.866026 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9496dbb-2305-4955-8584-250778e74b43-catalog-content\") pod \"redhat-marketplace-62fxl\" (UID: \"a9496dbb-2305-4955-8584-250778e74b43\") " pod="openshift-marketplace/redhat-marketplace-62fxl" Sep 30 07:47:54 crc kubenswrapper[4691]: I0930 07:47:54.892685 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xm75\" (UniqueName: \"kubernetes.io/projected/a9496dbb-2305-4955-8584-250778e74b43-kube-api-access-6xm75\") pod \"redhat-marketplace-62fxl\" (UID: \"a9496dbb-2305-4955-8584-250778e74b43\") " pod="openshift-marketplace/redhat-marketplace-62fxl" Sep 30 07:47:54 crc kubenswrapper[4691]: I0930 07:47:54.965187 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-62fxl" Sep 30 07:47:55 crc kubenswrapper[4691]: I0930 07:47:55.444530 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-62fxl"] Sep 30 07:47:55 crc kubenswrapper[4691]: I0930 07:47:55.900460 4691 generic.go:334] "Generic (PLEG): container finished" podID="a9496dbb-2305-4955-8584-250778e74b43" containerID="e125f443513f5d939de5dfc0d7f43e5378508b6ab6c30d26c97cbe864e07bef1" exitCode=0 Sep 30 07:47:55 crc kubenswrapper[4691]: I0930 07:47:55.900716 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-62fxl" event={"ID":"a9496dbb-2305-4955-8584-250778e74b43","Type":"ContainerDied","Data":"e125f443513f5d939de5dfc0d7f43e5378508b6ab6c30d26c97cbe864e07bef1"} Sep 30 07:47:55 crc kubenswrapper[4691]: I0930 07:47:55.900747 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-62fxl" event={"ID":"a9496dbb-2305-4955-8584-250778e74b43","Type":"ContainerStarted","Data":"c44d9305d0a831e72d214f26aa4bd563b9940dd655628639ccb7bfe0ddf18725"} Sep 30 07:47:57 crc kubenswrapper[4691]: I0930 07:47:57.925977 4691 generic.go:334] "Generic (PLEG): container finished" podID="a9496dbb-2305-4955-8584-250778e74b43" containerID="89ed71c0175ac99e783adbbc5e0f3887f8d34792770ca5bbbb630b6d3407ad95" exitCode=0 Sep 30 07:47:57 crc kubenswrapper[4691]: I0930 07:47:57.926093 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-62fxl" event={"ID":"a9496dbb-2305-4955-8584-250778e74b43","Type":"ContainerDied","Data":"89ed71c0175ac99e783adbbc5e0f3887f8d34792770ca5bbbb630b6d3407ad95"} Sep 30 07:47:58 crc kubenswrapper[4691]: I0930 07:47:58.944863 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-62fxl" event={"ID":"a9496dbb-2305-4955-8584-250778e74b43","Type":"ContainerStarted","Data":"1acdf4f71408883583defb597b0b61e854ad0139382d8d741dcc152107379f85"} Sep 30 07:47:58 crc kubenswrapper[4691]: I0930 07:47:58.996861 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-62fxl" podStartSLOduration=2.545404723 podStartE2EDuration="4.996830889s" podCreationTimestamp="2025-09-30 07:47:54 +0000 UTC" firstStartedPulling="2025-09-30 07:47:55.90280881 +0000 UTC m=+5319.377829860" lastFinishedPulling="2025-09-30 07:47:58.354234986 +0000 UTC m=+5321.829256026" observedRunningTime="2025-09-30 07:47:58.978040728 +0000 UTC m=+5322.453061768" watchObservedRunningTime="2025-09-30 07:47:58.996830889 +0000 UTC m=+5322.471851969" Sep 30 07:48:02 crc kubenswrapper[4691]: I0930 07:48:02.224827 4691 scope.go:117] "RemoveContainer" containerID="d2c1910332c7b39c28fb9c23f54694ce1e0dbe190d7af26b09b1c40104f1b3da" Sep 30 07:48:02 crc kubenswrapper[4691]: E0930 07:48:02.225382 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:48:04 crc kubenswrapper[4691]: I0930 07:48:04.965415 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-62fxl" Sep 30 07:48:04 crc kubenswrapper[4691]: I0930 07:48:04.966920 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-62fxl" Sep 30 07:48:05 crc kubenswrapper[4691]: I0930 07:48:05.045368 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-62fxl" Sep 30 07:48:06 crc kubenswrapper[4691]: I0930 07:48:06.124937 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-62fxl" Sep 30 07:48:06 crc kubenswrapper[4691]: I0930 07:48:06.196519 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-62fxl"] Sep 30 07:48:08 crc kubenswrapper[4691]: I0930 07:48:08.062089 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-62fxl" podUID="a9496dbb-2305-4955-8584-250778e74b43" containerName="registry-server" containerID="cri-o://1acdf4f71408883583defb597b0b61e854ad0139382d8d741dcc152107379f85" gracePeriod=2 Sep 30 07:48:08 crc kubenswrapper[4691]: I0930 07:48:08.538982 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-62fxl" Sep 30 07:48:08 crc kubenswrapper[4691]: I0930 07:48:08.668630 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9496dbb-2305-4955-8584-250778e74b43-utilities\") pod \"a9496dbb-2305-4955-8584-250778e74b43\" (UID: \"a9496dbb-2305-4955-8584-250778e74b43\") " Sep 30 07:48:08 crc kubenswrapper[4691]: I0930 07:48:08.669067 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9496dbb-2305-4955-8584-250778e74b43-catalog-content\") pod \"a9496dbb-2305-4955-8584-250778e74b43\" (UID: \"a9496dbb-2305-4955-8584-250778e74b43\") " Sep 30 07:48:08 crc kubenswrapper[4691]: I0930 07:48:08.669299 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xm75\" (UniqueName: \"kubernetes.io/projected/a9496dbb-2305-4955-8584-250778e74b43-kube-api-access-6xm75\") pod \"a9496dbb-2305-4955-8584-250778e74b43\" (UID: \"a9496dbb-2305-4955-8584-250778e74b43\") " Sep 30 07:48:08 crc kubenswrapper[4691]: I0930 07:48:08.669660 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9496dbb-2305-4955-8584-250778e74b43-utilities" (OuterVolumeSpecName: "utilities") pod "a9496dbb-2305-4955-8584-250778e74b43" (UID: "a9496dbb-2305-4955-8584-250778e74b43"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:48:08 crc kubenswrapper[4691]: I0930 07:48:08.670090 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9496dbb-2305-4955-8584-250778e74b43-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 07:48:08 crc kubenswrapper[4691]: I0930 07:48:08.675653 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9496dbb-2305-4955-8584-250778e74b43-kube-api-access-6xm75" (OuterVolumeSpecName: "kube-api-access-6xm75") pod "a9496dbb-2305-4955-8584-250778e74b43" (UID: "a9496dbb-2305-4955-8584-250778e74b43"). InnerVolumeSpecName "kube-api-access-6xm75". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:48:08 crc kubenswrapper[4691]: I0930 07:48:08.682976 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9496dbb-2305-4955-8584-250778e74b43-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a9496dbb-2305-4955-8584-250778e74b43" (UID: "a9496dbb-2305-4955-8584-250778e74b43"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:48:08 crc kubenswrapper[4691]: I0930 07:48:08.771675 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9496dbb-2305-4955-8584-250778e74b43-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 07:48:08 crc kubenswrapper[4691]: I0930 07:48:08.771717 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xm75\" (UniqueName: \"kubernetes.io/projected/a9496dbb-2305-4955-8584-250778e74b43-kube-api-access-6xm75\") on node \"crc\" DevicePath \"\"" Sep 30 07:48:09 crc kubenswrapper[4691]: I0930 07:48:09.080315 4691 generic.go:334] "Generic (PLEG): container finished" podID="a9496dbb-2305-4955-8584-250778e74b43" containerID="1acdf4f71408883583defb597b0b61e854ad0139382d8d741dcc152107379f85" exitCode=0 Sep 30 07:48:09 crc kubenswrapper[4691]: I0930 07:48:09.080374 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-62fxl" event={"ID":"a9496dbb-2305-4955-8584-250778e74b43","Type":"ContainerDied","Data":"1acdf4f71408883583defb597b0b61e854ad0139382d8d741dcc152107379f85"} Sep 30 07:48:09 crc kubenswrapper[4691]: I0930 07:48:09.080410 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-62fxl" event={"ID":"a9496dbb-2305-4955-8584-250778e74b43","Type":"ContainerDied","Data":"c44d9305d0a831e72d214f26aa4bd563b9940dd655628639ccb7bfe0ddf18725"} Sep 30 07:48:09 crc kubenswrapper[4691]: I0930 07:48:09.080434 4691 scope.go:117] "RemoveContainer" containerID="1acdf4f71408883583defb597b0b61e854ad0139382d8d741dcc152107379f85" Sep 30 07:48:09 crc kubenswrapper[4691]: I0930 07:48:09.080493 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-62fxl" Sep 30 07:48:09 crc kubenswrapper[4691]: I0930 07:48:09.116630 4691 scope.go:117] "RemoveContainer" containerID="89ed71c0175ac99e783adbbc5e0f3887f8d34792770ca5bbbb630b6d3407ad95" Sep 30 07:48:09 crc kubenswrapper[4691]: I0930 07:48:09.147907 4691 scope.go:117] "RemoveContainer" containerID="e125f443513f5d939de5dfc0d7f43e5378508b6ab6c30d26c97cbe864e07bef1" Sep 30 07:48:09 crc kubenswrapper[4691]: I0930 07:48:09.148460 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-62fxl"] Sep 30 07:48:09 crc kubenswrapper[4691]: I0930 07:48:09.163816 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-62fxl"] Sep 30 07:48:09 crc kubenswrapper[4691]: I0930 07:48:09.212918 4691 scope.go:117] "RemoveContainer" containerID="1acdf4f71408883583defb597b0b61e854ad0139382d8d741dcc152107379f85" Sep 30 07:48:09 crc kubenswrapper[4691]: E0930 07:48:09.213472 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1acdf4f71408883583defb597b0b61e854ad0139382d8d741dcc152107379f85\": container with ID starting with 1acdf4f71408883583defb597b0b61e854ad0139382d8d741dcc152107379f85 not found: ID does not exist" containerID="1acdf4f71408883583defb597b0b61e854ad0139382d8d741dcc152107379f85" Sep 30 07:48:09 crc kubenswrapper[4691]: I0930 07:48:09.213520 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1acdf4f71408883583defb597b0b61e854ad0139382d8d741dcc152107379f85"} err="failed to get container status \"1acdf4f71408883583defb597b0b61e854ad0139382d8d741dcc152107379f85\": rpc error: code = NotFound desc = could not find container \"1acdf4f71408883583defb597b0b61e854ad0139382d8d741dcc152107379f85\": container with ID starting with 1acdf4f71408883583defb597b0b61e854ad0139382d8d741dcc152107379f85 not found: ID does not exist" Sep 30 07:48:09 crc kubenswrapper[4691]: I0930 07:48:09.213553 4691 scope.go:117] "RemoveContainer" containerID="89ed71c0175ac99e783adbbc5e0f3887f8d34792770ca5bbbb630b6d3407ad95" Sep 30 07:48:09 crc kubenswrapper[4691]: E0930 07:48:09.214014 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89ed71c0175ac99e783adbbc5e0f3887f8d34792770ca5bbbb630b6d3407ad95\": container with ID starting with 89ed71c0175ac99e783adbbc5e0f3887f8d34792770ca5bbbb630b6d3407ad95 not found: ID does not exist" containerID="89ed71c0175ac99e783adbbc5e0f3887f8d34792770ca5bbbb630b6d3407ad95" Sep 30 07:48:09 crc kubenswrapper[4691]: I0930 07:48:09.214071 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89ed71c0175ac99e783adbbc5e0f3887f8d34792770ca5bbbb630b6d3407ad95"} err="failed to get container status \"89ed71c0175ac99e783adbbc5e0f3887f8d34792770ca5bbbb630b6d3407ad95\": rpc error: code = NotFound desc = could not find container \"89ed71c0175ac99e783adbbc5e0f3887f8d34792770ca5bbbb630b6d3407ad95\": container with ID starting with 89ed71c0175ac99e783adbbc5e0f3887f8d34792770ca5bbbb630b6d3407ad95 not found: ID does not exist" Sep 30 07:48:09 crc kubenswrapper[4691]: I0930 07:48:09.214093 4691 scope.go:117] "RemoveContainer" containerID="e125f443513f5d939de5dfc0d7f43e5378508b6ab6c30d26c97cbe864e07bef1" Sep 30 07:48:09 crc kubenswrapper[4691]: E0930 07:48:09.214360 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e125f443513f5d939de5dfc0d7f43e5378508b6ab6c30d26c97cbe864e07bef1\": container with ID starting with e125f443513f5d939de5dfc0d7f43e5378508b6ab6c30d26c97cbe864e07bef1 not found: ID does not exist" containerID="e125f443513f5d939de5dfc0d7f43e5378508b6ab6c30d26c97cbe864e07bef1" Sep 30 07:48:09 crc kubenswrapper[4691]: I0930 07:48:09.214415 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e125f443513f5d939de5dfc0d7f43e5378508b6ab6c30d26c97cbe864e07bef1"} err="failed to get container status \"e125f443513f5d939de5dfc0d7f43e5378508b6ab6c30d26c97cbe864e07bef1\": rpc error: code = NotFound desc = could not find container \"e125f443513f5d939de5dfc0d7f43e5378508b6ab6c30d26c97cbe864e07bef1\": container with ID starting with e125f443513f5d939de5dfc0d7f43e5378508b6ab6c30d26c97cbe864e07bef1 not found: ID does not exist" Sep 30 07:48:09 crc kubenswrapper[4691]: I0930 07:48:09.236459 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9496dbb-2305-4955-8584-250778e74b43" path="/var/lib/kubelet/pods/a9496dbb-2305-4955-8584-250778e74b43/volumes" Sep 30 07:48:14 crc kubenswrapper[4691]: I0930 07:48:14.224932 4691 scope.go:117] "RemoveContainer" containerID="d2c1910332c7b39c28fb9c23f54694ce1e0dbe190d7af26b09b1c40104f1b3da" Sep 30 07:48:14 crc kubenswrapper[4691]: E0930 07:48:14.226025 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:48:16 crc kubenswrapper[4691]: I0930 07:48:16.995992 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ds2r8"] Sep 30 07:48:16 crc kubenswrapper[4691]: E0930 07:48:16.996799 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9496dbb-2305-4955-8584-250778e74b43" containerName="extract-content" Sep 30 07:48:16 crc kubenswrapper[4691]: I0930 07:48:16.996815 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9496dbb-2305-4955-8584-250778e74b43" containerName="extract-content" Sep 30 07:48:16 crc kubenswrapper[4691]: E0930 07:48:16.996844 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9496dbb-2305-4955-8584-250778e74b43" containerName="extract-utilities" Sep 30 07:48:16 crc kubenswrapper[4691]: I0930 07:48:16.996853 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9496dbb-2305-4955-8584-250778e74b43" containerName="extract-utilities" Sep 30 07:48:16 crc kubenswrapper[4691]: E0930 07:48:16.996869 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9496dbb-2305-4955-8584-250778e74b43" containerName="registry-server" Sep 30 07:48:16 crc kubenswrapper[4691]: I0930 07:48:16.996878 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9496dbb-2305-4955-8584-250778e74b43" containerName="registry-server" Sep 30 07:48:16 crc kubenswrapper[4691]: I0930 07:48:16.997159 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9496dbb-2305-4955-8584-250778e74b43" containerName="registry-server" Sep 30 07:48:16 crc kubenswrapper[4691]: I0930 07:48:16.999017 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ds2r8" Sep 30 07:48:17 crc kubenswrapper[4691]: I0930 07:48:17.006205 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ds2r8"] Sep 30 07:48:17 crc kubenswrapper[4691]: I0930 07:48:17.068301 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjbrq\" (UniqueName: \"kubernetes.io/projected/8b1025f6-05a8-4d8c-9fdb-83955f2c38d1-kube-api-access-mjbrq\") pod \"certified-operators-ds2r8\" (UID: \"8b1025f6-05a8-4d8c-9fdb-83955f2c38d1\") " pod="openshift-marketplace/certified-operators-ds2r8" Sep 30 07:48:17 crc kubenswrapper[4691]: I0930 07:48:17.068390 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b1025f6-05a8-4d8c-9fdb-83955f2c38d1-utilities\") pod \"certified-operators-ds2r8\" (UID: \"8b1025f6-05a8-4d8c-9fdb-83955f2c38d1\") " pod="openshift-marketplace/certified-operators-ds2r8" Sep 30 07:48:17 crc kubenswrapper[4691]: I0930 07:48:17.068512 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b1025f6-05a8-4d8c-9fdb-83955f2c38d1-catalog-content\") pod \"certified-operators-ds2r8\" (UID: \"8b1025f6-05a8-4d8c-9fdb-83955f2c38d1\") " pod="openshift-marketplace/certified-operators-ds2r8" Sep 30 07:48:17 crc kubenswrapper[4691]: I0930 07:48:17.170555 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b1025f6-05a8-4d8c-9fdb-83955f2c38d1-catalog-content\") pod \"certified-operators-ds2r8\" (UID: \"8b1025f6-05a8-4d8c-9fdb-83955f2c38d1\") " pod="openshift-marketplace/certified-operators-ds2r8" Sep 30 07:48:17 crc kubenswrapper[4691]: I0930 07:48:17.170730 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjbrq\" (UniqueName: \"kubernetes.io/projected/8b1025f6-05a8-4d8c-9fdb-83955f2c38d1-kube-api-access-mjbrq\") pod \"certified-operators-ds2r8\" (UID: \"8b1025f6-05a8-4d8c-9fdb-83955f2c38d1\") " pod="openshift-marketplace/certified-operators-ds2r8" Sep 30 07:48:17 crc kubenswrapper[4691]: I0930 07:48:17.170835 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b1025f6-05a8-4d8c-9fdb-83955f2c38d1-utilities\") pod \"certified-operators-ds2r8\" (UID: \"8b1025f6-05a8-4d8c-9fdb-83955f2c38d1\") " pod="openshift-marketplace/certified-operators-ds2r8" Sep 30 07:48:17 crc kubenswrapper[4691]: I0930 07:48:17.171175 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b1025f6-05a8-4d8c-9fdb-83955f2c38d1-catalog-content\") pod \"certified-operators-ds2r8\" (UID: \"8b1025f6-05a8-4d8c-9fdb-83955f2c38d1\") " pod="openshift-marketplace/certified-operators-ds2r8" Sep 30 07:48:17 crc kubenswrapper[4691]: I0930 07:48:17.171186 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b1025f6-05a8-4d8c-9fdb-83955f2c38d1-utilities\") pod \"certified-operators-ds2r8\" (UID: \"8b1025f6-05a8-4d8c-9fdb-83955f2c38d1\") " pod="openshift-marketplace/certified-operators-ds2r8" Sep 30 07:48:17 crc kubenswrapper[4691]: I0930 07:48:17.202598 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjbrq\" (UniqueName: \"kubernetes.io/projected/8b1025f6-05a8-4d8c-9fdb-83955f2c38d1-kube-api-access-mjbrq\") pod \"certified-operators-ds2r8\" (UID: \"8b1025f6-05a8-4d8c-9fdb-83955f2c38d1\") " pod="openshift-marketplace/certified-operators-ds2r8" Sep 30 07:48:17 crc kubenswrapper[4691]: I0930 07:48:17.341095 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ds2r8" Sep 30 07:48:17 crc kubenswrapper[4691]: I0930 07:48:17.897736 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ds2r8"] Sep 30 07:48:18 crc kubenswrapper[4691]: I0930 07:48:18.199954 4691 generic.go:334] "Generic (PLEG): container finished" podID="8b1025f6-05a8-4d8c-9fdb-83955f2c38d1" containerID="eafa3cf1dfb33864bad3c690c50accd3ebb0f4c83c0a6f6a74de0431694e2522" exitCode=0 Sep 30 07:48:18 crc kubenswrapper[4691]: I0930 07:48:18.200055 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ds2r8" event={"ID":"8b1025f6-05a8-4d8c-9fdb-83955f2c38d1","Type":"ContainerDied","Data":"eafa3cf1dfb33864bad3c690c50accd3ebb0f4c83c0a6f6a74de0431694e2522"} Sep 30 07:48:18 crc kubenswrapper[4691]: I0930 07:48:18.200306 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ds2r8" event={"ID":"8b1025f6-05a8-4d8c-9fdb-83955f2c38d1","Type":"ContainerStarted","Data":"9fc4d57f244baba925a24f519eaeed34b56d7044b3d996a9f90e1c85952fe918"} Sep 30 07:48:19 crc kubenswrapper[4691]: I0930 07:48:19.216203 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ds2r8" event={"ID":"8b1025f6-05a8-4d8c-9fdb-83955f2c38d1","Type":"ContainerStarted","Data":"8a8e68383bb335cac2e582b6ee7d6fc9efe4b5517c1113a603b6577d44b9ff44"} Sep 30 07:48:20 crc kubenswrapper[4691]: I0930 07:48:20.231617 4691 generic.go:334] "Generic (PLEG): container finished" podID="8b1025f6-05a8-4d8c-9fdb-83955f2c38d1" containerID="8a8e68383bb335cac2e582b6ee7d6fc9efe4b5517c1113a603b6577d44b9ff44" exitCode=0 Sep 30 07:48:20 crc kubenswrapper[4691]: I0930 07:48:20.231729 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ds2r8" event={"ID":"8b1025f6-05a8-4d8c-9fdb-83955f2c38d1","Type":"ContainerDied","Data":"8a8e68383bb335cac2e582b6ee7d6fc9efe4b5517c1113a603b6577d44b9ff44"} Sep 30 07:48:21 crc kubenswrapper[4691]: I0930 07:48:21.245532 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ds2r8" event={"ID":"8b1025f6-05a8-4d8c-9fdb-83955f2c38d1","Type":"ContainerStarted","Data":"f6e7c58cd9ae645c6c68ae6a4ec8640a6986b006373587f77352a3cc09b3f5e0"} Sep 30 07:48:21 crc kubenswrapper[4691]: I0930 07:48:21.281562 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ds2r8" podStartSLOduration=2.855499361 podStartE2EDuration="5.281544306s" podCreationTimestamp="2025-09-30 07:48:16 +0000 UTC" firstStartedPulling="2025-09-30 07:48:18.202056832 +0000 UTC m=+5341.677077862" lastFinishedPulling="2025-09-30 07:48:20.628101767 +0000 UTC m=+5344.103122807" observedRunningTime="2025-09-30 07:48:21.272188748 +0000 UTC m=+5344.747209868" watchObservedRunningTime="2025-09-30 07:48:21.281544306 +0000 UTC m=+5344.756565346" Sep 30 07:48:27 crc kubenswrapper[4691]: I0930 07:48:27.341259 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ds2r8" Sep 30 07:48:27 crc kubenswrapper[4691]: I0930 07:48:27.343070 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ds2r8" Sep 30 07:48:27 crc kubenswrapper[4691]: I0930 07:48:27.415246 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ds2r8" Sep 30 07:48:28 crc kubenswrapper[4691]: I0930 07:48:28.225062 4691 scope.go:117] "RemoveContainer" containerID="d2c1910332c7b39c28fb9c23f54694ce1e0dbe190d7af26b09b1c40104f1b3da" Sep 30 07:48:28 crc kubenswrapper[4691]: E0930 07:48:28.225757 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:48:28 crc kubenswrapper[4691]: I0930 07:48:28.390083 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ds2r8" Sep 30 07:48:28 crc kubenswrapper[4691]: I0930 07:48:28.448162 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ds2r8"] Sep 30 07:48:30 crc kubenswrapper[4691]: I0930 07:48:30.333090 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ds2r8" podUID="8b1025f6-05a8-4d8c-9fdb-83955f2c38d1" containerName="registry-server" containerID="cri-o://f6e7c58cd9ae645c6c68ae6a4ec8640a6986b006373587f77352a3cc09b3f5e0" gracePeriod=2 Sep 30 07:48:30 crc kubenswrapper[4691]: I0930 07:48:30.843773 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ds2r8" Sep 30 07:48:30 crc kubenswrapper[4691]: I0930 07:48:30.862504 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b1025f6-05a8-4d8c-9fdb-83955f2c38d1-utilities\") pod \"8b1025f6-05a8-4d8c-9fdb-83955f2c38d1\" (UID: \"8b1025f6-05a8-4d8c-9fdb-83955f2c38d1\") " Sep 30 07:48:30 crc kubenswrapper[4691]: I0930 07:48:30.862655 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjbrq\" (UniqueName: \"kubernetes.io/projected/8b1025f6-05a8-4d8c-9fdb-83955f2c38d1-kube-api-access-mjbrq\") pod \"8b1025f6-05a8-4d8c-9fdb-83955f2c38d1\" (UID: \"8b1025f6-05a8-4d8c-9fdb-83955f2c38d1\") " Sep 30 07:48:30 crc kubenswrapper[4691]: I0930 07:48:30.862782 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b1025f6-05a8-4d8c-9fdb-83955f2c38d1-catalog-content\") pod \"8b1025f6-05a8-4d8c-9fdb-83955f2c38d1\" (UID: \"8b1025f6-05a8-4d8c-9fdb-83955f2c38d1\") " Sep 30 07:48:30 crc kubenswrapper[4691]: I0930 07:48:30.864204 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b1025f6-05a8-4d8c-9fdb-83955f2c38d1-utilities" (OuterVolumeSpecName: "utilities") pod "8b1025f6-05a8-4d8c-9fdb-83955f2c38d1" (UID: "8b1025f6-05a8-4d8c-9fdb-83955f2c38d1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:48:30 crc kubenswrapper[4691]: I0930 07:48:30.870930 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b1025f6-05a8-4d8c-9fdb-83955f2c38d1-kube-api-access-mjbrq" (OuterVolumeSpecName: "kube-api-access-mjbrq") pod "8b1025f6-05a8-4d8c-9fdb-83955f2c38d1" (UID: "8b1025f6-05a8-4d8c-9fdb-83955f2c38d1"). InnerVolumeSpecName "kube-api-access-mjbrq". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:48:30 crc kubenswrapper[4691]: I0930 07:48:30.929344 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8b1025f6-05a8-4d8c-9fdb-83955f2c38d1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8b1025f6-05a8-4d8c-9fdb-83955f2c38d1" (UID: "8b1025f6-05a8-4d8c-9fdb-83955f2c38d1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:48:30 crc kubenswrapper[4691]: I0930 07:48:30.965301 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b1025f6-05a8-4d8c-9fdb-83955f2c38d1-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 07:48:30 crc kubenswrapper[4691]: I0930 07:48:30.965366 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b1025f6-05a8-4d8c-9fdb-83955f2c38d1-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 07:48:30 crc kubenswrapper[4691]: I0930 07:48:30.965377 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjbrq\" (UniqueName: \"kubernetes.io/projected/8b1025f6-05a8-4d8c-9fdb-83955f2c38d1-kube-api-access-mjbrq\") on node \"crc\" DevicePath \"\"" Sep 30 07:48:31 crc kubenswrapper[4691]: I0930 07:48:31.346412 4691 generic.go:334] "Generic (PLEG): container finished" podID="8b1025f6-05a8-4d8c-9fdb-83955f2c38d1" containerID="f6e7c58cd9ae645c6c68ae6a4ec8640a6986b006373587f77352a3cc09b3f5e0" exitCode=0 Sep 30 07:48:31 crc kubenswrapper[4691]: I0930 07:48:31.346499 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ds2r8" event={"ID":"8b1025f6-05a8-4d8c-9fdb-83955f2c38d1","Type":"ContainerDied","Data":"f6e7c58cd9ae645c6c68ae6a4ec8640a6986b006373587f77352a3cc09b3f5e0"} Sep 30 07:48:31 crc kubenswrapper[4691]: I0930 07:48:31.346537 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ds2r8" Sep 30 07:48:31 crc kubenswrapper[4691]: I0930 07:48:31.346568 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ds2r8" event={"ID":"8b1025f6-05a8-4d8c-9fdb-83955f2c38d1","Type":"ContainerDied","Data":"9fc4d57f244baba925a24f519eaeed34b56d7044b3d996a9f90e1c85952fe918"} Sep 30 07:48:31 crc kubenswrapper[4691]: I0930 07:48:31.346596 4691 scope.go:117] "RemoveContainer" containerID="f6e7c58cd9ae645c6c68ae6a4ec8640a6986b006373587f77352a3cc09b3f5e0" Sep 30 07:48:31 crc kubenswrapper[4691]: I0930 07:48:31.373624 4691 scope.go:117] "RemoveContainer" containerID="8a8e68383bb335cac2e582b6ee7d6fc9efe4b5517c1113a603b6577d44b9ff44" Sep 30 07:48:31 crc kubenswrapper[4691]: I0930 07:48:31.376724 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ds2r8"] Sep 30 07:48:31 crc kubenswrapper[4691]: I0930 07:48:31.387186 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ds2r8"] Sep 30 07:48:31 crc kubenswrapper[4691]: I0930 07:48:31.396441 4691 scope.go:117] "RemoveContainer" containerID="eafa3cf1dfb33864bad3c690c50accd3ebb0f4c83c0a6f6a74de0431694e2522" Sep 30 07:48:31 crc kubenswrapper[4691]: I0930 07:48:31.450666 4691 scope.go:117] "RemoveContainer" containerID="f6e7c58cd9ae645c6c68ae6a4ec8640a6986b006373587f77352a3cc09b3f5e0" Sep 30 07:48:31 crc kubenswrapper[4691]: E0930 07:48:31.451232 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6e7c58cd9ae645c6c68ae6a4ec8640a6986b006373587f77352a3cc09b3f5e0\": container with ID starting with f6e7c58cd9ae645c6c68ae6a4ec8640a6986b006373587f77352a3cc09b3f5e0 not found: ID does not exist" containerID="f6e7c58cd9ae645c6c68ae6a4ec8640a6986b006373587f77352a3cc09b3f5e0" Sep 30 07:48:31 crc kubenswrapper[4691]: I0930 07:48:31.451340 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6e7c58cd9ae645c6c68ae6a4ec8640a6986b006373587f77352a3cc09b3f5e0"} err="failed to get container status \"f6e7c58cd9ae645c6c68ae6a4ec8640a6986b006373587f77352a3cc09b3f5e0\": rpc error: code = NotFound desc = could not find container \"f6e7c58cd9ae645c6c68ae6a4ec8640a6986b006373587f77352a3cc09b3f5e0\": container with ID starting with f6e7c58cd9ae645c6c68ae6a4ec8640a6986b006373587f77352a3cc09b3f5e0 not found: ID does not exist" Sep 30 07:48:31 crc kubenswrapper[4691]: I0930 07:48:31.451380 4691 scope.go:117] "RemoveContainer" containerID="8a8e68383bb335cac2e582b6ee7d6fc9efe4b5517c1113a603b6577d44b9ff44" Sep 30 07:48:31 crc kubenswrapper[4691]: E0930 07:48:31.451845 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a8e68383bb335cac2e582b6ee7d6fc9efe4b5517c1113a603b6577d44b9ff44\": container with ID starting with 8a8e68383bb335cac2e582b6ee7d6fc9efe4b5517c1113a603b6577d44b9ff44 not found: ID does not exist" containerID="8a8e68383bb335cac2e582b6ee7d6fc9efe4b5517c1113a603b6577d44b9ff44" Sep 30 07:48:31 crc kubenswrapper[4691]: I0930 07:48:31.451901 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a8e68383bb335cac2e582b6ee7d6fc9efe4b5517c1113a603b6577d44b9ff44"} err="failed to get container status \"8a8e68383bb335cac2e582b6ee7d6fc9efe4b5517c1113a603b6577d44b9ff44\": rpc error: code = NotFound desc = could not find container \"8a8e68383bb335cac2e582b6ee7d6fc9efe4b5517c1113a603b6577d44b9ff44\": container with ID starting with 8a8e68383bb335cac2e582b6ee7d6fc9efe4b5517c1113a603b6577d44b9ff44 not found: ID does not exist" Sep 30 07:48:31 crc kubenswrapper[4691]: I0930 07:48:31.451928 4691 scope.go:117] "RemoveContainer" containerID="eafa3cf1dfb33864bad3c690c50accd3ebb0f4c83c0a6f6a74de0431694e2522" Sep 30 07:48:31 crc kubenswrapper[4691]: E0930 07:48:31.452260 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eafa3cf1dfb33864bad3c690c50accd3ebb0f4c83c0a6f6a74de0431694e2522\": container with ID starting with eafa3cf1dfb33864bad3c690c50accd3ebb0f4c83c0a6f6a74de0431694e2522 not found: ID does not exist" containerID="eafa3cf1dfb33864bad3c690c50accd3ebb0f4c83c0a6f6a74de0431694e2522" Sep 30 07:48:31 crc kubenswrapper[4691]: I0930 07:48:31.452309 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eafa3cf1dfb33864bad3c690c50accd3ebb0f4c83c0a6f6a74de0431694e2522"} err="failed to get container status \"eafa3cf1dfb33864bad3c690c50accd3ebb0f4c83c0a6f6a74de0431694e2522\": rpc error: code = NotFound desc = could not find container \"eafa3cf1dfb33864bad3c690c50accd3ebb0f4c83c0a6f6a74de0431694e2522\": container with ID starting with eafa3cf1dfb33864bad3c690c50accd3ebb0f4c83c0a6f6a74de0431694e2522 not found: ID does not exist" Sep 30 07:48:33 crc kubenswrapper[4691]: I0930 07:48:33.236905 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b1025f6-05a8-4d8c-9fdb-83955f2c38d1" path="/var/lib/kubelet/pods/8b1025f6-05a8-4d8c-9fdb-83955f2c38d1/volumes" Sep 30 07:48:40 crc kubenswrapper[4691]: I0930 07:48:40.225936 4691 scope.go:117] "RemoveContainer" containerID="d2c1910332c7b39c28fb9c23f54694ce1e0dbe190d7af26b09b1c40104f1b3da" Sep 30 07:48:40 crc kubenswrapper[4691]: E0930 07:48:40.227132 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:48:52 crc kubenswrapper[4691]: I0930 07:48:52.224787 4691 scope.go:117] "RemoveContainer" containerID="d2c1910332c7b39c28fb9c23f54694ce1e0dbe190d7af26b09b1c40104f1b3da" Sep 30 07:48:52 crc kubenswrapper[4691]: E0930 07:48:52.225475 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:49:06 crc kubenswrapper[4691]: I0930 07:49:06.225343 4691 scope.go:117] "RemoveContainer" containerID="d2c1910332c7b39c28fb9c23f54694ce1e0dbe190d7af26b09b1c40104f1b3da" Sep 30 07:49:06 crc kubenswrapper[4691]: E0930 07:49:06.226020 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:49:17 crc kubenswrapper[4691]: I0930 07:49:17.240279 4691 scope.go:117] "RemoveContainer" containerID="d2c1910332c7b39c28fb9c23f54694ce1e0dbe190d7af26b09b1c40104f1b3da" Sep 30 07:49:17 crc kubenswrapper[4691]: E0930 07:49:17.241336 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:49:28 crc kubenswrapper[4691]: I0930 07:49:28.225015 4691 scope.go:117] "RemoveContainer" containerID="d2c1910332c7b39c28fb9c23f54694ce1e0dbe190d7af26b09b1c40104f1b3da" Sep 30 07:49:28 crc kubenswrapper[4691]: E0930 07:49:28.225795 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:49:42 crc kubenswrapper[4691]: I0930 07:49:42.226002 4691 scope.go:117] "RemoveContainer" containerID="d2c1910332c7b39c28fb9c23f54694ce1e0dbe190d7af26b09b1c40104f1b3da" Sep 30 07:49:42 crc kubenswrapper[4691]: E0930 07:49:42.227375 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:49:54 crc kubenswrapper[4691]: I0930 07:49:54.225867 4691 scope.go:117] "RemoveContainer" containerID="d2c1910332c7b39c28fb9c23f54694ce1e0dbe190d7af26b09b1c40104f1b3da" Sep 30 07:49:54 crc kubenswrapper[4691]: E0930 07:49:54.226931 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:50:07 crc kubenswrapper[4691]: I0930 07:50:07.237167 4691 scope.go:117] "RemoveContainer" containerID="d2c1910332c7b39c28fb9c23f54694ce1e0dbe190d7af26b09b1c40104f1b3da" Sep 30 07:50:07 crc kubenswrapper[4691]: E0930 07:50:07.237976 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:50:20 crc kubenswrapper[4691]: I0930 07:50:20.224984 4691 scope.go:117] "RemoveContainer" containerID="d2c1910332c7b39c28fb9c23f54694ce1e0dbe190d7af26b09b1c40104f1b3da" Sep 30 07:50:20 crc kubenswrapper[4691]: E0930 07:50:20.225586 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:50:33 crc kubenswrapper[4691]: I0930 07:50:33.225513 4691 scope.go:117] "RemoveContainer" containerID="d2c1910332c7b39c28fb9c23f54694ce1e0dbe190d7af26b09b1c40104f1b3da" Sep 30 07:50:33 crc kubenswrapper[4691]: E0930 07:50:33.226576 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:50:47 crc kubenswrapper[4691]: I0930 07:50:47.239803 4691 scope.go:117] "RemoveContainer" containerID="d2c1910332c7b39c28fb9c23f54694ce1e0dbe190d7af26b09b1c40104f1b3da" Sep 30 07:50:47 crc kubenswrapper[4691]: E0930 07:50:47.240777 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:50:58 crc kubenswrapper[4691]: I0930 07:50:58.225942 4691 scope.go:117] "RemoveContainer" containerID="d2c1910332c7b39c28fb9c23f54694ce1e0dbe190d7af26b09b1c40104f1b3da" Sep 30 07:50:58 crc kubenswrapper[4691]: E0930 07:50:58.226754 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:51:09 crc kubenswrapper[4691]: I0930 07:51:09.225852 4691 scope.go:117] "RemoveContainer" containerID="d2c1910332c7b39c28fb9c23f54694ce1e0dbe190d7af26b09b1c40104f1b3da" Sep 30 07:51:09 crc kubenswrapper[4691]: E0930 07:51:09.227234 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:51:23 crc kubenswrapper[4691]: I0930 07:51:23.225689 4691 scope.go:117] "RemoveContainer" containerID="d2c1910332c7b39c28fb9c23f54694ce1e0dbe190d7af26b09b1c40104f1b3da" Sep 30 07:51:23 crc kubenswrapper[4691]: E0930 07:51:23.226768 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:51:35 crc kubenswrapper[4691]: I0930 07:51:35.225630 4691 scope.go:117] "RemoveContainer" containerID="d2c1910332c7b39c28fb9c23f54694ce1e0dbe190d7af26b09b1c40104f1b3da" Sep 30 07:51:35 crc kubenswrapper[4691]: E0930 07:51:35.226520 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:51:46 crc kubenswrapper[4691]: I0930 07:51:46.225388 4691 scope.go:117] "RemoveContainer" containerID="d2c1910332c7b39c28fb9c23f54694ce1e0dbe190d7af26b09b1c40104f1b3da" Sep 30 07:51:46 crc kubenswrapper[4691]: E0930 07:51:46.226446 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:51:59 crc kubenswrapper[4691]: I0930 07:51:59.226127 4691 scope.go:117] "RemoveContainer" containerID="d2c1910332c7b39c28fb9c23f54694ce1e0dbe190d7af26b09b1c40104f1b3da" Sep 30 07:51:59 crc kubenswrapper[4691]: E0930 07:51:59.227497 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:52:12 crc kubenswrapper[4691]: I0930 07:52:12.226591 4691 scope.go:117] "RemoveContainer" containerID="d2c1910332c7b39c28fb9c23f54694ce1e0dbe190d7af26b09b1c40104f1b3da" Sep 30 07:52:12 crc kubenswrapper[4691]: E0930 07:52:12.228011 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:52:23 crc kubenswrapper[4691]: I0930 07:52:23.924318 4691 generic.go:334] "Generic (PLEG): container finished" podID="d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03" containerID="cb3fe673e142ffeb08a833510f05b62401a774565973e50edb442d4e439e69f9" exitCode=0 Sep 30 07:52:23 crc kubenswrapper[4691]: I0930 07:52:23.924446 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03","Type":"ContainerDied","Data":"cb3fe673e142ffeb08a833510f05b62401a774565973e50edb442d4e439e69f9"} Sep 30 07:52:25 crc kubenswrapper[4691]: I0930 07:52:25.225674 4691 scope.go:117] "RemoveContainer" containerID="d2c1910332c7b39c28fb9c23f54694ce1e0dbe190d7af26b09b1c40104f1b3da" Sep 30 07:52:25 crc kubenswrapper[4691]: I0930 07:52:25.429378 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Sep 30 07:52:25 crc kubenswrapper[4691]: I0930 07:52:25.617089 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03-openstack-config\") pod \"d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03\" (UID: \"d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03\") " Sep 30 07:52:25 crc kubenswrapper[4691]: I0930 07:52:25.617315 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03-test-operator-ephemeral-temporary\") pod \"d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03\" (UID: \"d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03\") " Sep 30 07:52:25 crc kubenswrapper[4691]: I0930 07:52:25.617751 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03-ssh-key\") pod \"d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03\" (UID: \"d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03\") " Sep 30 07:52:25 crc kubenswrapper[4691]: I0930 07:52:25.617829 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03-ca-certs\") pod \"d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03\" (UID: \"d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03\") " Sep 30 07:52:25 crc kubenswrapper[4691]: I0930 07:52:25.617942 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03-openstack-config-secret\") pod \"d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03\" (UID: \"d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03\") " Sep 30 07:52:25 crc kubenswrapper[4691]: I0930 07:52:25.618500 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03" (UID: "d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:52:25 crc kubenswrapper[4691]: I0930 07:52:25.618051 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03\" (UID: \"d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03\") " Sep 30 07:52:25 crc kubenswrapper[4691]: I0930 07:52:25.618792 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vxc8\" (UniqueName: \"kubernetes.io/projected/d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03-kube-api-access-2vxc8\") pod \"d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03\" (UID: \"d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03\") " Sep 30 07:52:25 crc kubenswrapper[4691]: I0930 07:52:25.618867 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03-config-data\") pod \"d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03\" (UID: \"d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03\") " Sep 30 07:52:25 crc kubenswrapper[4691]: I0930 07:52:25.619679 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03-test-operator-ephemeral-workdir\") pod \"d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03\" (UID: \"d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03\") " Sep 30 07:52:25 crc kubenswrapper[4691]: I0930 07:52:25.619961 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03-config-data" (OuterVolumeSpecName: "config-data") pod "d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03" (UID: "d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:52:25 crc kubenswrapper[4691]: I0930 07:52:25.625318 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "test-operator-logs") pod "d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03" (UID: "d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Sep 30 07:52:25 crc kubenswrapper[4691]: I0930 07:52:25.625841 4691 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Sep 30 07:52:25 crc kubenswrapper[4691]: I0930 07:52:25.625883 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 07:52:25 crc kubenswrapper[4691]: I0930 07:52:25.625928 4691 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Sep 30 07:52:25 crc kubenswrapper[4691]: I0930 07:52:25.627158 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03" (UID: "d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:52:25 crc kubenswrapper[4691]: I0930 07:52:25.628033 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03-kube-api-access-2vxc8" (OuterVolumeSpecName: "kube-api-access-2vxc8") pod "d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03" (UID: "d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03"). InnerVolumeSpecName "kube-api-access-2vxc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:52:25 crc kubenswrapper[4691]: I0930 07:52:25.656361 4691 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Sep 30 07:52:25 crc kubenswrapper[4691]: I0930 07:52:25.657953 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03" (UID: "d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:52:25 crc kubenswrapper[4691]: I0930 07:52:25.664151 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03" (UID: "d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:52:25 crc kubenswrapper[4691]: I0930 07:52:25.674030 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03" (UID: "d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 07:52:25 crc kubenswrapper[4691]: I0930 07:52:25.687459 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03" (UID: "d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 07:52:25 crc kubenswrapper[4691]: I0930 07:52:25.728255 4691 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03-ssh-key\") on node \"crc\" DevicePath \"\"" Sep 30 07:52:25 crc kubenswrapper[4691]: I0930 07:52:25.728289 4691 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03-ca-certs\") on node \"crc\" DevicePath \"\"" Sep 30 07:52:25 crc kubenswrapper[4691]: I0930 07:52:25.728304 4691 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Sep 30 07:52:25 crc kubenswrapper[4691]: I0930 07:52:25.728317 4691 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Sep 30 07:52:25 crc kubenswrapper[4691]: I0930 07:52:25.728330 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2vxc8\" (UniqueName: \"kubernetes.io/projected/d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03-kube-api-access-2vxc8\") on node \"crc\" DevicePath \"\"" Sep 30 07:52:25 crc kubenswrapper[4691]: I0930 07:52:25.728343 4691 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Sep 30 07:52:25 crc kubenswrapper[4691]: I0930 07:52:25.728356 4691 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03-openstack-config\") on node \"crc\" DevicePath \"\"" Sep 30 07:52:25 crc kubenswrapper[4691]: I0930 07:52:25.948701 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03","Type":"ContainerDied","Data":"54e2ee440c2cee54f29fc553d6e14510d13c1955ca9fffbd05c990aeda29307d"} Sep 30 07:52:25 crc kubenswrapper[4691]: I0930 07:52:25.949092 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54e2ee440c2cee54f29fc553d6e14510d13c1955ca9fffbd05c990aeda29307d" Sep 30 07:52:25 crc kubenswrapper[4691]: I0930 07:52:25.948829 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Sep 30 07:52:25 crc kubenswrapper[4691]: I0930 07:52:25.951590 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" event={"ID":"69b46ade-8260-448f-84b7-506632d23ff9","Type":"ContainerStarted","Data":"37d19b6a60363a5a6aed5989b48791cca474e4f931fda010298ee4b265d6d360"} Sep 30 07:52:27 crc kubenswrapper[4691]: I0930 07:52:27.801124 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-galera-0" podUID="08782d24-2bd9-48d6-b9b2-12a2ad66e6d0" containerName="galera" probeResult="failure" output="command timed out" Sep 30 07:52:27 crc kubenswrapper[4691]: I0930 07:52:27.801412 4691 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="08782d24-2bd9-48d6-b9b2-12a2ad66e6d0" containerName="galera" probeResult="failure" output="command timed out" Sep 30 07:52:35 crc kubenswrapper[4691]: I0930 07:52:35.537488 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Sep 30 07:52:35 crc kubenswrapper[4691]: E0930 07:52:35.538589 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b1025f6-05a8-4d8c-9fdb-83955f2c38d1" containerName="registry-server" Sep 30 07:52:35 crc kubenswrapper[4691]: I0930 07:52:35.538606 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b1025f6-05a8-4d8c-9fdb-83955f2c38d1" containerName="registry-server" Sep 30 07:52:35 crc kubenswrapper[4691]: E0930 07:52:35.538641 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b1025f6-05a8-4d8c-9fdb-83955f2c38d1" containerName="extract-utilities" Sep 30 07:52:35 crc kubenswrapper[4691]: I0930 07:52:35.538650 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b1025f6-05a8-4d8c-9fdb-83955f2c38d1" containerName="extract-utilities" Sep 30 07:52:35 crc kubenswrapper[4691]: E0930 07:52:35.538666 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03" containerName="tempest-tests-tempest-tests-runner" Sep 30 07:52:35 crc kubenswrapper[4691]: I0930 07:52:35.538675 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03" containerName="tempest-tests-tempest-tests-runner" Sep 30 07:52:35 crc kubenswrapper[4691]: E0930 07:52:35.538694 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b1025f6-05a8-4d8c-9fdb-83955f2c38d1" containerName="extract-content" Sep 30 07:52:35 crc kubenswrapper[4691]: I0930 07:52:35.538701 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b1025f6-05a8-4d8c-9fdb-83955f2c38d1" containerName="extract-content" Sep 30 07:52:35 crc kubenswrapper[4691]: I0930 07:52:35.539005 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b1025f6-05a8-4d8c-9fdb-83955f2c38d1" containerName="registry-server" Sep 30 07:52:35 crc kubenswrapper[4691]: I0930 07:52:35.539041 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03" containerName="tempest-tests-tempest-tests-runner" Sep 30 07:52:35 crc kubenswrapper[4691]: I0930 07:52:35.540026 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 30 07:52:35 crc kubenswrapper[4691]: I0930 07:52:35.543000 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-st99m" Sep 30 07:52:35 crc kubenswrapper[4691]: I0930 07:52:35.551968 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Sep 30 07:52:35 crc kubenswrapper[4691]: I0930 07:52:35.655599 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e1d484c8-a1d8-4c39-89fb-4b7679e1c22a\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 30 07:52:35 crc kubenswrapper[4691]: I0930 07:52:35.655788 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24jbv\" (UniqueName: \"kubernetes.io/projected/e1d484c8-a1d8-4c39-89fb-4b7679e1c22a-kube-api-access-24jbv\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e1d484c8-a1d8-4c39-89fb-4b7679e1c22a\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 30 07:52:35 crc kubenswrapper[4691]: I0930 07:52:35.758352 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e1d484c8-a1d8-4c39-89fb-4b7679e1c22a\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 30 07:52:35 crc kubenswrapper[4691]: I0930 07:52:35.758612 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24jbv\" (UniqueName: \"kubernetes.io/projected/e1d484c8-a1d8-4c39-89fb-4b7679e1c22a-kube-api-access-24jbv\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e1d484c8-a1d8-4c39-89fb-4b7679e1c22a\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 30 07:52:35 crc kubenswrapper[4691]: I0930 07:52:35.759654 4691 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e1d484c8-a1d8-4c39-89fb-4b7679e1c22a\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 30 07:52:35 crc kubenswrapper[4691]: I0930 07:52:35.808332 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24jbv\" (UniqueName: \"kubernetes.io/projected/e1d484c8-a1d8-4c39-89fb-4b7679e1c22a-kube-api-access-24jbv\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e1d484c8-a1d8-4c39-89fb-4b7679e1c22a\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 30 07:52:35 crc kubenswrapper[4691]: I0930 07:52:35.811191 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"e1d484c8-a1d8-4c39-89fb-4b7679e1c22a\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 30 07:52:35 crc kubenswrapper[4691]: I0930 07:52:35.877825 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Sep 30 07:52:36 crc kubenswrapper[4691]: I0930 07:52:36.220563 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Sep 30 07:52:36 crc kubenswrapper[4691]: W0930 07:52:36.224942 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode1d484c8_a1d8_4c39_89fb_4b7679e1c22a.slice/crio-7ca05fdfdf8926b44e711895aee4c21a64853159739b4d63354253f2ff442340 WatchSource:0}: Error finding container 7ca05fdfdf8926b44e711895aee4c21a64853159739b4d63354253f2ff442340: Status 404 returned error can't find the container with id 7ca05fdfdf8926b44e711895aee4c21a64853159739b4d63354253f2ff442340 Sep 30 07:52:36 crc kubenswrapper[4691]: I0930 07:52:36.228122 4691 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 07:52:37 crc kubenswrapper[4691]: I0930 07:52:37.083381 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"e1d484c8-a1d8-4c39-89fb-4b7679e1c22a","Type":"ContainerStarted","Data":"7ca05fdfdf8926b44e711895aee4c21a64853159739b4d63354253f2ff442340"} Sep 30 07:52:38 crc kubenswrapper[4691]: I0930 07:52:38.097278 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"e1d484c8-a1d8-4c39-89fb-4b7679e1c22a","Type":"ContainerStarted","Data":"ebe339d8908abe63378eea84da8df921636ff826fbc1d672ee5f7a24072700bc"} Sep 30 07:52:38 crc kubenswrapper[4691]: I0930 07:52:38.120759 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=1.6357563499999999 podStartE2EDuration="3.120733476s" podCreationTimestamp="2025-09-30 07:52:35 +0000 UTC" firstStartedPulling="2025-09-30 07:52:36.227872525 +0000 UTC m=+5599.702893565" lastFinishedPulling="2025-09-30 07:52:37.712849651 +0000 UTC m=+5601.187870691" observedRunningTime="2025-09-30 07:52:38.111680036 +0000 UTC m=+5601.586701106" watchObservedRunningTime="2025-09-30 07:52:38.120733476 +0000 UTC m=+5601.595754556" Sep 30 07:52:56 crc kubenswrapper[4691]: I0930 07:52:56.538754 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-78mzf/must-gather-6xhk8"] Sep 30 07:52:56 crc kubenswrapper[4691]: I0930 07:52:56.542144 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-78mzf/must-gather-6xhk8" Sep 30 07:52:56 crc kubenswrapper[4691]: I0930 07:52:56.547398 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-78mzf"/"openshift-service-ca.crt" Sep 30 07:52:56 crc kubenswrapper[4691]: I0930 07:52:56.547832 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-78mzf"/"default-dockercfg-cft55" Sep 30 07:52:56 crc kubenswrapper[4691]: I0930 07:52:56.547940 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-78mzf"/"kube-root-ca.crt" Sep 30 07:52:56 crc kubenswrapper[4691]: I0930 07:52:56.556383 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-78mzf/must-gather-6xhk8"] Sep 30 07:52:56 crc kubenswrapper[4691]: I0930 07:52:56.618397 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8405e9df-8bb3-4a22-8b85-1fa652143de8-must-gather-output\") pod \"must-gather-6xhk8\" (UID: \"8405e9df-8bb3-4a22-8b85-1fa652143de8\") " pod="openshift-must-gather-78mzf/must-gather-6xhk8" Sep 30 07:52:56 crc kubenswrapper[4691]: I0930 07:52:56.618504 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kj6zx\" (UniqueName: \"kubernetes.io/projected/8405e9df-8bb3-4a22-8b85-1fa652143de8-kube-api-access-kj6zx\") pod \"must-gather-6xhk8\" (UID: \"8405e9df-8bb3-4a22-8b85-1fa652143de8\") " pod="openshift-must-gather-78mzf/must-gather-6xhk8" Sep 30 07:52:56 crc kubenswrapper[4691]: I0930 07:52:56.719807 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8405e9df-8bb3-4a22-8b85-1fa652143de8-must-gather-output\") pod \"must-gather-6xhk8\" (UID: \"8405e9df-8bb3-4a22-8b85-1fa652143de8\") " pod="openshift-must-gather-78mzf/must-gather-6xhk8" Sep 30 07:52:56 crc kubenswrapper[4691]: I0930 07:52:56.719958 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kj6zx\" (UniqueName: \"kubernetes.io/projected/8405e9df-8bb3-4a22-8b85-1fa652143de8-kube-api-access-kj6zx\") pod \"must-gather-6xhk8\" (UID: \"8405e9df-8bb3-4a22-8b85-1fa652143de8\") " pod="openshift-must-gather-78mzf/must-gather-6xhk8" Sep 30 07:52:56 crc kubenswrapper[4691]: I0930 07:52:56.720223 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8405e9df-8bb3-4a22-8b85-1fa652143de8-must-gather-output\") pod \"must-gather-6xhk8\" (UID: \"8405e9df-8bb3-4a22-8b85-1fa652143de8\") " pod="openshift-must-gather-78mzf/must-gather-6xhk8" Sep 30 07:52:56 crc kubenswrapper[4691]: I0930 07:52:56.739621 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kj6zx\" (UniqueName: \"kubernetes.io/projected/8405e9df-8bb3-4a22-8b85-1fa652143de8-kube-api-access-kj6zx\") pod \"must-gather-6xhk8\" (UID: \"8405e9df-8bb3-4a22-8b85-1fa652143de8\") " pod="openshift-must-gather-78mzf/must-gather-6xhk8" Sep 30 07:52:56 crc kubenswrapper[4691]: I0930 07:52:56.897508 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-78mzf/must-gather-6xhk8" Sep 30 07:52:57 crc kubenswrapper[4691]: I0930 07:52:57.538194 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-78mzf/must-gather-6xhk8"] Sep 30 07:52:58 crc kubenswrapper[4691]: I0930 07:52:58.336628 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-78mzf/must-gather-6xhk8" event={"ID":"8405e9df-8bb3-4a22-8b85-1fa652143de8","Type":"ContainerStarted","Data":"a8d61d3b2f26498b95f274c9fed7bb0ac02dc1a6ba659401d319cb56f37371ef"} Sep 30 07:53:04 crc kubenswrapper[4691]: I0930 07:53:04.405290 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-78mzf/must-gather-6xhk8" event={"ID":"8405e9df-8bb3-4a22-8b85-1fa652143de8","Type":"ContainerStarted","Data":"c903f0050d5c1653352cb808ca581132207115e5e53e3379df6fa2b1682a7e2d"} Sep 30 07:53:04 crc kubenswrapper[4691]: I0930 07:53:04.407142 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-78mzf/must-gather-6xhk8" event={"ID":"8405e9df-8bb3-4a22-8b85-1fa652143de8","Type":"ContainerStarted","Data":"bcdc7418b47009c90688ca12101c3d50139645481ec0e0b26b840f3bd9192fec"} Sep 30 07:53:08 crc kubenswrapper[4691]: I0930 07:53:08.211560 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-78mzf/must-gather-6xhk8" podStartSLOduration=6.098694172 podStartE2EDuration="12.211544496s" podCreationTimestamp="2025-09-30 07:52:56 +0000 UTC" firstStartedPulling="2025-09-30 07:52:57.550966647 +0000 UTC m=+5621.025987697" lastFinishedPulling="2025-09-30 07:53:03.663816981 +0000 UTC m=+5627.138838021" observedRunningTime="2025-09-30 07:53:04.431477084 +0000 UTC m=+5627.906498134" watchObservedRunningTime="2025-09-30 07:53:08.211544496 +0000 UTC m=+5631.686565536" Sep 30 07:53:08 crc kubenswrapper[4691]: I0930 07:53:08.219434 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-78mzf/crc-debug-dzmz8"] Sep 30 07:53:08 crc kubenswrapper[4691]: I0930 07:53:08.220859 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-78mzf/crc-debug-dzmz8" Sep 30 07:53:08 crc kubenswrapper[4691]: I0930 07:53:08.394615 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/63ec617d-5809-4806-a0a5-bcc4421b2737-host\") pod \"crc-debug-dzmz8\" (UID: \"63ec617d-5809-4806-a0a5-bcc4421b2737\") " pod="openshift-must-gather-78mzf/crc-debug-dzmz8" Sep 30 07:53:08 crc kubenswrapper[4691]: I0930 07:53:08.395154 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7nqd\" (UniqueName: \"kubernetes.io/projected/63ec617d-5809-4806-a0a5-bcc4421b2737-kube-api-access-n7nqd\") pod \"crc-debug-dzmz8\" (UID: \"63ec617d-5809-4806-a0a5-bcc4421b2737\") " pod="openshift-must-gather-78mzf/crc-debug-dzmz8" Sep 30 07:53:08 crc kubenswrapper[4691]: I0930 07:53:08.502454 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n7nqd\" (UniqueName: \"kubernetes.io/projected/63ec617d-5809-4806-a0a5-bcc4421b2737-kube-api-access-n7nqd\") pod \"crc-debug-dzmz8\" (UID: \"63ec617d-5809-4806-a0a5-bcc4421b2737\") " pod="openshift-must-gather-78mzf/crc-debug-dzmz8" Sep 30 07:53:08 crc kubenswrapper[4691]: I0930 07:53:08.503039 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/63ec617d-5809-4806-a0a5-bcc4421b2737-host\") pod \"crc-debug-dzmz8\" (UID: \"63ec617d-5809-4806-a0a5-bcc4421b2737\") " pod="openshift-must-gather-78mzf/crc-debug-dzmz8" Sep 30 07:53:08 crc kubenswrapper[4691]: I0930 07:53:08.503365 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/63ec617d-5809-4806-a0a5-bcc4421b2737-host\") pod \"crc-debug-dzmz8\" (UID: \"63ec617d-5809-4806-a0a5-bcc4421b2737\") " pod="openshift-must-gather-78mzf/crc-debug-dzmz8" Sep 30 07:53:08 crc kubenswrapper[4691]: I0930 07:53:08.917352 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7nqd\" (UniqueName: \"kubernetes.io/projected/63ec617d-5809-4806-a0a5-bcc4421b2737-kube-api-access-n7nqd\") pod \"crc-debug-dzmz8\" (UID: \"63ec617d-5809-4806-a0a5-bcc4421b2737\") " pod="openshift-must-gather-78mzf/crc-debug-dzmz8" Sep 30 07:53:09 crc kubenswrapper[4691]: I0930 07:53:09.136774 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-78mzf/crc-debug-dzmz8" Sep 30 07:53:09 crc kubenswrapper[4691]: W0930 07:53:09.191409 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63ec617d_5809_4806_a0a5_bcc4421b2737.slice/crio-0b7a05b6eb5903371e5800896ea2206fd7fca29e057dc86e10b56c69e3c711cf WatchSource:0}: Error finding container 0b7a05b6eb5903371e5800896ea2206fd7fca29e057dc86e10b56c69e3c711cf: Status 404 returned error can't find the container with id 0b7a05b6eb5903371e5800896ea2206fd7fca29e057dc86e10b56c69e3c711cf Sep 30 07:53:09 crc kubenswrapper[4691]: I0930 07:53:09.463532 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-78mzf/crc-debug-dzmz8" event={"ID":"63ec617d-5809-4806-a0a5-bcc4421b2737","Type":"ContainerStarted","Data":"0b7a05b6eb5903371e5800896ea2206fd7fca29e057dc86e10b56c69e3c711cf"} Sep 30 07:53:19 crc kubenswrapper[4691]: I0930 07:53:19.561486 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-78mzf/crc-debug-dzmz8" event={"ID":"63ec617d-5809-4806-a0a5-bcc4421b2737","Type":"ContainerStarted","Data":"aaa30257679c618c88bf823386d62d3dc5ac8a9cfb64188ccb12e77069ef622e"} Sep 30 07:54:28 crc kubenswrapper[4691]: I0930 07:54:28.284746 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-68888bb5f6-d225g_29d7ded5-bae4-41e2-9aa9-c959091d3696/barbican-api-log/0.log" Sep 30 07:54:28 crc kubenswrapper[4691]: I0930 07:54:28.343374 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-68888bb5f6-d225g_29d7ded5-bae4-41e2-9aa9-c959091d3696/barbican-api/0.log" Sep 30 07:54:28 crc kubenswrapper[4691]: I0930 07:54:28.532891 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5c7dfdd7cd-qdz5k_bccc96eb-4a1a-44bc-8086-eb5e7a7ce253/barbican-keystone-listener/0.log" Sep 30 07:54:28 crc kubenswrapper[4691]: I0930 07:54:28.657174 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5c7dfdd7cd-qdz5k_bccc96eb-4a1a-44bc-8086-eb5e7a7ce253/barbican-keystone-listener-log/0.log" Sep 30 07:54:28 crc kubenswrapper[4691]: I0930 07:54:28.715938 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5d7ff878f-9tz9w_9d94cb2d-a415-4b43-9976-0a844c446734/barbican-worker/0.log" Sep 30 07:54:28 crc kubenswrapper[4691]: I0930 07:54:28.868442 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5d7ff878f-9tz9w_9d94cb2d-a415-4b43-9976-0a844c446734/barbican-worker-log/0.log" Sep 30 07:54:29 crc kubenswrapper[4691]: I0930 07:54:29.032396 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-wdn8x_c6027156-9dfc-40c5-b265-96d0231b32d6/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 07:54:29 crc kubenswrapper[4691]: I0930 07:54:29.279445 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ac8d6a42-d8ce-419f-ae31-d9746dcedea9/ceilometer-notification-agent/0.log" Sep 30 07:54:29 crc kubenswrapper[4691]: I0930 07:54:29.290413 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ac8d6a42-d8ce-419f-ae31-d9746dcedea9/ceilometer-central-agent/0.log" Sep 30 07:54:29 crc kubenswrapper[4691]: I0930 07:54:29.352271 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ac8d6a42-d8ce-419f-ae31-d9746dcedea9/proxy-httpd/0.log" Sep 30 07:54:29 crc kubenswrapper[4691]: I0930 07:54:29.453784 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ac8d6a42-d8ce-419f-ae31-d9746dcedea9/sg-core/0.log" Sep 30 07:54:29 crc kubenswrapper[4691]: I0930 07:54:29.713129 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_397c7023-cd6a-42ac-8d37-5813f5f9d45e/cinder-api-log/0.log" Sep 30 07:54:29 crc kubenswrapper[4691]: I0930 07:54:29.815852 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_397c7023-cd6a-42ac-8d37-5813f5f9d45e/cinder-api/0.log" Sep 30 07:54:29 crc kubenswrapper[4691]: I0930 07:54:29.948815 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_0937a7d3-6bf0-4114-b73b-0d10f2f19945/cinder-scheduler/0.log" Sep 30 07:54:30 crc kubenswrapper[4691]: I0930 07:54:30.059474 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_0937a7d3-6bf0-4114-b73b-0d10f2f19945/probe/0.log" Sep 30 07:54:30 crc kubenswrapper[4691]: I0930 07:54:30.233315 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-528ll_6b78a233-7f96-48a0-b484-0bb1196d8d4e/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 07:54:30 crc kubenswrapper[4691]: I0930 07:54:30.355471 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-hpsvp_6bb5c646-a0b7-4ed5-b5ef-28727886b271/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 07:54:30 crc kubenswrapper[4691]: I0930 07:54:30.529070 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-c6b9844bc-q6q6n_c18992fb-4c6e-4a18-a9b9-f00db9817b1b/init/0.log" Sep 30 07:54:30 crc kubenswrapper[4691]: I0930 07:54:30.769316 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-c6b9844bc-q6q6n_c18992fb-4c6e-4a18-a9b9-f00db9817b1b/init/0.log" Sep 30 07:54:30 crc kubenswrapper[4691]: I0930 07:54:30.858746 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-c6b9844bc-q6q6n_c18992fb-4c6e-4a18-a9b9-f00db9817b1b/dnsmasq-dns/0.log" Sep 30 07:54:30 crc kubenswrapper[4691]: I0930 07:54:30.981990 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-mwcfh_f8536c3f-e28e-49a1-9b22-bb6ab2652c5b/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 07:54:31 crc kubenswrapper[4691]: I0930 07:54:31.056734 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_d2a83f16-21dd-442b-b27d-6c583c783055/glance-httpd/0.log" Sep 30 07:54:31 crc kubenswrapper[4691]: I0930 07:54:31.232527 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_d2a83f16-21dd-442b-b27d-6c583c783055/glance-log/0.log" Sep 30 07:54:31 crc kubenswrapper[4691]: I0930 07:54:31.549456 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_34863af3-4c23-43ce-b483-713ca0d1f744/glance-httpd/0.log" Sep 30 07:54:31 crc kubenswrapper[4691]: I0930 07:54:31.641172 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_34863af3-4c23-43ce-b483-713ca0d1f744/glance-log/0.log" Sep 30 07:54:31 crc kubenswrapper[4691]: I0930 07:54:31.801441 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-56948c48fd-czzmm_fec95e1e-14f4-4093-b1d4-402c29686348/horizon/0.log" Sep 30 07:54:31 crc kubenswrapper[4691]: I0930 07:54:31.891505 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-crvv4_98946d0d-1b03-4bf2-bd9b-71105ac901f8/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 07:54:32 crc kubenswrapper[4691]: I0930 07:54:32.139691 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-wlzdt_455e6d2b-cc2e-4b09-899d-f913094c603f/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 07:54:32 crc kubenswrapper[4691]: I0930 07:54:32.296243 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-56948c48fd-czzmm_fec95e1e-14f4-4093-b1d4-402c29686348/horizon-log/0.log" Sep 30 07:54:32 crc kubenswrapper[4691]: I0930 07:54:32.468587 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29320261-kh7r8_a5b3e3e6-7e53-4c0f-a1a5-e77f887cb06b/keystone-cron/0.log" Sep 30 07:54:32 crc kubenswrapper[4691]: I0930 07:54:32.620446 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_36b81859-2533-442a-bf54-a2fe2a8a5baa/kube-state-metrics/0.log" Sep 30 07:54:32 crc kubenswrapper[4691]: I0930 07:54:32.894790 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-bf6754cd6-fsq4c_5fef5a53-6fb3-4c3b-8929-e9e49f85f050/keystone-api/0.log" Sep 30 07:54:33 crc kubenswrapper[4691]: I0930 07:54:33.063100 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-qgtz6_8e08d67e-28fd-4a4b-905a-765d0e33013d/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 07:54:33 crc kubenswrapper[4691]: I0930 07:54:33.708708 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-8687477df-8l865_7d0f6749-bfde-4329-9905-f51ef18e904c/neutron-httpd/0.log" Sep 30 07:54:33 crc kubenswrapper[4691]: I0930 07:54:33.718583 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_5c310640-e561-4e1e-8f7c-046a7eec139d/memcached/0.log" Sep 30 07:54:33 crc kubenswrapper[4691]: I0930 07:54:33.750222 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-8687477df-8l865_7d0f6749-bfde-4329-9905-f51ef18e904c/neutron-api/0.log" Sep 30 07:54:33 crc kubenswrapper[4691]: I0930 07:54:33.756130 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-qpl4r_214c8c4f-8184-4b59-9fcd-c1112551b5b2/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 07:54:34 crc kubenswrapper[4691]: I0930 07:54:34.592668 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_c523d401-a3b1-4181-8216-bbf80156c7c4/nova-cell0-conductor-conductor/0.log" Sep 30 07:54:34 crc kubenswrapper[4691]: I0930 07:54:34.742819 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_b0efbbe5-44f2-4424-9a32-476f81246c28/nova-cell1-conductor-conductor/0.log" Sep 30 07:54:35 crc kubenswrapper[4691]: I0930 07:54:35.065765 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_3a61f6fc-3212-4050-92f5-363ed195680f/nova-cell1-novncproxy-novncproxy/0.log" Sep 30 07:54:35 crc kubenswrapper[4691]: I0930 07:54:35.358865 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-z8bsh_c6e5786f-d234-454c-8276-5355726052be/nova-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 07:54:35 crc kubenswrapper[4691]: I0930 07:54:35.375189 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_f8b5ef76-7c77-4698-9f7f-219791e59bd2/nova-api-log/0.log" Sep 30 07:54:35 crc kubenswrapper[4691]: I0930 07:54:35.627640 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_f8b5ef76-7c77-4698-9f7f-219791e59bd2/nova-api-api/0.log" Sep 30 07:54:35 crc kubenswrapper[4691]: I0930 07:54:35.664434 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_4f6cce79-72b3-407a-8ac5-ca3782a878b5/nova-metadata-log/0.log" Sep 30 07:54:35 crc kubenswrapper[4691]: I0930 07:54:35.950567 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_605e7ef7-ef82-4a1f-98aa-15eef2a3f8c6/mysql-bootstrap/0.log" Sep 30 07:54:36 crc kubenswrapper[4691]: I0930 07:54:36.103141 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_605e7ef7-ef82-4a1f-98aa-15eef2a3f8c6/mysql-bootstrap/0.log" Sep 30 07:54:36 crc kubenswrapper[4691]: I0930 07:54:36.112000 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_24eb00d6-56e5-477b-840b-ad3f6fd6e473/nova-scheduler-scheduler/0.log" Sep 30 07:54:36 crc kubenswrapper[4691]: I0930 07:54:36.214525 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_605e7ef7-ef82-4a1f-98aa-15eef2a3f8c6/galera/0.log" Sep 30 07:54:36 crc kubenswrapper[4691]: I0930 07:54:36.375563 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_08782d24-2bd9-48d6-b9b2-12a2ad66e6d0/mysql-bootstrap/0.log" Sep 30 07:54:36 crc kubenswrapper[4691]: I0930 07:54:36.609684 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_08782d24-2bd9-48d6-b9b2-12a2ad66e6d0/galera/0.log" Sep 30 07:54:36 crc kubenswrapper[4691]: I0930 07:54:36.628777 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_08782d24-2bd9-48d6-b9b2-12a2ad66e6d0/mysql-bootstrap/0.log" Sep 30 07:54:36 crc kubenswrapper[4691]: I0930 07:54:36.814950 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-2wmg8_86397f09-76d1-4c35-a96a-5b6bde1e3574/ovn-controller/0.log" Sep 30 07:54:36 crc kubenswrapper[4691]: I0930 07:54:36.841136 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_7161fa44-37a9-4d36-b45b-b4f5cf9aa2ee/openstackclient/0.log" Sep 30 07:54:37 crc kubenswrapper[4691]: I0930 07:54:37.056633 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-8h9p5_24478def-6fea-4596-b4e3-fd3abee81a62/openstack-network-exporter/0.log" Sep 30 07:54:37 crc kubenswrapper[4691]: I0930 07:54:37.123822 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_4f6cce79-72b3-407a-8ac5-ca3782a878b5/nova-metadata-metadata/0.log" Sep 30 07:54:37 crc kubenswrapper[4691]: I0930 07:54:37.308722 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-csq87_99bbc7fe-4a99-4f60-b840-8843790d6cb4/ovsdb-server-init/0.log" Sep 30 07:54:37 crc kubenswrapper[4691]: I0930 07:54:37.658430 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-csq87_99bbc7fe-4a99-4f60-b840-8843790d6cb4/ovsdb-server-init/0.log" Sep 30 07:54:37 crc kubenswrapper[4691]: I0930 07:54:37.661084 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-csq87_99bbc7fe-4a99-4f60-b840-8843790d6cb4/ovsdb-server/0.log" Sep 30 07:54:37 crc kubenswrapper[4691]: I0930 07:54:37.820709 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-csq87_99bbc7fe-4a99-4f60-b840-8843790d6cb4/ovs-vswitchd/0.log" Sep 30 07:54:37 crc kubenswrapper[4691]: I0930 07:54:37.885741 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-bl96m_8638fef8-f042-4cc1-949d-fb0c107085b5/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 07:54:37 crc kubenswrapper[4691]: I0930 07:54:37.960273 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_e59b91c6-5922-4272-9c75-4e139031c87b/openstack-network-exporter/0.log" Sep 30 07:54:38 crc kubenswrapper[4691]: I0930 07:54:38.010096 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_e59b91c6-5922-4272-9c75-4e139031c87b/ovn-northd/0.log" Sep 30 07:54:38 crc kubenswrapper[4691]: I0930 07:54:38.113475 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_46482328-297b-40b1-83e1-2270733d27d7/openstack-network-exporter/0.log" Sep 30 07:54:38 crc kubenswrapper[4691]: I0930 07:54:38.140019 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_46482328-297b-40b1-83e1-2270733d27d7/ovsdbserver-nb/0.log" Sep 30 07:54:38 crc kubenswrapper[4691]: I0930 07:54:38.316718 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_48c486cf-48da-4fd0-b450-d821ab6b2755/openstack-network-exporter/0.log" Sep 30 07:54:38 crc kubenswrapper[4691]: I0930 07:54:38.352029 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_48c486cf-48da-4fd0-b450-d821ab6b2755/ovsdbserver-sb/0.log" Sep 30 07:54:38 crc kubenswrapper[4691]: I0930 07:54:38.543253 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-56d945494d-7svb6_d35f539b-5139-4155-8f51-a1e425e19925/placement-api/0.log" Sep 30 07:54:38 crc kubenswrapper[4691]: I0930 07:54:38.651361 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_5621e369-5e8a-491d-aa26-098025c50c2f/init-config-reloader/0.log" Sep 30 07:54:38 crc kubenswrapper[4691]: I0930 07:54:38.680872 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-56d945494d-7svb6_d35f539b-5139-4155-8f51-a1e425e19925/placement-log/0.log" Sep 30 07:54:38 crc kubenswrapper[4691]: I0930 07:54:38.756485 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_5621e369-5e8a-491d-aa26-098025c50c2f/init-config-reloader/0.log" Sep 30 07:54:38 crc kubenswrapper[4691]: I0930 07:54:38.814204 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_5621e369-5e8a-491d-aa26-098025c50c2f/config-reloader/0.log" Sep 30 07:54:38 crc kubenswrapper[4691]: I0930 07:54:38.862001 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_5621e369-5e8a-491d-aa26-098025c50c2f/prometheus/0.log" Sep 30 07:54:38 crc kubenswrapper[4691]: I0930 07:54:38.887687 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_5621e369-5e8a-491d-aa26-098025c50c2f/thanos-sidecar/0.log" Sep 30 07:54:39 crc kubenswrapper[4691]: I0930 07:54:39.011237 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_136adcf8-2194-4dd2-9b57-6bf571f9e295/setup-container/0.log" Sep 30 07:54:39 crc kubenswrapper[4691]: I0930 07:54:39.185346 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_136adcf8-2194-4dd2-9b57-6bf571f9e295/setup-container/0.log" Sep 30 07:54:39 crc kubenswrapper[4691]: I0930 07:54:39.207717 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_136adcf8-2194-4dd2-9b57-6bf571f9e295/rabbitmq/0.log" Sep 30 07:54:39 crc kubenswrapper[4691]: I0930 07:54:39.215224 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_d454968e-74c7-45e3-9608-e915973c7f25/setup-container/0.log" Sep 30 07:54:39 crc kubenswrapper[4691]: I0930 07:54:39.418369 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_d454968e-74c7-45e3-9608-e915973c7f25/setup-container/0.log" Sep 30 07:54:39 crc kubenswrapper[4691]: I0930 07:54:39.447524 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_d454968e-74c7-45e3-9608-e915973c7f25/rabbitmq/0.log" Sep 30 07:54:39 crc kubenswrapper[4691]: I0930 07:54:39.531212 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_a7ac4031-4e4c-4dd1-bbe8-16846c31d7b5/setup-container/0.log" Sep 30 07:54:39 crc kubenswrapper[4691]: I0930 07:54:39.622710 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_a7ac4031-4e4c-4dd1-bbe8-16846c31d7b5/setup-container/0.log" Sep 30 07:54:39 crc kubenswrapper[4691]: I0930 07:54:39.646380 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_a7ac4031-4e4c-4dd1-bbe8-16846c31d7b5/rabbitmq/0.log" Sep 30 07:54:39 crc kubenswrapper[4691]: I0930 07:54:39.743866 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-cdxzg_6b407d88-19cd-402f-a417-64c08a37f051/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 07:54:39 crc kubenswrapper[4691]: I0930 07:54:39.856293 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-ljjt4_d39e1c92-309d-4295-8f78-e9d01ffdb114/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 07:54:39 crc kubenswrapper[4691]: I0930 07:54:39.931794 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-jkj8w_1ae5c682-dd33-42b2-8b7c-564876eef00a/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 07:54:40 crc kubenswrapper[4691]: I0930 07:54:40.062373 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-lt582_30e86152-90a5-42db-a157-e86cede48629/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 07:54:40 crc kubenswrapper[4691]: I0930 07:54:40.143926 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-jsfrl_890b2a56-9627-4b04-9e09-5bd7625272cd/ssh-known-hosts-edpm-deployment/0.log" Sep 30 07:54:40 crc kubenswrapper[4691]: I0930 07:54:40.391526 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-688b4ff469-2cgjc_e4e270ac-98e8-47b9-bf7b-7492996aa18c/proxy-httpd/0.log" Sep 30 07:54:40 crc kubenswrapper[4691]: I0930 07:54:40.531068 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-688b4ff469-2cgjc_e4e270ac-98e8-47b9-bf7b-7492996aa18c/proxy-server/0.log" Sep 30 07:54:40 crc kubenswrapper[4691]: I0930 07:54:40.648853 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-f4xvm_22775d02-1312-4d7a-917d-80dc62539dba/swift-ring-rebalance/0.log" Sep 30 07:54:40 crc kubenswrapper[4691]: I0930 07:54:40.718122 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ccb4d975-40e7-4a38-8b86-b18e685c570b/account-auditor/0.log" Sep 30 07:54:40 crc kubenswrapper[4691]: I0930 07:54:40.818758 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ccb4d975-40e7-4a38-8b86-b18e685c570b/account-reaper/0.log" Sep 30 07:54:40 crc kubenswrapper[4691]: I0930 07:54:40.846928 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ccb4d975-40e7-4a38-8b86-b18e685c570b/account-replicator/0.log" Sep 30 07:54:40 crc kubenswrapper[4691]: I0930 07:54:40.937367 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ccb4d975-40e7-4a38-8b86-b18e685c570b/account-server/0.log" Sep 30 07:54:40 crc kubenswrapper[4691]: I0930 07:54:40.955589 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ccb4d975-40e7-4a38-8b86-b18e685c570b/container-auditor/0.log" Sep 30 07:54:41 crc kubenswrapper[4691]: I0930 07:54:41.053184 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ccb4d975-40e7-4a38-8b86-b18e685c570b/container-replicator/0.log" Sep 30 07:54:41 crc kubenswrapper[4691]: I0930 07:54:41.060249 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ccb4d975-40e7-4a38-8b86-b18e685c570b/container-server/0.log" Sep 30 07:54:41 crc kubenswrapper[4691]: I0930 07:54:41.123574 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ccb4d975-40e7-4a38-8b86-b18e685c570b/container-updater/0.log" Sep 30 07:54:41 crc kubenswrapper[4691]: I0930 07:54:41.199573 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ccb4d975-40e7-4a38-8b86-b18e685c570b/object-auditor/0.log" Sep 30 07:54:41 crc kubenswrapper[4691]: I0930 07:54:41.266456 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ccb4d975-40e7-4a38-8b86-b18e685c570b/object-expirer/0.log" Sep 30 07:54:41 crc kubenswrapper[4691]: I0930 07:54:41.282704 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ccb4d975-40e7-4a38-8b86-b18e685c570b/object-replicator/0.log" Sep 30 07:54:41 crc kubenswrapper[4691]: I0930 07:54:41.324625 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ccb4d975-40e7-4a38-8b86-b18e685c570b/object-server/0.log" Sep 30 07:54:41 crc kubenswrapper[4691]: I0930 07:54:41.417421 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ccb4d975-40e7-4a38-8b86-b18e685c570b/object-updater/0.log" Sep 30 07:54:41 crc kubenswrapper[4691]: I0930 07:54:41.464300 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ccb4d975-40e7-4a38-8b86-b18e685c570b/swift-recon-cron/0.log" Sep 30 07:54:41 crc kubenswrapper[4691]: I0930 07:54:41.486217 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ccb4d975-40e7-4a38-8b86-b18e685c570b/rsync/0.log" Sep 30 07:54:41 crc kubenswrapper[4691]: I0930 07:54:41.654224 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-cthz9_8c1bc2df-cff0-4d61-9773-0db30010956c/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 07:54:41 crc kubenswrapper[4691]: I0930 07:54:41.689821 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03/tempest-tests-tempest-tests-runner/0.log" Sep 30 07:54:41 crc kubenswrapper[4691]: I0930 07:54:41.825223 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_e1d484c8-a1d8-4c39-89fb-4b7679e1c22a/test-operator-logs-container/0.log" Sep 30 07:54:41 crc kubenswrapper[4691]: I0930 07:54:41.888355 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-t25dz_e97ad218-7d51-462b-bdcf-cd39157152c1/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 07:54:42 crc kubenswrapper[4691]: I0930 07:54:42.708306 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-applier-0_901f3032-8727-419d-8de7-b00c08535ca1/watcher-applier/0.log" Sep 30 07:54:42 crc kubenswrapper[4691]: I0930 07:54:42.965927 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-decision-engine-0_af3f1644-3ab8-4a6a-9f80-f8ea42297e98/watcher-decision-engine/1.log" Sep 30 07:54:43 crc kubenswrapper[4691]: I0930 07:54:43.193765 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_7c37a536-f38a-431d-8b76-fa23d610af0b/watcher-api-log/0.log" Sep 30 07:54:45 crc kubenswrapper[4691]: I0930 07:54:45.473059 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-decision-engine-0_af3f1644-3ab8-4a6a-9f80-f8ea42297e98/watcher-decision-engine/2.log" Sep 30 07:54:46 crc kubenswrapper[4691]: I0930 07:54:46.290022 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_7c37a536-f38a-431d-8b76-fa23d610af0b/watcher-api/0.log" Sep 30 07:54:52 crc kubenswrapper[4691]: I0930 07:54:52.850281 4691 patch_prober.go:28] interesting pod/machine-config-daemon-4w4k6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 07:54:52 crc kubenswrapper[4691]: I0930 07:54:52.850832 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 07:55:22 crc kubenswrapper[4691]: I0930 07:55:22.850345 4691 patch_prober.go:28] interesting pod/machine-config-daemon-4w4k6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 07:55:22 crc kubenswrapper[4691]: I0930 07:55:22.850842 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 07:55:25 crc kubenswrapper[4691]: I0930 07:55:25.823953 4691 generic.go:334] "Generic (PLEG): container finished" podID="63ec617d-5809-4806-a0a5-bcc4421b2737" containerID="aaa30257679c618c88bf823386d62d3dc5ac8a9cfb64188ccb12e77069ef622e" exitCode=0 Sep 30 07:55:25 crc kubenswrapper[4691]: I0930 07:55:25.824002 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-78mzf/crc-debug-dzmz8" event={"ID":"63ec617d-5809-4806-a0a5-bcc4421b2737","Type":"ContainerDied","Data":"aaa30257679c618c88bf823386d62d3dc5ac8a9cfb64188ccb12e77069ef622e"} Sep 30 07:55:26 crc kubenswrapper[4691]: I0930 07:55:26.966175 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-78mzf/crc-debug-dzmz8" Sep 30 07:55:27 crc kubenswrapper[4691]: I0930 07:55:27.009633 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-78mzf/crc-debug-dzmz8"] Sep 30 07:55:27 crc kubenswrapper[4691]: I0930 07:55:27.045799 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-78mzf/crc-debug-dzmz8"] Sep 30 07:55:27 crc kubenswrapper[4691]: I0930 07:55:27.160744 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/63ec617d-5809-4806-a0a5-bcc4421b2737-host\") pod \"63ec617d-5809-4806-a0a5-bcc4421b2737\" (UID: \"63ec617d-5809-4806-a0a5-bcc4421b2737\") " Sep 30 07:55:27 crc kubenswrapper[4691]: I0930 07:55:27.160829 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n7nqd\" (UniqueName: \"kubernetes.io/projected/63ec617d-5809-4806-a0a5-bcc4421b2737-kube-api-access-n7nqd\") pod \"63ec617d-5809-4806-a0a5-bcc4421b2737\" (UID: \"63ec617d-5809-4806-a0a5-bcc4421b2737\") " Sep 30 07:55:27 crc kubenswrapper[4691]: I0930 07:55:27.160945 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/63ec617d-5809-4806-a0a5-bcc4421b2737-host" (OuterVolumeSpecName: "host") pod "63ec617d-5809-4806-a0a5-bcc4421b2737" (UID: "63ec617d-5809-4806-a0a5-bcc4421b2737"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 07:55:27 crc kubenswrapper[4691]: I0930 07:55:27.161649 4691 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/63ec617d-5809-4806-a0a5-bcc4421b2737-host\") on node \"crc\" DevicePath \"\"" Sep 30 07:55:27 crc kubenswrapper[4691]: I0930 07:55:27.169466 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63ec617d-5809-4806-a0a5-bcc4421b2737-kube-api-access-n7nqd" (OuterVolumeSpecName: "kube-api-access-n7nqd") pod "63ec617d-5809-4806-a0a5-bcc4421b2737" (UID: "63ec617d-5809-4806-a0a5-bcc4421b2737"). InnerVolumeSpecName "kube-api-access-n7nqd". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:55:27 crc kubenswrapper[4691]: I0930 07:55:27.238373 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63ec617d-5809-4806-a0a5-bcc4421b2737" path="/var/lib/kubelet/pods/63ec617d-5809-4806-a0a5-bcc4421b2737/volumes" Sep 30 07:55:27 crc kubenswrapper[4691]: I0930 07:55:27.263634 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n7nqd\" (UniqueName: \"kubernetes.io/projected/63ec617d-5809-4806-a0a5-bcc4421b2737-kube-api-access-n7nqd\") on node \"crc\" DevicePath \"\"" Sep 30 07:55:27 crc kubenswrapper[4691]: I0930 07:55:27.855975 4691 scope.go:117] "RemoveContainer" containerID="aaa30257679c618c88bf823386d62d3dc5ac8a9cfb64188ccb12e77069ef622e" Sep 30 07:55:27 crc kubenswrapper[4691]: I0930 07:55:27.856132 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-78mzf/crc-debug-dzmz8" Sep 30 07:55:28 crc kubenswrapper[4691]: I0930 07:55:28.184253 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-78mzf/crc-debug-6pd8b"] Sep 30 07:55:28 crc kubenswrapper[4691]: E0930 07:55:28.185203 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63ec617d-5809-4806-a0a5-bcc4421b2737" containerName="container-00" Sep 30 07:55:28 crc kubenswrapper[4691]: I0930 07:55:28.185218 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="63ec617d-5809-4806-a0a5-bcc4421b2737" containerName="container-00" Sep 30 07:55:28 crc kubenswrapper[4691]: I0930 07:55:28.185468 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="63ec617d-5809-4806-a0a5-bcc4421b2737" containerName="container-00" Sep 30 07:55:28 crc kubenswrapper[4691]: I0930 07:55:28.186525 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-78mzf/crc-debug-6pd8b" Sep 30 07:55:28 crc kubenswrapper[4691]: I0930 07:55:28.283277 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d6f270b3-7ae8-45e1-8438-da7839f4df1e-host\") pod \"crc-debug-6pd8b\" (UID: \"d6f270b3-7ae8-45e1-8438-da7839f4df1e\") " pod="openshift-must-gather-78mzf/crc-debug-6pd8b" Sep 30 07:55:28 crc kubenswrapper[4691]: I0930 07:55:28.283578 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94d9q\" (UniqueName: \"kubernetes.io/projected/d6f270b3-7ae8-45e1-8438-da7839f4df1e-kube-api-access-94d9q\") pod \"crc-debug-6pd8b\" (UID: \"d6f270b3-7ae8-45e1-8438-da7839f4df1e\") " pod="openshift-must-gather-78mzf/crc-debug-6pd8b" Sep 30 07:55:28 crc kubenswrapper[4691]: I0930 07:55:28.384968 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d6f270b3-7ae8-45e1-8438-da7839f4df1e-host\") pod \"crc-debug-6pd8b\" (UID: \"d6f270b3-7ae8-45e1-8438-da7839f4df1e\") " pod="openshift-must-gather-78mzf/crc-debug-6pd8b" Sep 30 07:55:28 crc kubenswrapper[4691]: I0930 07:55:28.385083 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d6f270b3-7ae8-45e1-8438-da7839f4df1e-host\") pod \"crc-debug-6pd8b\" (UID: \"d6f270b3-7ae8-45e1-8438-da7839f4df1e\") " pod="openshift-must-gather-78mzf/crc-debug-6pd8b" Sep 30 07:55:28 crc kubenswrapper[4691]: I0930 07:55:28.385422 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94d9q\" (UniqueName: \"kubernetes.io/projected/d6f270b3-7ae8-45e1-8438-da7839f4df1e-kube-api-access-94d9q\") pod \"crc-debug-6pd8b\" (UID: \"d6f270b3-7ae8-45e1-8438-da7839f4df1e\") " pod="openshift-must-gather-78mzf/crc-debug-6pd8b" Sep 30 07:55:28 crc kubenswrapper[4691]: I0930 07:55:28.408378 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94d9q\" (UniqueName: \"kubernetes.io/projected/d6f270b3-7ae8-45e1-8438-da7839f4df1e-kube-api-access-94d9q\") pod \"crc-debug-6pd8b\" (UID: \"d6f270b3-7ae8-45e1-8438-da7839f4df1e\") " pod="openshift-must-gather-78mzf/crc-debug-6pd8b" Sep 30 07:55:28 crc kubenswrapper[4691]: I0930 07:55:28.503916 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-78mzf/crc-debug-6pd8b" Sep 30 07:55:28 crc kubenswrapper[4691]: W0930 07:55:28.545949 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd6f270b3_7ae8_45e1_8438_da7839f4df1e.slice/crio-4cfe3bdd1d42d994af085d60092f38ec75cebc209ec128575fde49517f704eea WatchSource:0}: Error finding container 4cfe3bdd1d42d994af085d60092f38ec75cebc209ec128575fde49517f704eea: Status 404 returned error can't find the container with id 4cfe3bdd1d42d994af085d60092f38ec75cebc209ec128575fde49517f704eea Sep 30 07:55:28 crc kubenswrapper[4691]: I0930 07:55:28.869785 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-78mzf/crc-debug-6pd8b" event={"ID":"d6f270b3-7ae8-45e1-8438-da7839f4df1e","Type":"ContainerStarted","Data":"458226a15c404f1418d17b80690fde5b54a44837964707213b53d584f857649f"} Sep 30 07:55:28 crc kubenswrapper[4691]: I0930 07:55:28.870188 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-78mzf/crc-debug-6pd8b" event={"ID":"d6f270b3-7ae8-45e1-8438-da7839f4df1e","Type":"ContainerStarted","Data":"4cfe3bdd1d42d994af085d60092f38ec75cebc209ec128575fde49517f704eea"} Sep 30 07:55:29 crc kubenswrapper[4691]: I0930 07:55:29.882507 4691 generic.go:334] "Generic (PLEG): container finished" podID="d6f270b3-7ae8-45e1-8438-da7839f4df1e" containerID="458226a15c404f1418d17b80690fde5b54a44837964707213b53d584f857649f" exitCode=0 Sep 30 07:55:29 crc kubenswrapper[4691]: I0930 07:55:29.882549 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-78mzf/crc-debug-6pd8b" event={"ID":"d6f270b3-7ae8-45e1-8438-da7839f4df1e","Type":"ContainerDied","Data":"458226a15c404f1418d17b80690fde5b54a44837964707213b53d584f857649f"} Sep 30 07:55:30 crc kubenswrapper[4691]: I0930 07:55:30.982351 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-78mzf/crc-debug-6pd8b" Sep 30 07:55:31 crc kubenswrapper[4691]: I0930 07:55:31.129013 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94d9q\" (UniqueName: \"kubernetes.io/projected/d6f270b3-7ae8-45e1-8438-da7839f4df1e-kube-api-access-94d9q\") pod \"d6f270b3-7ae8-45e1-8438-da7839f4df1e\" (UID: \"d6f270b3-7ae8-45e1-8438-da7839f4df1e\") " Sep 30 07:55:31 crc kubenswrapper[4691]: I0930 07:55:31.129343 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d6f270b3-7ae8-45e1-8438-da7839f4df1e-host\") pod \"d6f270b3-7ae8-45e1-8438-da7839f4df1e\" (UID: \"d6f270b3-7ae8-45e1-8438-da7839f4df1e\") " Sep 30 07:55:31 crc kubenswrapper[4691]: I0930 07:55:31.129418 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d6f270b3-7ae8-45e1-8438-da7839f4df1e-host" (OuterVolumeSpecName: "host") pod "d6f270b3-7ae8-45e1-8438-da7839f4df1e" (UID: "d6f270b3-7ae8-45e1-8438-da7839f4df1e"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 07:55:31 crc kubenswrapper[4691]: I0930 07:55:31.129860 4691 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d6f270b3-7ae8-45e1-8438-da7839f4df1e-host\") on node \"crc\" DevicePath \"\"" Sep 30 07:55:31 crc kubenswrapper[4691]: I0930 07:55:31.134777 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6f270b3-7ae8-45e1-8438-da7839f4df1e-kube-api-access-94d9q" (OuterVolumeSpecName: "kube-api-access-94d9q") pod "d6f270b3-7ae8-45e1-8438-da7839f4df1e" (UID: "d6f270b3-7ae8-45e1-8438-da7839f4df1e"). InnerVolumeSpecName "kube-api-access-94d9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:55:31 crc kubenswrapper[4691]: I0930 07:55:31.231395 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-94d9q\" (UniqueName: \"kubernetes.io/projected/d6f270b3-7ae8-45e1-8438-da7839f4df1e-kube-api-access-94d9q\") on node \"crc\" DevicePath \"\"" Sep 30 07:55:31 crc kubenswrapper[4691]: I0930 07:55:31.899327 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-78mzf/crc-debug-6pd8b" event={"ID":"d6f270b3-7ae8-45e1-8438-da7839f4df1e","Type":"ContainerDied","Data":"4cfe3bdd1d42d994af085d60092f38ec75cebc209ec128575fde49517f704eea"} Sep 30 07:55:31 crc kubenswrapper[4691]: I0930 07:55:31.899375 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4cfe3bdd1d42d994af085d60092f38ec75cebc209ec128575fde49517f704eea" Sep 30 07:55:31 crc kubenswrapper[4691]: I0930 07:55:31.899424 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-78mzf/crc-debug-6pd8b" Sep 30 07:55:38 crc kubenswrapper[4691]: I0930 07:55:38.600283 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-78mzf/crc-debug-6pd8b"] Sep 30 07:55:38 crc kubenswrapper[4691]: I0930 07:55:38.608046 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-78mzf/crc-debug-6pd8b"] Sep 30 07:55:39 crc kubenswrapper[4691]: I0930 07:55:39.246347 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6f270b3-7ae8-45e1-8438-da7839f4df1e" path="/var/lib/kubelet/pods/d6f270b3-7ae8-45e1-8438-da7839f4df1e/volumes" Sep 30 07:55:39 crc kubenswrapper[4691]: I0930 07:55:39.831991 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-78mzf/crc-debug-mrpl2"] Sep 30 07:55:39 crc kubenswrapper[4691]: E0930 07:55:39.832713 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6f270b3-7ae8-45e1-8438-da7839f4df1e" containerName="container-00" Sep 30 07:55:39 crc kubenswrapper[4691]: I0930 07:55:39.832729 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6f270b3-7ae8-45e1-8438-da7839f4df1e" containerName="container-00" Sep 30 07:55:39 crc kubenswrapper[4691]: I0930 07:55:39.832976 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6f270b3-7ae8-45e1-8438-da7839f4df1e" containerName="container-00" Sep 30 07:55:39 crc kubenswrapper[4691]: I0930 07:55:39.833585 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-78mzf/crc-debug-mrpl2" Sep 30 07:55:39 crc kubenswrapper[4691]: I0930 07:55:39.977060 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/00fbfb53-6d9d-4321-8ee6-2382ec4125fd-host\") pod \"crc-debug-mrpl2\" (UID: \"00fbfb53-6d9d-4321-8ee6-2382ec4125fd\") " pod="openshift-must-gather-78mzf/crc-debug-mrpl2" Sep 30 07:55:39 crc kubenswrapper[4691]: I0930 07:55:39.977373 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-276mp\" (UniqueName: \"kubernetes.io/projected/00fbfb53-6d9d-4321-8ee6-2382ec4125fd-kube-api-access-276mp\") pod \"crc-debug-mrpl2\" (UID: \"00fbfb53-6d9d-4321-8ee6-2382ec4125fd\") " pod="openshift-must-gather-78mzf/crc-debug-mrpl2" Sep 30 07:55:40 crc kubenswrapper[4691]: I0930 07:55:40.079332 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-276mp\" (UniqueName: \"kubernetes.io/projected/00fbfb53-6d9d-4321-8ee6-2382ec4125fd-kube-api-access-276mp\") pod \"crc-debug-mrpl2\" (UID: \"00fbfb53-6d9d-4321-8ee6-2382ec4125fd\") " pod="openshift-must-gather-78mzf/crc-debug-mrpl2" Sep 30 07:55:40 crc kubenswrapper[4691]: I0930 07:55:40.079416 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/00fbfb53-6d9d-4321-8ee6-2382ec4125fd-host\") pod \"crc-debug-mrpl2\" (UID: \"00fbfb53-6d9d-4321-8ee6-2382ec4125fd\") " pod="openshift-must-gather-78mzf/crc-debug-mrpl2" Sep 30 07:55:40 crc kubenswrapper[4691]: I0930 07:55:40.079587 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/00fbfb53-6d9d-4321-8ee6-2382ec4125fd-host\") pod \"crc-debug-mrpl2\" (UID: \"00fbfb53-6d9d-4321-8ee6-2382ec4125fd\") " pod="openshift-must-gather-78mzf/crc-debug-mrpl2" Sep 30 07:55:40 crc kubenswrapper[4691]: I0930 07:55:40.098011 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-276mp\" (UniqueName: \"kubernetes.io/projected/00fbfb53-6d9d-4321-8ee6-2382ec4125fd-kube-api-access-276mp\") pod \"crc-debug-mrpl2\" (UID: \"00fbfb53-6d9d-4321-8ee6-2382ec4125fd\") " pod="openshift-must-gather-78mzf/crc-debug-mrpl2" Sep 30 07:55:40 crc kubenswrapper[4691]: I0930 07:55:40.152248 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-78mzf/crc-debug-mrpl2" Sep 30 07:55:40 crc kubenswrapper[4691]: I0930 07:55:40.986528 4691 generic.go:334] "Generic (PLEG): container finished" podID="00fbfb53-6d9d-4321-8ee6-2382ec4125fd" containerID="944ff933b4f94e985358a57857304c0dd372bb11d6acc94b6f67371790e50c5e" exitCode=0 Sep 30 07:55:40 crc kubenswrapper[4691]: I0930 07:55:40.986842 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-78mzf/crc-debug-mrpl2" event={"ID":"00fbfb53-6d9d-4321-8ee6-2382ec4125fd","Type":"ContainerDied","Data":"944ff933b4f94e985358a57857304c0dd372bb11d6acc94b6f67371790e50c5e"} Sep 30 07:55:40 crc kubenswrapper[4691]: I0930 07:55:40.986874 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-78mzf/crc-debug-mrpl2" event={"ID":"00fbfb53-6d9d-4321-8ee6-2382ec4125fd","Type":"ContainerStarted","Data":"2e53004468ec252e6ee5bcbc0abe0ceb91ece6188ba6ab2b75093046b4f8636a"} Sep 30 07:55:41 crc kubenswrapper[4691]: I0930 07:55:41.032723 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-78mzf/crc-debug-mrpl2"] Sep 30 07:55:41 crc kubenswrapper[4691]: I0930 07:55:41.043836 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-78mzf/crc-debug-mrpl2"] Sep 30 07:55:42 crc kubenswrapper[4691]: I0930 07:55:42.119033 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-78mzf/crc-debug-mrpl2" Sep 30 07:55:42 crc kubenswrapper[4691]: I0930 07:55:42.220356 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-276mp\" (UniqueName: \"kubernetes.io/projected/00fbfb53-6d9d-4321-8ee6-2382ec4125fd-kube-api-access-276mp\") pod \"00fbfb53-6d9d-4321-8ee6-2382ec4125fd\" (UID: \"00fbfb53-6d9d-4321-8ee6-2382ec4125fd\") " Sep 30 07:55:42 crc kubenswrapper[4691]: I0930 07:55:42.220419 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/00fbfb53-6d9d-4321-8ee6-2382ec4125fd-host\") pod \"00fbfb53-6d9d-4321-8ee6-2382ec4125fd\" (UID: \"00fbfb53-6d9d-4321-8ee6-2382ec4125fd\") " Sep 30 07:55:42 crc kubenswrapper[4691]: I0930 07:55:42.220568 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/00fbfb53-6d9d-4321-8ee6-2382ec4125fd-host" (OuterVolumeSpecName: "host") pod "00fbfb53-6d9d-4321-8ee6-2382ec4125fd" (UID: "00fbfb53-6d9d-4321-8ee6-2382ec4125fd"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 07:55:42 crc kubenswrapper[4691]: I0930 07:55:42.220825 4691 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/00fbfb53-6d9d-4321-8ee6-2382ec4125fd-host\") on node \"crc\" DevicePath \"\"" Sep 30 07:55:42 crc kubenswrapper[4691]: I0930 07:55:42.234466 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00fbfb53-6d9d-4321-8ee6-2382ec4125fd-kube-api-access-276mp" (OuterVolumeSpecName: "kube-api-access-276mp") pod "00fbfb53-6d9d-4321-8ee6-2382ec4125fd" (UID: "00fbfb53-6d9d-4321-8ee6-2382ec4125fd"). InnerVolumeSpecName "kube-api-access-276mp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:55:42 crc kubenswrapper[4691]: I0930 07:55:42.322341 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-276mp\" (UniqueName: \"kubernetes.io/projected/00fbfb53-6d9d-4321-8ee6-2382ec4125fd-kube-api-access-276mp\") on node \"crc\" DevicePath \"\"" Sep 30 07:55:42 crc kubenswrapper[4691]: I0930 07:55:42.581901 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3cdb0998117b6528e8d4f4afc7f870da2241657f1d88a1ece591ac0268b75jz_ed991450-2c1e-4d1a-a54c-196c5067ce69/util/0.log" Sep 30 07:55:42 crc kubenswrapper[4691]: I0930 07:55:42.760183 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3cdb0998117b6528e8d4f4afc7f870da2241657f1d88a1ece591ac0268b75jz_ed991450-2c1e-4d1a-a54c-196c5067ce69/util/0.log" Sep 30 07:55:42 crc kubenswrapper[4691]: I0930 07:55:42.778566 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3cdb0998117b6528e8d4f4afc7f870da2241657f1d88a1ece591ac0268b75jz_ed991450-2c1e-4d1a-a54c-196c5067ce69/pull/0.log" Sep 30 07:55:42 crc kubenswrapper[4691]: I0930 07:55:42.784776 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3cdb0998117b6528e8d4f4afc7f870da2241657f1d88a1ece591ac0268b75jz_ed991450-2c1e-4d1a-a54c-196c5067ce69/pull/0.log" Sep 30 07:55:42 crc kubenswrapper[4691]: I0930 07:55:42.957699 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3cdb0998117b6528e8d4f4afc7f870da2241657f1d88a1ece591ac0268b75jz_ed991450-2c1e-4d1a-a54c-196c5067ce69/util/0.log" Sep 30 07:55:42 crc kubenswrapper[4691]: I0930 07:55:42.984500 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3cdb0998117b6528e8d4f4afc7f870da2241657f1d88a1ece591ac0268b75jz_ed991450-2c1e-4d1a-a54c-196c5067ce69/pull/0.log" Sep 30 07:55:42 crc kubenswrapper[4691]: I0930 07:55:42.993192 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3cdb0998117b6528e8d4f4afc7f870da2241657f1d88a1ece591ac0268b75jz_ed991450-2c1e-4d1a-a54c-196c5067ce69/extract/0.log" Sep 30 07:55:43 crc kubenswrapper[4691]: I0930 07:55:43.006453 4691 scope.go:117] "RemoveContainer" containerID="944ff933b4f94e985358a57857304c0dd372bb11d6acc94b6f67371790e50c5e" Sep 30 07:55:43 crc kubenswrapper[4691]: I0930 07:55:43.006696 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-78mzf/crc-debug-mrpl2" Sep 30 07:55:43 crc kubenswrapper[4691]: I0930 07:55:43.140146 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6ff8b75857-qqx55_a5779e0d-8902-4a45-b28e-4253af3938ae/kube-rbac-proxy/0.log" Sep 30 07:55:43 crc kubenswrapper[4691]: I0930 07:55:43.201256 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6ff8b75857-qqx55_a5779e0d-8902-4a45-b28e-4253af3938ae/manager/0.log" Sep 30 07:55:43 crc kubenswrapper[4691]: I0930 07:55:43.222573 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-644bddb6d8-9dj86_d80f2f0e-5d71-42d4-8bb3-69b8dadd63c2/kube-rbac-proxy/0.log" Sep 30 07:55:43 crc kubenswrapper[4691]: I0930 07:55:43.236702 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00fbfb53-6d9d-4321-8ee6-2382ec4125fd" path="/var/lib/kubelet/pods/00fbfb53-6d9d-4321-8ee6-2382ec4125fd/volumes" Sep 30 07:55:43 crc kubenswrapper[4691]: I0930 07:55:43.386832 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-644bddb6d8-9dj86_d80f2f0e-5d71-42d4-8bb3-69b8dadd63c2/manager/0.log" Sep 30 07:55:43 crc kubenswrapper[4691]: I0930 07:55:43.421556 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-84f4f7b77b-8wf6n_2a1af285-1505-419c-bacc-16d8a161aca2/kube-rbac-proxy/0.log" Sep 30 07:55:43 crc kubenswrapper[4691]: I0930 07:55:43.430128 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-84f4f7b77b-8wf6n_2a1af285-1505-419c-bacc-16d8a161aca2/manager/0.log" Sep 30 07:55:43 crc kubenswrapper[4691]: I0930 07:55:43.610418 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84958c4d49-qwmv9_60dcfaf5-c692-44e5-8868-1dfccb14f535/kube-rbac-proxy/0.log" Sep 30 07:55:43 crc kubenswrapper[4691]: I0930 07:55:43.693309 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84958c4d49-qwmv9_60dcfaf5-c692-44e5-8868-1dfccb14f535/manager/0.log" Sep 30 07:55:43 crc kubenswrapper[4691]: I0930 07:55:43.757457 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5d889d78cf-5j6fw_5c0ba848-ac6e-4515-99e3-e1665ff79d7c/kube-rbac-proxy/0.log" Sep 30 07:55:43 crc kubenswrapper[4691]: I0930 07:55:43.810252 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5d889d78cf-5j6fw_5c0ba848-ac6e-4515-99e3-e1665ff79d7c/manager/0.log" Sep 30 07:55:43 crc kubenswrapper[4691]: I0930 07:55:43.864772 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-9f4696d94-rxdj9_c9f2f281-c656-4c29-bf86-c38f9cd79528/kube-rbac-proxy/0.log" Sep 30 07:55:43 crc kubenswrapper[4691]: I0930 07:55:43.972975 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-9f4696d94-rxdj9_c9f2f281-c656-4c29-bf86-c38f9cd79528/manager/0.log" Sep 30 07:55:44 crc kubenswrapper[4691]: I0930 07:55:44.002898 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7d857cc749-jgh2d_1d4f2da8-966a-4a80-aca6-efdd8faca337/kube-rbac-proxy/0.log" Sep 30 07:55:44 crc kubenswrapper[4691]: I0930 07:55:44.230516 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7d857cc749-jgh2d_1d4f2da8-966a-4a80-aca6-efdd8faca337/manager/0.log" Sep 30 07:55:44 crc kubenswrapper[4691]: I0930 07:55:44.240640 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-7975b88857-x79vm_9c5c1b63-6185-424c-a584-35a18e2c69bd/manager/0.log" Sep 30 07:55:44 crc kubenswrapper[4691]: I0930 07:55:44.246585 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-7975b88857-x79vm_9c5c1b63-6185-424c-a584-35a18e2c69bd/kube-rbac-proxy/0.log" Sep 30 07:55:44 crc kubenswrapper[4691]: I0930 07:55:44.384197 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5bd55b4bff-9xs5h_d6fb63f5-e7b6-47fd-ac44-b59058899b3c/kube-rbac-proxy/0.log" Sep 30 07:55:44 crc kubenswrapper[4691]: I0930 07:55:44.477095 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5bd55b4bff-9xs5h_d6fb63f5-e7b6-47fd-ac44-b59058899b3c/manager/0.log" Sep 30 07:55:44 crc kubenswrapper[4691]: I0930 07:55:44.590156 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6d68dbc695-rtqwx_11bd74e6-05a4-44fc-b360-f1d71352011e/kube-rbac-proxy/0.log" Sep 30 07:55:44 crc kubenswrapper[4691]: I0930 07:55:44.590299 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6d68dbc695-rtqwx_11bd74e6-05a4-44fc-b360-f1d71352011e/manager/0.log" Sep 30 07:55:44 crc kubenswrapper[4691]: I0930 07:55:44.653209 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-88c7-c5nbk_a1aaa7fa-8695-4124-ad5a-26f11a99b1c8/kube-rbac-proxy/0.log" Sep 30 07:55:44 crc kubenswrapper[4691]: I0930 07:55:44.790220 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-88c7-c5nbk_a1aaa7fa-8695-4124-ad5a-26f11a99b1c8/manager/0.log" Sep 30 07:55:44 crc kubenswrapper[4691]: I0930 07:55:44.818444 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-64d7b59854-2kzwg_2c674607-65d9-4be2-9244-d61eadb97dd7/kube-rbac-proxy/0.log" Sep 30 07:55:44 crc kubenswrapper[4691]: I0930 07:55:44.886775 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-64d7b59854-2kzwg_2c674607-65d9-4be2-9244-d61eadb97dd7/manager/0.log" Sep 30 07:55:44 crc kubenswrapper[4691]: I0930 07:55:44.965996 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-c7c776c96-gtznz_a8dd4aa3-ab8b-4f66-9722-8873600c87eb/kube-rbac-proxy/0.log" Sep 30 07:55:45 crc kubenswrapper[4691]: I0930 07:55:45.081376 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-c7c776c96-gtznz_a8dd4aa3-ab8b-4f66-9722-8873600c87eb/manager/0.log" Sep 30 07:55:45 crc kubenswrapper[4691]: I0930 07:55:45.157342 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-76fcc6dc7c-h42cw_88c75f60-538f-4059-aaeb-b41dcdcf7cfa/kube-rbac-proxy/0.log" Sep 30 07:55:45 crc kubenswrapper[4691]: I0930 07:55:45.214194 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-76fcc6dc7c-h42cw_88c75f60-538f-4059-aaeb-b41dcdcf7cfa/manager/0.log" Sep 30 07:55:45 crc kubenswrapper[4691]: I0930 07:55:45.285372 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6d776955-r8s99_4a38ca7d-d01d-4a0e-adfa-4875c46d8d7d/kube-rbac-proxy/0.log" Sep 30 07:55:45 crc kubenswrapper[4691]: I0930 07:55:45.356649 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6d776955-r8s99_4a38ca7d-d01d-4a0e-adfa-4875c46d8d7d/manager/0.log" Sep 30 07:55:45 crc kubenswrapper[4691]: I0930 07:55:45.448999 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-86d6bdfc6d-7zkhq_1013022f-3fa2-44d5-a111-5f89a6a7bb17/kube-rbac-proxy/0.log" Sep 30 07:55:45 crc kubenswrapper[4691]: I0930 07:55:45.666016 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-6bb46fb86b-vx6kv_e5d557a0-1dee-462b-89d5-86c8479ef2e4/kube-rbac-proxy/0.log" Sep 30 07:55:45 crc kubenswrapper[4691]: I0930 07:55:45.777454 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-6bb46fb86b-vx6kv_e5d557a0-1dee-462b-89d5-86c8479ef2e4/operator/0.log" Sep 30 07:55:45 crc kubenswrapper[4691]: I0930 07:55:45.933445 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-swhwg_664688ee-c3cc-4f92-86b7-64d53b8c133d/registry-server/0.log" Sep 30 07:55:46 crc kubenswrapper[4691]: I0930 07:55:46.021288 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-9976ff44c-bzx6l_a7851de5-00e1-43bb-b0c9-8ae4f4f9c5af/kube-rbac-proxy/0.log" Sep 30 07:55:46 crc kubenswrapper[4691]: I0930 07:55:46.204049 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-9976ff44c-bzx6l_a7851de5-00e1-43bb-b0c9-8ae4f4f9c5af/manager/0.log" Sep 30 07:55:46 crc kubenswrapper[4691]: I0930 07:55:46.296320 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-589c58c6c-vd4rw_52aa93bd-f5d7-479e-a8fe-2c6e70a70fae/kube-rbac-proxy/0.log" Sep 30 07:55:46 crc kubenswrapper[4691]: I0930 07:55:46.341555 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-589c58c6c-vd4rw_52aa93bd-f5d7-479e-a8fe-2c6e70a70fae/manager/0.log" Sep 30 07:55:46 crc kubenswrapper[4691]: I0930 07:55:46.481686 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-79d8469568-7v82c_54fbcf55-e81d-4336-8e38-9bb1d3ec3c47/operator/0.log" Sep 30 07:55:46 crc kubenswrapper[4691]: I0930 07:55:46.577494 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-bc7dc7bd9-s4rpv_3bce910d-be3e-4332-89df-75e715d95988/kube-rbac-proxy/0.log" Sep 30 07:55:46 crc kubenswrapper[4691]: I0930 07:55:46.648983 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-86d6bdfc6d-7zkhq_1013022f-3fa2-44d5-a111-5f89a6a7bb17/manager/0.log" Sep 30 07:55:46 crc kubenswrapper[4691]: I0930 07:55:46.697816 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-b8d54b5d7-44qv5_88da73a4-c9e2-4a78-b313-8cf689562e38/kube-rbac-proxy/0.log" Sep 30 07:55:46 crc kubenswrapper[4691]: I0930 07:55:46.711305 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-bc7dc7bd9-s4rpv_3bce910d-be3e-4332-89df-75e715d95988/manager/0.log" Sep 30 07:55:46 crc kubenswrapper[4691]: I0930 07:55:46.936278 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-f66b554c6-8qj9q_a1cbd98a-2f66-4649-8347-938d07f93eb1/manager/0.log" Sep 30 07:55:46 crc kubenswrapper[4691]: I0930 07:55:46.936621 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-f66b554c6-8qj9q_a1cbd98a-2f66-4649-8347-938d07f93eb1/kube-rbac-proxy/0.log" Sep 30 07:55:46 crc kubenswrapper[4691]: I0930 07:55:46.992123 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-b8d54b5d7-44qv5_88da73a4-c9e2-4a78-b313-8cf689562e38/manager/0.log" Sep 30 07:55:47 crc kubenswrapper[4691]: I0930 07:55:47.026773 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-bd494bc6d-x495w_9678f82b-58e6-4529-bdf6-6faaf2d7bcfa/kube-rbac-proxy/0.log" Sep 30 07:55:47 crc kubenswrapper[4691]: I0930 07:55:47.191503 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-bd494bc6d-x495w_9678f82b-58e6-4529-bdf6-6faaf2d7bcfa/manager/0.log" Sep 30 07:55:52 crc kubenswrapper[4691]: I0930 07:55:52.849874 4691 patch_prober.go:28] interesting pod/machine-config-daemon-4w4k6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 07:55:52 crc kubenswrapper[4691]: I0930 07:55:52.850862 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 07:55:52 crc kubenswrapper[4691]: I0930 07:55:52.850968 4691 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" Sep 30 07:55:52 crc kubenswrapper[4691]: I0930 07:55:52.852116 4691 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"37d19b6a60363a5a6aed5989b48791cca474e4f931fda010298ee4b265d6d360"} pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 07:55:52 crc kubenswrapper[4691]: I0930 07:55:52.852187 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" containerName="machine-config-daemon" containerID="cri-o://37d19b6a60363a5a6aed5989b48791cca474e4f931fda010298ee4b265d6d360" gracePeriod=600 Sep 30 07:55:53 crc kubenswrapper[4691]: I0930 07:55:53.097697 4691 generic.go:334] "Generic (PLEG): container finished" podID="69b46ade-8260-448f-84b7-506632d23ff9" containerID="37d19b6a60363a5a6aed5989b48791cca474e4f931fda010298ee4b265d6d360" exitCode=0 Sep 30 07:55:53 crc kubenswrapper[4691]: I0930 07:55:53.097772 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" event={"ID":"69b46ade-8260-448f-84b7-506632d23ff9","Type":"ContainerDied","Data":"37d19b6a60363a5a6aed5989b48791cca474e4f931fda010298ee4b265d6d360"} Sep 30 07:55:53 crc kubenswrapper[4691]: I0930 07:55:53.098050 4691 scope.go:117] "RemoveContainer" containerID="d2c1910332c7b39c28fb9c23f54694ce1e0dbe190d7af26b09b1c40104f1b3da" Sep 30 07:55:54 crc kubenswrapper[4691]: I0930 07:55:54.106916 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" event={"ID":"69b46ade-8260-448f-84b7-506632d23ff9","Type":"ContainerStarted","Data":"aec37d63291116406dc4ed4a91e1140d14893da8976b65afbaac021af006f18a"} Sep 30 07:56:02 crc kubenswrapper[4691]: I0930 07:56:02.932564 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-kr8lp_2997210f-48b1-46e1-bf0f-12ed24852c8b/control-plane-machine-set-operator/0.log" Sep 30 07:56:03 crc kubenswrapper[4691]: I0930 07:56:03.133358 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-kzftt_ee1c2dd6-d759-4d3c-9ec7-86ec11419202/kube-rbac-proxy/0.log" Sep 30 07:56:03 crc kubenswrapper[4691]: I0930 07:56:03.223359 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-kzftt_ee1c2dd6-d759-4d3c-9ec7-86ec11419202/machine-api-operator/0.log" Sep 30 07:56:14 crc kubenswrapper[4691]: I0930 07:56:14.731081 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-xxwh7_2ee3b0f5-be03-426e-b603-ec6c53237e85/cert-manager-controller/0.log" Sep 30 07:56:14 crc kubenswrapper[4691]: I0930 07:56:14.892251 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-xwxbt_5746a924-b059-4e93-91c3-31bbe5e2ef86/cert-manager-cainjector/0.log" Sep 30 07:56:14 crc kubenswrapper[4691]: I0930 07:56:14.986733 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-99v8p_746ab0d5-3b5c-4985-935e-73a35939302d/cert-manager-webhook/0.log" Sep 30 07:56:26 crc kubenswrapper[4691]: I0930 07:56:26.591445 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-864bb6dfb5-r4qfl_e1989084-5e13-4ce8-9d59-050337ff70da/nmstate-console-plugin/0.log" Sep 30 07:56:26 crc kubenswrapper[4691]: I0930 07:56:26.774028 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-gcxcd_3df27da9-f98e-41a7-84fb-bfad238e7533/nmstate-handler/0.log" Sep 30 07:56:26 crc kubenswrapper[4691]: I0930 07:56:26.832443 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58fcddf996-tqjqg_55ec66af-837d-40c5-81d2-6b311f0dc05c/kube-rbac-proxy/0.log" Sep 30 07:56:26 crc kubenswrapper[4691]: I0930 07:56:26.843082 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58fcddf996-tqjqg_55ec66af-837d-40c5-81d2-6b311f0dc05c/nmstate-metrics/0.log" Sep 30 07:56:26 crc kubenswrapper[4691]: I0930 07:56:26.965282 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5d6f6cfd66-wnr29_8dee2c6d-f8b8-4b1a-ae65-af2728adad3e/nmstate-operator/0.log" Sep 30 07:56:27 crc kubenswrapper[4691]: I0930 07:56:27.020142 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6d689559c5-pz8zd_3cdd6ae9-7044-4fb4-92fb-0a503651b60d/nmstate-webhook/0.log" Sep 30 07:56:40 crc kubenswrapper[4691]: I0930 07:56:40.162790 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-52gd8_238ed092-dc40-4e7d-add0-854dd611a65f/kube-rbac-proxy/0.log" Sep 30 07:56:40 crc kubenswrapper[4691]: I0930 07:56:40.315779 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-52gd8_238ed092-dc40-4e7d-add0-854dd611a65f/controller/0.log" Sep 30 07:56:40 crc kubenswrapper[4691]: I0930 07:56:40.354825 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jbzhs_9949a206-ebbd-42f2-8b22-8dfcf266b934/cp-frr-files/0.log" Sep 30 07:56:40 crc kubenswrapper[4691]: I0930 07:56:40.520515 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jbzhs_9949a206-ebbd-42f2-8b22-8dfcf266b934/cp-frr-files/0.log" Sep 30 07:56:40 crc kubenswrapper[4691]: I0930 07:56:40.539562 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jbzhs_9949a206-ebbd-42f2-8b22-8dfcf266b934/cp-metrics/0.log" Sep 30 07:56:40 crc kubenswrapper[4691]: I0930 07:56:40.556542 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jbzhs_9949a206-ebbd-42f2-8b22-8dfcf266b934/cp-reloader/0.log" Sep 30 07:56:40 crc kubenswrapper[4691]: I0930 07:56:40.583192 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jbzhs_9949a206-ebbd-42f2-8b22-8dfcf266b934/cp-reloader/0.log" Sep 30 07:56:40 crc kubenswrapper[4691]: I0930 07:56:40.737797 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jbzhs_9949a206-ebbd-42f2-8b22-8dfcf266b934/cp-metrics/0.log" Sep 30 07:56:40 crc kubenswrapper[4691]: I0930 07:56:40.749608 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jbzhs_9949a206-ebbd-42f2-8b22-8dfcf266b934/cp-frr-files/0.log" Sep 30 07:56:40 crc kubenswrapper[4691]: I0930 07:56:40.767303 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jbzhs_9949a206-ebbd-42f2-8b22-8dfcf266b934/cp-reloader/0.log" Sep 30 07:56:40 crc kubenswrapper[4691]: I0930 07:56:40.774771 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jbzhs_9949a206-ebbd-42f2-8b22-8dfcf266b934/cp-metrics/0.log" Sep 30 07:56:40 crc kubenswrapper[4691]: I0930 07:56:40.956313 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jbzhs_9949a206-ebbd-42f2-8b22-8dfcf266b934/cp-frr-files/0.log" Sep 30 07:56:40 crc kubenswrapper[4691]: I0930 07:56:40.981182 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jbzhs_9949a206-ebbd-42f2-8b22-8dfcf266b934/cp-metrics/0.log" Sep 30 07:56:40 crc kubenswrapper[4691]: I0930 07:56:40.996673 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jbzhs_9949a206-ebbd-42f2-8b22-8dfcf266b934/cp-reloader/0.log" Sep 30 07:56:41 crc kubenswrapper[4691]: I0930 07:56:41.009558 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jbzhs_9949a206-ebbd-42f2-8b22-8dfcf266b934/controller/0.log" Sep 30 07:56:41 crc kubenswrapper[4691]: I0930 07:56:41.173835 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jbzhs_9949a206-ebbd-42f2-8b22-8dfcf266b934/frr-metrics/0.log" Sep 30 07:56:41 crc kubenswrapper[4691]: I0930 07:56:41.182031 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jbzhs_9949a206-ebbd-42f2-8b22-8dfcf266b934/kube-rbac-proxy/0.log" Sep 30 07:56:41 crc kubenswrapper[4691]: I0930 07:56:41.225096 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jbzhs_9949a206-ebbd-42f2-8b22-8dfcf266b934/kube-rbac-proxy-frr/0.log" Sep 30 07:56:41 crc kubenswrapper[4691]: I0930 07:56:41.406827 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jbzhs_9949a206-ebbd-42f2-8b22-8dfcf266b934/reloader/0.log" Sep 30 07:56:41 crc kubenswrapper[4691]: I0930 07:56:41.466066 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-5478bdb765-4szpr_45188dc2-6524-4d87-bfaa-676d46684df8/frr-k8s-webhook-server/0.log" Sep 30 07:56:41 crc kubenswrapper[4691]: I0930 07:56:41.639993 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-5b7f74d8d8-lfcgt_fc264033-2e29-41cc-b961-92dbd3230d34/manager/0.log" Sep 30 07:56:41 crc kubenswrapper[4691]: I0930 07:56:41.884659 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-b8f956b88-zp5fz_164027cf-f7af-41cc-bbd2-e3a725230c9e/webhook-server/0.log" Sep 30 07:56:41 crc kubenswrapper[4691]: I0930 07:56:41.931375 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-vqw66_cbbc08d3-615a-450a-9a3e-7f0aba1c5ff6/kube-rbac-proxy/0.log" Sep 30 07:56:42 crc kubenswrapper[4691]: I0930 07:56:42.595859 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-vqw66_cbbc08d3-615a-450a-9a3e-7f0aba1c5ff6/speaker/0.log" Sep 30 07:56:42 crc kubenswrapper[4691]: I0930 07:56:42.878383 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jbzhs_9949a206-ebbd-42f2-8b22-8dfcf266b934/frr/0.log" Sep 30 07:56:54 crc kubenswrapper[4691]: I0930 07:56:54.674244 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc6pcbs_36dddebb-8230-4914-b81c-b53683028a63/util/0.log" Sep 30 07:56:54 crc kubenswrapper[4691]: I0930 07:56:54.822823 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc6pcbs_36dddebb-8230-4914-b81c-b53683028a63/pull/0.log" Sep 30 07:56:54 crc kubenswrapper[4691]: I0930 07:56:54.852393 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc6pcbs_36dddebb-8230-4914-b81c-b53683028a63/util/0.log" Sep 30 07:56:54 crc kubenswrapper[4691]: I0930 07:56:54.853202 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc6pcbs_36dddebb-8230-4914-b81c-b53683028a63/pull/0.log" Sep 30 07:56:55 crc kubenswrapper[4691]: I0930 07:56:55.004391 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc6pcbs_36dddebb-8230-4914-b81c-b53683028a63/util/0.log" Sep 30 07:56:55 crc kubenswrapper[4691]: I0930 07:56:55.005122 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc6pcbs_36dddebb-8230-4914-b81c-b53683028a63/pull/0.log" Sep 30 07:56:55 crc kubenswrapper[4691]: I0930 07:56:55.038687 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc6pcbs_36dddebb-8230-4914-b81c-b53683028a63/extract/0.log" Sep 30 07:56:55 crc kubenswrapper[4691]: I0930 07:56:55.185932 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkrbqr_de284bf7-ec7a-419c-89fb-8a555cd5b320/util/0.log" Sep 30 07:56:55 crc kubenswrapper[4691]: I0930 07:56:55.377130 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkrbqr_de284bf7-ec7a-419c-89fb-8a555cd5b320/pull/0.log" Sep 30 07:56:55 crc kubenswrapper[4691]: I0930 07:56:55.377755 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkrbqr_de284bf7-ec7a-419c-89fb-8a555cd5b320/util/0.log" Sep 30 07:56:55 crc kubenswrapper[4691]: I0930 07:56:55.394954 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkrbqr_de284bf7-ec7a-419c-89fb-8a555cd5b320/pull/0.log" Sep 30 07:56:55 crc kubenswrapper[4691]: I0930 07:56:55.523409 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkrbqr_de284bf7-ec7a-419c-89fb-8a555cd5b320/pull/0.log" Sep 30 07:56:55 crc kubenswrapper[4691]: I0930 07:56:55.533298 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkrbqr_de284bf7-ec7a-419c-89fb-8a555cd5b320/util/0.log" Sep 30 07:56:55 crc kubenswrapper[4691]: I0930 07:56:55.581072 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkrbqr_de284bf7-ec7a-419c-89fb-8a555cd5b320/extract/0.log" Sep 30 07:56:55 crc kubenswrapper[4691]: I0930 07:56:55.687423 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9xmqg_93c048ec-3ef2-4dc9-af06-5f3aa9bcae6f/extract-utilities/0.log" Sep 30 07:56:55 crc kubenswrapper[4691]: I0930 07:56:55.826461 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9xmqg_93c048ec-3ef2-4dc9-af06-5f3aa9bcae6f/extract-utilities/0.log" Sep 30 07:56:55 crc kubenswrapper[4691]: I0930 07:56:55.841049 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9xmqg_93c048ec-3ef2-4dc9-af06-5f3aa9bcae6f/extract-content/0.log" Sep 30 07:56:55 crc kubenswrapper[4691]: I0930 07:56:55.868953 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9xmqg_93c048ec-3ef2-4dc9-af06-5f3aa9bcae6f/extract-content/0.log" Sep 30 07:56:56 crc kubenswrapper[4691]: I0930 07:56:56.024163 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9xmqg_93c048ec-3ef2-4dc9-af06-5f3aa9bcae6f/extract-utilities/0.log" Sep 30 07:56:56 crc kubenswrapper[4691]: I0930 07:56:56.031755 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9xmqg_93c048ec-3ef2-4dc9-af06-5f3aa9bcae6f/extract-content/0.log" Sep 30 07:56:56 crc kubenswrapper[4691]: I0930 07:56:56.235187 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hcqm7_89aa6b93-31ae-4f4f-ad2c-bed85c8fd8fa/extract-utilities/0.log" Sep 30 07:56:56 crc kubenswrapper[4691]: I0930 07:56:56.317134 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9xmqg_93c048ec-3ef2-4dc9-af06-5f3aa9bcae6f/registry-server/0.log" Sep 30 07:56:56 crc kubenswrapper[4691]: I0930 07:56:56.409135 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hcqm7_89aa6b93-31ae-4f4f-ad2c-bed85c8fd8fa/extract-utilities/0.log" Sep 30 07:56:56 crc kubenswrapper[4691]: I0930 07:56:56.414581 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hcqm7_89aa6b93-31ae-4f4f-ad2c-bed85c8fd8fa/extract-content/0.log" Sep 30 07:56:56 crc kubenswrapper[4691]: I0930 07:56:56.423794 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hcqm7_89aa6b93-31ae-4f4f-ad2c-bed85c8fd8fa/extract-content/0.log" Sep 30 07:56:56 crc kubenswrapper[4691]: I0930 07:56:56.606434 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hcqm7_89aa6b93-31ae-4f4f-ad2c-bed85c8fd8fa/extract-utilities/0.log" Sep 30 07:56:56 crc kubenswrapper[4691]: I0930 07:56:56.619809 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hcqm7_89aa6b93-31ae-4f4f-ad2c-bed85c8fd8fa/extract-content/0.log" Sep 30 07:56:56 crc kubenswrapper[4691]: I0930 07:56:56.788155 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96lp95h_cfeab7bf-5c68-43a0-8c09-cbf293e56f35/util/0.log" Sep 30 07:56:57 crc kubenswrapper[4691]: I0930 07:56:57.029819 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96lp95h_cfeab7bf-5c68-43a0-8c09-cbf293e56f35/pull/0.log" Sep 30 07:56:57 crc kubenswrapper[4691]: I0930 07:56:57.040563 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96lp95h_cfeab7bf-5c68-43a0-8c09-cbf293e56f35/pull/0.log" Sep 30 07:56:57 crc kubenswrapper[4691]: I0930 07:56:57.045516 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96lp95h_cfeab7bf-5c68-43a0-8c09-cbf293e56f35/util/0.log" Sep 30 07:56:57 crc kubenswrapper[4691]: I0930 07:56:57.239082 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96lp95h_cfeab7bf-5c68-43a0-8c09-cbf293e56f35/util/0.log" Sep 30 07:56:57 crc kubenswrapper[4691]: I0930 07:56:57.262505 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96lp95h_cfeab7bf-5c68-43a0-8c09-cbf293e56f35/pull/0.log" Sep 30 07:56:57 crc kubenswrapper[4691]: I0930 07:56:57.297810 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96lp95h_cfeab7bf-5c68-43a0-8c09-cbf293e56f35/extract/0.log" Sep 30 07:56:57 crc kubenswrapper[4691]: I0930 07:56:57.518689 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-4km9n_f46c875b-2f18-4fac-98af-64b0756b7e26/marketplace-operator/0.log" Sep 30 07:56:57 crc kubenswrapper[4691]: I0930 07:56:57.572061 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hcqm7_89aa6b93-31ae-4f4f-ad2c-bed85c8fd8fa/registry-server/0.log" Sep 30 07:56:57 crc kubenswrapper[4691]: I0930 07:56:57.682267 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tzs7x_02f1c945-9a65-40b7-871c-aebadb76aa48/extract-utilities/0.log" Sep 30 07:56:57 crc kubenswrapper[4691]: I0930 07:56:57.827698 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tzs7x_02f1c945-9a65-40b7-871c-aebadb76aa48/extract-utilities/0.log" Sep 30 07:56:57 crc kubenswrapper[4691]: I0930 07:56:57.833366 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tzs7x_02f1c945-9a65-40b7-871c-aebadb76aa48/extract-content/0.log" Sep 30 07:56:57 crc kubenswrapper[4691]: I0930 07:56:57.872933 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tzs7x_02f1c945-9a65-40b7-871c-aebadb76aa48/extract-content/0.log" Sep 30 07:56:58 crc kubenswrapper[4691]: I0930 07:56:58.017609 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tzs7x_02f1c945-9a65-40b7-871c-aebadb76aa48/extract-utilities/0.log" Sep 30 07:56:58 crc kubenswrapper[4691]: I0930 07:56:58.084953 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tzs7x_02f1c945-9a65-40b7-871c-aebadb76aa48/extract-content/0.log" Sep 30 07:56:58 crc kubenswrapper[4691]: I0930 07:56:58.091788 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xbbxj_6366c530-63f4-4f81-b0eb-b4db91578068/extract-utilities/0.log" Sep 30 07:56:58 crc kubenswrapper[4691]: I0930 07:56:58.279231 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xbbxj_6366c530-63f4-4f81-b0eb-b4db91578068/extract-utilities/0.log" Sep 30 07:56:58 crc kubenswrapper[4691]: I0930 07:56:58.318538 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tzs7x_02f1c945-9a65-40b7-871c-aebadb76aa48/registry-server/0.log" Sep 30 07:56:58 crc kubenswrapper[4691]: I0930 07:56:58.351144 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xbbxj_6366c530-63f4-4f81-b0eb-b4db91578068/extract-content/0.log" Sep 30 07:56:58 crc kubenswrapper[4691]: I0930 07:56:58.369001 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xbbxj_6366c530-63f4-4f81-b0eb-b4db91578068/extract-content/0.log" Sep 30 07:56:58 crc kubenswrapper[4691]: I0930 07:56:58.543823 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xbbxj_6366c530-63f4-4f81-b0eb-b4db91578068/extract-content/0.log" Sep 30 07:56:58 crc kubenswrapper[4691]: I0930 07:56:58.546565 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xbbxj_6366c530-63f4-4f81-b0eb-b4db91578068/extract-utilities/0.log" Sep 30 07:56:59 crc kubenswrapper[4691]: I0930 07:56:59.274739 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-xbbxj_6366c530-63f4-4f81-b0eb-b4db91578068/registry-server/0.log" Sep 30 07:57:10 crc kubenswrapper[4691]: I0930 07:57:10.508448 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-7c8cf85677-xqs9h_276e7e45-2756-4551-867f-2184113b0749/prometheus-operator/0.log" Sep 30 07:57:10 crc kubenswrapper[4691]: I0930 07:57:10.655156 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-67d746f8c7-2qbrt_5a44c0de-cf12-49e9-9f72-eb618b14445b/prometheus-operator-admission-webhook/0.log" Sep 30 07:57:10 crc kubenswrapper[4691]: I0930 07:57:10.689242 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-67d746f8c7-h259t_b9ef8251-85be-4df7-9372-65a9fa9db6f7/prometheus-operator-admission-webhook/0.log" Sep 30 07:57:10 crc kubenswrapper[4691]: I0930 07:57:10.868947 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-54bc95c9fb-fmpwf_e6e2f68b-8f48-4a4d-a96d-400c32cb80c9/perses-operator/0.log" Sep 30 07:57:10 crc kubenswrapper[4691]: I0930 07:57:10.892794 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-cc5f78dfc-jw7dt_426f2d02-4b9e-432d-a888-c799b2db417a/operator/0.log" Sep 30 07:57:27 crc kubenswrapper[4691]: I0930 07:57:27.081164 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zjndd"] Sep 30 07:57:27 crc kubenswrapper[4691]: E0930 07:57:27.082223 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00fbfb53-6d9d-4321-8ee6-2382ec4125fd" containerName="container-00" Sep 30 07:57:27 crc kubenswrapper[4691]: I0930 07:57:27.082241 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="00fbfb53-6d9d-4321-8ee6-2382ec4125fd" containerName="container-00" Sep 30 07:57:27 crc kubenswrapper[4691]: I0930 07:57:27.082527 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="00fbfb53-6d9d-4321-8ee6-2382ec4125fd" containerName="container-00" Sep 30 07:57:27 crc kubenswrapper[4691]: I0930 07:57:27.084553 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zjndd" Sep 30 07:57:27 crc kubenswrapper[4691]: I0930 07:57:27.092263 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zjndd"] Sep 30 07:57:27 crc kubenswrapper[4691]: I0930 07:57:27.126274 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czz97\" (UniqueName: \"kubernetes.io/projected/fcb4d569-d722-4e61-a4eb-e05e8bfe3927-kube-api-access-czz97\") pod \"redhat-operators-zjndd\" (UID: \"fcb4d569-d722-4e61-a4eb-e05e8bfe3927\") " pod="openshift-marketplace/redhat-operators-zjndd" Sep 30 07:57:27 crc kubenswrapper[4691]: I0930 07:57:27.126324 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcb4d569-d722-4e61-a4eb-e05e8bfe3927-catalog-content\") pod \"redhat-operators-zjndd\" (UID: \"fcb4d569-d722-4e61-a4eb-e05e8bfe3927\") " pod="openshift-marketplace/redhat-operators-zjndd" Sep 30 07:57:27 crc kubenswrapper[4691]: I0930 07:57:27.126367 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcb4d569-d722-4e61-a4eb-e05e8bfe3927-utilities\") pod \"redhat-operators-zjndd\" (UID: \"fcb4d569-d722-4e61-a4eb-e05e8bfe3927\") " pod="openshift-marketplace/redhat-operators-zjndd" Sep 30 07:57:27 crc kubenswrapper[4691]: I0930 07:57:27.228146 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czz97\" (UniqueName: \"kubernetes.io/projected/fcb4d569-d722-4e61-a4eb-e05e8bfe3927-kube-api-access-czz97\") pod \"redhat-operators-zjndd\" (UID: \"fcb4d569-d722-4e61-a4eb-e05e8bfe3927\") " pod="openshift-marketplace/redhat-operators-zjndd" Sep 30 07:57:27 crc kubenswrapper[4691]: I0930 07:57:27.228192 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcb4d569-d722-4e61-a4eb-e05e8bfe3927-catalog-content\") pod \"redhat-operators-zjndd\" (UID: \"fcb4d569-d722-4e61-a4eb-e05e8bfe3927\") " pod="openshift-marketplace/redhat-operators-zjndd" Sep 30 07:57:27 crc kubenswrapper[4691]: I0930 07:57:27.228220 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcb4d569-d722-4e61-a4eb-e05e8bfe3927-utilities\") pod \"redhat-operators-zjndd\" (UID: \"fcb4d569-d722-4e61-a4eb-e05e8bfe3927\") " pod="openshift-marketplace/redhat-operators-zjndd" Sep 30 07:57:27 crc kubenswrapper[4691]: I0930 07:57:27.228727 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcb4d569-d722-4e61-a4eb-e05e8bfe3927-catalog-content\") pod \"redhat-operators-zjndd\" (UID: \"fcb4d569-d722-4e61-a4eb-e05e8bfe3927\") " pod="openshift-marketplace/redhat-operators-zjndd" Sep 30 07:57:27 crc kubenswrapper[4691]: I0930 07:57:27.228757 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcb4d569-d722-4e61-a4eb-e05e8bfe3927-utilities\") pod \"redhat-operators-zjndd\" (UID: \"fcb4d569-d722-4e61-a4eb-e05e8bfe3927\") " pod="openshift-marketplace/redhat-operators-zjndd" Sep 30 07:57:27 crc kubenswrapper[4691]: I0930 07:57:27.251599 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czz97\" (UniqueName: \"kubernetes.io/projected/fcb4d569-d722-4e61-a4eb-e05e8bfe3927-kube-api-access-czz97\") pod \"redhat-operators-zjndd\" (UID: \"fcb4d569-d722-4e61-a4eb-e05e8bfe3927\") " pod="openshift-marketplace/redhat-operators-zjndd" Sep 30 07:57:27 crc kubenswrapper[4691]: I0930 07:57:27.403101 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zjndd" Sep 30 07:57:27 crc kubenswrapper[4691]: I0930 07:57:27.935992 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zjndd"] Sep 30 07:57:28 crc kubenswrapper[4691]: I0930 07:57:28.027629 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zjndd" event={"ID":"fcb4d569-d722-4e61-a4eb-e05e8bfe3927","Type":"ContainerStarted","Data":"25c03dcd943233b12434f460114bda82e6cb17a37ac0f9042faf8f1a868d21f9"} Sep 30 07:57:29 crc kubenswrapper[4691]: I0930 07:57:29.044999 4691 generic.go:334] "Generic (PLEG): container finished" podID="fcb4d569-d722-4e61-a4eb-e05e8bfe3927" containerID="2a43c984fe76f8be7a75c430b1b1845b52cea38d0529a13534a520c789218e04" exitCode=0 Sep 30 07:57:29 crc kubenswrapper[4691]: I0930 07:57:29.045049 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zjndd" event={"ID":"fcb4d569-d722-4e61-a4eb-e05e8bfe3927","Type":"ContainerDied","Data":"2a43c984fe76f8be7a75c430b1b1845b52cea38d0529a13534a520c789218e04"} Sep 30 07:57:30 crc kubenswrapper[4691]: I0930 07:57:30.092331 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-9gqqq"] Sep 30 07:57:30 crc kubenswrapper[4691]: I0930 07:57:30.095554 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9gqqq" Sep 30 07:57:30 crc kubenswrapper[4691]: I0930 07:57:30.107865 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9gqqq"] Sep 30 07:57:30 crc kubenswrapper[4691]: I0930 07:57:30.198012 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca1e0f85-0060-40b4-9ff8-fdb2573fdaab-catalog-content\") pod \"community-operators-9gqqq\" (UID: \"ca1e0f85-0060-40b4-9ff8-fdb2573fdaab\") " pod="openshift-marketplace/community-operators-9gqqq" Sep 30 07:57:30 crc kubenswrapper[4691]: I0930 07:57:30.198272 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hplrw\" (UniqueName: \"kubernetes.io/projected/ca1e0f85-0060-40b4-9ff8-fdb2573fdaab-kube-api-access-hplrw\") pod \"community-operators-9gqqq\" (UID: \"ca1e0f85-0060-40b4-9ff8-fdb2573fdaab\") " pod="openshift-marketplace/community-operators-9gqqq" Sep 30 07:57:30 crc kubenswrapper[4691]: I0930 07:57:30.198301 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca1e0f85-0060-40b4-9ff8-fdb2573fdaab-utilities\") pod \"community-operators-9gqqq\" (UID: \"ca1e0f85-0060-40b4-9ff8-fdb2573fdaab\") " pod="openshift-marketplace/community-operators-9gqqq" Sep 30 07:57:30 crc kubenswrapper[4691]: I0930 07:57:30.300781 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca1e0f85-0060-40b4-9ff8-fdb2573fdaab-catalog-content\") pod \"community-operators-9gqqq\" (UID: \"ca1e0f85-0060-40b4-9ff8-fdb2573fdaab\") " pod="openshift-marketplace/community-operators-9gqqq" Sep 30 07:57:30 crc kubenswrapper[4691]: I0930 07:57:30.300977 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hplrw\" (UniqueName: \"kubernetes.io/projected/ca1e0f85-0060-40b4-9ff8-fdb2573fdaab-kube-api-access-hplrw\") pod \"community-operators-9gqqq\" (UID: \"ca1e0f85-0060-40b4-9ff8-fdb2573fdaab\") " pod="openshift-marketplace/community-operators-9gqqq" Sep 30 07:57:30 crc kubenswrapper[4691]: I0930 07:57:30.301005 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca1e0f85-0060-40b4-9ff8-fdb2573fdaab-utilities\") pod \"community-operators-9gqqq\" (UID: \"ca1e0f85-0060-40b4-9ff8-fdb2573fdaab\") " pod="openshift-marketplace/community-operators-9gqqq" Sep 30 07:57:30 crc kubenswrapper[4691]: I0930 07:57:30.301579 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca1e0f85-0060-40b4-9ff8-fdb2573fdaab-catalog-content\") pod \"community-operators-9gqqq\" (UID: \"ca1e0f85-0060-40b4-9ff8-fdb2573fdaab\") " pod="openshift-marketplace/community-operators-9gqqq" Sep 30 07:57:30 crc kubenswrapper[4691]: I0930 07:57:30.301841 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca1e0f85-0060-40b4-9ff8-fdb2573fdaab-utilities\") pod \"community-operators-9gqqq\" (UID: \"ca1e0f85-0060-40b4-9ff8-fdb2573fdaab\") " pod="openshift-marketplace/community-operators-9gqqq" Sep 30 07:57:30 crc kubenswrapper[4691]: I0930 07:57:30.322216 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hplrw\" (UniqueName: \"kubernetes.io/projected/ca1e0f85-0060-40b4-9ff8-fdb2573fdaab-kube-api-access-hplrw\") pod \"community-operators-9gqqq\" (UID: \"ca1e0f85-0060-40b4-9ff8-fdb2573fdaab\") " pod="openshift-marketplace/community-operators-9gqqq" Sep 30 07:57:30 crc kubenswrapper[4691]: I0930 07:57:30.417291 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9gqqq" Sep 30 07:57:30 crc kubenswrapper[4691]: I0930 07:57:30.955259 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-9gqqq"] Sep 30 07:57:30 crc kubenswrapper[4691]: W0930 07:57:30.961837 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca1e0f85_0060_40b4_9ff8_fdb2573fdaab.slice/crio-9adddd86450930895deb814b553d649f2f5994fe8225fda440f93848538a64a4 WatchSource:0}: Error finding container 9adddd86450930895deb814b553d649f2f5994fe8225fda440f93848538a64a4: Status 404 returned error can't find the container with id 9adddd86450930895deb814b553d649f2f5994fe8225fda440f93848538a64a4 Sep 30 07:57:31 crc kubenswrapper[4691]: I0930 07:57:31.073011 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9gqqq" event={"ID":"ca1e0f85-0060-40b4-9ff8-fdb2573fdaab","Type":"ContainerStarted","Data":"9adddd86450930895deb814b553d649f2f5994fe8225fda440f93848538a64a4"} Sep 30 07:57:32 crc kubenswrapper[4691]: I0930 07:57:32.084649 4691 generic.go:334] "Generic (PLEG): container finished" podID="ca1e0f85-0060-40b4-9ff8-fdb2573fdaab" containerID="769eb708f5c4c3baf99f9ed40a6df72d89c327e0275ef32f513fd816f0adaf38" exitCode=0 Sep 30 07:57:32 crc kubenswrapper[4691]: I0930 07:57:32.084737 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9gqqq" event={"ID":"ca1e0f85-0060-40b4-9ff8-fdb2573fdaab","Type":"ContainerDied","Data":"769eb708f5c4c3baf99f9ed40a6df72d89c327e0275ef32f513fd816f0adaf38"} Sep 30 07:57:34 crc kubenswrapper[4691]: I0930 07:57:34.108322 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9gqqq" event={"ID":"ca1e0f85-0060-40b4-9ff8-fdb2573fdaab","Type":"ContainerStarted","Data":"82475612db2b436ddb4675ee45c6616d96431ee4be999d2d1d4ba3134c947fca"} Sep 30 07:57:35 crc kubenswrapper[4691]: I0930 07:57:35.124649 4691 generic.go:334] "Generic (PLEG): container finished" podID="ca1e0f85-0060-40b4-9ff8-fdb2573fdaab" containerID="82475612db2b436ddb4675ee45c6616d96431ee4be999d2d1d4ba3134c947fca" exitCode=0 Sep 30 07:57:35 crc kubenswrapper[4691]: I0930 07:57:35.124759 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9gqqq" event={"ID":"ca1e0f85-0060-40b4-9ff8-fdb2573fdaab","Type":"ContainerDied","Data":"82475612db2b436ddb4675ee45c6616d96431ee4be999d2d1d4ba3134c947fca"} Sep 30 07:57:38 crc kubenswrapper[4691]: I0930 07:57:38.489845 4691 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Sep 30 07:57:39 crc kubenswrapper[4691]: I0930 07:57:39.182835 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zjndd" event={"ID":"fcb4d569-d722-4e61-a4eb-e05e8bfe3927","Type":"ContainerStarted","Data":"23db1749d5c6be8c7d64f376fe9f214afa621c6c0aafde62f9af3d532495a113"} Sep 30 07:57:40 crc kubenswrapper[4691]: I0930 07:57:40.196784 4691 generic.go:334] "Generic (PLEG): container finished" podID="fcb4d569-d722-4e61-a4eb-e05e8bfe3927" containerID="23db1749d5c6be8c7d64f376fe9f214afa621c6c0aafde62f9af3d532495a113" exitCode=0 Sep 30 07:57:40 crc kubenswrapper[4691]: I0930 07:57:40.196860 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zjndd" event={"ID":"fcb4d569-d722-4e61-a4eb-e05e8bfe3927","Type":"ContainerDied","Data":"23db1749d5c6be8c7d64f376fe9f214afa621c6c0aafde62f9af3d532495a113"} Sep 30 07:57:40 crc kubenswrapper[4691]: I0930 07:57:40.202636 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9gqqq" event={"ID":"ca1e0f85-0060-40b4-9ff8-fdb2573fdaab","Type":"ContainerStarted","Data":"5a99d33905d64610f40669d0a4ad390e3f5974ad29f9ef2bbb996ef1295db38f"} Sep 30 07:57:40 crc kubenswrapper[4691]: I0930 07:57:40.250468 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-9gqqq" podStartSLOduration=3.33835497 podStartE2EDuration="10.250450334s" podCreationTimestamp="2025-09-30 07:57:30 +0000 UTC" firstStartedPulling="2025-09-30 07:57:32.086848014 +0000 UTC m=+5895.561869064" lastFinishedPulling="2025-09-30 07:57:38.998943378 +0000 UTC m=+5902.473964428" observedRunningTime="2025-09-30 07:57:40.244107242 +0000 UTC m=+5903.719128342" watchObservedRunningTime="2025-09-30 07:57:40.250450334 +0000 UTC m=+5903.725471374" Sep 30 07:57:40 crc kubenswrapper[4691]: I0930 07:57:40.418543 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-9gqqq" Sep 30 07:57:40 crc kubenswrapper[4691]: I0930 07:57:40.418587 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-9gqqq" Sep 30 07:57:41 crc kubenswrapper[4691]: I0930 07:57:41.473327 4691 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-9gqqq" podUID="ca1e0f85-0060-40b4-9ff8-fdb2573fdaab" containerName="registry-server" probeResult="failure" output=< Sep 30 07:57:41 crc kubenswrapper[4691]: timeout: failed to connect service ":50051" within 1s Sep 30 07:57:41 crc kubenswrapper[4691]: > Sep 30 07:57:42 crc kubenswrapper[4691]: I0930 07:57:42.226990 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zjndd" event={"ID":"fcb4d569-d722-4e61-a4eb-e05e8bfe3927","Type":"ContainerStarted","Data":"a184f4d91794479c59680422923cee1802a8af9f9a69ba02c2d34eaf494c48d1"} Sep 30 07:57:42 crc kubenswrapper[4691]: I0930 07:57:42.251444 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zjndd" podStartSLOduration=2.905095563 podStartE2EDuration="15.25142048s" podCreationTimestamp="2025-09-30 07:57:27 +0000 UTC" firstStartedPulling="2025-09-30 07:57:29.047250796 +0000 UTC m=+5892.522271836" lastFinishedPulling="2025-09-30 07:57:41.393575713 +0000 UTC m=+5904.868596753" observedRunningTime="2025-09-30 07:57:42.248074923 +0000 UTC m=+5905.723096023" watchObservedRunningTime="2025-09-30 07:57:42.25142048 +0000 UTC m=+5905.726441550" Sep 30 07:57:47 crc kubenswrapper[4691]: I0930 07:57:47.403261 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zjndd" Sep 30 07:57:47 crc kubenswrapper[4691]: I0930 07:57:47.403815 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zjndd" Sep 30 07:57:47 crc kubenswrapper[4691]: I0930 07:57:47.472123 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zjndd" Sep 30 07:57:48 crc kubenswrapper[4691]: I0930 07:57:48.381593 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zjndd" Sep 30 07:57:48 crc kubenswrapper[4691]: I0930 07:57:48.496849 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zjndd"] Sep 30 07:57:48 crc kubenswrapper[4691]: I0930 07:57:48.521406 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xbbxj"] Sep 30 07:57:48 crc kubenswrapper[4691]: I0930 07:57:48.521699 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xbbxj" podUID="6366c530-63f4-4f81-b0eb-b4db91578068" containerName="registry-server" containerID="cri-o://10a65d458860ed3b8bb6fb11740202bb4aa58bedccae5d865e82f44abe165d0a" gracePeriod=2 Sep 30 07:57:49 crc kubenswrapper[4691]: I0930 07:57:49.038559 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xbbxj" Sep 30 07:57:49 crc kubenswrapper[4691]: I0930 07:57:49.136285 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6366c530-63f4-4f81-b0eb-b4db91578068-catalog-content\") pod \"6366c530-63f4-4f81-b0eb-b4db91578068\" (UID: \"6366c530-63f4-4f81-b0eb-b4db91578068\") " Sep 30 07:57:49 crc kubenswrapper[4691]: I0930 07:57:49.136451 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vj6zp\" (UniqueName: \"kubernetes.io/projected/6366c530-63f4-4f81-b0eb-b4db91578068-kube-api-access-vj6zp\") pod \"6366c530-63f4-4f81-b0eb-b4db91578068\" (UID: \"6366c530-63f4-4f81-b0eb-b4db91578068\") " Sep 30 07:57:49 crc kubenswrapper[4691]: I0930 07:57:49.136483 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6366c530-63f4-4f81-b0eb-b4db91578068-utilities\") pod \"6366c530-63f4-4f81-b0eb-b4db91578068\" (UID: \"6366c530-63f4-4f81-b0eb-b4db91578068\") " Sep 30 07:57:49 crc kubenswrapper[4691]: I0930 07:57:49.137383 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6366c530-63f4-4f81-b0eb-b4db91578068-utilities" (OuterVolumeSpecName: "utilities") pod "6366c530-63f4-4f81-b0eb-b4db91578068" (UID: "6366c530-63f4-4f81-b0eb-b4db91578068"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:57:49 crc kubenswrapper[4691]: I0930 07:57:49.144196 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6366c530-63f4-4f81-b0eb-b4db91578068-kube-api-access-vj6zp" (OuterVolumeSpecName: "kube-api-access-vj6zp") pod "6366c530-63f4-4f81-b0eb-b4db91578068" (UID: "6366c530-63f4-4f81-b0eb-b4db91578068"). InnerVolumeSpecName "kube-api-access-vj6zp". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:57:49 crc kubenswrapper[4691]: I0930 07:57:49.238571 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vj6zp\" (UniqueName: \"kubernetes.io/projected/6366c530-63f4-4f81-b0eb-b4db91578068-kube-api-access-vj6zp\") on node \"crc\" DevicePath \"\"" Sep 30 07:57:49 crc kubenswrapper[4691]: I0930 07:57:49.238792 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6366c530-63f4-4f81-b0eb-b4db91578068-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 07:57:49 crc kubenswrapper[4691]: I0930 07:57:49.242293 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6366c530-63f4-4f81-b0eb-b4db91578068-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6366c530-63f4-4f81-b0eb-b4db91578068" (UID: "6366c530-63f4-4f81-b0eb-b4db91578068"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:57:49 crc kubenswrapper[4691]: I0930 07:57:49.322961 4691 generic.go:334] "Generic (PLEG): container finished" podID="6366c530-63f4-4f81-b0eb-b4db91578068" containerID="10a65d458860ed3b8bb6fb11740202bb4aa58bedccae5d865e82f44abe165d0a" exitCode=0 Sep 30 07:57:49 crc kubenswrapper[4691]: I0930 07:57:49.323257 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xbbxj" Sep 30 07:57:49 crc kubenswrapper[4691]: I0930 07:57:49.323267 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xbbxj" event={"ID":"6366c530-63f4-4f81-b0eb-b4db91578068","Type":"ContainerDied","Data":"10a65d458860ed3b8bb6fb11740202bb4aa58bedccae5d865e82f44abe165d0a"} Sep 30 07:57:49 crc kubenswrapper[4691]: I0930 07:57:49.323453 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xbbxj" event={"ID":"6366c530-63f4-4f81-b0eb-b4db91578068","Type":"ContainerDied","Data":"c3f110b5b89a9a1b6bc7b0c4dda4da315ca4be51d6a0fdf75770c6a346d39665"} Sep 30 07:57:49 crc kubenswrapper[4691]: I0930 07:57:49.323474 4691 scope.go:117] "RemoveContainer" containerID="10a65d458860ed3b8bb6fb11740202bb4aa58bedccae5d865e82f44abe165d0a" Sep 30 07:57:49 crc kubenswrapper[4691]: I0930 07:57:49.340629 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6366c530-63f4-4f81-b0eb-b4db91578068-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 07:57:49 crc kubenswrapper[4691]: I0930 07:57:49.343911 4691 scope.go:117] "RemoveContainer" containerID="7f9b1227e638e5bfb7b05323fc4627f51135b1af60fa694ad847c30811193111" Sep 30 07:57:49 crc kubenswrapper[4691]: I0930 07:57:49.373106 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xbbxj"] Sep 30 07:57:49 crc kubenswrapper[4691]: I0930 07:57:49.377778 4691 scope.go:117] "RemoveContainer" containerID="a8430e7e4e6f2dd4173c74a3ec78264a2aa6b79d397b0461968df6b9aa8a4018" Sep 30 07:57:49 crc kubenswrapper[4691]: I0930 07:57:49.386502 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xbbxj"] Sep 30 07:57:49 crc kubenswrapper[4691]: I0930 07:57:49.416077 4691 scope.go:117] "RemoveContainer" containerID="10a65d458860ed3b8bb6fb11740202bb4aa58bedccae5d865e82f44abe165d0a" Sep 30 07:57:49 crc kubenswrapper[4691]: E0930 07:57:49.416664 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10a65d458860ed3b8bb6fb11740202bb4aa58bedccae5d865e82f44abe165d0a\": container with ID starting with 10a65d458860ed3b8bb6fb11740202bb4aa58bedccae5d865e82f44abe165d0a not found: ID does not exist" containerID="10a65d458860ed3b8bb6fb11740202bb4aa58bedccae5d865e82f44abe165d0a" Sep 30 07:57:49 crc kubenswrapper[4691]: I0930 07:57:49.416695 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10a65d458860ed3b8bb6fb11740202bb4aa58bedccae5d865e82f44abe165d0a"} err="failed to get container status \"10a65d458860ed3b8bb6fb11740202bb4aa58bedccae5d865e82f44abe165d0a\": rpc error: code = NotFound desc = could not find container \"10a65d458860ed3b8bb6fb11740202bb4aa58bedccae5d865e82f44abe165d0a\": container with ID starting with 10a65d458860ed3b8bb6fb11740202bb4aa58bedccae5d865e82f44abe165d0a not found: ID does not exist" Sep 30 07:57:49 crc kubenswrapper[4691]: I0930 07:57:49.416716 4691 scope.go:117] "RemoveContainer" containerID="7f9b1227e638e5bfb7b05323fc4627f51135b1af60fa694ad847c30811193111" Sep 30 07:57:49 crc kubenswrapper[4691]: E0930 07:57:49.418320 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f9b1227e638e5bfb7b05323fc4627f51135b1af60fa694ad847c30811193111\": container with ID starting with 7f9b1227e638e5bfb7b05323fc4627f51135b1af60fa694ad847c30811193111 not found: ID does not exist" containerID="7f9b1227e638e5bfb7b05323fc4627f51135b1af60fa694ad847c30811193111" Sep 30 07:57:49 crc kubenswrapper[4691]: I0930 07:57:49.418370 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f9b1227e638e5bfb7b05323fc4627f51135b1af60fa694ad847c30811193111"} err="failed to get container status \"7f9b1227e638e5bfb7b05323fc4627f51135b1af60fa694ad847c30811193111\": rpc error: code = NotFound desc = could not find container \"7f9b1227e638e5bfb7b05323fc4627f51135b1af60fa694ad847c30811193111\": container with ID starting with 7f9b1227e638e5bfb7b05323fc4627f51135b1af60fa694ad847c30811193111 not found: ID does not exist" Sep 30 07:57:49 crc kubenswrapper[4691]: I0930 07:57:49.418406 4691 scope.go:117] "RemoveContainer" containerID="a8430e7e4e6f2dd4173c74a3ec78264a2aa6b79d397b0461968df6b9aa8a4018" Sep 30 07:57:49 crc kubenswrapper[4691]: E0930 07:57:49.418847 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8430e7e4e6f2dd4173c74a3ec78264a2aa6b79d397b0461968df6b9aa8a4018\": container with ID starting with a8430e7e4e6f2dd4173c74a3ec78264a2aa6b79d397b0461968df6b9aa8a4018 not found: ID does not exist" containerID="a8430e7e4e6f2dd4173c74a3ec78264a2aa6b79d397b0461968df6b9aa8a4018" Sep 30 07:57:49 crc kubenswrapper[4691]: I0930 07:57:49.419041 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8430e7e4e6f2dd4173c74a3ec78264a2aa6b79d397b0461968df6b9aa8a4018"} err="failed to get container status \"a8430e7e4e6f2dd4173c74a3ec78264a2aa6b79d397b0461968df6b9aa8a4018\": rpc error: code = NotFound desc = could not find container \"a8430e7e4e6f2dd4173c74a3ec78264a2aa6b79d397b0461968df6b9aa8a4018\": container with ID starting with a8430e7e4e6f2dd4173c74a3ec78264a2aa6b79d397b0461968df6b9aa8a4018 not found: ID does not exist" Sep 30 07:57:49 crc kubenswrapper[4691]: E0930 07:57:49.589623 4691 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6366c530_63f4_4f81_b0eb_b4db91578068.slice/crio-c3f110b5b89a9a1b6bc7b0c4dda4da315ca4be51d6a0fdf75770c6a346d39665\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6366c530_63f4_4f81_b0eb_b4db91578068.slice\": RecentStats: unable to find data in memory cache]" Sep 30 07:57:50 crc kubenswrapper[4691]: I0930 07:57:50.513839 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-9gqqq" Sep 30 07:57:50 crc kubenswrapper[4691]: I0930 07:57:50.570173 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-9gqqq" Sep 30 07:57:51 crc kubenswrapper[4691]: I0930 07:57:51.235975 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6366c530-63f4-4f81-b0eb-b4db91578068" path="/var/lib/kubelet/pods/6366c530-63f4-4f81-b0eb-b4db91578068/volumes" Sep 30 07:57:52 crc kubenswrapper[4691]: I0930 07:57:52.917344 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9gqqq"] Sep 30 07:57:52 crc kubenswrapper[4691]: I0930 07:57:52.918013 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-9gqqq" podUID="ca1e0f85-0060-40b4-9ff8-fdb2573fdaab" containerName="registry-server" containerID="cri-o://5a99d33905d64610f40669d0a4ad390e3f5974ad29f9ef2bbb996ef1295db38f" gracePeriod=2 Sep 30 07:57:53 crc kubenswrapper[4691]: I0930 07:57:53.385269 4691 generic.go:334] "Generic (PLEG): container finished" podID="ca1e0f85-0060-40b4-9ff8-fdb2573fdaab" containerID="5a99d33905d64610f40669d0a4ad390e3f5974ad29f9ef2bbb996ef1295db38f" exitCode=0 Sep 30 07:57:53 crc kubenswrapper[4691]: I0930 07:57:53.385529 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9gqqq" event={"ID":"ca1e0f85-0060-40b4-9ff8-fdb2573fdaab","Type":"ContainerDied","Data":"5a99d33905d64610f40669d0a4ad390e3f5974ad29f9ef2bbb996ef1295db38f"} Sep 30 07:57:53 crc kubenswrapper[4691]: I0930 07:57:53.385557 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-9gqqq" event={"ID":"ca1e0f85-0060-40b4-9ff8-fdb2573fdaab","Type":"ContainerDied","Data":"9adddd86450930895deb814b553d649f2f5994fe8225fda440f93848538a64a4"} Sep 30 07:57:53 crc kubenswrapper[4691]: I0930 07:57:53.385567 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9adddd86450930895deb814b553d649f2f5994fe8225fda440f93848538a64a4" Sep 30 07:57:53 crc kubenswrapper[4691]: I0930 07:57:53.460912 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9gqqq" Sep 30 07:57:53 crc kubenswrapper[4691]: I0930 07:57:53.628278 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca1e0f85-0060-40b4-9ff8-fdb2573fdaab-utilities\") pod \"ca1e0f85-0060-40b4-9ff8-fdb2573fdaab\" (UID: \"ca1e0f85-0060-40b4-9ff8-fdb2573fdaab\") " Sep 30 07:57:53 crc kubenswrapper[4691]: I0930 07:57:53.628348 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca1e0f85-0060-40b4-9ff8-fdb2573fdaab-catalog-content\") pod \"ca1e0f85-0060-40b4-9ff8-fdb2573fdaab\" (UID: \"ca1e0f85-0060-40b4-9ff8-fdb2573fdaab\") " Sep 30 07:57:53 crc kubenswrapper[4691]: I0930 07:57:53.628412 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hplrw\" (UniqueName: \"kubernetes.io/projected/ca1e0f85-0060-40b4-9ff8-fdb2573fdaab-kube-api-access-hplrw\") pod \"ca1e0f85-0060-40b4-9ff8-fdb2573fdaab\" (UID: \"ca1e0f85-0060-40b4-9ff8-fdb2573fdaab\") " Sep 30 07:57:53 crc kubenswrapper[4691]: I0930 07:57:53.630160 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca1e0f85-0060-40b4-9ff8-fdb2573fdaab-utilities" (OuterVolumeSpecName: "utilities") pod "ca1e0f85-0060-40b4-9ff8-fdb2573fdaab" (UID: "ca1e0f85-0060-40b4-9ff8-fdb2573fdaab"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:57:53 crc kubenswrapper[4691]: I0930 07:57:53.636478 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca1e0f85-0060-40b4-9ff8-fdb2573fdaab-kube-api-access-hplrw" (OuterVolumeSpecName: "kube-api-access-hplrw") pod "ca1e0f85-0060-40b4-9ff8-fdb2573fdaab" (UID: "ca1e0f85-0060-40b4-9ff8-fdb2573fdaab"). InnerVolumeSpecName "kube-api-access-hplrw". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:57:53 crc kubenswrapper[4691]: I0930 07:57:53.691911 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ca1e0f85-0060-40b4-9ff8-fdb2573fdaab-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ca1e0f85-0060-40b4-9ff8-fdb2573fdaab" (UID: "ca1e0f85-0060-40b4-9ff8-fdb2573fdaab"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:57:53 crc kubenswrapper[4691]: I0930 07:57:53.731211 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ca1e0f85-0060-40b4-9ff8-fdb2573fdaab-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 07:57:53 crc kubenswrapper[4691]: I0930 07:57:53.731273 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hplrw\" (UniqueName: \"kubernetes.io/projected/ca1e0f85-0060-40b4-9ff8-fdb2573fdaab-kube-api-access-hplrw\") on node \"crc\" DevicePath \"\"" Sep 30 07:57:53 crc kubenswrapper[4691]: I0930 07:57:53.731285 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ca1e0f85-0060-40b4-9ff8-fdb2573fdaab-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 07:57:54 crc kubenswrapper[4691]: I0930 07:57:54.420412 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-9gqqq" Sep 30 07:57:54 crc kubenswrapper[4691]: I0930 07:57:54.479931 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-9gqqq"] Sep 30 07:57:54 crc kubenswrapper[4691]: I0930 07:57:54.521587 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-9gqqq"] Sep 30 07:57:55 crc kubenswrapper[4691]: I0930 07:57:55.257089 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca1e0f85-0060-40b4-9ff8-fdb2573fdaab" path="/var/lib/kubelet/pods/ca1e0f85-0060-40b4-9ff8-fdb2573fdaab/volumes" Sep 30 07:57:57 crc kubenswrapper[4691]: I0930 07:57:57.120072 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-j66hp"] Sep 30 07:57:57 crc kubenswrapper[4691]: E0930 07:57:57.120753 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca1e0f85-0060-40b4-9ff8-fdb2573fdaab" containerName="extract-content" Sep 30 07:57:57 crc kubenswrapper[4691]: I0930 07:57:57.120767 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca1e0f85-0060-40b4-9ff8-fdb2573fdaab" containerName="extract-content" Sep 30 07:57:57 crc kubenswrapper[4691]: E0930 07:57:57.120780 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6366c530-63f4-4f81-b0eb-b4db91578068" containerName="extract-utilities" Sep 30 07:57:57 crc kubenswrapper[4691]: I0930 07:57:57.120786 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="6366c530-63f4-4f81-b0eb-b4db91578068" containerName="extract-utilities" Sep 30 07:57:57 crc kubenswrapper[4691]: E0930 07:57:57.120811 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6366c530-63f4-4f81-b0eb-b4db91578068" containerName="registry-server" Sep 30 07:57:57 crc kubenswrapper[4691]: I0930 07:57:57.120817 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="6366c530-63f4-4f81-b0eb-b4db91578068" containerName="registry-server" Sep 30 07:57:57 crc kubenswrapper[4691]: E0930 07:57:57.120836 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca1e0f85-0060-40b4-9ff8-fdb2573fdaab" containerName="registry-server" Sep 30 07:57:57 crc kubenswrapper[4691]: I0930 07:57:57.120841 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca1e0f85-0060-40b4-9ff8-fdb2573fdaab" containerName="registry-server" Sep 30 07:57:57 crc kubenswrapper[4691]: E0930 07:57:57.120861 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca1e0f85-0060-40b4-9ff8-fdb2573fdaab" containerName="extract-utilities" Sep 30 07:57:57 crc kubenswrapper[4691]: I0930 07:57:57.120867 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca1e0f85-0060-40b4-9ff8-fdb2573fdaab" containerName="extract-utilities" Sep 30 07:57:57 crc kubenswrapper[4691]: E0930 07:57:57.120879 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6366c530-63f4-4f81-b0eb-b4db91578068" containerName="extract-content" Sep 30 07:57:57 crc kubenswrapper[4691]: I0930 07:57:57.121040 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="6366c530-63f4-4f81-b0eb-b4db91578068" containerName="extract-content" Sep 30 07:57:57 crc kubenswrapper[4691]: I0930 07:57:57.121231 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="6366c530-63f4-4f81-b0eb-b4db91578068" containerName="registry-server" Sep 30 07:57:57 crc kubenswrapper[4691]: I0930 07:57:57.121253 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca1e0f85-0060-40b4-9ff8-fdb2573fdaab" containerName="registry-server" Sep 30 07:57:57 crc kubenswrapper[4691]: I0930 07:57:57.122625 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j66hp" Sep 30 07:57:57 crc kubenswrapper[4691]: I0930 07:57:57.159164 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j66hp"] Sep 30 07:57:57 crc kubenswrapper[4691]: I0930 07:57:57.310959 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kqkc\" (UniqueName: \"kubernetes.io/projected/29da4d7e-405d-4b46-b83a-dd9d82b2d51d-kube-api-access-5kqkc\") pod \"redhat-marketplace-j66hp\" (UID: \"29da4d7e-405d-4b46-b83a-dd9d82b2d51d\") " pod="openshift-marketplace/redhat-marketplace-j66hp" Sep 30 07:57:57 crc kubenswrapper[4691]: I0930 07:57:57.311393 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29da4d7e-405d-4b46-b83a-dd9d82b2d51d-catalog-content\") pod \"redhat-marketplace-j66hp\" (UID: \"29da4d7e-405d-4b46-b83a-dd9d82b2d51d\") " pod="openshift-marketplace/redhat-marketplace-j66hp" Sep 30 07:57:57 crc kubenswrapper[4691]: I0930 07:57:57.311670 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29da4d7e-405d-4b46-b83a-dd9d82b2d51d-utilities\") pod \"redhat-marketplace-j66hp\" (UID: \"29da4d7e-405d-4b46-b83a-dd9d82b2d51d\") " pod="openshift-marketplace/redhat-marketplace-j66hp" Sep 30 07:57:57 crc kubenswrapper[4691]: I0930 07:57:57.414190 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kqkc\" (UniqueName: \"kubernetes.io/projected/29da4d7e-405d-4b46-b83a-dd9d82b2d51d-kube-api-access-5kqkc\") pod \"redhat-marketplace-j66hp\" (UID: \"29da4d7e-405d-4b46-b83a-dd9d82b2d51d\") " pod="openshift-marketplace/redhat-marketplace-j66hp" Sep 30 07:57:57 crc kubenswrapper[4691]: I0930 07:57:57.414372 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29da4d7e-405d-4b46-b83a-dd9d82b2d51d-catalog-content\") pod \"redhat-marketplace-j66hp\" (UID: \"29da4d7e-405d-4b46-b83a-dd9d82b2d51d\") " pod="openshift-marketplace/redhat-marketplace-j66hp" Sep 30 07:57:57 crc kubenswrapper[4691]: I0930 07:57:57.414443 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29da4d7e-405d-4b46-b83a-dd9d82b2d51d-utilities\") pod \"redhat-marketplace-j66hp\" (UID: \"29da4d7e-405d-4b46-b83a-dd9d82b2d51d\") " pod="openshift-marketplace/redhat-marketplace-j66hp" Sep 30 07:57:57 crc kubenswrapper[4691]: I0930 07:57:57.414859 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29da4d7e-405d-4b46-b83a-dd9d82b2d51d-utilities\") pod \"redhat-marketplace-j66hp\" (UID: \"29da4d7e-405d-4b46-b83a-dd9d82b2d51d\") " pod="openshift-marketplace/redhat-marketplace-j66hp" Sep 30 07:57:57 crc kubenswrapper[4691]: I0930 07:57:57.415058 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29da4d7e-405d-4b46-b83a-dd9d82b2d51d-catalog-content\") pod \"redhat-marketplace-j66hp\" (UID: \"29da4d7e-405d-4b46-b83a-dd9d82b2d51d\") " pod="openshift-marketplace/redhat-marketplace-j66hp" Sep 30 07:57:57 crc kubenswrapper[4691]: I0930 07:57:57.450925 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kqkc\" (UniqueName: \"kubernetes.io/projected/29da4d7e-405d-4b46-b83a-dd9d82b2d51d-kube-api-access-5kqkc\") pod \"redhat-marketplace-j66hp\" (UID: \"29da4d7e-405d-4b46-b83a-dd9d82b2d51d\") " pod="openshift-marketplace/redhat-marketplace-j66hp" Sep 30 07:57:57 crc kubenswrapper[4691]: I0930 07:57:57.454521 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j66hp" Sep 30 07:57:57 crc kubenswrapper[4691]: I0930 07:57:57.907692 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-j66hp"] Sep 30 07:57:58 crc kubenswrapper[4691]: I0930 07:57:58.476142 4691 generic.go:334] "Generic (PLEG): container finished" podID="29da4d7e-405d-4b46-b83a-dd9d82b2d51d" containerID="c22290888dad8efaa5c478e7060d86ab611d24a23222dec068715b3617e84240" exitCode=0 Sep 30 07:57:58 crc kubenswrapper[4691]: I0930 07:57:58.476195 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j66hp" event={"ID":"29da4d7e-405d-4b46-b83a-dd9d82b2d51d","Type":"ContainerDied","Data":"c22290888dad8efaa5c478e7060d86ab611d24a23222dec068715b3617e84240"} Sep 30 07:57:58 crc kubenswrapper[4691]: I0930 07:57:58.476225 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j66hp" event={"ID":"29da4d7e-405d-4b46-b83a-dd9d82b2d51d","Type":"ContainerStarted","Data":"227bcb7a2155b9309900d4a4b72ecac388e9d84a9086948a9a0228394e465ca3"} Sep 30 07:57:59 crc kubenswrapper[4691]: I0930 07:57:59.492445 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j66hp" event={"ID":"29da4d7e-405d-4b46-b83a-dd9d82b2d51d","Type":"ContainerStarted","Data":"a4db3a035e3d43f501212e549049e4581ca88926205146645036292abe8c93fa"} Sep 30 07:58:00 crc kubenswrapper[4691]: I0930 07:58:00.511630 4691 generic.go:334] "Generic (PLEG): container finished" podID="29da4d7e-405d-4b46-b83a-dd9d82b2d51d" containerID="a4db3a035e3d43f501212e549049e4581ca88926205146645036292abe8c93fa" exitCode=0 Sep 30 07:58:00 crc kubenswrapper[4691]: I0930 07:58:00.511731 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j66hp" event={"ID":"29da4d7e-405d-4b46-b83a-dd9d82b2d51d","Type":"ContainerDied","Data":"a4db3a035e3d43f501212e549049e4581ca88926205146645036292abe8c93fa"} Sep 30 07:58:01 crc kubenswrapper[4691]: I0930 07:58:01.523605 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j66hp" event={"ID":"29da4d7e-405d-4b46-b83a-dd9d82b2d51d","Type":"ContainerStarted","Data":"036c6fdc876c3b6d7ac706e13da266541a3edb221246963c033e70a81882ebe0"} Sep 30 07:58:01 crc kubenswrapper[4691]: I0930 07:58:01.547546 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-j66hp" podStartSLOduration=2.100367181 podStartE2EDuration="4.547525426s" podCreationTimestamp="2025-09-30 07:57:57 +0000 UTC" firstStartedPulling="2025-09-30 07:57:58.478791876 +0000 UTC m=+5921.953812956" lastFinishedPulling="2025-09-30 07:58:00.925950161 +0000 UTC m=+5924.400971201" observedRunningTime="2025-09-30 07:58:01.547138463 +0000 UTC m=+5925.022159533" watchObservedRunningTime="2025-09-30 07:58:01.547525426 +0000 UTC m=+5925.022546466" Sep 30 07:58:07 crc kubenswrapper[4691]: I0930 07:58:07.456181 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-j66hp" Sep 30 07:58:07 crc kubenswrapper[4691]: I0930 07:58:07.456910 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-j66hp" Sep 30 07:58:07 crc kubenswrapper[4691]: I0930 07:58:07.543026 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-j66hp" Sep 30 07:58:07 crc kubenswrapper[4691]: I0930 07:58:07.696690 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-j66hp" Sep 30 07:58:07 crc kubenswrapper[4691]: I0930 07:58:07.797637 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j66hp"] Sep 30 07:58:09 crc kubenswrapper[4691]: I0930 07:58:09.634040 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-j66hp" podUID="29da4d7e-405d-4b46-b83a-dd9d82b2d51d" containerName="registry-server" containerID="cri-o://036c6fdc876c3b6d7ac706e13da266541a3edb221246963c033e70a81882ebe0" gracePeriod=2 Sep 30 07:58:10 crc kubenswrapper[4691]: I0930 07:58:10.199761 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j66hp" Sep 30 07:58:10 crc kubenswrapper[4691]: I0930 07:58:10.312202 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5kqkc\" (UniqueName: \"kubernetes.io/projected/29da4d7e-405d-4b46-b83a-dd9d82b2d51d-kube-api-access-5kqkc\") pod \"29da4d7e-405d-4b46-b83a-dd9d82b2d51d\" (UID: \"29da4d7e-405d-4b46-b83a-dd9d82b2d51d\") " Sep 30 07:58:10 crc kubenswrapper[4691]: I0930 07:58:10.312346 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29da4d7e-405d-4b46-b83a-dd9d82b2d51d-catalog-content\") pod \"29da4d7e-405d-4b46-b83a-dd9d82b2d51d\" (UID: \"29da4d7e-405d-4b46-b83a-dd9d82b2d51d\") " Sep 30 07:58:10 crc kubenswrapper[4691]: I0930 07:58:10.312397 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29da4d7e-405d-4b46-b83a-dd9d82b2d51d-utilities\") pod \"29da4d7e-405d-4b46-b83a-dd9d82b2d51d\" (UID: \"29da4d7e-405d-4b46-b83a-dd9d82b2d51d\") " Sep 30 07:58:10 crc kubenswrapper[4691]: I0930 07:58:10.313254 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29da4d7e-405d-4b46-b83a-dd9d82b2d51d-utilities" (OuterVolumeSpecName: "utilities") pod "29da4d7e-405d-4b46-b83a-dd9d82b2d51d" (UID: "29da4d7e-405d-4b46-b83a-dd9d82b2d51d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:58:10 crc kubenswrapper[4691]: I0930 07:58:10.320857 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29da4d7e-405d-4b46-b83a-dd9d82b2d51d-kube-api-access-5kqkc" (OuterVolumeSpecName: "kube-api-access-5kqkc") pod "29da4d7e-405d-4b46-b83a-dd9d82b2d51d" (UID: "29da4d7e-405d-4b46-b83a-dd9d82b2d51d"). InnerVolumeSpecName "kube-api-access-5kqkc". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:58:10 crc kubenswrapper[4691]: I0930 07:58:10.336311 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29da4d7e-405d-4b46-b83a-dd9d82b2d51d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "29da4d7e-405d-4b46-b83a-dd9d82b2d51d" (UID: "29da4d7e-405d-4b46-b83a-dd9d82b2d51d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:58:10 crc kubenswrapper[4691]: I0930 07:58:10.414365 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5kqkc\" (UniqueName: \"kubernetes.io/projected/29da4d7e-405d-4b46-b83a-dd9d82b2d51d-kube-api-access-5kqkc\") on node \"crc\" DevicePath \"\"" Sep 30 07:58:10 crc kubenswrapper[4691]: I0930 07:58:10.414393 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29da4d7e-405d-4b46-b83a-dd9d82b2d51d-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 07:58:10 crc kubenswrapper[4691]: I0930 07:58:10.414402 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29da4d7e-405d-4b46-b83a-dd9d82b2d51d-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 07:58:10 crc kubenswrapper[4691]: I0930 07:58:10.650865 4691 generic.go:334] "Generic (PLEG): container finished" podID="29da4d7e-405d-4b46-b83a-dd9d82b2d51d" containerID="036c6fdc876c3b6d7ac706e13da266541a3edb221246963c033e70a81882ebe0" exitCode=0 Sep 30 07:58:10 crc kubenswrapper[4691]: I0930 07:58:10.651038 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-j66hp" Sep 30 07:58:10 crc kubenswrapper[4691]: I0930 07:58:10.651043 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j66hp" event={"ID":"29da4d7e-405d-4b46-b83a-dd9d82b2d51d","Type":"ContainerDied","Data":"036c6fdc876c3b6d7ac706e13da266541a3edb221246963c033e70a81882ebe0"} Sep 30 07:58:10 crc kubenswrapper[4691]: I0930 07:58:10.651534 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-j66hp" event={"ID":"29da4d7e-405d-4b46-b83a-dd9d82b2d51d","Type":"ContainerDied","Data":"227bcb7a2155b9309900d4a4b72ecac388e9d84a9086948a9a0228394e465ca3"} Sep 30 07:58:10 crc kubenswrapper[4691]: I0930 07:58:10.651573 4691 scope.go:117] "RemoveContainer" containerID="036c6fdc876c3b6d7ac706e13da266541a3edb221246963c033e70a81882ebe0" Sep 30 07:58:10 crc kubenswrapper[4691]: I0930 07:58:10.705917 4691 scope.go:117] "RemoveContainer" containerID="a4db3a035e3d43f501212e549049e4581ca88926205146645036292abe8c93fa" Sep 30 07:58:10 crc kubenswrapper[4691]: I0930 07:58:10.717607 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-j66hp"] Sep 30 07:58:10 crc kubenswrapper[4691]: I0930 07:58:10.731935 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-j66hp"] Sep 30 07:58:10 crc kubenswrapper[4691]: I0930 07:58:10.750193 4691 scope.go:117] "RemoveContainer" containerID="c22290888dad8efaa5c478e7060d86ab611d24a23222dec068715b3617e84240" Sep 30 07:58:10 crc kubenswrapper[4691]: I0930 07:58:10.793437 4691 scope.go:117] "RemoveContainer" containerID="036c6fdc876c3b6d7ac706e13da266541a3edb221246963c033e70a81882ebe0" Sep 30 07:58:10 crc kubenswrapper[4691]: E0930 07:58:10.793938 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"036c6fdc876c3b6d7ac706e13da266541a3edb221246963c033e70a81882ebe0\": container with ID starting with 036c6fdc876c3b6d7ac706e13da266541a3edb221246963c033e70a81882ebe0 not found: ID does not exist" containerID="036c6fdc876c3b6d7ac706e13da266541a3edb221246963c033e70a81882ebe0" Sep 30 07:58:10 crc kubenswrapper[4691]: I0930 07:58:10.794015 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"036c6fdc876c3b6d7ac706e13da266541a3edb221246963c033e70a81882ebe0"} err="failed to get container status \"036c6fdc876c3b6d7ac706e13da266541a3edb221246963c033e70a81882ebe0\": rpc error: code = NotFound desc = could not find container \"036c6fdc876c3b6d7ac706e13da266541a3edb221246963c033e70a81882ebe0\": container with ID starting with 036c6fdc876c3b6d7ac706e13da266541a3edb221246963c033e70a81882ebe0 not found: ID does not exist" Sep 30 07:58:10 crc kubenswrapper[4691]: I0930 07:58:10.794054 4691 scope.go:117] "RemoveContainer" containerID="a4db3a035e3d43f501212e549049e4581ca88926205146645036292abe8c93fa" Sep 30 07:58:10 crc kubenswrapper[4691]: E0930 07:58:10.794609 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4db3a035e3d43f501212e549049e4581ca88926205146645036292abe8c93fa\": container with ID starting with a4db3a035e3d43f501212e549049e4581ca88926205146645036292abe8c93fa not found: ID does not exist" containerID="a4db3a035e3d43f501212e549049e4581ca88926205146645036292abe8c93fa" Sep 30 07:58:10 crc kubenswrapper[4691]: I0930 07:58:10.794686 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4db3a035e3d43f501212e549049e4581ca88926205146645036292abe8c93fa"} err="failed to get container status \"a4db3a035e3d43f501212e549049e4581ca88926205146645036292abe8c93fa\": rpc error: code = NotFound desc = could not find container \"a4db3a035e3d43f501212e549049e4581ca88926205146645036292abe8c93fa\": container with ID starting with a4db3a035e3d43f501212e549049e4581ca88926205146645036292abe8c93fa not found: ID does not exist" Sep 30 07:58:10 crc kubenswrapper[4691]: I0930 07:58:10.794713 4691 scope.go:117] "RemoveContainer" containerID="c22290888dad8efaa5c478e7060d86ab611d24a23222dec068715b3617e84240" Sep 30 07:58:10 crc kubenswrapper[4691]: E0930 07:58:10.795131 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c22290888dad8efaa5c478e7060d86ab611d24a23222dec068715b3617e84240\": container with ID starting with c22290888dad8efaa5c478e7060d86ab611d24a23222dec068715b3617e84240 not found: ID does not exist" containerID="c22290888dad8efaa5c478e7060d86ab611d24a23222dec068715b3617e84240" Sep 30 07:58:10 crc kubenswrapper[4691]: I0930 07:58:10.795160 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c22290888dad8efaa5c478e7060d86ab611d24a23222dec068715b3617e84240"} err="failed to get container status \"c22290888dad8efaa5c478e7060d86ab611d24a23222dec068715b3617e84240\": rpc error: code = NotFound desc = could not find container \"c22290888dad8efaa5c478e7060d86ab611d24a23222dec068715b3617e84240\": container with ID starting with c22290888dad8efaa5c478e7060d86ab611d24a23222dec068715b3617e84240 not found: ID does not exist" Sep 30 07:58:11 crc kubenswrapper[4691]: I0930 07:58:11.247252 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29da4d7e-405d-4b46-b83a-dd9d82b2d51d" path="/var/lib/kubelet/pods/29da4d7e-405d-4b46-b83a-dd9d82b2d51d/volumes" Sep 30 07:58:22 crc kubenswrapper[4691]: I0930 07:58:22.849958 4691 patch_prober.go:28] interesting pod/machine-config-daemon-4w4k6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 07:58:22 crc kubenswrapper[4691]: I0930 07:58:22.852287 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 07:58:52 crc kubenswrapper[4691]: I0930 07:58:52.850246 4691 patch_prober.go:28] interesting pod/machine-config-daemon-4w4k6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 07:58:52 crc kubenswrapper[4691]: I0930 07:58:52.851118 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 07:59:05 crc kubenswrapper[4691]: I0930 07:59:05.294954 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5x8pl"] Sep 30 07:59:05 crc kubenswrapper[4691]: E0930 07:59:05.296128 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29da4d7e-405d-4b46-b83a-dd9d82b2d51d" containerName="registry-server" Sep 30 07:59:05 crc kubenswrapper[4691]: I0930 07:59:05.296150 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="29da4d7e-405d-4b46-b83a-dd9d82b2d51d" containerName="registry-server" Sep 30 07:59:05 crc kubenswrapper[4691]: E0930 07:59:05.296183 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29da4d7e-405d-4b46-b83a-dd9d82b2d51d" containerName="extract-utilities" Sep 30 07:59:05 crc kubenswrapper[4691]: I0930 07:59:05.296198 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="29da4d7e-405d-4b46-b83a-dd9d82b2d51d" containerName="extract-utilities" Sep 30 07:59:05 crc kubenswrapper[4691]: E0930 07:59:05.296227 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29da4d7e-405d-4b46-b83a-dd9d82b2d51d" containerName="extract-content" Sep 30 07:59:05 crc kubenswrapper[4691]: I0930 07:59:05.296240 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="29da4d7e-405d-4b46-b83a-dd9d82b2d51d" containerName="extract-content" Sep 30 07:59:05 crc kubenswrapper[4691]: I0930 07:59:05.296566 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="29da4d7e-405d-4b46-b83a-dd9d82b2d51d" containerName="registry-server" Sep 30 07:59:05 crc kubenswrapper[4691]: I0930 07:59:05.299067 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5x8pl" Sep 30 07:59:05 crc kubenswrapper[4691]: I0930 07:59:05.311974 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5x8pl"] Sep 30 07:59:05 crc kubenswrapper[4691]: I0930 07:59:05.428192 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95dc8\" (UniqueName: \"kubernetes.io/projected/afa2bb77-2916-4996-a328-d6bf4ff47e3f-kube-api-access-95dc8\") pod \"certified-operators-5x8pl\" (UID: \"afa2bb77-2916-4996-a328-d6bf4ff47e3f\") " pod="openshift-marketplace/certified-operators-5x8pl" Sep 30 07:59:05 crc kubenswrapper[4691]: I0930 07:59:05.428456 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/afa2bb77-2916-4996-a328-d6bf4ff47e3f-catalog-content\") pod \"certified-operators-5x8pl\" (UID: \"afa2bb77-2916-4996-a328-d6bf4ff47e3f\") " pod="openshift-marketplace/certified-operators-5x8pl" Sep 30 07:59:05 crc kubenswrapper[4691]: I0930 07:59:05.428516 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/afa2bb77-2916-4996-a328-d6bf4ff47e3f-utilities\") pod \"certified-operators-5x8pl\" (UID: \"afa2bb77-2916-4996-a328-d6bf4ff47e3f\") " pod="openshift-marketplace/certified-operators-5x8pl" Sep 30 07:59:05 crc kubenswrapper[4691]: I0930 07:59:05.530309 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/afa2bb77-2916-4996-a328-d6bf4ff47e3f-catalog-content\") pod \"certified-operators-5x8pl\" (UID: \"afa2bb77-2916-4996-a328-d6bf4ff47e3f\") " pod="openshift-marketplace/certified-operators-5x8pl" Sep 30 07:59:05 crc kubenswrapper[4691]: I0930 07:59:05.530391 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/afa2bb77-2916-4996-a328-d6bf4ff47e3f-utilities\") pod \"certified-operators-5x8pl\" (UID: \"afa2bb77-2916-4996-a328-d6bf4ff47e3f\") " pod="openshift-marketplace/certified-operators-5x8pl" Sep 30 07:59:05 crc kubenswrapper[4691]: I0930 07:59:05.531174 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/afa2bb77-2916-4996-a328-d6bf4ff47e3f-catalog-content\") pod \"certified-operators-5x8pl\" (UID: \"afa2bb77-2916-4996-a328-d6bf4ff47e3f\") " pod="openshift-marketplace/certified-operators-5x8pl" Sep 30 07:59:05 crc kubenswrapper[4691]: I0930 07:59:05.531269 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/afa2bb77-2916-4996-a328-d6bf4ff47e3f-utilities\") pod \"certified-operators-5x8pl\" (UID: \"afa2bb77-2916-4996-a328-d6bf4ff47e3f\") " pod="openshift-marketplace/certified-operators-5x8pl" Sep 30 07:59:05 crc kubenswrapper[4691]: I0930 07:59:05.531549 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95dc8\" (UniqueName: \"kubernetes.io/projected/afa2bb77-2916-4996-a328-d6bf4ff47e3f-kube-api-access-95dc8\") pod \"certified-operators-5x8pl\" (UID: \"afa2bb77-2916-4996-a328-d6bf4ff47e3f\") " pod="openshift-marketplace/certified-operators-5x8pl" Sep 30 07:59:05 crc kubenswrapper[4691]: I0930 07:59:05.577433 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95dc8\" (UniqueName: \"kubernetes.io/projected/afa2bb77-2916-4996-a328-d6bf4ff47e3f-kube-api-access-95dc8\") pod \"certified-operators-5x8pl\" (UID: \"afa2bb77-2916-4996-a328-d6bf4ff47e3f\") " pod="openshift-marketplace/certified-operators-5x8pl" Sep 30 07:59:05 crc kubenswrapper[4691]: I0930 07:59:05.637672 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5x8pl" Sep 30 07:59:06 crc kubenswrapper[4691]: I0930 07:59:06.245115 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5x8pl"] Sep 30 07:59:06 crc kubenswrapper[4691]: I0930 07:59:06.368636 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5x8pl" event={"ID":"afa2bb77-2916-4996-a328-d6bf4ff47e3f","Type":"ContainerStarted","Data":"a61e7f45adc1c60e511d6e24a2d1c38a4b4e6682d1de7dffb5c7dd458a3d271f"} Sep 30 07:59:07 crc kubenswrapper[4691]: I0930 07:59:07.389698 4691 generic.go:334] "Generic (PLEG): container finished" podID="afa2bb77-2916-4996-a328-d6bf4ff47e3f" containerID="a923083b4b12fd6a1bf3cd21cc6164ddb97a04129f1cb9ddeb5c43183b94df27" exitCode=0 Sep 30 07:59:07 crc kubenswrapper[4691]: I0930 07:59:07.389753 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5x8pl" event={"ID":"afa2bb77-2916-4996-a328-d6bf4ff47e3f","Type":"ContainerDied","Data":"a923083b4b12fd6a1bf3cd21cc6164ddb97a04129f1cb9ddeb5c43183b94df27"} Sep 30 07:59:09 crc kubenswrapper[4691]: I0930 07:59:09.413879 4691 generic.go:334] "Generic (PLEG): container finished" podID="afa2bb77-2916-4996-a328-d6bf4ff47e3f" containerID="e7a5cb2897a3d3d431bfc081d04bdfeebf9f45313ff778aac977d822e1b55b32" exitCode=0 Sep 30 07:59:09 crc kubenswrapper[4691]: I0930 07:59:09.414112 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5x8pl" event={"ID":"afa2bb77-2916-4996-a328-d6bf4ff47e3f","Type":"ContainerDied","Data":"e7a5cb2897a3d3d431bfc081d04bdfeebf9f45313ff778aac977d822e1b55b32"} Sep 30 07:59:10 crc kubenswrapper[4691]: I0930 07:59:10.431712 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5x8pl" event={"ID":"afa2bb77-2916-4996-a328-d6bf4ff47e3f","Type":"ContainerStarted","Data":"b3e92b5b28e037609748e346cbb9ae204efb739578b3be26e26896e6308b0c7e"} Sep 30 07:59:15 crc kubenswrapper[4691]: I0930 07:59:15.638237 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5x8pl" Sep 30 07:59:15 crc kubenswrapper[4691]: I0930 07:59:15.638739 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5x8pl" Sep 30 07:59:15 crc kubenswrapper[4691]: I0930 07:59:15.695766 4691 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5x8pl" Sep 30 07:59:15 crc kubenswrapper[4691]: I0930 07:59:15.727804 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5x8pl" podStartSLOduration=8.272351527 podStartE2EDuration="10.727779118s" podCreationTimestamp="2025-09-30 07:59:05 +0000 UTC" firstStartedPulling="2025-09-30 07:59:07.393713201 +0000 UTC m=+5990.868734251" lastFinishedPulling="2025-09-30 07:59:09.849140762 +0000 UTC m=+5993.324161842" observedRunningTime="2025-09-30 07:59:10.460497208 +0000 UTC m=+5993.935518308" watchObservedRunningTime="2025-09-30 07:59:15.727779118 +0000 UTC m=+5999.202800198" Sep 30 07:59:16 crc kubenswrapper[4691]: I0930 07:59:16.551383 4691 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5x8pl" Sep 30 07:59:16 crc kubenswrapper[4691]: I0930 07:59:16.614619 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5x8pl"] Sep 30 07:59:18 crc kubenswrapper[4691]: I0930 07:59:18.519713 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5x8pl" podUID="afa2bb77-2916-4996-a328-d6bf4ff47e3f" containerName="registry-server" containerID="cri-o://b3e92b5b28e037609748e346cbb9ae204efb739578b3be26e26896e6308b0c7e" gracePeriod=2 Sep 30 07:59:19 crc kubenswrapper[4691]: I0930 07:59:19.038719 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5x8pl" Sep 30 07:59:19 crc kubenswrapper[4691]: I0930 07:59:19.155788 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/afa2bb77-2916-4996-a328-d6bf4ff47e3f-utilities\") pod \"afa2bb77-2916-4996-a328-d6bf4ff47e3f\" (UID: \"afa2bb77-2916-4996-a328-d6bf4ff47e3f\") " Sep 30 07:59:19 crc kubenswrapper[4691]: I0930 07:59:19.155835 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/afa2bb77-2916-4996-a328-d6bf4ff47e3f-catalog-content\") pod \"afa2bb77-2916-4996-a328-d6bf4ff47e3f\" (UID: \"afa2bb77-2916-4996-a328-d6bf4ff47e3f\") " Sep 30 07:59:19 crc kubenswrapper[4691]: I0930 07:59:19.156000 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95dc8\" (UniqueName: \"kubernetes.io/projected/afa2bb77-2916-4996-a328-d6bf4ff47e3f-kube-api-access-95dc8\") pod \"afa2bb77-2916-4996-a328-d6bf4ff47e3f\" (UID: \"afa2bb77-2916-4996-a328-d6bf4ff47e3f\") " Sep 30 07:59:19 crc kubenswrapper[4691]: I0930 07:59:19.156955 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/afa2bb77-2916-4996-a328-d6bf4ff47e3f-utilities" (OuterVolumeSpecName: "utilities") pod "afa2bb77-2916-4996-a328-d6bf4ff47e3f" (UID: "afa2bb77-2916-4996-a328-d6bf4ff47e3f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:59:19 crc kubenswrapper[4691]: I0930 07:59:19.165810 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afa2bb77-2916-4996-a328-d6bf4ff47e3f-kube-api-access-95dc8" (OuterVolumeSpecName: "kube-api-access-95dc8") pod "afa2bb77-2916-4996-a328-d6bf4ff47e3f" (UID: "afa2bb77-2916-4996-a328-d6bf4ff47e3f"). InnerVolumeSpecName "kube-api-access-95dc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:59:19 crc kubenswrapper[4691]: I0930 07:59:19.261326 4691 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/afa2bb77-2916-4996-a328-d6bf4ff47e3f-utilities\") on node \"crc\" DevicePath \"\"" Sep 30 07:59:19 crc kubenswrapper[4691]: I0930 07:59:19.261592 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-95dc8\" (UniqueName: \"kubernetes.io/projected/afa2bb77-2916-4996-a328-d6bf4ff47e3f-kube-api-access-95dc8\") on node \"crc\" DevicePath \"\"" Sep 30 07:59:19 crc kubenswrapper[4691]: I0930 07:59:19.273115 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/afa2bb77-2916-4996-a328-d6bf4ff47e3f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "afa2bb77-2916-4996-a328-d6bf4ff47e3f" (UID: "afa2bb77-2916-4996-a328-d6bf4ff47e3f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:59:19 crc kubenswrapper[4691]: I0930 07:59:19.364310 4691 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/afa2bb77-2916-4996-a328-d6bf4ff47e3f-catalog-content\") on node \"crc\" DevicePath \"\"" Sep 30 07:59:19 crc kubenswrapper[4691]: I0930 07:59:19.533757 4691 generic.go:334] "Generic (PLEG): container finished" podID="afa2bb77-2916-4996-a328-d6bf4ff47e3f" containerID="b3e92b5b28e037609748e346cbb9ae204efb739578b3be26e26896e6308b0c7e" exitCode=0 Sep 30 07:59:19 crc kubenswrapper[4691]: I0930 07:59:19.533834 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5x8pl" event={"ID":"afa2bb77-2916-4996-a328-d6bf4ff47e3f","Type":"ContainerDied","Data":"b3e92b5b28e037609748e346cbb9ae204efb739578b3be26e26896e6308b0c7e"} Sep 30 07:59:19 crc kubenswrapper[4691]: I0930 07:59:19.533847 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5x8pl" Sep 30 07:59:19 crc kubenswrapper[4691]: I0930 07:59:19.533868 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5x8pl" event={"ID":"afa2bb77-2916-4996-a328-d6bf4ff47e3f","Type":"ContainerDied","Data":"a61e7f45adc1c60e511d6e24a2d1c38a4b4e6682d1de7dffb5c7dd458a3d271f"} Sep 30 07:59:19 crc kubenswrapper[4691]: I0930 07:59:19.533920 4691 scope.go:117] "RemoveContainer" containerID="b3e92b5b28e037609748e346cbb9ae204efb739578b3be26e26896e6308b0c7e" Sep 30 07:59:19 crc kubenswrapper[4691]: I0930 07:59:19.570178 4691 scope.go:117] "RemoveContainer" containerID="e7a5cb2897a3d3d431bfc081d04bdfeebf9f45313ff778aac977d822e1b55b32" Sep 30 07:59:19 crc kubenswrapper[4691]: I0930 07:59:19.591996 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5x8pl"] Sep 30 07:59:19 crc kubenswrapper[4691]: I0930 07:59:19.608132 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5x8pl"] Sep 30 07:59:19 crc kubenswrapper[4691]: I0930 07:59:19.608852 4691 scope.go:117] "RemoveContainer" containerID="a923083b4b12fd6a1bf3cd21cc6164ddb97a04129f1cb9ddeb5c43183b94df27" Sep 30 07:59:19 crc kubenswrapper[4691]: I0930 07:59:19.643796 4691 scope.go:117] "RemoveContainer" containerID="b3e92b5b28e037609748e346cbb9ae204efb739578b3be26e26896e6308b0c7e" Sep 30 07:59:19 crc kubenswrapper[4691]: E0930 07:59:19.644327 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3e92b5b28e037609748e346cbb9ae204efb739578b3be26e26896e6308b0c7e\": container with ID starting with b3e92b5b28e037609748e346cbb9ae204efb739578b3be26e26896e6308b0c7e not found: ID does not exist" containerID="b3e92b5b28e037609748e346cbb9ae204efb739578b3be26e26896e6308b0c7e" Sep 30 07:59:19 crc kubenswrapper[4691]: I0930 07:59:19.644377 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3e92b5b28e037609748e346cbb9ae204efb739578b3be26e26896e6308b0c7e"} err="failed to get container status \"b3e92b5b28e037609748e346cbb9ae204efb739578b3be26e26896e6308b0c7e\": rpc error: code = NotFound desc = could not find container \"b3e92b5b28e037609748e346cbb9ae204efb739578b3be26e26896e6308b0c7e\": container with ID starting with b3e92b5b28e037609748e346cbb9ae204efb739578b3be26e26896e6308b0c7e not found: ID does not exist" Sep 30 07:59:19 crc kubenswrapper[4691]: I0930 07:59:19.644405 4691 scope.go:117] "RemoveContainer" containerID="e7a5cb2897a3d3d431bfc081d04bdfeebf9f45313ff778aac977d822e1b55b32" Sep 30 07:59:19 crc kubenswrapper[4691]: E0930 07:59:19.644761 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7a5cb2897a3d3d431bfc081d04bdfeebf9f45313ff778aac977d822e1b55b32\": container with ID starting with e7a5cb2897a3d3d431bfc081d04bdfeebf9f45313ff778aac977d822e1b55b32 not found: ID does not exist" containerID="e7a5cb2897a3d3d431bfc081d04bdfeebf9f45313ff778aac977d822e1b55b32" Sep 30 07:59:19 crc kubenswrapper[4691]: I0930 07:59:19.644786 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7a5cb2897a3d3d431bfc081d04bdfeebf9f45313ff778aac977d822e1b55b32"} err="failed to get container status \"e7a5cb2897a3d3d431bfc081d04bdfeebf9f45313ff778aac977d822e1b55b32\": rpc error: code = NotFound desc = could not find container \"e7a5cb2897a3d3d431bfc081d04bdfeebf9f45313ff778aac977d822e1b55b32\": container with ID starting with e7a5cb2897a3d3d431bfc081d04bdfeebf9f45313ff778aac977d822e1b55b32 not found: ID does not exist" Sep 30 07:59:19 crc kubenswrapper[4691]: I0930 07:59:19.644800 4691 scope.go:117] "RemoveContainer" containerID="a923083b4b12fd6a1bf3cd21cc6164ddb97a04129f1cb9ddeb5c43183b94df27" Sep 30 07:59:19 crc kubenswrapper[4691]: E0930 07:59:19.645250 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a923083b4b12fd6a1bf3cd21cc6164ddb97a04129f1cb9ddeb5c43183b94df27\": container with ID starting with a923083b4b12fd6a1bf3cd21cc6164ddb97a04129f1cb9ddeb5c43183b94df27 not found: ID does not exist" containerID="a923083b4b12fd6a1bf3cd21cc6164ddb97a04129f1cb9ddeb5c43183b94df27" Sep 30 07:59:19 crc kubenswrapper[4691]: I0930 07:59:19.645280 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a923083b4b12fd6a1bf3cd21cc6164ddb97a04129f1cb9ddeb5c43183b94df27"} err="failed to get container status \"a923083b4b12fd6a1bf3cd21cc6164ddb97a04129f1cb9ddeb5c43183b94df27\": rpc error: code = NotFound desc = could not find container \"a923083b4b12fd6a1bf3cd21cc6164ddb97a04129f1cb9ddeb5c43183b94df27\": container with ID starting with a923083b4b12fd6a1bf3cd21cc6164ddb97a04129f1cb9ddeb5c43183b94df27 not found: ID does not exist" Sep 30 07:59:20 crc kubenswrapper[4691]: I0930 07:59:20.545507 4691 generic.go:334] "Generic (PLEG): container finished" podID="8405e9df-8bb3-4a22-8b85-1fa652143de8" containerID="bcdc7418b47009c90688ca12101c3d50139645481ec0e0b26b840f3bd9192fec" exitCode=0 Sep 30 07:59:20 crc kubenswrapper[4691]: I0930 07:59:20.545647 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-78mzf/must-gather-6xhk8" event={"ID":"8405e9df-8bb3-4a22-8b85-1fa652143de8","Type":"ContainerDied","Data":"bcdc7418b47009c90688ca12101c3d50139645481ec0e0b26b840f3bd9192fec"} Sep 30 07:59:20 crc kubenswrapper[4691]: I0930 07:59:20.546865 4691 scope.go:117] "RemoveContainer" containerID="bcdc7418b47009c90688ca12101c3d50139645481ec0e0b26b840f3bd9192fec" Sep 30 07:59:21 crc kubenswrapper[4691]: I0930 07:59:21.235233 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afa2bb77-2916-4996-a328-d6bf4ff47e3f" path="/var/lib/kubelet/pods/afa2bb77-2916-4996-a328-d6bf4ff47e3f/volumes" Sep 30 07:59:21 crc kubenswrapper[4691]: I0930 07:59:21.549485 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-78mzf_must-gather-6xhk8_8405e9df-8bb3-4a22-8b85-1fa652143de8/gather/0.log" Sep 30 07:59:22 crc kubenswrapper[4691]: I0930 07:59:22.849715 4691 patch_prober.go:28] interesting pod/machine-config-daemon-4w4k6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 07:59:22 crc kubenswrapper[4691]: I0930 07:59:22.850062 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Sep 30 07:59:22 crc kubenswrapper[4691]: I0930 07:59:22.850481 4691 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" Sep 30 07:59:22 crc kubenswrapper[4691]: I0930 07:59:22.851342 4691 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"aec37d63291116406dc4ed4a91e1140d14893da8976b65afbaac021af006f18a"} pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Sep 30 07:59:22 crc kubenswrapper[4691]: I0930 07:59:22.851396 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" containerName="machine-config-daemon" containerID="cri-o://aec37d63291116406dc4ed4a91e1140d14893da8976b65afbaac021af006f18a" gracePeriod=600 Sep 30 07:59:22 crc kubenswrapper[4691]: E0930 07:59:22.978593 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:59:23 crc kubenswrapper[4691]: I0930 07:59:23.585296 4691 generic.go:334] "Generic (PLEG): container finished" podID="69b46ade-8260-448f-84b7-506632d23ff9" containerID="aec37d63291116406dc4ed4a91e1140d14893da8976b65afbaac021af006f18a" exitCode=0 Sep 30 07:59:23 crc kubenswrapper[4691]: I0930 07:59:23.585367 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" event={"ID":"69b46ade-8260-448f-84b7-506632d23ff9","Type":"ContainerDied","Data":"aec37d63291116406dc4ed4a91e1140d14893da8976b65afbaac021af006f18a"} Sep 30 07:59:23 crc kubenswrapper[4691]: I0930 07:59:23.585806 4691 scope.go:117] "RemoveContainer" containerID="37d19b6a60363a5a6aed5989b48791cca474e4f931fda010298ee4b265d6d360" Sep 30 07:59:23 crc kubenswrapper[4691]: I0930 07:59:23.587020 4691 scope.go:117] "RemoveContainer" containerID="aec37d63291116406dc4ed4a91e1140d14893da8976b65afbaac021af006f18a" Sep 30 07:59:23 crc kubenswrapper[4691]: E0930 07:59:23.587526 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:59:30 crc kubenswrapper[4691]: I0930 07:59:30.045793 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-78mzf/must-gather-6xhk8"] Sep 30 07:59:30 crc kubenswrapper[4691]: I0930 07:59:30.052241 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-78mzf/must-gather-6xhk8" podUID="8405e9df-8bb3-4a22-8b85-1fa652143de8" containerName="copy" containerID="cri-o://c903f0050d5c1653352cb808ca581132207115e5e53e3379df6fa2b1682a7e2d" gracePeriod=2 Sep 30 07:59:30 crc kubenswrapper[4691]: I0930 07:59:30.062408 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-78mzf/must-gather-6xhk8"] Sep 30 07:59:30 crc kubenswrapper[4691]: I0930 07:59:30.494812 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-78mzf_must-gather-6xhk8_8405e9df-8bb3-4a22-8b85-1fa652143de8/copy/0.log" Sep 30 07:59:30 crc kubenswrapper[4691]: I0930 07:59:30.495671 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-78mzf/must-gather-6xhk8" Sep 30 07:59:30 crc kubenswrapper[4691]: I0930 07:59:30.639074 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8405e9df-8bb3-4a22-8b85-1fa652143de8-must-gather-output\") pod \"8405e9df-8bb3-4a22-8b85-1fa652143de8\" (UID: \"8405e9df-8bb3-4a22-8b85-1fa652143de8\") " Sep 30 07:59:30 crc kubenswrapper[4691]: I0930 07:59:30.639147 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kj6zx\" (UniqueName: \"kubernetes.io/projected/8405e9df-8bb3-4a22-8b85-1fa652143de8-kube-api-access-kj6zx\") pod \"8405e9df-8bb3-4a22-8b85-1fa652143de8\" (UID: \"8405e9df-8bb3-4a22-8b85-1fa652143de8\") " Sep 30 07:59:30 crc kubenswrapper[4691]: I0930 07:59:30.646598 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8405e9df-8bb3-4a22-8b85-1fa652143de8-kube-api-access-kj6zx" (OuterVolumeSpecName: "kube-api-access-kj6zx") pod "8405e9df-8bb3-4a22-8b85-1fa652143de8" (UID: "8405e9df-8bb3-4a22-8b85-1fa652143de8"). InnerVolumeSpecName "kube-api-access-kj6zx". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 07:59:30 crc kubenswrapper[4691]: I0930 07:59:30.682839 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-78mzf_must-gather-6xhk8_8405e9df-8bb3-4a22-8b85-1fa652143de8/copy/0.log" Sep 30 07:59:30 crc kubenswrapper[4691]: I0930 07:59:30.683395 4691 generic.go:334] "Generic (PLEG): container finished" podID="8405e9df-8bb3-4a22-8b85-1fa652143de8" containerID="c903f0050d5c1653352cb808ca581132207115e5e53e3379df6fa2b1682a7e2d" exitCode=143 Sep 30 07:59:30 crc kubenswrapper[4691]: I0930 07:59:30.683454 4691 scope.go:117] "RemoveContainer" containerID="c903f0050d5c1653352cb808ca581132207115e5e53e3379df6fa2b1682a7e2d" Sep 30 07:59:30 crc kubenswrapper[4691]: I0930 07:59:30.683615 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-78mzf/must-gather-6xhk8" Sep 30 07:59:30 crc kubenswrapper[4691]: I0930 07:59:30.706123 4691 scope.go:117] "RemoveContainer" containerID="bcdc7418b47009c90688ca12101c3d50139645481ec0e0b26b840f3bd9192fec" Sep 30 07:59:30 crc kubenswrapper[4691]: I0930 07:59:30.747527 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kj6zx\" (UniqueName: \"kubernetes.io/projected/8405e9df-8bb3-4a22-8b85-1fa652143de8-kube-api-access-kj6zx\") on node \"crc\" DevicePath \"\"" Sep 30 07:59:30 crc kubenswrapper[4691]: I0930 07:59:30.804056 4691 scope.go:117] "RemoveContainer" containerID="c903f0050d5c1653352cb808ca581132207115e5e53e3379df6fa2b1682a7e2d" Sep 30 07:59:30 crc kubenswrapper[4691]: E0930 07:59:30.808011 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c903f0050d5c1653352cb808ca581132207115e5e53e3379df6fa2b1682a7e2d\": container with ID starting with c903f0050d5c1653352cb808ca581132207115e5e53e3379df6fa2b1682a7e2d not found: ID does not exist" containerID="c903f0050d5c1653352cb808ca581132207115e5e53e3379df6fa2b1682a7e2d" Sep 30 07:59:30 crc kubenswrapper[4691]: I0930 07:59:30.808052 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c903f0050d5c1653352cb808ca581132207115e5e53e3379df6fa2b1682a7e2d"} err="failed to get container status \"c903f0050d5c1653352cb808ca581132207115e5e53e3379df6fa2b1682a7e2d\": rpc error: code = NotFound desc = could not find container \"c903f0050d5c1653352cb808ca581132207115e5e53e3379df6fa2b1682a7e2d\": container with ID starting with c903f0050d5c1653352cb808ca581132207115e5e53e3379df6fa2b1682a7e2d not found: ID does not exist" Sep 30 07:59:30 crc kubenswrapper[4691]: I0930 07:59:30.808076 4691 scope.go:117] "RemoveContainer" containerID="bcdc7418b47009c90688ca12101c3d50139645481ec0e0b26b840f3bd9192fec" Sep 30 07:59:30 crc kubenswrapper[4691]: E0930 07:59:30.817083 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bcdc7418b47009c90688ca12101c3d50139645481ec0e0b26b840f3bd9192fec\": container with ID starting with bcdc7418b47009c90688ca12101c3d50139645481ec0e0b26b840f3bd9192fec not found: ID does not exist" containerID="bcdc7418b47009c90688ca12101c3d50139645481ec0e0b26b840f3bd9192fec" Sep 30 07:59:30 crc kubenswrapper[4691]: I0930 07:59:30.817134 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcdc7418b47009c90688ca12101c3d50139645481ec0e0b26b840f3bd9192fec"} err="failed to get container status \"bcdc7418b47009c90688ca12101c3d50139645481ec0e0b26b840f3bd9192fec\": rpc error: code = NotFound desc = could not find container \"bcdc7418b47009c90688ca12101c3d50139645481ec0e0b26b840f3bd9192fec\": container with ID starting with bcdc7418b47009c90688ca12101c3d50139645481ec0e0b26b840f3bd9192fec not found: ID does not exist" Sep 30 07:59:30 crc kubenswrapper[4691]: I0930 07:59:30.876316 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8405e9df-8bb3-4a22-8b85-1fa652143de8-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "8405e9df-8bb3-4a22-8b85-1fa652143de8" (UID: "8405e9df-8bb3-4a22-8b85-1fa652143de8"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 07:59:30 crc kubenswrapper[4691]: I0930 07:59:30.951310 4691 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8405e9df-8bb3-4a22-8b85-1fa652143de8-must-gather-output\") on node \"crc\" DevicePath \"\"" Sep 30 07:59:31 crc kubenswrapper[4691]: I0930 07:59:31.237001 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8405e9df-8bb3-4a22-8b85-1fa652143de8" path="/var/lib/kubelet/pods/8405e9df-8bb3-4a22-8b85-1fa652143de8/volumes" Sep 30 07:59:35 crc kubenswrapper[4691]: I0930 07:59:35.225969 4691 scope.go:117] "RemoveContainer" containerID="aec37d63291116406dc4ed4a91e1140d14893da8976b65afbaac021af006f18a" Sep 30 07:59:35 crc kubenswrapper[4691]: E0930 07:59:35.227330 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:59:49 crc kubenswrapper[4691]: I0930 07:59:49.225937 4691 scope.go:117] "RemoveContainer" containerID="aec37d63291116406dc4ed4a91e1140d14893da8976b65afbaac021af006f18a" Sep 30 07:59:49 crc kubenswrapper[4691]: E0930 07:59:49.226878 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 07:59:55 crc kubenswrapper[4691]: I0930 07:59:55.998737 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-hhnpm/must-gather-ccdnm"] Sep 30 07:59:56 crc kubenswrapper[4691]: E0930 07:59:55.999709 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8405e9df-8bb3-4a22-8b85-1fa652143de8" containerName="copy" Sep 30 07:59:56 crc kubenswrapper[4691]: I0930 07:59:55.999724 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="8405e9df-8bb3-4a22-8b85-1fa652143de8" containerName="copy" Sep 30 07:59:56 crc kubenswrapper[4691]: E0930 07:59:55.999744 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afa2bb77-2916-4996-a328-d6bf4ff47e3f" containerName="extract-content" Sep 30 07:59:56 crc kubenswrapper[4691]: I0930 07:59:55.999754 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="afa2bb77-2916-4996-a328-d6bf4ff47e3f" containerName="extract-content" Sep 30 07:59:56 crc kubenswrapper[4691]: E0930 07:59:55.999762 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afa2bb77-2916-4996-a328-d6bf4ff47e3f" containerName="registry-server" Sep 30 07:59:56 crc kubenswrapper[4691]: I0930 07:59:55.999770 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="afa2bb77-2916-4996-a328-d6bf4ff47e3f" containerName="registry-server" Sep 30 07:59:56 crc kubenswrapper[4691]: E0930 07:59:55.999778 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8405e9df-8bb3-4a22-8b85-1fa652143de8" containerName="gather" Sep 30 07:59:56 crc kubenswrapper[4691]: I0930 07:59:55.999786 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="8405e9df-8bb3-4a22-8b85-1fa652143de8" containerName="gather" Sep 30 07:59:56 crc kubenswrapper[4691]: E0930 07:59:55.999811 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afa2bb77-2916-4996-a328-d6bf4ff47e3f" containerName="extract-utilities" Sep 30 07:59:56 crc kubenswrapper[4691]: I0930 07:59:55.999818 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="afa2bb77-2916-4996-a328-d6bf4ff47e3f" containerName="extract-utilities" Sep 30 07:59:56 crc kubenswrapper[4691]: I0930 07:59:56.000095 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="afa2bb77-2916-4996-a328-d6bf4ff47e3f" containerName="registry-server" Sep 30 07:59:56 crc kubenswrapper[4691]: I0930 07:59:56.000130 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="8405e9df-8bb3-4a22-8b85-1fa652143de8" containerName="gather" Sep 30 07:59:56 crc kubenswrapper[4691]: I0930 07:59:56.000145 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="8405e9df-8bb3-4a22-8b85-1fa652143de8" containerName="copy" Sep 30 07:59:56 crc kubenswrapper[4691]: I0930 07:59:56.001338 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hhnpm/must-gather-ccdnm" Sep 30 07:59:56 crc kubenswrapper[4691]: I0930 07:59:56.003574 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-hhnpm"/"openshift-service-ca.crt" Sep 30 07:59:56 crc kubenswrapper[4691]: I0930 07:59:56.003921 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-hhnpm"/"kube-root-ca.crt" Sep 30 07:59:56 crc kubenswrapper[4691]: I0930 07:59:56.025328 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-hhnpm/must-gather-ccdnm"] Sep 30 07:59:56 crc kubenswrapper[4691]: I0930 07:59:56.119637 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/39eb777b-f6a5-40bc-b2c2-f27c3c6ed1e1-must-gather-output\") pod \"must-gather-ccdnm\" (UID: \"39eb777b-f6a5-40bc-b2c2-f27c3c6ed1e1\") " pod="openshift-must-gather-hhnpm/must-gather-ccdnm" Sep 30 07:59:56 crc kubenswrapper[4691]: I0930 07:59:56.119742 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwsjb\" (UniqueName: \"kubernetes.io/projected/39eb777b-f6a5-40bc-b2c2-f27c3c6ed1e1-kube-api-access-xwsjb\") pod \"must-gather-ccdnm\" (UID: \"39eb777b-f6a5-40bc-b2c2-f27c3c6ed1e1\") " pod="openshift-must-gather-hhnpm/must-gather-ccdnm" Sep 30 07:59:56 crc kubenswrapper[4691]: I0930 07:59:56.221897 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/39eb777b-f6a5-40bc-b2c2-f27c3c6ed1e1-must-gather-output\") pod \"must-gather-ccdnm\" (UID: \"39eb777b-f6a5-40bc-b2c2-f27c3c6ed1e1\") " pod="openshift-must-gather-hhnpm/must-gather-ccdnm" Sep 30 07:59:56 crc kubenswrapper[4691]: I0930 07:59:56.222034 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwsjb\" (UniqueName: \"kubernetes.io/projected/39eb777b-f6a5-40bc-b2c2-f27c3c6ed1e1-kube-api-access-xwsjb\") pod \"must-gather-ccdnm\" (UID: \"39eb777b-f6a5-40bc-b2c2-f27c3c6ed1e1\") " pod="openshift-must-gather-hhnpm/must-gather-ccdnm" Sep 30 07:59:56 crc kubenswrapper[4691]: I0930 07:59:56.222621 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/39eb777b-f6a5-40bc-b2c2-f27c3c6ed1e1-must-gather-output\") pod \"must-gather-ccdnm\" (UID: \"39eb777b-f6a5-40bc-b2c2-f27c3c6ed1e1\") " pod="openshift-must-gather-hhnpm/must-gather-ccdnm" Sep 30 07:59:56 crc kubenswrapper[4691]: I0930 07:59:56.241687 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwsjb\" (UniqueName: \"kubernetes.io/projected/39eb777b-f6a5-40bc-b2c2-f27c3c6ed1e1-kube-api-access-xwsjb\") pod \"must-gather-ccdnm\" (UID: \"39eb777b-f6a5-40bc-b2c2-f27c3c6ed1e1\") " pod="openshift-must-gather-hhnpm/must-gather-ccdnm" Sep 30 07:59:56 crc kubenswrapper[4691]: I0930 07:59:56.329689 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hhnpm/must-gather-ccdnm" Sep 30 07:59:56 crc kubenswrapper[4691]: W0930 07:59:56.804824 4691 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod39eb777b_f6a5_40bc_b2c2_f27c3c6ed1e1.slice/crio-2c88b5814d8fe270c3cc1ecfed39b7743570994ef7453a47d5aad45fdcd83e07 WatchSource:0}: Error finding container 2c88b5814d8fe270c3cc1ecfed39b7743570994ef7453a47d5aad45fdcd83e07: Status 404 returned error can't find the container with id 2c88b5814d8fe270c3cc1ecfed39b7743570994ef7453a47d5aad45fdcd83e07 Sep 30 07:59:56 crc kubenswrapper[4691]: I0930 07:59:56.807967 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-hhnpm/must-gather-ccdnm"] Sep 30 07:59:56 crc kubenswrapper[4691]: I0930 07:59:56.980812 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hhnpm/must-gather-ccdnm" event={"ID":"39eb777b-f6a5-40bc-b2c2-f27c3c6ed1e1","Type":"ContainerStarted","Data":"2c88b5814d8fe270c3cc1ecfed39b7743570994ef7453a47d5aad45fdcd83e07"} Sep 30 07:59:57 crc kubenswrapper[4691]: I0930 07:59:57.990711 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hhnpm/must-gather-ccdnm" event={"ID":"39eb777b-f6a5-40bc-b2c2-f27c3c6ed1e1","Type":"ContainerStarted","Data":"dab23aa0aaf3c335a7a53c2ef34d634faa252f4efd9be9a8ae036531bb11eb22"} Sep 30 07:59:57 crc kubenswrapper[4691]: I0930 07:59:57.991193 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hhnpm/must-gather-ccdnm" event={"ID":"39eb777b-f6a5-40bc-b2c2-f27c3c6ed1e1","Type":"ContainerStarted","Data":"f950cdf871747e448a2a00653d859efb67ddab749746d91be32f1acf40bac201"} Sep 30 07:59:58 crc kubenswrapper[4691]: I0930 07:59:58.014506 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-hhnpm/must-gather-ccdnm" podStartSLOduration=3.014492649 podStartE2EDuration="3.014492649s" podCreationTimestamp="2025-09-30 07:59:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 07:59:58.010129499 +0000 UTC m=+6041.485150539" watchObservedRunningTime="2025-09-30 07:59:58.014492649 +0000 UTC m=+6041.489513689" Sep 30 08:00:00 crc kubenswrapper[4691]: I0930 08:00:00.160585 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320320-f4g9t"] Sep 30 08:00:00 crc kubenswrapper[4691]: I0930 08:00:00.162144 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320320-f4g9t" Sep 30 08:00:00 crc kubenswrapper[4691]: I0930 08:00:00.171374 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320320-f4g9t"] Sep 30 08:00:00 crc kubenswrapper[4691]: I0930 08:00:00.179532 4691 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Sep 30 08:00:00 crc kubenswrapper[4691]: I0930 08:00:00.179678 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Sep 30 08:00:00 crc kubenswrapper[4691]: I0930 08:00:00.226235 4691 scope.go:117] "RemoveContainer" containerID="aec37d63291116406dc4ed4a91e1140d14893da8976b65afbaac021af006f18a" Sep 30 08:00:00 crc kubenswrapper[4691]: E0930 08:00:00.226435 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 08:00:00 crc kubenswrapper[4691]: I0930 08:00:00.309190 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/37ed5786-2149-462c-afb3-6117272c87fd-config-volume\") pod \"collect-profiles-29320320-f4g9t\" (UID: \"37ed5786-2149-462c-afb3-6117272c87fd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320320-f4g9t" Sep 30 08:00:00 crc kubenswrapper[4691]: I0930 08:00:00.309448 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68stm\" (UniqueName: \"kubernetes.io/projected/37ed5786-2149-462c-afb3-6117272c87fd-kube-api-access-68stm\") pod \"collect-profiles-29320320-f4g9t\" (UID: \"37ed5786-2149-462c-afb3-6117272c87fd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320320-f4g9t" Sep 30 08:00:00 crc kubenswrapper[4691]: I0930 08:00:00.309490 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/37ed5786-2149-462c-afb3-6117272c87fd-secret-volume\") pod \"collect-profiles-29320320-f4g9t\" (UID: \"37ed5786-2149-462c-afb3-6117272c87fd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320320-f4g9t" Sep 30 08:00:00 crc kubenswrapper[4691]: I0930 08:00:00.411088 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68stm\" (UniqueName: \"kubernetes.io/projected/37ed5786-2149-462c-afb3-6117272c87fd-kube-api-access-68stm\") pod \"collect-profiles-29320320-f4g9t\" (UID: \"37ed5786-2149-462c-afb3-6117272c87fd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320320-f4g9t" Sep 30 08:00:00 crc kubenswrapper[4691]: I0930 08:00:00.411149 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/37ed5786-2149-462c-afb3-6117272c87fd-secret-volume\") pod \"collect-profiles-29320320-f4g9t\" (UID: \"37ed5786-2149-462c-afb3-6117272c87fd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320320-f4g9t" Sep 30 08:00:00 crc kubenswrapper[4691]: I0930 08:00:00.411186 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/37ed5786-2149-462c-afb3-6117272c87fd-config-volume\") pod \"collect-profiles-29320320-f4g9t\" (UID: \"37ed5786-2149-462c-afb3-6117272c87fd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320320-f4g9t" Sep 30 08:00:00 crc kubenswrapper[4691]: I0930 08:00:00.412131 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/37ed5786-2149-462c-afb3-6117272c87fd-config-volume\") pod \"collect-profiles-29320320-f4g9t\" (UID: \"37ed5786-2149-462c-afb3-6117272c87fd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320320-f4g9t" Sep 30 08:00:00 crc kubenswrapper[4691]: I0930 08:00:00.416490 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/37ed5786-2149-462c-afb3-6117272c87fd-secret-volume\") pod \"collect-profiles-29320320-f4g9t\" (UID: \"37ed5786-2149-462c-afb3-6117272c87fd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320320-f4g9t" Sep 30 08:00:00 crc kubenswrapper[4691]: I0930 08:00:00.428409 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68stm\" (UniqueName: \"kubernetes.io/projected/37ed5786-2149-462c-afb3-6117272c87fd-kube-api-access-68stm\") pod \"collect-profiles-29320320-f4g9t\" (UID: \"37ed5786-2149-462c-afb3-6117272c87fd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29320320-f4g9t" Sep 30 08:00:00 crc kubenswrapper[4691]: I0930 08:00:00.511237 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320320-f4g9t" Sep 30 08:00:00 crc kubenswrapper[4691]: I0930 08:00:00.858085 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-hhnpm/crc-debug-hktr8"] Sep 30 08:00:00 crc kubenswrapper[4691]: I0930 08:00:00.859507 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hhnpm/crc-debug-hktr8" Sep 30 08:00:00 crc kubenswrapper[4691]: I0930 08:00:00.861587 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-hhnpm"/"default-dockercfg-zn5xk" Sep 30 08:00:00 crc kubenswrapper[4691]: I0930 08:00:00.987212 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320320-f4g9t"] Sep 30 08:00:01 crc kubenswrapper[4691]: I0930 08:00:01.016748 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320320-f4g9t" event={"ID":"37ed5786-2149-462c-afb3-6117272c87fd","Type":"ContainerStarted","Data":"83a1e383e83a355dab893b2a0b7d819d6be4b8fe7afff64ecdb3247e28f58f54"} Sep 30 08:00:01 crc kubenswrapper[4691]: I0930 08:00:01.024233 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c3809769-1691-4da5-8006-42a670facdb2-host\") pod \"crc-debug-hktr8\" (UID: \"c3809769-1691-4da5-8006-42a670facdb2\") " pod="openshift-must-gather-hhnpm/crc-debug-hktr8" Sep 30 08:00:01 crc kubenswrapper[4691]: I0930 08:00:01.024297 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8jk4\" (UniqueName: \"kubernetes.io/projected/c3809769-1691-4da5-8006-42a670facdb2-kube-api-access-w8jk4\") pod \"crc-debug-hktr8\" (UID: \"c3809769-1691-4da5-8006-42a670facdb2\") " pod="openshift-must-gather-hhnpm/crc-debug-hktr8" Sep 30 08:00:01 crc kubenswrapper[4691]: I0930 08:00:01.126011 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c3809769-1691-4da5-8006-42a670facdb2-host\") pod \"crc-debug-hktr8\" (UID: \"c3809769-1691-4da5-8006-42a670facdb2\") " pod="openshift-must-gather-hhnpm/crc-debug-hktr8" Sep 30 08:00:01 crc kubenswrapper[4691]: I0930 08:00:01.126108 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8jk4\" (UniqueName: \"kubernetes.io/projected/c3809769-1691-4da5-8006-42a670facdb2-kube-api-access-w8jk4\") pod \"crc-debug-hktr8\" (UID: \"c3809769-1691-4da5-8006-42a670facdb2\") " pod="openshift-must-gather-hhnpm/crc-debug-hktr8" Sep 30 08:00:01 crc kubenswrapper[4691]: I0930 08:00:01.126148 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c3809769-1691-4da5-8006-42a670facdb2-host\") pod \"crc-debug-hktr8\" (UID: \"c3809769-1691-4da5-8006-42a670facdb2\") " pod="openshift-must-gather-hhnpm/crc-debug-hktr8" Sep 30 08:00:01 crc kubenswrapper[4691]: I0930 08:00:01.145223 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8jk4\" (UniqueName: \"kubernetes.io/projected/c3809769-1691-4da5-8006-42a670facdb2-kube-api-access-w8jk4\") pod \"crc-debug-hktr8\" (UID: \"c3809769-1691-4da5-8006-42a670facdb2\") " pod="openshift-must-gather-hhnpm/crc-debug-hktr8" Sep 30 08:00:01 crc kubenswrapper[4691]: I0930 08:00:01.180227 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hhnpm/crc-debug-hktr8" Sep 30 08:00:02 crc kubenswrapper[4691]: I0930 08:00:02.025366 4691 generic.go:334] "Generic (PLEG): container finished" podID="37ed5786-2149-462c-afb3-6117272c87fd" containerID="243467e1555cc133d1b0bed9d2a1e3d5074551620e15ae1e9f3a8bbbe8e960a0" exitCode=0 Sep 30 08:00:02 crc kubenswrapper[4691]: I0930 08:00:02.025458 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320320-f4g9t" event={"ID":"37ed5786-2149-462c-afb3-6117272c87fd","Type":"ContainerDied","Data":"243467e1555cc133d1b0bed9d2a1e3d5074551620e15ae1e9f3a8bbbe8e960a0"} Sep 30 08:00:02 crc kubenswrapper[4691]: I0930 08:00:02.027397 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hhnpm/crc-debug-hktr8" event={"ID":"c3809769-1691-4da5-8006-42a670facdb2","Type":"ContainerStarted","Data":"23cd875c3665508cb3172658942abea599d7610c29c55827096cc831a5c32322"} Sep 30 08:00:02 crc kubenswrapper[4691]: I0930 08:00:02.027429 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hhnpm/crc-debug-hktr8" event={"ID":"c3809769-1691-4da5-8006-42a670facdb2","Type":"ContainerStarted","Data":"f0623aeaca9ea9cba9695e3f183a4ecf7b3bd89507860c5088e74f3d63bd4bdc"} Sep 30 08:00:02 crc kubenswrapper[4691]: I0930 08:00:02.069539 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-hhnpm/crc-debug-hktr8" podStartSLOduration=2.069524298 podStartE2EDuration="2.069524298s" podCreationTimestamp="2025-09-30 08:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 08:00:02.062520853 +0000 UTC m=+6045.537541893" watchObservedRunningTime="2025-09-30 08:00:02.069524298 +0000 UTC m=+6045.544545338" Sep 30 08:00:03 crc kubenswrapper[4691]: I0930 08:00:03.454615 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320320-f4g9t" Sep 30 08:00:03 crc kubenswrapper[4691]: I0930 08:00:03.606113 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/37ed5786-2149-462c-afb3-6117272c87fd-secret-volume\") pod \"37ed5786-2149-462c-afb3-6117272c87fd\" (UID: \"37ed5786-2149-462c-afb3-6117272c87fd\") " Sep 30 08:00:03 crc kubenswrapper[4691]: I0930 08:00:03.606197 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68stm\" (UniqueName: \"kubernetes.io/projected/37ed5786-2149-462c-afb3-6117272c87fd-kube-api-access-68stm\") pod \"37ed5786-2149-462c-afb3-6117272c87fd\" (UID: \"37ed5786-2149-462c-afb3-6117272c87fd\") " Sep 30 08:00:03 crc kubenswrapper[4691]: I0930 08:00:03.606440 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/37ed5786-2149-462c-afb3-6117272c87fd-config-volume\") pod \"37ed5786-2149-462c-afb3-6117272c87fd\" (UID: \"37ed5786-2149-462c-afb3-6117272c87fd\") " Sep 30 08:00:03 crc kubenswrapper[4691]: I0930 08:00:03.607398 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/37ed5786-2149-462c-afb3-6117272c87fd-config-volume" (OuterVolumeSpecName: "config-volume") pod "37ed5786-2149-462c-afb3-6117272c87fd" (UID: "37ed5786-2149-462c-afb3-6117272c87fd"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Sep 30 08:00:03 crc kubenswrapper[4691]: I0930 08:00:03.617247 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/37ed5786-2149-462c-afb3-6117272c87fd-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "37ed5786-2149-462c-afb3-6117272c87fd" (UID: "37ed5786-2149-462c-afb3-6117272c87fd"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 08:00:03 crc kubenswrapper[4691]: I0930 08:00:03.617635 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/37ed5786-2149-462c-afb3-6117272c87fd-kube-api-access-68stm" (OuterVolumeSpecName: "kube-api-access-68stm") pod "37ed5786-2149-462c-afb3-6117272c87fd" (UID: "37ed5786-2149-462c-afb3-6117272c87fd"). InnerVolumeSpecName "kube-api-access-68stm". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 08:00:03 crc kubenswrapper[4691]: I0930 08:00:03.709691 4691 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/37ed5786-2149-462c-afb3-6117272c87fd-config-volume\") on node \"crc\" DevicePath \"\"" Sep 30 08:00:03 crc kubenswrapper[4691]: I0930 08:00:03.709730 4691 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/37ed5786-2149-462c-afb3-6117272c87fd-secret-volume\") on node \"crc\" DevicePath \"\"" Sep 30 08:00:03 crc kubenswrapper[4691]: I0930 08:00:03.709744 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68stm\" (UniqueName: \"kubernetes.io/projected/37ed5786-2149-462c-afb3-6117272c87fd-kube-api-access-68stm\") on node \"crc\" DevicePath \"\"" Sep 30 08:00:04 crc kubenswrapper[4691]: I0930 08:00:04.046533 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29320320-f4g9t" event={"ID":"37ed5786-2149-462c-afb3-6117272c87fd","Type":"ContainerDied","Data":"83a1e383e83a355dab893b2a0b7d819d6be4b8fe7afff64ecdb3247e28f58f54"} Sep 30 08:00:04 crc kubenswrapper[4691]: I0930 08:00:04.046576 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83a1e383e83a355dab893b2a0b7d819d6be4b8fe7afff64ecdb3247e28f58f54" Sep 30 08:00:04 crc kubenswrapper[4691]: I0930 08:00:04.046635 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29320320-f4g9t" Sep 30 08:00:04 crc kubenswrapper[4691]: I0930 08:00:04.541792 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320275-bpbdp"] Sep 30 08:00:04 crc kubenswrapper[4691]: I0930 08:00:04.554404 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29320275-bpbdp"] Sep 30 08:00:05 crc kubenswrapper[4691]: I0930 08:00:05.235018 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4fb18c4-f0fd-438b-a522-1a7807fb7b30" path="/var/lib/kubelet/pods/c4fb18c4-f0fd-438b-a522-1a7807fb7b30/volumes" Sep 30 08:00:11 crc kubenswrapper[4691]: I0930 08:00:11.225289 4691 scope.go:117] "RemoveContainer" containerID="aec37d63291116406dc4ed4a91e1140d14893da8976b65afbaac021af006f18a" Sep 30 08:00:11 crc kubenswrapper[4691]: E0930 08:00:11.225995 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 08:00:22 crc kubenswrapper[4691]: I0930 08:00:22.225448 4691 scope.go:117] "RemoveContainer" containerID="aec37d63291116406dc4ed4a91e1140d14893da8976b65afbaac021af006f18a" Sep 30 08:00:22 crc kubenswrapper[4691]: E0930 08:00:22.226110 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 08:00:28 crc kubenswrapper[4691]: I0930 08:00:28.155899 4691 scope.go:117] "RemoveContainer" containerID="656a3487a2e2480d56f0b95bb8f030caa95accbee7d82b2d6e4df1185f2d34d2" Sep 30 08:00:33 crc kubenswrapper[4691]: I0930 08:00:33.224905 4691 scope.go:117] "RemoveContainer" containerID="aec37d63291116406dc4ed4a91e1140d14893da8976b65afbaac021af006f18a" Sep 30 08:00:33 crc kubenswrapper[4691]: E0930 08:00:33.226600 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 08:00:47 crc kubenswrapper[4691]: I0930 08:00:47.237935 4691 scope.go:117] "RemoveContainer" containerID="aec37d63291116406dc4ed4a91e1140d14893da8976b65afbaac021af006f18a" Sep 30 08:00:47 crc kubenswrapper[4691]: E0930 08:00:47.239228 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 08:01:00 crc kubenswrapper[4691]: I0930 08:01:00.151275 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29320321-5bgjr"] Sep 30 08:01:00 crc kubenswrapper[4691]: E0930 08:01:00.153338 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="37ed5786-2149-462c-afb3-6117272c87fd" containerName="collect-profiles" Sep 30 08:01:00 crc kubenswrapper[4691]: I0930 08:01:00.153479 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="37ed5786-2149-462c-afb3-6117272c87fd" containerName="collect-profiles" Sep 30 08:01:00 crc kubenswrapper[4691]: I0930 08:01:00.153875 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="37ed5786-2149-462c-afb3-6117272c87fd" containerName="collect-profiles" Sep 30 08:01:00 crc kubenswrapper[4691]: I0930 08:01:00.154847 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29320321-5bgjr" Sep 30 08:01:00 crc kubenswrapper[4691]: I0930 08:01:00.192019 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29320321-5bgjr"] Sep 30 08:01:00 crc kubenswrapper[4691]: I0930 08:01:00.262927 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvf29\" (UniqueName: \"kubernetes.io/projected/0511fcca-418a-452f-a2de-32c8edb206f2-kube-api-access-cvf29\") pod \"keystone-cron-29320321-5bgjr\" (UID: \"0511fcca-418a-452f-a2de-32c8edb206f2\") " pod="openstack/keystone-cron-29320321-5bgjr" Sep 30 08:01:00 crc kubenswrapper[4691]: I0930 08:01:00.262985 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0511fcca-418a-452f-a2de-32c8edb206f2-fernet-keys\") pod \"keystone-cron-29320321-5bgjr\" (UID: \"0511fcca-418a-452f-a2de-32c8edb206f2\") " pod="openstack/keystone-cron-29320321-5bgjr" Sep 30 08:01:00 crc kubenswrapper[4691]: I0930 08:01:00.263160 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0511fcca-418a-452f-a2de-32c8edb206f2-combined-ca-bundle\") pod \"keystone-cron-29320321-5bgjr\" (UID: \"0511fcca-418a-452f-a2de-32c8edb206f2\") " pod="openstack/keystone-cron-29320321-5bgjr" Sep 30 08:01:00 crc kubenswrapper[4691]: I0930 08:01:00.265193 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0511fcca-418a-452f-a2de-32c8edb206f2-config-data\") pod \"keystone-cron-29320321-5bgjr\" (UID: \"0511fcca-418a-452f-a2de-32c8edb206f2\") " pod="openstack/keystone-cron-29320321-5bgjr" Sep 30 08:01:00 crc kubenswrapper[4691]: I0930 08:01:00.373080 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0511fcca-418a-452f-a2de-32c8edb206f2-combined-ca-bundle\") pod \"keystone-cron-29320321-5bgjr\" (UID: \"0511fcca-418a-452f-a2de-32c8edb206f2\") " pod="openstack/keystone-cron-29320321-5bgjr" Sep 30 08:01:00 crc kubenswrapper[4691]: I0930 08:01:00.373156 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0511fcca-418a-452f-a2de-32c8edb206f2-config-data\") pod \"keystone-cron-29320321-5bgjr\" (UID: \"0511fcca-418a-452f-a2de-32c8edb206f2\") " pod="openstack/keystone-cron-29320321-5bgjr" Sep 30 08:01:00 crc kubenswrapper[4691]: I0930 08:01:00.373242 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvf29\" (UniqueName: \"kubernetes.io/projected/0511fcca-418a-452f-a2de-32c8edb206f2-kube-api-access-cvf29\") pod \"keystone-cron-29320321-5bgjr\" (UID: \"0511fcca-418a-452f-a2de-32c8edb206f2\") " pod="openstack/keystone-cron-29320321-5bgjr" Sep 30 08:01:00 crc kubenswrapper[4691]: I0930 08:01:00.373271 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0511fcca-418a-452f-a2de-32c8edb206f2-fernet-keys\") pod \"keystone-cron-29320321-5bgjr\" (UID: \"0511fcca-418a-452f-a2de-32c8edb206f2\") " pod="openstack/keystone-cron-29320321-5bgjr" Sep 30 08:01:00 crc kubenswrapper[4691]: I0930 08:01:00.379810 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0511fcca-418a-452f-a2de-32c8edb206f2-fernet-keys\") pod \"keystone-cron-29320321-5bgjr\" (UID: \"0511fcca-418a-452f-a2de-32c8edb206f2\") " pod="openstack/keystone-cron-29320321-5bgjr" Sep 30 08:01:00 crc kubenswrapper[4691]: I0930 08:01:00.385659 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0511fcca-418a-452f-a2de-32c8edb206f2-combined-ca-bundle\") pod \"keystone-cron-29320321-5bgjr\" (UID: \"0511fcca-418a-452f-a2de-32c8edb206f2\") " pod="openstack/keystone-cron-29320321-5bgjr" Sep 30 08:01:00 crc kubenswrapper[4691]: I0930 08:01:00.387867 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0511fcca-418a-452f-a2de-32c8edb206f2-config-data\") pod \"keystone-cron-29320321-5bgjr\" (UID: \"0511fcca-418a-452f-a2de-32c8edb206f2\") " pod="openstack/keystone-cron-29320321-5bgjr" Sep 30 08:01:00 crc kubenswrapper[4691]: I0930 08:01:00.409218 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvf29\" (UniqueName: \"kubernetes.io/projected/0511fcca-418a-452f-a2de-32c8edb206f2-kube-api-access-cvf29\") pod \"keystone-cron-29320321-5bgjr\" (UID: \"0511fcca-418a-452f-a2de-32c8edb206f2\") " pod="openstack/keystone-cron-29320321-5bgjr" Sep 30 08:01:00 crc kubenswrapper[4691]: I0930 08:01:00.478022 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29320321-5bgjr" Sep 30 08:01:00 crc kubenswrapper[4691]: I0930 08:01:00.967854 4691 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29320321-5bgjr"] Sep 30 08:01:01 crc kubenswrapper[4691]: I0930 08:01:01.226271 4691 scope.go:117] "RemoveContainer" containerID="aec37d63291116406dc4ed4a91e1140d14893da8976b65afbaac021af006f18a" Sep 30 08:01:01 crc kubenswrapper[4691]: E0930 08:01:01.227183 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 08:01:01 crc kubenswrapper[4691]: I0930 08:01:01.618764 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29320321-5bgjr" event={"ID":"0511fcca-418a-452f-a2de-32c8edb206f2","Type":"ContainerStarted","Data":"c418a5eba4960ee1b73890e6913e59e2493d15b35666a62bd6dca2aa5365458e"} Sep 30 08:01:01 crc kubenswrapper[4691]: I0930 08:01:01.618808 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29320321-5bgjr" event={"ID":"0511fcca-418a-452f-a2de-32c8edb206f2","Type":"ContainerStarted","Data":"f60a448d9a6c38c1441ec79e5d29119ba345e230ccc6a75c6c780a799543112a"} Sep 30 08:01:01 crc kubenswrapper[4691]: I0930 08:01:01.639694 4691 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29320321-5bgjr" podStartSLOduration=1.639662366 podStartE2EDuration="1.639662366s" podCreationTimestamp="2025-09-30 08:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-09-30 08:01:01.634792659 +0000 UTC m=+6105.109813719" watchObservedRunningTime="2025-09-30 08:01:01.639662366 +0000 UTC m=+6105.114683486" Sep 30 08:01:04 crc kubenswrapper[4691]: I0930 08:01:04.652386 4691 generic.go:334] "Generic (PLEG): container finished" podID="0511fcca-418a-452f-a2de-32c8edb206f2" containerID="c418a5eba4960ee1b73890e6913e59e2493d15b35666a62bd6dca2aa5365458e" exitCode=0 Sep 30 08:01:04 crc kubenswrapper[4691]: I0930 08:01:04.652511 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29320321-5bgjr" event={"ID":"0511fcca-418a-452f-a2de-32c8edb206f2","Type":"ContainerDied","Data":"c418a5eba4960ee1b73890e6913e59e2493d15b35666a62bd6dca2aa5365458e"} Sep 30 08:01:05 crc kubenswrapper[4691]: I0930 08:01:05.993561 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29320321-5bgjr" Sep 30 08:01:06 crc kubenswrapper[4691]: I0930 08:01:06.095032 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0511fcca-418a-452f-a2de-32c8edb206f2-combined-ca-bundle\") pod \"0511fcca-418a-452f-a2de-32c8edb206f2\" (UID: \"0511fcca-418a-452f-a2de-32c8edb206f2\") " Sep 30 08:01:06 crc kubenswrapper[4691]: I0930 08:01:06.095215 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0511fcca-418a-452f-a2de-32c8edb206f2-config-data\") pod \"0511fcca-418a-452f-a2de-32c8edb206f2\" (UID: \"0511fcca-418a-452f-a2de-32c8edb206f2\") " Sep 30 08:01:06 crc kubenswrapper[4691]: I0930 08:01:06.095262 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0511fcca-418a-452f-a2de-32c8edb206f2-fernet-keys\") pod \"0511fcca-418a-452f-a2de-32c8edb206f2\" (UID: \"0511fcca-418a-452f-a2de-32c8edb206f2\") " Sep 30 08:01:06 crc kubenswrapper[4691]: I0930 08:01:06.095373 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvf29\" (UniqueName: \"kubernetes.io/projected/0511fcca-418a-452f-a2de-32c8edb206f2-kube-api-access-cvf29\") pod \"0511fcca-418a-452f-a2de-32c8edb206f2\" (UID: \"0511fcca-418a-452f-a2de-32c8edb206f2\") " Sep 30 08:01:06 crc kubenswrapper[4691]: I0930 08:01:06.103221 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0511fcca-418a-452f-a2de-32c8edb206f2-kube-api-access-cvf29" (OuterVolumeSpecName: "kube-api-access-cvf29") pod "0511fcca-418a-452f-a2de-32c8edb206f2" (UID: "0511fcca-418a-452f-a2de-32c8edb206f2"). InnerVolumeSpecName "kube-api-access-cvf29". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 08:01:06 crc kubenswrapper[4691]: I0930 08:01:06.117104 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0511fcca-418a-452f-a2de-32c8edb206f2-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "0511fcca-418a-452f-a2de-32c8edb206f2" (UID: "0511fcca-418a-452f-a2de-32c8edb206f2"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 08:01:06 crc kubenswrapper[4691]: I0930 08:01:06.125827 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0511fcca-418a-452f-a2de-32c8edb206f2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0511fcca-418a-452f-a2de-32c8edb206f2" (UID: "0511fcca-418a-452f-a2de-32c8edb206f2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 08:01:06 crc kubenswrapper[4691]: I0930 08:01:06.154277 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0511fcca-418a-452f-a2de-32c8edb206f2-config-data" (OuterVolumeSpecName: "config-data") pod "0511fcca-418a-452f-a2de-32c8edb206f2" (UID: "0511fcca-418a-452f-a2de-32c8edb206f2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Sep 30 08:01:06 crc kubenswrapper[4691]: I0930 08:01:06.197245 4691 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0511fcca-418a-452f-a2de-32c8edb206f2-config-data\") on node \"crc\" DevicePath \"\"" Sep 30 08:01:06 crc kubenswrapper[4691]: I0930 08:01:06.197283 4691 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0511fcca-418a-452f-a2de-32c8edb206f2-fernet-keys\") on node \"crc\" DevicePath \"\"" Sep 30 08:01:06 crc kubenswrapper[4691]: I0930 08:01:06.197297 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvf29\" (UniqueName: \"kubernetes.io/projected/0511fcca-418a-452f-a2de-32c8edb206f2-kube-api-access-cvf29\") on node \"crc\" DevicePath \"\"" Sep 30 08:01:06 crc kubenswrapper[4691]: I0930 08:01:06.197312 4691 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0511fcca-418a-452f-a2de-32c8edb206f2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Sep 30 08:01:06 crc kubenswrapper[4691]: I0930 08:01:06.675599 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29320321-5bgjr" event={"ID":"0511fcca-418a-452f-a2de-32c8edb206f2","Type":"ContainerDied","Data":"f60a448d9a6c38c1441ec79e5d29119ba345e230ccc6a75c6c780a799543112a"} Sep 30 08:01:06 crc kubenswrapper[4691]: I0930 08:01:06.675990 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f60a448d9a6c38c1441ec79e5d29119ba345e230ccc6a75c6c780a799543112a" Sep 30 08:01:06 crc kubenswrapper[4691]: I0930 08:01:06.675676 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29320321-5bgjr" Sep 30 08:01:15 crc kubenswrapper[4691]: I0930 08:01:15.228277 4691 scope.go:117] "RemoveContainer" containerID="aec37d63291116406dc4ed4a91e1140d14893da8976b65afbaac021af006f18a" Sep 30 08:01:15 crc kubenswrapper[4691]: E0930 08:01:15.229135 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 08:01:18 crc kubenswrapper[4691]: I0930 08:01:18.957157 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-68888bb5f6-d225g_29d7ded5-bae4-41e2-9aa9-c959091d3696/barbican-api/0.log" Sep 30 08:01:19 crc kubenswrapper[4691]: I0930 08:01:19.027670 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-68888bb5f6-d225g_29d7ded5-bae4-41e2-9aa9-c959091d3696/barbican-api-log/0.log" Sep 30 08:01:19 crc kubenswrapper[4691]: I0930 08:01:19.204875 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5c7dfdd7cd-qdz5k_bccc96eb-4a1a-44bc-8086-eb5e7a7ce253/barbican-keystone-listener/0.log" Sep 30 08:01:19 crc kubenswrapper[4691]: I0930 08:01:19.298634 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-5c7dfdd7cd-qdz5k_bccc96eb-4a1a-44bc-8086-eb5e7a7ce253/barbican-keystone-listener-log/0.log" Sep 30 08:01:19 crc kubenswrapper[4691]: I0930 08:01:19.419626 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5d7ff878f-9tz9w_9d94cb2d-a415-4b43-9976-0a844c446734/barbican-worker/0.log" Sep 30 08:01:19 crc kubenswrapper[4691]: I0930 08:01:19.497560 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-5d7ff878f-9tz9w_9d94cb2d-a415-4b43-9976-0a844c446734/barbican-worker-log/0.log" Sep 30 08:01:19 crc kubenswrapper[4691]: I0930 08:01:19.662786 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-wdn8x_c6027156-9dfc-40c5-b265-96d0231b32d6/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 08:01:19 crc kubenswrapper[4691]: I0930 08:01:19.923569 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ac8d6a42-d8ce-419f-ae31-d9746dcedea9/ceilometer-central-agent/0.log" Sep 30 08:01:19 crc kubenswrapper[4691]: I0930 08:01:19.934452 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ac8d6a42-d8ce-419f-ae31-d9746dcedea9/ceilometer-notification-agent/0.log" Sep 30 08:01:19 crc kubenswrapper[4691]: I0930 08:01:19.953653 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ac8d6a42-d8ce-419f-ae31-d9746dcedea9/proxy-httpd/0.log" Sep 30 08:01:20 crc kubenswrapper[4691]: I0930 08:01:20.096635 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_ac8d6a42-d8ce-419f-ae31-d9746dcedea9/sg-core/0.log" Sep 30 08:01:20 crc kubenswrapper[4691]: I0930 08:01:20.337420 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_397c7023-cd6a-42ac-8d37-5813f5f9d45e/cinder-api-log/0.log" Sep 30 08:01:20 crc kubenswrapper[4691]: I0930 08:01:20.415414 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_397c7023-cd6a-42ac-8d37-5813f5f9d45e/cinder-api/0.log" Sep 30 08:01:20 crc kubenswrapper[4691]: I0930 08:01:20.605196 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_0937a7d3-6bf0-4114-b73b-0d10f2f19945/cinder-scheduler/0.log" Sep 30 08:01:20 crc kubenswrapper[4691]: I0930 08:01:20.685821 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_0937a7d3-6bf0-4114-b73b-0d10f2f19945/probe/0.log" Sep 30 08:01:20 crc kubenswrapper[4691]: I0930 08:01:20.832441 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-528ll_6b78a233-7f96-48a0-b484-0bb1196d8d4e/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 08:01:20 crc kubenswrapper[4691]: I0930 08:01:20.901677 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-hpsvp_6bb5c646-a0b7-4ed5-b5ef-28727886b271/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 08:01:21 crc kubenswrapper[4691]: I0930 08:01:21.083374 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-c6b9844bc-q6q6n_c18992fb-4c6e-4a18-a9b9-f00db9817b1b/init/0.log" Sep 30 08:01:21 crc kubenswrapper[4691]: I0930 08:01:21.274426 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-c6b9844bc-q6q6n_c18992fb-4c6e-4a18-a9b9-f00db9817b1b/init/0.log" Sep 30 08:01:21 crc kubenswrapper[4691]: I0930 08:01:21.464984 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-c6b9844bc-q6q6n_c18992fb-4c6e-4a18-a9b9-f00db9817b1b/dnsmasq-dns/0.log" Sep 30 08:01:21 crc kubenswrapper[4691]: I0930 08:01:21.508595 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-mwcfh_f8536c3f-e28e-49a1-9b22-bb6ab2652c5b/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 08:01:21 crc kubenswrapper[4691]: I0930 08:01:21.690720 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_d2a83f16-21dd-442b-b27d-6c583c783055/glance-log/0.log" Sep 30 08:01:21 crc kubenswrapper[4691]: I0930 08:01:21.691989 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_d2a83f16-21dd-442b-b27d-6c583c783055/glance-httpd/0.log" Sep 30 08:01:21 crc kubenswrapper[4691]: I0930 08:01:21.883520 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_34863af3-4c23-43ce-b483-713ca0d1f744/glance-log/0.log" Sep 30 08:01:21 crc kubenswrapper[4691]: I0930 08:01:21.890172 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_34863af3-4c23-43ce-b483-713ca0d1f744/glance-httpd/0.log" Sep 30 08:01:22 crc kubenswrapper[4691]: I0930 08:01:22.193550 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-crvv4_98946d0d-1b03-4bf2-bd9b-71105ac901f8/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 08:01:22 crc kubenswrapper[4691]: I0930 08:01:22.194483 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-56948c48fd-czzmm_fec95e1e-14f4-4093-b1d4-402c29686348/horizon/0.log" Sep 30 08:01:22 crc kubenswrapper[4691]: I0930 08:01:22.492642 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-wlzdt_455e6d2b-cc2e-4b09-899d-f913094c603f/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 08:01:22 crc kubenswrapper[4691]: I0930 08:01:22.694850 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29320261-kh7r8_a5b3e3e6-7e53-4c0f-a1a5-e77f887cb06b/keystone-cron/0.log" Sep 30 08:01:22 crc kubenswrapper[4691]: I0930 08:01:22.909344 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29320321-5bgjr_0511fcca-418a-452f-a2de-32c8edb206f2/keystone-cron/0.log" Sep 30 08:01:22 crc kubenswrapper[4691]: I0930 08:01:22.934838 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-56948c48fd-czzmm_fec95e1e-14f4-4093-b1d4-402c29686348/horizon-log/0.log" Sep 30 08:01:23 crc kubenswrapper[4691]: I0930 08:01:23.085737 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_36b81859-2533-442a-bf54-a2fe2a8a5baa/kube-state-metrics/0.log" Sep 30 08:01:23 crc kubenswrapper[4691]: I0930 08:01:23.163949 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-bf6754cd6-fsq4c_5fef5a53-6fb3-4c3b-8929-e9e49f85f050/keystone-api/0.log" Sep 30 08:01:23 crc kubenswrapper[4691]: I0930 08:01:23.353755 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-qgtz6_8e08d67e-28fd-4a4b-905a-765d0e33013d/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 08:01:23 crc kubenswrapper[4691]: I0930 08:01:23.821296 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-8687477df-8l865_7d0f6749-bfde-4329-9905-f51ef18e904c/neutron-httpd/0.log" Sep 30 08:01:23 crc kubenswrapper[4691]: I0930 08:01:23.884898 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-8687477df-8l865_7d0f6749-bfde-4329-9905-f51ef18e904c/neutron-api/0.log" Sep 30 08:01:23 crc kubenswrapper[4691]: I0930 08:01:23.907110 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-qpl4r_214c8c4f-8184-4b59-9fcd-c1112551b5b2/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 08:01:24 crc kubenswrapper[4691]: I0930 08:01:24.777462 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_c523d401-a3b1-4181-8216-bbf80156c7c4/nova-cell0-conductor-conductor/0.log" Sep 30 08:01:25 crc kubenswrapper[4691]: I0930 08:01:25.409821 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_b0efbbe5-44f2-4424-9a32-476f81246c28/nova-cell1-conductor-conductor/0.log" Sep 30 08:01:25 crc kubenswrapper[4691]: I0930 08:01:25.615637 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_f8b5ef76-7c77-4698-9f7f-219791e59bd2/nova-api-log/0.log" Sep 30 08:01:26 crc kubenswrapper[4691]: I0930 08:01:26.055116 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_3a61f6fc-3212-4050-92f5-363ed195680f/nova-cell1-novncproxy-novncproxy/0.log" Sep 30 08:01:26 crc kubenswrapper[4691]: I0930 08:01:26.102800 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-z8bsh_c6e5786f-d234-454c-8276-5355726052be/nova-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 08:01:26 crc kubenswrapper[4691]: I0930 08:01:26.142265 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_f8b5ef76-7c77-4698-9f7f-219791e59bd2/nova-api-api/0.log" Sep 30 08:01:26 crc kubenswrapper[4691]: I0930 08:01:26.431130 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_4f6cce79-72b3-407a-8ac5-ca3782a878b5/nova-metadata-log/0.log" Sep 30 08:01:27 crc kubenswrapper[4691]: I0930 08:01:27.002595 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_605e7ef7-ef82-4a1f-98aa-15eef2a3f8c6/mysql-bootstrap/0.log" Sep 30 08:01:27 crc kubenswrapper[4691]: I0930 08:01:27.028405 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_24eb00d6-56e5-477b-840b-ad3f6fd6e473/nova-scheduler-scheduler/0.log" Sep 30 08:01:27 crc kubenswrapper[4691]: I0930 08:01:27.197514 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_605e7ef7-ef82-4a1f-98aa-15eef2a3f8c6/mysql-bootstrap/0.log" Sep 30 08:01:27 crc kubenswrapper[4691]: I0930 08:01:27.242850 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_605e7ef7-ef82-4a1f-98aa-15eef2a3f8c6/galera/0.log" Sep 30 08:01:27 crc kubenswrapper[4691]: I0930 08:01:27.496479 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_08782d24-2bd9-48d6-b9b2-12a2ad66e6d0/mysql-bootstrap/0.log" Sep 30 08:01:27 crc kubenswrapper[4691]: I0930 08:01:27.679251 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_08782d24-2bd9-48d6-b9b2-12a2ad66e6d0/mysql-bootstrap/0.log" Sep 30 08:01:27 crc kubenswrapper[4691]: I0930 08:01:27.686529 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_08782d24-2bd9-48d6-b9b2-12a2ad66e6d0/galera/0.log" Sep 30 08:01:27 crc kubenswrapper[4691]: I0930 08:01:27.951393 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_7161fa44-37a9-4d36-b45b-b4f5cf9aa2ee/openstackclient/0.log" Sep 30 08:01:28 crc kubenswrapper[4691]: I0930 08:01:28.168734 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-2wmg8_86397f09-76d1-4c35-a96a-5b6bde1e3574/ovn-controller/0.log" Sep 30 08:01:28 crc kubenswrapper[4691]: I0930 08:01:28.336313 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-8h9p5_24478def-6fea-4596-b4e3-fd3abee81a62/openstack-network-exporter/0.log" Sep 30 08:01:28 crc kubenswrapper[4691]: I0930 08:01:28.613751 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-csq87_99bbc7fe-4a99-4f60-b840-8843790d6cb4/ovsdb-server-init/0.log" Sep 30 08:01:28 crc kubenswrapper[4691]: I0930 08:01:28.649459 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_5c310640-e561-4e1e-8f7c-046a7eec139d/memcached/0.log" Sep 30 08:01:28 crc kubenswrapper[4691]: I0930 08:01:28.670827 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_4f6cce79-72b3-407a-8ac5-ca3782a878b5/nova-metadata-metadata/0.log" Sep 30 08:01:28 crc kubenswrapper[4691]: I0930 08:01:28.766209 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-csq87_99bbc7fe-4a99-4f60-b840-8843790d6cb4/ovsdb-server-init/0.log" Sep 30 08:01:28 crc kubenswrapper[4691]: I0930 08:01:28.841559 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-csq87_99bbc7fe-4a99-4f60-b840-8843790d6cb4/ovsdb-server/0.log" Sep 30 08:01:29 crc kubenswrapper[4691]: I0930 08:01:29.010954 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-csq87_99bbc7fe-4a99-4f60-b840-8843790d6cb4/ovs-vswitchd/0.log" Sep 30 08:01:29 crc kubenswrapper[4691]: I0930 08:01:29.011180 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-bl96m_8638fef8-f042-4cc1-949d-fb0c107085b5/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 08:01:29 crc kubenswrapper[4691]: I0930 08:01:29.083865 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_e59b91c6-5922-4272-9c75-4e139031c87b/openstack-network-exporter/0.log" Sep 30 08:01:29 crc kubenswrapper[4691]: I0930 08:01:29.167777 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_e59b91c6-5922-4272-9c75-4e139031c87b/ovn-northd/0.log" Sep 30 08:01:29 crc kubenswrapper[4691]: I0930 08:01:29.225303 4691 scope.go:117] "RemoveContainer" containerID="aec37d63291116406dc4ed4a91e1140d14893da8976b65afbaac021af006f18a" Sep 30 08:01:29 crc kubenswrapper[4691]: E0930 08:01:29.225653 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 08:01:29 crc kubenswrapper[4691]: I0930 08:01:29.259797 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_46482328-297b-40b1-83e1-2270733d27d7/openstack-network-exporter/0.log" Sep 30 08:01:29 crc kubenswrapper[4691]: I0930 08:01:29.302943 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_46482328-297b-40b1-83e1-2270733d27d7/ovsdbserver-nb/0.log" Sep 30 08:01:29 crc kubenswrapper[4691]: I0930 08:01:29.399353 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_48c486cf-48da-4fd0-b450-d821ab6b2755/openstack-network-exporter/0.log" Sep 30 08:01:29 crc kubenswrapper[4691]: I0930 08:01:29.437642 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_48c486cf-48da-4fd0-b450-d821ab6b2755/ovsdbserver-sb/0.log" Sep 30 08:01:29 crc kubenswrapper[4691]: I0930 08:01:29.694752 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-56d945494d-7svb6_d35f539b-5139-4155-8f51-a1e425e19925/placement-api/0.log" Sep 30 08:01:29 crc kubenswrapper[4691]: I0930 08:01:29.730656 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_5621e369-5e8a-491d-aa26-098025c50c2f/init-config-reloader/0.log" Sep 30 08:01:29 crc kubenswrapper[4691]: I0930 08:01:29.772057 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-56d945494d-7svb6_d35f539b-5139-4155-8f51-a1e425e19925/placement-log/0.log" Sep 30 08:01:29 crc kubenswrapper[4691]: I0930 08:01:29.888644 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_5621e369-5e8a-491d-aa26-098025c50c2f/prometheus/0.log" Sep 30 08:01:29 crc kubenswrapper[4691]: I0930 08:01:29.894680 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_5621e369-5e8a-491d-aa26-098025c50c2f/init-config-reloader/0.log" Sep 30 08:01:29 crc kubenswrapper[4691]: I0930 08:01:29.917636 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_5621e369-5e8a-491d-aa26-098025c50c2f/config-reloader/0.log" Sep 30 08:01:29 crc kubenswrapper[4691]: I0930 08:01:29.977006 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_5621e369-5e8a-491d-aa26-098025c50c2f/thanos-sidecar/0.log" Sep 30 08:01:30 crc kubenswrapper[4691]: I0930 08:01:30.052525 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_136adcf8-2194-4dd2-9b57-6bf571f9e295/setup-container/0.log" Sep 30 08:01:30 crc kubenswrapper[4691]: I0930 08:01:30.316676 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_136adcf8-2194-4dd2-9b57-6bf571f9e295/rabbitmq/0.log" Sep 30 08:01:30 crc kubenswrapper[4691]: I0930 08:01:30.337433 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_d454968e-74c7-45e3-9608-e915973c7f25/setup-container/0.log" Sep 30 08:01:30 crc kubenswrapper[4691]: I0930 08:01:30.542029 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_136adcf8-2194-4dd2-9b57-6bf571f9e295/setup-container/0.log" Sep 30 08:01:30 crc kubenswrapper[4691]: I0930 08:01:30.691688 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_d454968e-74c7-45e3-9608-e915973c7f25/setup-container/0.log" Sep 30 08:01:30 crc kubenswrapper[4691]: I0930 08:01:30.709414 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_a7ac4031-4e4c-4dd1-bbe8-16846c31d7b5/setup-container/0.log" Sep 30 08:01:30 crc kubenswrapper[4691]: I0930 08:01:30.711797 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-notifications-server-0_d454968e-74c7-45e3-9608-e915973c7f25/rabbitmq/0.log" Sep 30 08:01:30 crc kubenswrapper[4691]: I0930 08:01:30.916060 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_a7ac4031-4e4c-4dd1-bbe8-16846c31d7b5/setup-container/0.log" Sep 30 08:01:30 crc kubenswrapper[4691]: I0930 08:01:30.935150 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_a7ac4031-4e4c-4dd1-bbe8-16846c31d7b5/rabbitmq/0.log" Sep 30 08:01:30 crc kubenswrapper[4691]: I0930 08:01:30.936086 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-cdxzg_6b407d88-19cd-402f-a417-64c08a37f051/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 08:01:31 crc kubenswrapper[4691]: I0930 08:01:31.091833 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-ljjt4_d39e1c92-309d-4295-8f78-e9d01ffdb114/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 08:01:31 crc kubenswrapper[4691]: I0930 08:01:31.160074 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-jkj8w_1ae5c682-dd33-42b2-8b7c-564876eef00a/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 08:01:31 crc kubenswrapper[4691]: I0930 08:01:31.297104 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-lt582_30e86152-90a5-42db-a157-e86cede48629/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 08:01:31 crc kubenswrapper[4691]: I0930 08:01:31.436980 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-jsfrl_890b2a56-9627-4b04-9e09-5bd7625272cd/ssh-known-hosts-edpm-deployment/0.log" Sep 30 08:01:31 crc kubenswrapper[4691]: I0930 08:01:31.652638 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-688b4ff469-2cgjc_e4e270ac-98e8-47b9-bf7b-7492996aa18c/proxy-server/0.log" Sep 30 08:01:31 crc kubenswrapper[4691]: I0930 08:01:31.657229 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-688b4ff469-2cgjc_e4e270ac-98e8-47b9-bf7b-7492996aa18c/proxy-httpd/0.log" Sep 30 08:01:31 crc kubenswrapper[4691]: I0930 08:01:31.702497 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-f4xvm_22775d02-1312-4d7a-917d-80dc62539dba/swift-ring-rebalance/0.log" Sep 30 08:01:31 crc kubenswrapper[4691]: I0930 08:01:31.816759 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ccb4d975-40e7-4a38-8b86-b18e685c570b/account-auditor/0.log" Sep 30 08:01:31 crc kubenswrapper[4691]: I0930 08:01:31.859964 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ccb4d975-40e7-4a38-8b86-b18e685c570b/account-reaper/0.log" Sep 30 08:01:31 crc kubenswrapper[4691]: I0930 08:01:31.939615 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ccb4d975-40e7-4a38-8b86-b18e685c570b/account-replicator/0.log" Sep 30 08:01:31 crc kubenswrapper[4691]: I0930 08:01:31.984126 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ccb4d975-40e7-4a38-8b86-b18e685c570b/account-server/0.log" Sep 30 08:01:32 crc kubenswrapper[4691]: I0930 08:01:32.041025 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ccb4d975-40e7-4a38-8b86-b18e685c570b/container-replicator/0.log" Sep 30 08:01:32 crc kubenswrapper[4691]: I0930 08:01:32.046952 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ccb4d975-40e7-4a38-8b86-b18e685c570b/container-auditor/0.log" Sep 30 08:01:32 crc kubenswrapper[4691]: I0930 08:01:32.122982 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ccb4d975-40e7-4a38-8b86-b18e685c570b/container-server/0.log" Sep 30 08:01:32 crc kubenswrapper[4691]: I0930 08:01:32.175574 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ccb4d975-40e7-4a38-8b86-b18e685c570b/container-updater/0.log" Sep 30 08:01:32 crc kubenswrapper[4691]: I0930 08:01:32.232966 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ccb4d975-40e7-4a38-8b86-b18e685c570b/object-auditor/0.log" Sep 30 08:01:32 crc kubenswrapper[4691]: I0930 08:01:32.239508 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ccb4d975-40e7-4a38-8b86-b18e685c570b/object-expirer/0.log" Sep 30 08:01:32 crc kubenswrapper[4691]: I0930 08:01:32.342512 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ccb4d975-40e7-4a38-8b86-b18e685c570b/object-replicator/0.log" Sep 30 08:01:32 crc kubenswrapper[4691]: I0930 08:01:32.362020 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ccb4d975-40e7-4a38-8b86-b18e685c570b/object-server/0.log" Sep 30 08:01:32 crc kubenswrapper[4691]: I0930 08:01:32.446830 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ccb4d975-40e7-4a38-8b86-b18e685c570b/rsync/0.log" Sep 30 08:01:32 crc kubenswrapper[4691]: I0930 08:01:32.449819 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ccb4d975-40e7-4a38-8b86-b18e685c570b/object-updater/0.log" Sep 30 08:01:32 crc kubenswrapper[4691]: I0930 08:01:32.512443 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_ccb4d975-40e7-4a38-8b86-b18e685c570b/swift-recon-cron/0.log" Sep 30 08:01:32 crc kubenswrapper[4691]: I0930 08:01:32.658339 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-cthz9_8c1bc2df-cff0-4d61-9773-0db30010956c/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 08:01:32 crc kubenswrapper[4691]: I0930 08:01:32.738227 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_d7f0691f-aa04-4bb3-b9aa-8e29fd3eeb03/tempest-tests-tempest-tests-runner/0.log" Sep 30 08:01:32 crc kubenswrapper[4691]: I0930 08:01:32.839416 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_e1d484c8-a1d8-4c39-89fb-4b7679e1c22a/test-operator-logs-container/0.log" Sep 30 08:01:32 crc kubenswrapper[4691]: I0930 08:01:32.918106 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-t25dz_e97ad218-7d51-462b-bdcf-cd39157152c1/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Sep 30 08:01:33 crc kubenswrapper[4691]: I0930 08:01:33.886132 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-applier-0_901f3032-8727-419d-8de7-b00c08535ca1/watcher-applier/0.log" Sep 30 08:01:34 crc kubenswrapper[4691]: I0930 08:01:34.117996 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-decision-engine-0_af3f1644-3ab8-4a6a-9f80-f8ea42297e98/watcher-decision-engine/1.log" Sep 30 08:01:34 crc kubenswrapper[4691]: I0930 08:01:34.319275 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_7c37a536-f38a-431d-8b76-fa23d610af0b/watcher-api-log/0.log" Sep 30 08:01:36 crc kubenswrapper[4691]: I0930 08:01:36.736931 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-decision-engine-0_af3f1644-3ab8-4a6a-9f80-f8ea42297e98/watcher-decision-engine/2.log" Sep 30 08:01:37 crc kubenswrapper[4691]: I0930 08:01:37.806545 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_watcher-api-0_7c37a536-f38a-431d-8b76-fa23d610af0b/watcher-api/0.log" Sep 30 08:01:43 crc kubenswrapper[4691]: I0930 08:01:43.225101 4691 scope.go:117] "RemoveContainer" containerID="aec37d63291116406dc4ed4a91e1140d14893da8976b65afbaac021af006f18a" Sep 30 08:01:43 crc kubenswrapper[4691]: E0930 08:01:43.226080 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 08:01:56 crc kubenswrapper[4691]: I0930 08:01:56.224540 4691 scope.go:117] "RemoveContainer" containerID="aec37d63291116406dc4ed4a91e1140d14893da8976b65afbaac021af006f18a" Sep 30 08:01:56 crc kubenswrapper[4691]: E0930 08:01:56.225386 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 08:02:03 crc kubenswrapper[4691]: I0930 08:02:03.263072 4691 generic.go:334] "Generic (PLEG): container finished" podID="c3809769-1691-4da5-8006-42a670facdb2" containerID="23cd875c3665508cb3172658942abea599d7610c29c55827096cc831a5c32322" exitCode=0 Sep 30 08:02:03 crc kubenswrapper[4691]: I0930 08:02:03.263290 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hhnpm/crc-debug-hktr8" event={"ID":"c3809769-1691-4da5-8006-42a670facdb2","Type":"ContainerDied","Data":"23cd875c3665508cb3172658942abea599d7610c29c55827096cc831a5c32322"} Sep 30 08:02:04 crc kubenswrapper[4691]: I0930 08:02:04.409783 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hhnpm/crc-debug-hktr8" Sep 30 08:02:04 crc kubenswrapper[4691]: I0930 08:02:04.460329 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-hhnpm/crc-debug-hktr8"] Sep 30 08:02:04 crc kubenswrapper[4691]: I0930 08:02:04.471379 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-hhnpm/crc-debug-hktr8"] Sep 30 08:02:04 crc kubenswrapper[4691]: I0930 08:02:04.559912 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8jk4\" (UniqueName: \"kubernetes.io/projected/c3809769-1691-4da5-8006-42a670facdb2-kube-api-access-w8jk4\") pod \"c3809769-1691-4da5-8006-42a670facdb2\" (UID: \"c3809769-1691-4da5-8006-42a670facdb2\") " Sep 30 08:02:04 crc kubenswrapper[4691]: I0930 08:02:04.560086 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c3809769-1691-4da5-8006-42a670facdb2-host\") pod \"c3809769-1691-4da5-8006-42a670facdb2\" (UID: \"c3809769-1691-4da5-8006-42a670facdb2\") " Sep 30 08:02:04 crc kubenswrapper[4691]: I0930 08:02:04.560259 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c3809769-1691-4da5-8006-42a670facdb2-host" (OuterVolumeSpecName: "host") pod "c3809769-1691-4da5-8006-42a670facdb2" (UID: "c3809769-1691-4da5-8006-42a670facdb2"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 08:02:04 crc kubenswrapper[4691]: I0930 08:02:04.560795 4691 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/c3809769-1691-4da5-8006-42a670facdb2-host\") on node \"crc\" DevicePath \"\"" Sep 30 08:02:04 crc kubenswrapper[4691]: I0930 08:02:04.568768 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3809769-1691-4da5-8006-42a670facdb2-kube-api-access-w8jk4" (OuterVolumeSpecName: "kube-api-access-w8jk4") pod "c3809769-1691-4da5-8006-42a670facdb2" (UID: "c3809769-1691-4da5-8006-42a670facdb2"). InnerVolumeSpecName "kube-api-access-w8jk4". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 08:02:04 crc kubenswrapper[4691]: I0930 08:02:04.664544 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8jk4\" (UniqueName: \"kubernetes.io/projected/c3809769-1691-4da5-8006-42a670facdb2-kube-api-access-w8jk4\") on node \"crc\" DevicePath \"\"" Sep 30 08:02:05 crc kubenswrapper[4691]: I0930 08:02:05.243986 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3809769-1691-4da5-8006-42a670facdb2" path="/var/lib/kubelet/pods/c3809769-1691-4da5-8006-42a670facdb2/volumes" Sep 30 08:02:05 crc kubenswrapper[4691]: I0930 08:02:05.293202 4691 scope.go:117] "RemoveContainer" containerID="23cd875c3665508cb3172658942abea599d7610c29c55827096cc831a5c32322" Sep 30 08:02:05 crc kubenswrapper[4691]: I0930 08:02:05.293323 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hhnpm/crc-debug-hktr8" Sep 30 08:02:05 crc kubenswrapper[4691]: I0930 08:02:05.669254 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-hhnpm/crc-debug-mfd99"] Sep 30 08:02:05 crc kubenswrapper[4691]: E0930 08:02:05.669915 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3809769-1691-4da5-8006-42a670facdb2" containerName="container-00" Sep 30 08:02:05 crc kubenswrapper[4691]: I0930 08:02:05.669939 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3809769-1691-4da5-8006-42a670facdb2" containerName="container-00" Sep 30 08:02:05 crc kubenswrapper[4691]: E0930 08:02:05.669988 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0511fcca-418a-452f-a2de-32c8edb206f2" containerName="keystone-cron" Sep 30 08:02:05 crc kubenswrapper[4691]: I0930 08:02:05.670001 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="0511fcca-418a-452f-a2de-32c8edb206f2" containerName="keystone-cron" Sep 30 08:02:05 crc kubenswrapper[4691]: I0930 08:02:05.670364 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3809769-1691-4da5-8006-42a670facdb2" containerName="container-00" Sep 30 08:02:05 crc kubenswrapper[4691]: I0930 08:02:05.670403 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="0511fcca-418a-452f-a2de-32c8edb206f2" containerName="keystone-cron" Sep 30 08:02:05 crc kubenswrapper[4691]: I0930 08:02:05.671466 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hhnpm/crc-debug-mfd99" Sep 30 08:02:05 crc kubenswrapper[4691]: I0930 08:02:05.674513 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-hhnpm"/"default-dockercfg-zn5xk" Sep 30 08:02:05 crc kubenswrapper[4691]: I0930 08:02:05.788502 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htkt6\" (UniqueName: \"kubernetes.io/projected/cc6f112c-f283-4a90-a889-d012e7163af9-kube-api-access-htkt6\") pod \"crc-debug-mfd99\" (UID: \"cc6f112c-f283-4a90-a889-d012e7163af9\") " pod="openshift-must-gather-hhnpm/crc-debug-mfd99" Sep 30 08:02:05 crc kubenswrapper[4691]: I0930 08:02:05.789226 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cc6f112c-f283-4a90-a889-d012e7163af9-host\") pod \"crc-debug-mfd99\" (UID: \"cc6f112c-f283-4a90-a889-d012e7163af9\") " pod="openshift-must-gather-hhnpm/crc-debug-mfd99" Sep 30 08:02:05 crc kubenswrapper[4691]: I0930 08:02:05.892645 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cc6f112c-f283-4a90-a889-d012e7163af9-host\") pod \"crc-debug-mfd99\" (UID: \"cc6f112c-f283-4a90-a889-d012e7163af9\") " pod="openshift-must-gather-hhnpm/crc-debug-mfd99" Sep 30 08:02:05 crc kubenswrapper[4691]: I0930 08:02:05.892869 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htkt6\" (UniqueName: \"kubernetes.io/projected/cc6f112c-f283-4a90-a889-d012e7163af9-kube-api-access-htkt6\") pod \"crc-debug-mfd99\" (UID: \"cc6f112c-f283-4a90-a889-d012e7163af9\") " pod="openshift-must-gather-hhnpm/crc-debug-mfd99" Sep 30 08:02:05 crc kubenswrapper[4691]: I0930 08:02:05.893375 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cc6f112c-f283-4a90-a889-d012e7163af9-host\") pod \"crc-debug-mfd99\" (UID: \"cc6f112c-f283-4a90-a889-d012e7163af9\") " pod="openshift-must-gather-hhnpm/crc-debug-mfd99" Sep 30 08:02:05 crc kubenswrapper[4691]: I0930 08:02:05.926820 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htkt6\" (UniqueName: \"kubernetes.io/projected/cc6f112c-f283-4a90-a889-d012e7163af9-kube-api-access-htkt6\") pod \"crc-debug-mfd99\" (UID: \"cc6f112c-f283-4a90-a889-d012e7163af9\") " pod="openshift-must-gather-hhnpm/crc-debug-mfd99" Sep 30 08:02:05 crc kubenswrapper[4691]: I0930 08:02:05.999925 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hhnpm/crc-debug-mfd99" Sep 30 08:02:06 crc kubenswrapper[4691]: I0930 08:02:06.319512 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hhnpm/crc-debug-mfd99" event={"ID":"cc6f112c-f283-4a90-a889-d012e7163af9","Type":"ContainerStarted","Data":"8387b5c376e7bdfeb4a8e1fecc40631a52f0b949ecae68069dc8c281d95e9edc"} Sep 30 08:02:07 crc kubenswrapper[4691]: I0930 08:02:07.356733 4691 generic.go:334] "Generic (PLEG): container finished" podID="cc6f112c-f283-4a90-a889-d012e7163af9" containerID="45ba7d826f87e857f104111ddb035e209849e73cb197877340f92b4c745fa0cc" exitCode=0 Sep 30 08:02:07 crc kubenswrapper[4691]: I0930 08:02:07.356821 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hhnpm/crc-debug-mfd99" event={"ID":"cc6f112c-f283-4a90-a889-d012e7163af9","Type":"ContainerDied","Data":"45ba7d826f87e857f104111ddb035e209849e73cb197877340f92b4c745fa0cc"} Sep 30 08:02:08 crc kubenswrapper[4691]: I0930 08:02:08.467604 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hhnpm/crc-debug-mfd99" Sep 30 08:02:08 crc kubenswrapper[4691]: I0930 08:02:08.649195 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htkt6\" (UniqueName: \"kubernetes.io/projected/cc6f112c-f283-4a90-a889-d012e7163af9-kube-api-access-htkt6\") pod \"cc6f112c-f283-4a90-a889-d012e7163af9\" (UID: \"cc6f112c-f283-4a90-a889-d012e7163af9\") " Sep 30 08:02:08 crc kubenswrapper[4691]: I0930 08:02:08.649299 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cc6f112c-f283-4a90-a889-d012e7163af9-host\") pod \"cc6f112c-f283-4a90-a889-d012e7163af9\" (UID: \"cc6f112c-f283-4a90-a889-d012e7163af9\") " Sep 30 08:02:08 crc kubenswrapper[4691]: I0930 08:02:08.649362 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cc6f112c-f283-4a90-a889-d012e7163af9-host" (OuterVolumeSpecName: "host") pod "cc6f112c-f283-4a90-a889-d012e7163af9" (UID: "cc6f112c-f283-4a90-a889-d012e7163af9"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 08:02:08 crc kubenswrapper[4691]: I0930 08:02:08.649700 4691 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cc6f112c-f283-4a90-a889-d012e7163af9-host\") on node \"crc\" DevicePath \"\"" Sep 30 08:02:08 crc kubenswrapper[4691]: I0930 08:02:08.654303 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc6f112c-f283-4a90-a889-d012e7163af9-kube-api-access-htkt6" (OuterVolumeSpecName: "kube-api-access-htkt6") pod "cc6f112c-f283-4a90-a889-d012e7163af9" (UID: "cc6f112c-f283-4a90-a889-d012e7163af9"). InnerVolumeSpecName "kube-api-access-htkt6". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 08:02:08 crc kubenswrapper[4691]: I0930 08:02:08.751014 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htkt6\" (UniqueName: \"kubernetes.io/projected/cc6f112c-f283-4a90-a889-d012e7163af9-kube-api-access-htkt6\") on node \"crc\" DevicePath \"\"" Sep 30 08:02:09 crc kubenswrapper[4691]: I0930 08:02:09.374824 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hhnpm/crc-debug-mfd99" event={"ID":"cc6f112c-f283-4a90-a889-d012e7163af9","Type":"ContainerDied","Data":"8387b5c376e7bdfeb4a8e1fecc40631a52f0b949ecae68069dc8c281d95e9edc"} Sep 30 08:02:09 crc kubenswrapper[4691]: I0930 08:02:09.374872 4691 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8387b5c376e7bdfeb4a8e1fecc40631a52f0b949ecae68069dc8c281d95e9edc" Sep 30 08:02:09 crc kubenswrapper[4691]: I0930 08:02:09.374913 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hhnpm/crc-debug-mfd99" Sep 30 08:02:10 crc kubenswrapper[4691]: I0930 08:02:10.224472 4691 scope.go:117] "RemoveContainer" containerID="aec37d63291116406dc4ed4a91e1140d14893da8976b65afbaac021af006f18a" Sep 30 08:02:10 crc kubenswrapper[4691]: E0930 08:02:10.224855 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 08:02:17 crc kubenswrapper[4691]: I0930 08:02:17.441641 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-hhnpm/crc-debug-mfd99"] Sep 30 08:02:17 crc kubenswrapper[4691]: I0930 08:02:17.449922 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-hhnpm/crc-debug-mfd99"] Sep 30 08:02:18 crc kubenswrapper[4691]: I0930 08:02:18.648611 4691 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-hhnpm/crc-debug-bdnms"] Sep 30 08:02:18 crc kubenswrapper[4691]: E0930 08:02:18.649749 4691 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc6f112c-f283-4a90-a889-d012e7163af9" containerName="container-00" Sep 30 08:02:18 crc kubenswrapper[4691]: I0930 08:02:18.649772 4691 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc6f112c-f283-4a90-a889-d012e7163af9" containerName="container-00" Sep 30 08:02:18 crc kubenswrapper[4691]: I0930 08:02:18.650195 4691 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc6f112c-f283-4a90-a889-d012e7163af9" containerName="container-00" Sep 30 08:02:18 crc kubenswrapper[4691]: I0930 08:02:18.651607 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hhnpm/crc-debug-bdnms" Sep 30 08:02:18 crc kubenswrapper[4691]: I0930 08:02:18.654073 4691 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-hhnpm"/"default-dockercfg-zn5xk" Sep 30 08:02:18 crc kubenswrapper[4691]: I0930 08:02:18.818934 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5e427378-6831-4927-9b7e-f920c764ec01-host\") pod \"crc-debug-bdnms\" (UID: \"5e427378-6831-4927-9b7e-f920c764ec01\") " pod="openshift-must-gather-hhnpm/crc-debug-bdnms" Sep 30 08:02:18 crc kubenswrapper[4691]: I0930 08:02:18.819402 4691 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhww2\" (UniqueName: \"kubernetes.io/projected/5e427378-6831-4927-9b7e-f920c764ec01-kube-api-access-mhww2\") pod \"crc-debug-bdnms\" (UID: \"5e427378-6831-4927-9b7e-f920c764ec01\") " pod="openshift-must-gather-hhnpm/crc-debug-bdnms" Sep 30 08:02:18 crc kubenswrapper[4691]: I0930 08:02:18.921959 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhww2\" (UniqueName: \"kubernetes.io/projected/5e427378-6831-4927-9b7e-f920c764ec01-kube-api-access-mhww2\") pod \"crc-debug-bdnms\" (UID: \"5e427378-6831-4927-9b7e-f920c764ec01\") " pod="openshift-must-gather-hhnpm/crc-debug-bdnms" Sep 30 08:02:18 crc kubenswrapper[4691]: I0930 08:02:18.922075 4691 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5e427378-6831-4927-9b7e-f920c764ec01-host\") pod \"crc-debug-bdnms\" (UID: \"5e427378-6831-4927-9b7e-f920c764ec01\") " pod="openshift-must-gather-hhnpm/crc-debug-bdnms" Sep 30 08:02:18 crc kubenswrapper[4691]: I0930 08:02:18.922314 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5e427378-6831-4927-9b7e-f920c764ec01-host\") pod \"crc-debug-bdnms\" (UID: \"5e427378-6831-4927-9b7e-f920c764ec01\") " pod="openshift-must-gather-hhnpm/crc-debug-bdnms" Sep 30 08:02:18 crc kubenswrapper[4691]: I0930 08:02:18.944411 4691 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhww2\" (UniqueName: \"kubernetes.io/projected/5e427378-6831-4927-9b7e-f920c764ec01-kube-api-access-mhww2\") pod \"crc-debug-bdnms\" (UID: \"5e427378-6831-4927-9b7e-f920c764ec01\") " pod="openshift-must-gather-hhnpm/crc-debug-bdnms" Sep 30 08:02:18 crc kubenswrapper[4691]: I0930 08:02:18.986253 4691 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hhnpm/crc-debug-bdnms" Sep 30 08:02:19 crc kubenswrapper[4691]: I0930 08:02:19.240467 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc6f112c-f283-4a90-a889-d012e7163af9" path="/var/lib/kubelet/pods/cc6f112c-f283-4a90-a889-d012e7163af9/volumes" Sep 30 08:02:19 crc kubenswrapper[4691]: I0930 08:02:19.456057 4691 generic.go:334] "Generic (PLEG): container finished" podID="5e427378-6831-4927-9b7e-f920c764ec01" containerID="33fbced17dc0f1659acdf7240766597a0027ce3b48c4c166f689abcda93c24e3" exitCode=0 Sep 30 08:02:19 crc kubenswrapper[4691]: I0930 08:02:19.456154 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hhnpm/crc-debug-bdnms" event={"ID":"5e427378-6831-4927-9b7e-f920c764ec01","Type":"ContainerDied","Data":"33fbced17dc0f1659acdf7240766597a0027ce3b48c4c166f689abcda93c24e3"} Sep 30 08:02:19 crc kubenswrapper[4691]: I0930 08:02:19.456418 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hhnpm/crc-debug-bdnms" event={"ID":"5e427378-6831-4927-9b7e-f920c764ec01","Type":"ContainerStarted","Data":"9dddb7ac5e4a54f4e23aa563de4aacffcc011ac274015c536a548214236d9204"} Sep 30 08:02:19 crc kubenswrapper[4691]: I0930 08:02:19.493372 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-hhnpm/crc-debug-bdnms"] Sep 30 08:02:19 crc kubenswrapper[4691]: I0930 08:02:19.500454 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-hhnpm/crc-debug-bdnms"] Sep 30 08:02:20 crc kubenswrapper[4691]: I0930 08:02:20.578424 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hhnpm/crc-debug-bdnms" Sep 30 08:02:20 crc kubenswrapper[4691]: I0930 08:02:20.676710 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhww2\" (UniqueName: \"kubernetes.io/projected/5e427378-6831-4927-9b7e-f920c764ec01-kube-api-access-mhww2\") pod \"5e427378-6831-4927-9b7e-f920c764ec01\" (UID: \"5e427378-6831-4927-9b7e-f920c764ec01\") " Sep 30 08:02:20 crc kubenswrapper[4691]: I0930 08:02:20.676906 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5e427378-6831-4927-9b7e-f920c764ec01-host\") pod \"5e427378-6831-4927-9b7e-f920c764ec01\" (UID: \"5e427378-6831-4927-9b7e-f920c764ec01\") " Sep 30 08:02:20 crc kubenswrapper[4691]: I0930 08:02:20.677000 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5e427378-6831-4927-9b7e-f920c764ec01-host" (OuterVolumeSpecName: "host") pod "5e427378-6831-4927-9b7e-f920c764ec01" (UID: "5e427378-6831-4927-9b7e-f920c764ec01"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Sep 30 08:02:20 crc kubenswrapper[4691]: I0930 08:02:20.677314 4691 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5e427378-6831-4927-9b7e-f920c764ec01-host\") on node \"crc\" DevicePath \"\"" Sep 30 08:02:20 crc kubenswrapper[4691]: I0930 08:02:20.681214 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e427378-6831-4927-9b7e-f920c764ec01-kube-api-access-mhww2" (OuterVolumeSpecName: "kube-api-access-mhww2") pod "5e427378-6831-4927-9b7e-f920c764ec01" (UID: "5e427378-6831-4927-9b7e-f920c764ec01"). InnerVolumeSpecName "kube-api-access-mhww2". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 08:02:20 crc kubenswrapper[4691]: I0930 08:02:20.779760 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhww2\" (UniqueName: \"kubernetes.io/projected/5e427378-6831-4927-9b7e-f920c764ec01-kube-api-access-mhww2\") on node \"crc\" DevicePath \"\"" Sep 30 08:02:21 crc kubenswrapper[4691]: I0930 08:02:21.082692 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3cdb0998117b6528e8d4f4afc7f870da2241657f1d88a1ece591ac0268b75jz_ed991450-2c1e-4d1a-a54c-196c5067ce69/util/0.log" Sep 30 08:02:21 crc kubenswrapper[4691]: I0930 08:02:21.242715 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e427378-6831-4927-9b7e-f920c764ec01" path="/var/lib/kubelet/pods/5e427378-6831-4927-9b7e-f920c764ec01/volumes" Sep 30 08:02:21 crc kubenswrapper[4691]: I0930 08:02:21.254240 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3cdb0998117b6528e8d4f4afc7f870da2241657f1d88a1ece591ac0268b75jz_ed991450-2c1e-4d1a-a54c-196c5067ce69/util/0.log" Sep 30 08:02:21 crc kubenswrapper[4691]: I0930 08:02:21.257574 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3cdb0998117b6528e8d4f4afc7f870da2241657f1d88a1ece591ac0268b75jz_ed991450-2c1e-4d1a-a54c-196c5067ce69/pull/0.log" Sep 30 08:02:21 crc kubenswrapper[4691]: I0930 08:02:21.285400 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3cdb0998117b6528e8d4f4afc7f870da2241657f1d88a1ece591ac0268b75jz_ed991450-2c1e-4d1a-a54c-196c5067ce69/pull/0.log" Sep 30 08:02:21 crc kubenswrapper[4691]: I0930 08:02:21.431743 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3cdb0998117b6528e8d4f4afc7f870da2241657f1d88a1ece591ac0268b75jz_ed991450-2c1e-4d1a-a54c-196c5067ce69/util/0.log" Sep 30 08:02:21 crc kubenswrapper[4691]: I0930 08:02:21.454743 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3cdb0998117b6528e8d4f4afc7f870da2241657f1d88a1ece591ac0268b75jz_ed991450-2c1e-4d1a-a54c-196c5067ce69/extract/0.log" Sep 30 08:02:21 crc kubenswrapper[4691]: I0930 08:02:21.459515 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3cdb0998117b6528e8d4f4afc7f870da2241657f1d88a1ece591ac0268b75jz_ed991450-2c1e-4d1a-a54c-196c5067ce69/pull/0.log" Sep 30 08:02:21 crc kubenswrapper[4691]: I0930 08:02:21.474792 4691 scope.go:117] "RemoveContainer" containerID="33fbced17dc0f1659acdf7240766597a0027ce3b48c4c166f689abcda93c24e3" Sep 30 08:02:21 crc kubenswrapper[4691]: I0930 08:02:21.474854 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hhnpm/crc-debug-bdnms" Sep 30 08:02:21 crc kubenswrapper[4691]: I0930 08:02:21.588427 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6ff8b75857-qqx55_a5779e0d-8902-4a45-b28e-4253af3938ae/kube-rbac-proxy/0.log" Sep 30 08:02:21 crc kubenswrapper[4691]: I0930 08:02:21.675398 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-6ff8b75857-qqx55_a5779e0d-8902-4a45-b28e-4253af3938ae/manager/0.log" Sep 30 08:02:21 crc kubenswrapper[4691]: I0930 08:02:21.675899 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-644bddb6d8-9dj86_d80f2f0e-5d71-42d4-8bb3-69b8dadd63c2/kube-rbac-proxy/0.log" Sep 30 08:02:21 crc kubenswrapper[4691]: I0930 08:02:21.815625 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-644bddb6d8-9dj86_d80f2f0e-5d71-42d4-8bb3-69b8dadd63c2/manager/0.log" Sep 30 08:02:21 crc kubenswrapper[4691]: I0930 08:02:21.874053 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-84f4f7b77b-8wf6n_2a1af285-1505-419c-bacc-16d8a161aca2/kube-rbac-proxy/0.log" Sep 30 08:02:21 crc kubenswrapper[4691]: I0930 08:02:21.913946 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-84f4f7b77b-8wf6n_2a1af285-1505-419c-bacc-16d8a161aca2/manager/0.log" Sep 30 08:02:22 crc kubenswrapper[4691]: I0930 08:02:22.018764 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84958c4d49-qwmv9_60dcfaf5-c692-44e5-8868-1dfccb14f535/kube-rbac-proxy/0.log" Sep 30 08:02:22 crc kubenswrapper[4691]: I0930 08:02:22.118926 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-84958c4d49-qwmv9_60dcfaf5-c692-44e5-8868-1dfccb14f535/manager/0.log" Sep 30 08:02:22 crc kubenswrapper[4691]: I0930 08:02:22.137594 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5d889d78cf-5j6fw_5c0ba848-ac6e-4515-99e3-e1665ff79d7c/kube-rbac-proxy/0.log" Sep 30 08:02:22 crc kubenswrapper[4691]: I0930 08:02:22.218044 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5d889d78cf-5j6fw_5c0ba848-ac6e-4515-99e3-e1665ff79d7c/manager/0.log" Sep 30 08:02:22 crc kubenswrapper[4691]: I0930 08:02:22.296530 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-9f4696d94-rxdj9_c9f2f281-c656-4c29-bf86-c38f9cd79528/kube-rbac-proxy/0.log" Sep 30 08:02:22 crc kubenswrapper[4691]: I0930 08:02:22.309911 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-9f4696d94-rxdj9_c9f2f281-c656-4c29-bf86-c38f9cd79528/manager/0.log" Sep 30 08:02:22 crc kubenswrapper[4691]: I0930 08:02:22.485243 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7d857cc749-jgh2d_1d4f2da8-966a-4a80-aca6-efdd8faca337/kube-rbac-proxy/0.log" Sep 30 08:02:22 crc kubenswrapper[4691]: I0930 08:02:22.608260 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-7975b88857-x79vm_9c5c1b63-6185-424c-a584-35a18e2c69bd/kube-rbac-proxy/0.log" Sep 30 08:02:22 crc kubenswrapper[4691]: I0930 08:02:22.683229 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7d857cc749-jgh2d_1d4f2da8-966a-4a80-aca6-efdd8faca337/manager/0.log" Sep 30 08:02:22 crc kubenswrapper[4691]: I0930 08:02:22.709135 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-7975b88857-x79vm_9c5c1b63-6185-424c-a584-35a18e2c69bd/manager/0.log" Sep 30 08:02:22 crc kubenswrapper[4691]: I0930 08:02:22.837062 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5bd55b4bff-9xs5h_d6fb63f5-e7b6-47fd-ac44-b59058899b3c/kube-rbac-proxy/0.log" Sep 30 08:02:22 crc kubenswrapper[4691]: I0930 08:02:22.881722 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-5bd55b4bff-9xs5h_d6fb63f5-e7b6-47fd-ac44-b59058899b3c/manager/0.log" Sep 30 08:02:22 crc kubenswrapper[4691]: I0930 08:02:22.899913 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6d68dbc695-rtqwx_11bd74e6-05a4-44fc-b360-f1d71352011e/kube-rbac-proxy/0.log" Sep 30 08:02:22 crc kubenswrapper[4691]: I0930 08:02:22.998939 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-6d68dbc695-rtqwx_11bd74e6-05a4-44fc-b360-f1d71352011e/manager/0.log" Sep 30 08:02:23 crc kubenswrapper[4691]: I0930 08:02:23.082315 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-88c7-c5nbk_a1aaa7fa-8695-4124-ad5a-26f11a99b1c8/kube-rbac-proxy/0.log" Sep 30 08:02:23 crc kubenswrapper[4691]: I0930 08:02:23.164389 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-88c7-c5nbk_a1aaa7fa-8695-4124-ad5a-26f11a99b1c8/manager/0.log" Sep 30 08:02:23 crc kubenswrapper[4691]: I0930 08:02:23.265212 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-64d7b59854-2kzwg_2c674607-65d9-4be2-9244-d61eadb97dd7/kube-rbac-proxy/0.log" Sep 30 08:02:23 crc kubenswrapper[4691]: I0930 08:02:23.284566 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-64d7b59854-2kzwg_2c674607-65d9-4be2-9244-d61eadb97dd7/manager/0.log" Sep 30 08:02:23 crc kubenswrapper[4691]: I0930 08:02:23.405654 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-c7c776c96-gtznz_a8dd4aa3-ab8b-4f66-9722-8873600c87eb/kube-rbac-proxy/0.log" Sep 30 08:02:23 crc kubenswrapper[4691]: I0930 08:02:23.523481 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-c7c776c96-gtznz_a8dd4aa3-ab8b-4f66-9722-8873600c87eb/manager/0.log" Sep 30 08:02:23 crc kubenswrapper[4691]: I0930 08:02:23.583154 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-76fcc6dc7c-h42cw_88c75f60-538f-4059-aaeb-b41dcdcf7cfa/manager/0.log" Sep 30 08:02:23 crc kubenswrapper[4691]: I0930 08:02:23.586003 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-76fcc6dc7c-h42cw_88c75f60-538f-4059-aaeb-b41dcdcf7cfa/kube-rbac-proxy/0.log" Sep 30 08:02:23 crc kubenswrapper[4691]: I0930 08:02:23.692740 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6d776955-r8s99_4a38ca7d-d01d-4a0e-adfa-4875c46d8d7d/kube-rbac-proxy/0.log" Sep 30 08:02:23 crc kubenswrapper[4691]: I0930 08:02:23.757463 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6d776955-r8s99_4a38ca7d-d01d-4a0e-adfa-4875c46d8d7d/manager/0.log" Sep 30 08:02:23 crc kubenswrapper[4691]: I0930 08:02:23.877789 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-86d6bdfc6d-7zkhq_1013022f-3fa2-44d5-a111-5f89a6a7bb17/kube-rbac-proxy/0.log" Sep 30 08:02:24 crc kubenswrapper[4691]: I0930 08:02:24.024670 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-6bb46fb86b-vx6kv_e5d557a0-1dee-462b-89d5-86c8479ef2e4/kube-rbac-proxy/0.log" Sep 30 08:02:24 crc kubenswrapper[4691]: I0930 08:02:24.186941 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-6bb46fb86b-vx6kv_e5d557a0-1dee-462b-89d5-86c8479ef2e4/operator/0.log" Sep 30 08:02:24 crc kubenswrapper[4691]: I0930 08:02:24.255959 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-swhwg_664688ee-c3cc-4f92-86b7-64d53b8c133d/registry-server/0.log" Sep 30 08:02:24 crc kubenswrapper[4691]: I0930 08:02:24.327444 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-9976ff44c-bzx6l_a7851de5-00e1-43bb-b0c9-8ae4f4f9c5af/kube-rbac-proxy/0.log" Sep 30 08:02:24 crc kubenswrapper[4691]: I0930 08:02:24.468664 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-9976ff44c-bzx6l_a7851de5-00e1-43bb-b0c9-8ae4f4f9c5af/manager/0.log" Sep 30 08:02:24 crc kubenswrapper[4691]: I0930 08:02:24.485854 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-589c58c6c-vd4rw_52aa93bd-f5d7-479e-a8fe-2c6e70a70fae/kube-rbac-proxy/0.log" Sep 30 08:02:24 crc kubenswrapper[4691]: I0930 08:02:24.555401 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-589c58c6c-vd4rw_52aa93bd-f5d7-479e-a8fe-2c6e70a70fae/manager/0.log" Sep 30 08:02:24 crc kubenswrapper[4691]: I0930 08:02:24.786672 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-79d8469568-7v82c_54fbcf55-e81d-4336-8e38-9bb1d3ec3c47/operator/0.log" Sep 30 08:02:24 crc kubenswrapper[4691]: I0930 08:02:24.791993 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-bc7dc7bd9-s4rpv_3bce910d-be3e-4332-89df-75e715d95988/kube-rbac-proxy/0.log" Sep 30 08:02:24 crc kubenswrapper[4691]: I0930 08:02:24.878159 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-bc7dc7bd9-s4rpv_3bce910d-be3e-4332-89df-75e715d95988/manager/0.log" Sep 30 08:02:24 crc kubenswrapper[4691]: I0930 08:02:24.978858 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-b8d54b5d7-44qv5_88da73a4-c9e2-4a78-b313-8cf689562e38/kube-rbac-proxy/0.log" Sep 30 08:02:25 crc kubenswrapper[4691]: I0930 08:02:25.153052 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-f66b554c6-8qj9q_a1cbd98a-2f66-4649-8347-938d07f93eb1/kube-rbac-proxy/0.log" Sep 30 08:02:25 crc kubenswrapper[4691]: I0930 08:02:25.224564 4691 scope.go:117] "RemoveContainer" containerID="aec37d63291116406dc4ed4a91e1140d14893da8976b65afbaac021af006f18a" Sep 30 08:02:25 crc kubenswrapper[4691]: E0930 08:02:25.224914 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 08:02:25 crc kubenswrapper[4691]: I0930 08:02:25.231221 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-f66b554c6-8qj9q_a1cbd98a-2f66-4649-8347-938d07f93eb1/manager/0.log" Sep 30 08:02:25 crc kubenswrapper[4691]: I0930 08:02:25.316713 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-86d6bdfc6d-7zkhq_1013022f-3fa2-44d5-a111-5f89a6a7bb17/manager/0.log" Sep 30 08:02:25 crc kubenswrapper[4691]: I0930 08:02:25.335156 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-bd494bc6d-x495w_9678f82b-58e6-4529-bdf6-6faaf2d7bcfa/kube-rbac-proxy/0.log" Sep 30 08:02:25 crc kubenswrapper[4691]: I0930 08:02:25.342438 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-b8d54b5d7-44qv5_88da73a4-c9e2-4a78-b313-8cf689562e38/manager/0.log" Sep 30 08:02:25 crc kubenswrapper[4691]: I0930 08:02:25.510055 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-bd494bc6d-x495w_9678f82b-58e6-4529-bdf6-6faaf2d7bcfa/manager/0.log" Sep 30 08:02:28 crc kubenswrapper[4691]: I0930 08:02:28.269289 4691 scope.go:117] "RemoveContainer" containerID="458226a15c404f1418d17b80690fde5b54a44837964707213b53d584f857649f" Sep 30 08:02:37 crc kubenswrapper[4691]: I0930 08:02:37.235118 4691 scope.go:117] "RemoveContainer" containerID="aec37d63291116406dc4ed4a91e1140d14893da8976b65afbaac021af006f18a" Sep 30 08:02:37 crc kubenswrapper[4691]: E0930 08:02:37.235859 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 08:02:41 crc kubenswrapper[4691]: I0930 08:02:41.291507 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-kr8lp_2997210f-48b1-46e1-bf0f-12ed24852c8b/control-plane-machine-set-operator/0.log" Sep 30 08:02:41 crc kubenswrapper[4691]: I0930 08:02:41.358700 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-kzftt_ee1c2dd6-d759-4d3c-9ec7-86ec11419202/kube-rbac-proxy/0.log" Sep 30 08:02:41 crc kubenswrapper[4691]: I0930 08:02:41.476164 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-kzftt_ee1c2dd6-d759-4d3c-9ec7-86ec11419202/machine-api-operator/0.log" Sep 30 08:02:51 crc kubenswrapper[4691]: I0930 08:02:51.225171 4691 scope.go:117] "RemoveContainer" containerID="aec37d63291116406dc4ed4a91e1140d14893da8976b65afbaac021af006f18a" Sep 30 08:02:51 crc kubenswrapper[4691]: E0930 08:02:51.225815 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 08:02:54 crc kubenswrapper[4691]: I0930 08:02:54.239919 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-5b446d88c5-xxwh7_2ee3b0f5-be03-426e-b603-ec6c53237e85/cert-manager-controller/0.log" Sep 30 08:02:54 crc kubenswrapper[4691]: I0930 08:02:54.385075 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-7f985d654d-xwxbt_5746a924-b059-4e93-91c3-31bbe5e2ef86/cert-manager-cainjector/0.log" Sep 30 08:02:54 crc kubenswrapper[4691]: I0930 08:02:54.409655 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-5655c58dd6-99v8p_746ab0d5-3b5c-4985-935e-73a35939302d/cert-manager-webhook/0.log" Sep 30 08:03:06 crc kubenswrapper[4691]: I0930 08:03:06.226085 4691 scope.go:117] "RemoveContainer" containerID="aec37d63291116406dc4ed4a91e1140d14893da8976b65afbaac021af006f18a" Sep 30 08:03:06 crc kubenswrapper[4691]: E0930 08:03:06.226828 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 08:03:07 crc kubenswrapper[4691]: I0930 08:03:07.395446 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-864bb6dfb5-r4qfl_e1989084-5e13-4ce8-9d59-050337ff70da/nmstate-console-plugin/0.log" Sep 30 08:03:07 crc kubenswrapper[4691]: I0930 08:03:07.599098 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-gcxcd_3df27da9-f98e-41a7-84fb-bfad238e7533/nmstate-handler/0.log" Sep 30 08:03:07 crc kubenswrapper[4691]: I0930 08:03:07.634741 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58fcddf996-tqjqg_55ec66af-837d-40c5-81d2-6b311f0dc05c/kube-rbac-proxy/0.log" Sep 30 08:03:07 crc kubenswrapper[4691]: I0930 08:03:07.634827 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-58fcddf996-tqjqg_55ec66af-837d-40c5-81d2-6b311f0dc05c/nmstate-metrics/0.log" Sep 30 08:03:07 crc kubenswrapper[4691]: I0930 08:03:07.815976 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5d6f6cfd66-wnr29_8dee2c6d-f8b8-4b1a-ae65-af2728adad3e/nmstate-operator/0.log" Sep 30 08:03:07 crc kubenswrapper[4691]: I0930 08:03:07.867111 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-6d689559c5-pz8zd_3cdd6ae9-7044-4fb4-92fb-0a503651b60d/nmstate-webhook/0.log" Sep 30 08:03:19 crc kubenswrapper[4691]: I0930 08:03:19.225401 4691 scope.go:117] "RemoveContainer" containerID="aec37d63291116406dc4ed4a91e1140d14893da8976b65afbaac021af006f18a" Sep 30 08:03:19 crc kubenswrapper[4691]: E0930 08:03:19.226269 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 08:03:21 crc kubenswrapper[4691]: I0930 08:03:21.924543 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-52gd8_238ed092-dc40-4e7d-add0-854dd611a65f/kube-rbac-proxy/0.log" Sep 30 08:03:22 crc kubenswrapper[4691]: I0930 08:03:22.140726 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jbzhs_9949a206-ebbd-42f2-8b22-8dfcf266b934/cp-frr-files/0.log" Sep 30 08:03:22 crc kubenswrapper[4691]: I0930 08:03:22.144867 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-5d688f5ffc-52gd8_238ed092-dc40-4e7d-add0-854dd611a65f/controller/0.log" Sep 30 08:03:22 crc kubenswrapper[4691]: I0930 08:03:22.346779 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jbzhs_9949a206-ebbd-42f2-8b22-8dfcf266b934/cp-metrics/0.log" Sep 30 08:03:22 crc kubenswrapper[4691]: I0930 08:03:22.346819 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jbzhs_9949a206-ebbd-42f2-8b22-8dfcf266b934/cp-reloader/0.log" Sep 30 08:03:22 crc kubenswrapper[4691]: I0930 08:03:22.361031 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jbzhs_9949a206-ebbd-42f2-8b22-8dfcf266b934/cp-frr-files/0.log" Sep 30 08:03:22 crc kubenswrapper[4691]: I0930 08:03:22.362983 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jbzhs_9949a206-ebbd-42f2-8b22-8dfcf266b934/cp-reloader/0.log" Sep 30 08:03:22 crc kubenswrapper[4691]: I0930 08:03:22.498837 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jbzhs_9949a206-ebbd-42f2-8b22-8dfcf266b934/cp-frr-files/0.log" Sep 30 08:03:22 crc kubenswrapper[4691]: I0930 08:03:22.516696 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jbzhs_9949a206-ebbd-42f2-8b22-8dfcf266b934/cp-reloader/0.log" Sep 30 08:03:22 crc kubenswrapper[4691]: I0930 08:03:22.546627 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jbzhs_9949a206-ebbd-42f2-8b22-8dfcf266b934/cp-metrics/0.log" Sep 30 08:03:22 crc kubenswrapper[4691]: I0930 08:03:22.553896 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jbzhs_9949a206-ebbd-42f2-8b22-8dfcf266b934/cp-metrics/0.log" Sep 30 08:03:22 crc kubenswrapper[4691]: I0930 08:03:22.731438 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jbzhs_9949a206-ebbd-42f2-8b22-8dfcf266b934/controller/0.log" Sep 30 08:03:22 crc kubenswrapper[4691]: I0930 08:03:22.734355 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jbzhs_9949a206-ebbd-42f2-8b22-8dfcf266b934/cp-frr-files/0.log" Sep 30 08:03:22 crc kubenswrapper[4691]: I0930 08:03:22.747678 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jbzhs_9949a206-ebbd-42f2-8b22-8dfcf266b934/cp-metrics/0.log" Sep 30 08:03:22 crc kubenswrapper[4691]: I0930 08:03:22.757286 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jbzhs_9949a206-ebbd-42f2-8b22-8dfcf266b934/cp-reloader/0.log" Sep 30 08:03:22 crc kubenswrapper[4691]: I0930 08:03:22.940587 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jbzhs_9949a206-ebbd-42f2-8b22-8dfcf266b934/frr-metrics/0.log" Sep 30 08:03:22 crc kubenswrapper[4691]: I0930 08:03:22.951667 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jbzhs_9949a206-ebbd-42f2-8b22-8dfcf266b934/kube-rbac-proxy/0.log" Sep 30 08:03:22 crc kubenswrapper[4691]: I0930 08:03:22.965063 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jbzhs_9949a206-ebbd-42f2-8b22-8dfcf266b934/kube-rbac-proxy-frr/0.log" Sep 30 08:03:23 crc kubenswrapper[4691]: I0930 08:03:23.162851 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jbzhs_9949a206-ebbd-42f2-8b22-8dfcf266b934/reloader/0.log" Sep 30 08:03:23 crc kubenswrapper[4691]: I0930 08:03:23.240987 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-5478bdb765-4szpr_45188dc2-6524-4d87-bfaa-676d46684df8/frr-k8s-webhook-server/0.log" Sep 30 08:03:23 crc kubenswrapper[4691]: I0930 08:03:23.435942 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-5b7f74d8d8-lfcgt_fc264033-2e29-41cc-b961-92dbd3230d34/manager/0.log" Sep 30 08:03:23 crc kubenswrapper[4691]: I0930 08:03:23.631877 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-b8f956b88-zp5fz_164027cf-f7af-41cc-bbd2-e3a725230c9e/webhook-server/0.log" Sep 30 08:03:23 crc kubenswrapper[4691]: I0930 08:03:23.715911 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-vqw66_cbbc08d3-615a-450a-9a3e-7f0aba1c5ff6/kube-rbac-proxy/0.log" Sep 30 08:03:24 crc kubenswrapper[4691]: I0930 08:03:24.309075 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-vqw66_cbbc08d3-615a-450a-9a3e-7f0aba1c5ff6/speaker/0.log" Sep 30 08:03:24 crc kubenswrapper[4691]: I0930 08:03:24.621556 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jbzhs_9949a206-ebbd-42f2-8b22-8dfcf266b934/frr/0.log" Sep 30 08:03:31 crc kubenswrapper[4691]: I0930 08:03:31.228672 4691 scope.go:117] "RemoveContainer" containerID="aec37d63291116406dc4ed4a91e1140d14893da8976b65afbaac021af006f18a" Sep 30 08:03:31 crc kubenswrapper[4691]: E0930 08:03:31.230442 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 08:03:37 crc kubenswrapper[4691]: I0930 08:03:37.629485 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc6pcbs_36dddebb-8230-4914-b81c-b53683028a63/util/0.log" Sep 30 08:03:37 crc kubenswrapper[4691]: I0930 08:03:37.819764 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc6pcbs_36dddebb-8230-4914-b81c-b53683028a63/pull/0.log" Sep 30 08:03:37 crc kubenswrapper[4691]: I0930 08:03:37.841091 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc6pcbs_36dddebb-8230-4914-b81c-b53683028a63/util/0.log" Sep 30 08:03:37 crc kubenswrapper[4691]: I0930 08:03:37.856274 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc6pcbs_36dddebb-8230-4914-b81c-b53683028a63/pull/0.log" Sep 30 08:03:38 crc kubenswrapper[4691]: I0930 08:03:38.007084 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc6pcbs_36dddebb-8230-4914-b81c-b53683028a63/extract/0.log" Sep 30 08:03:38 crc kubenswrapper[4691]: I0930 08:03:38.055529 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc6pcbs_36dddebb-8230-4914-b81c-b53683028a63/util/0.log" Sep 30 08:03:38 crc kubenswrapper[4691]: I0930 08:03:38.064656 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_9a6e092ce660f08e14c0b0ceab3711fa43f2b70244f9df8a7a069040bc6pcbs_36dddebb-8230-4914-b81c-b53683028a63/pull/0.log" Sep 30 08:03:38 crc kubenswrapper[4691]: I0930 08:03:38.187874 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkrbqr_de284bf7-ec7a-419c-89fb-8a555cd5b320/util/0.log" Sep 30 08:03:38 crc kubenswrapper[4691]: I0930 08:03:38.439009 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkrbqr_de284bf7-ec7a-419c-89fb-8a555cd5b320/pull/0.log" Sep 30 08:03:38 crc kubenswrapper[4691]: I0930 08:03:38.452187 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkrbqr_de284bf7-ec7a-419c-89fb-8a555cd5b320/util/0.log" Sep 30 08:03:38 crc kubenswrapper[4691]: I0930 08:03:38.454672 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkrbqr_de284bf7-ec7a-419c-89fb-8a555cd5b320/pull/0.log" Sep 30 08:03:38 crc kubenswrapper[4691]: I0930 08:03:38.602186 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkrbqr_de284bf7-ec7a-419c-89fb-8a555cd5b320/util/0.log" Sep 30 08:03:38 crc kubenswrapper[4691]: I0930 08:03:38.627292 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkrbqr_de284bf7-ec7a-419c-89fb-8a555cd5b320/pull/0.log" Sep 30 08:03:38 crc kubenswrapper[4691]: I0930 08:03:38.631458 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_a6d815214afcb93f379916e45350d3de39072121f31a1d7eaaf6e22c2dkrbqr_de284bf7-ec7a-419c-89fb-8a555cd5b320/extract/0.log" Sep 30 08:03:38 crc kubenswrapper[4691]: I0930 08:03:38.801915 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9xmqg_93c048ec-3ef2-4dc9-af06-5f3aa9bcae6f/extract-utilities/0.log" Sep 30 08:03:39 crc kubenswrapper[4691]: I0930 08:03:39.145861 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9xmqg_93c048ec-3ef2-4dc9-af06-5f3aa9bcae6f/extract-utilities/0.log" Sep 30 08:03:39 crc kubenswrapper[4691]: I0930 08:03:39.171521 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9xmqg_93c048ec-3ef2-4dc9-af06-5f3aa9bcae6f/extract-content/0.log" Sep 30 08:03:39 crc kubenswrapper[4691]: I0930 08:03:39.241735 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9xmqg_93c048ec-3ef2-4dc9-af06-5f3aa9bcae6f/extract-content/0.log" Sep 30 08:03:39 crc kubenswrapper[4691]: I0930 08:03:39.296646 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9xmqg_93c048ec-3ef2-4dc9-af06-5f3aa9bcae6f/extract-utilities/0.log" Sep 30 08:03:39 crc kubenswrapper[4691]: I0930 08:03:39.303809 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9xmqg_93c048ec-3ef2-4dc9-af06-5f3aa9bcae6f/extract-content/0.log" Sep 30 08:03:39 crc kubenswrapper[4691]: I0930 08:03:39.463410 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hcqm7_89aa6b93-31ae-4f4f-ad2c-bed85c8fd8fa/extract-utilities/0.log" Sep 30 08:03:39 crc kubenswrapper[4691]: I0930 08:03:39.693460 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-9xmqg_93c048ec-3ef2-4dc9-af06-5f3aa9bcae6f/registry-server/0.log" Sep 30 08:03:39 crc kubenswrapper[4691]: I0930 08:03:39.776092 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hcqm7_89aa6b93-31ae-4f4f-ad2c-bed85c8fd8fa/extract-content/0.log" Sep 30 08:03:39 crc kubenswrapper[4691]: I0930 08:03:39.793511 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hcqm7_89aa6b93-31ae-4f4f-ad2c-bed85c8fd8fa/extract-utilities/0.log" Sep 30 08:03:39 crc kubenswrapper[4691]: I0930 08:03:39.796809 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hcqm7_89aa6b93-31ae-4f4f-ad2c-bed85c8fd8fa/extract-content/0.log" Sep 30 08:03:39 crc kubenswrapper[4691]: I0930 08:03:39.954437 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hcqm7_89aa6b93-31ae-4f4f-ad2c-bed85c8fd8fa/extract-utilities/0.log" Sep 30 08:03:39 crc kubenswrapper[4691]: I0930 08:03:39.968430 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hcqm7_89aa6b93-31ae-4f4f-ad2c-bed85c8fd8fa/extract-content/0.log" Sep 30 08:03:40 crc kubenswrapper[4691]: I0930 08:03:40.171603 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96lp95h_cfeab7bf-5c68-43a0-8c09-cbf293e56f35/util/0.log" Sep 30 08:03:40 crc kubenswrapper[4691]: I0930 08:03:40.465210 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96lp95h_cfeab7bf-5c68-43a0-8c09-cbf293e56f35/util/0.log" Sep 30 08:03:40 crc kubenswrapper[4691]: I0930 08:03:40.494398 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96lp95h_cfeab7bf-5c68-43a0-8c09-cbf293e56f35/pull/0.log" Sep 30 08:03:40 crc kubenswrapper[4691]: I0930 08:03:40.514361 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96lp95h_cfeab7bf-5c68-43a0-8c09-cbf293e56f35/pull/0.log" Sep 30 08:03:40 crc kubenswrapper[4691]: I0930 08:03:40.706419 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96lp95h_cfeab7bf-5c68-43a0-8c09-cbf293e56f35/pull/0.log" Sep 30 08:03:40 crc kubenswrapper[4691]: I0930 08:03:40.732040 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96lp95h_cfeab7bf-5c68-43a0-8c09-cbf293e56f35/util/0.log" Sep 30 08:03:40 crc kubenswrapper[4691]: I0930 08:03:40.748011 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_f29efc416ca216184f30dbb4b19e0f463bdcecc8ef634322abbad88d96lp95h_cfeab7bf-5c68-43a0-8c09-cbf293e56f35/extract/0.log" Sep 30 08:03:40 crc kubenswrapper[4691]: I0930 08:03:40.937035 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-4km9n_f46c875b-2f18-4fac-98af-64b0756b7e26/marketplace-operator/0.log" Sep 30 08:03:41 crc kubenswrapper[4691]: I0930 08:03:41.010426 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-hcqm7_89aa6b93-31ae-4f4f-ad2c-bed85c8fd8fa/registry-server/0.log" Sep 30 08:03:41 crc kubenswrapper[4691]: I0930 08:03:41.143114 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tzs7x_02f1c945-9a65-40b7-871c-aebadb76aa48/extract-utilities/0.log" Sep 30 08:03:41 crc kubenswrapper[4691]: I0930 08:03:41.321374 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tzs7x_02f1c945-9a65-40b7-871c-aebadb76aa48/extract-content/0.log" Sep 30 08:03:41 crc kubenswrapper[4691]: I0930 08:03:41.326568 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tzs7x_02f1c945-9a65-40b7-871c-aebadb76aa48/extract-utilities/0.log" Sep 30 08:03:41 crc kubenswrapper[4691]: I0930 08:03:41.334339 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tzs7x_02f1c945-9a65-40b7-871c-aebadb76aa48/extract-content/0.log" Sep 30 08:03:41 crc kubenswrapper[4691]: I0930 08:03:41.509392 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tzs7x_02f1c945-9a65-40b7-871c-aebadb76aa48/extract-utilities/0.log" Sep 30 08:03:41 crc kubenswrapper[4691]: I0930 08:03:41.591942 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zjndd_fcb4d569-d722-4e61-a4eb-e05e8bfe3927/extract-utilities/0.log" Sep 30 08:03:41 crc kubenswrapper[4691]: I0930 08:03:41.614719 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tzs7x_02f1c945-9a65-40b7-871c-aebadb76aa48/extract-content/0.log" Sep 30 08:03:41 crc kubenswrapper[4691]: I0930 08:03:41.720592 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-tzs7x_02f1c945-9a65-40b7-871c-aebadb76aa48/registry-server/0.log" Sep 30 08:03:41 crc kubenswrapper[4691]: I0930 08:03:41.776996 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zjndd_fcb4d569-d722-4e61-a4eb-e05e8bfe3927/extract-content/0.log" Sep 30 08:03:41 crc kubenswrapper[4691]: I0930 08:03:41.800691 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zjndd_fcb4d569-d722-4e61-a4eb-e05e8bfe3927/extract-utilities/0.log" Sep 30 08:03:41 crc kubenswrapper[4691]: I0930 08:03:41.833966 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zjndd_fcb4d569-d722-4e61-a4eb-e05e8bfe3927/extract-content/0.log" Sep 30 08:03:41 crc kubenswrapper[4691]: I0930 08:03:41.984404 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zjndd_fcb4d569-d722-4e61-a4eb-e05e8bfe3927/extract-content/0.log" Sep 30 08:03:41 crc kubenswrapper[4691]: I0930 08:03:41.993785 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zjndd_fcb4d569-d722-4e61-a4eb-e05e8bfe3927/extract-utilities/0.log" Sep 30 08:03:42 crc kubenswrapper[4691]: I0930 08:03:42.143036 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zjndd_fcb4d569-d722-4e61-a4eb-e05e8bfe3927/registry-server/0.log" Sep 30 08:03:44 crc kubenswrapper[4691]: I0930 08:03:44.225036 4691 scope.go:117] "RemoveContainer" containerID="aec37d63291116406dc4ed4a91e1140d14893da8976b65afbaac021af006f18a" Sep 30 08:03:44 crc kubenswrapper[4691]: E0930 08:03:44.225632 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 08:03:55 crc kubenswrapper[4691]: I0930 08:03:55.268808 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-7c8cf85677-xqs9h_276e7e45-2756-4551-867f-2184113b0749/prometheus-operator/0.log" Sep 30 08:03:55 crc kubenswrapper[4691]: I0930 08:03:55.448200 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-67d746f8c7-h259t_b9ef8251-85be-4df7-9372-65a9fa9db6f7/prometheus-operator-admission-webhook/0.log" Sep 30 08:03:55 crc kubenswrapper[4691]: I0930 08:03:55.449563 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-67d746f8c7-2qbrt_5a44c0de-cf12-49e9-9f72-eb618b14445b/prometheus-operator-admission-webhook/0.log" Sep 30 08:03:55 crc kubenswrapper[4691]: I0930 08:03:55.629408 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-cc5f78dfc-jw7dt_426f2d02-4b9e-432d-a888-c799b2db417a/operator/0.log" Sep 30 08:03:55 crc kubenswrapper[4691]: I0930 08:03:55.677340 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-54bc95c9fb-fmpwf_e6e2f68b-8f48-4a4d-a96d-400c32cb80c9/perses-operator/0.log" Sep 30 08:03:57 crc kubenswrapper[4691]: I0930 08:03:57.232257 4691 scope.go:117] "RemoveContainer" containerID="aec37d63291116406dc4ed4a91e1140d14893da8976b65afbaac021af006f18a" Sep 30 08:03:57 crc kubenswrapper[4691]: E0930 08:03:57.232859 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 08:04:12 crc kubenswrapper[4691]: I0930 08:04:12.224927 4691 scope.go:117] "RemoveContainer" containerID="aec37d63291116406dc4ed4a91e1140d14893da8976b65afbaac021af006f18a" Sep 30 08:04:12 crc kubenswrapper[4691]: E0930 08:04:12.225909 4691 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4w4k6_openshift-machine-config-operator(69b46ade-8260-448f-84b7-506632d23ff9)\"" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" Sep 30 08:04:26 crc kubenswrapper[4691]: I0930 08:04:26.226183 4691 scope.go:117] "RemoveContainer" containerID="aec37d63291116406dc4ed4a91e1140d14893da8976b65afbaac021af006f18a" Sep 30 08:04:26 crc kubenswrapper[4691]: I0930 08:04:26.809623 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" event={"ID":"69b46ade-8260-448f-84b7-506632d23ff9","Type":"ContainerStarted","Data":"720194b32f3a6c315bea458df0227b94780babcf3f2f6f3d6352a4710a079f46"} Sep 30 08:04:28 crc kubenswrapper[4691]: I0930 08:04:28.418197 4691 scope.go:117] "RemoveContainer" containerID="82475612db2b436ddb4675ee45c6616d96431ee4be999d2d1d4ba3134c947fca" Sep 30 08:04:28 crc kubenswrapper[4691]: I0930 08:04:28.467626 4691 scope.go:117] "RemoveContainer" containerID="769eb708f5c4c3baf99f9ed40a6df72d89c327e0275ef32f513fd816f0adaf38" Sep 30 08:04:28 crc kubenswrapper[4691]: I0930 08:04:28.529670 4691 scope.go:117] "RemoveContainer" containerID="5a99d33905d64610f40669d0a4ad390e3f5974ad29f9ef2bbb996ef1295db38f" Sep 30 08:06:05 crc kubenswrapper[4691]: I0930 08:06:05.011499 4691 generic.go:334] "Generic (PLEG): container finished" podID="39eb777b-f6a5-40bc-b2c2-f27c3c6ed1e1" containerID="f950cdf871747e448a2a00653d859efb67ddab749746d91be32f1acf40bac201" exitCode=0 Sep 30 08:06:05 crc kubenswrapper[4691]: I0930 08:06:05.011608 4691 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-hhnpm/must-gather-ccdnm" event={"ID":"39eb777b-f6a5-40bc-b2c2-f27c3c6ed1e1","Type":"ContainerDied","Data":"f950cdf871747e448a2a00653d859efb67ddab749746d91be32f1acf40bac201"} Sep 30 08:06:05 crc kubenswrapper[4691]: I0930 08:06:05.013237 4691 scope.go:117] "RemoveContainer" containerID="f950cdf871747e448a2a00653d859efb67ddab749746d91be32f1acf40bac201" Sep 30 08:06:05 crc kubenswrapper[4691]: I0930 08:06:05.598690 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-hhnpm_must-gather-ccdnm_39eb777b-f6a5-40bc-b2c2-f27c3c6ed1e1/gather/0.log" Sep 30 08:06:19 crc kubenswrapper[4691]: I0930 08:06:19.397735 4691 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-hhnpm/must-gather-ccdnm"] Sep 30 08:06:19 crc kubenswrapper[4691]: I0930 08:06:19.398742 4691 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-hhnpm/must-gather-ccdnm" podUID="39eb777b-f6a5-40bc-b2c2-f27c3c6ed1e1" containerName="copy" containerID="cri-o://dab23aa0aaf3c335a7a53c2ef34d634faa252f4efd9be9a8ae036531bb11eb22" gracePeriod=2 Sep 30 08:06:19 crc kubenswrapper[4691]: I0930 08:06:19.405543 4691 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-hhnpm/must-gather-ccdnm"] Sep 30 08:06:19 crc kubenswrapper[4691]: I0930 08:06:19.886720 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-hhnpm_must-gather-ccdnm_39eb777b-f6a5-40bc-b2c2-f27c3c6ed1e1/copy/0.log" Sep 30 08:06:19 crc kubenswrapper[4691]: I0930 08:06:19.887267 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hhnpm/must-gather-ccdnm" Sep 30 08:06:20 crc kubenswrapper[4691]: I0930 08:06:20.063866 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwsjb\" (UniqueName: \"kubernetes.io/projected/39eb777b-f6a5-40bc-b2c2-f27c3c6ed1e1-kube-api-access-xwsjb\") pod \"39eb777b-f6a5-40bc-b2c2-f27c3c6ed1e1\" (UID: \"39eb777b-f6a5-40bc-b2c2-f27c3c6ed1e1\") " Sep 30 08:06:20 crc kubenswrapper[4691]: I0930 08:06:20.064156 4691 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/39eb777b-f6a5-40bc-b2c2-f27c3c6ed1e1-must-gather-output\") pod \"39eb777b-f6a5-40bc-b2c2-f27c3c6ed1e1\" (UID: \"39eb777b-f6a5-40bc-b2c2-f27c3c6ed1e1\") " Sep 30 08:06:20 crc kubenswrapper[4691]: I0930 08:06:20.070247 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39eb777b-f6a5-40bc-b2c2-f27c3c6ed1e1-kube-api-access-xwsjb" (OuterVolumeSpecName: "kube-api-access-xwsjb") pod "39eb777b-f6a5-40bc-b2c2-f27c3c6ed1e1" (UID: "39eb777b-f6a5-40bc-b2c2-f27c3c6ed1e1"). InnerVolumeSpecName "kube-api-access-xwsjb". PluginName "kubernetes.io/projected", VolumeGidValue "" Sep 30 08:06:20 crc kubenswrapper[4691]: I0930 08:06:20.166762 4691 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwsjb\" (UniqueName: \"kubernetes.io/projected/39eb777b-f6a5-40bc-b2c2-f27c3c6ed1e1-kube-api-access-xwsjb\") on node \"crc\" DevicePath \"\"" Sep 30 08:06:20 crc kubenswrapper[4691]: I0930 08:06:20.190971 4691 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-hhnpm_must-gather-ccdnm_39eb777b-f6a5-40bc-b2c2-f27c3c6ed1e1/copy/0.log" Sep 30 08:06:20 crc kubenswrapper[4691]: I0930 08:06:20.191574 4691 generic.go:334] "Generic (PLEG): container finished" podID="39eb777b-f6a5-40bc-b2c2-f27c3c6ed1e1" containerID="dab23aa0aaf3c335a7a53c2ef34d634faa252f4efd9be9a8ae036531bb11eb22" exitCode=143 Sep 30 08:06:20 crc kubenswrapper[4691]: I0930 08:06:20.191641 4691 scope.go:117] "RemoveContainer" containerID="dab23aa0aaf3c335a7a53c2ef34d634faa252f4efd9be9a8ae036531bb11eb22" Sep 30 08:06:20 crc kubenswrapper[4691]: I0930 08:06:20.191812 4691 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-hhnpm/must-gather-ccdnm" Sep 30 08:06:20 crc kubenswrapper[4691]: I0930 08:06:20.233581 4691 scope.go:117] "RemoveContainer" containerID="f950cdf871747e448a2a00653d859efb67ddab749746d91be32f1acf40bac201" Sep 30 08:06:20 crc kubenswrapper[4691]: I0930 08:06:20.248718 4691 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/39eb777b-f6a5-40bc-b2c2-f27c3c6ed1e1-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "39eb777b-f6a5-40bc-b2c2-f27c3c6ed1e1" (UID: "39eb777b-f6a5-40bc-b2c2-f27c3c6ed1e1"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Sep 30 08:06:20 crc kubenswrapper[4691]: I0930 08:06:20.268430 4691 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/39eb777b-f6a5-40bc-b2c2-f27c3c6ed1e1-must-gather-output\") on node \"crc\" DevicePath \"\"" Sep 30 08:06:20 crc kubenswrapper[4691]: I0930 08:06:20.322375 4691 scope.go:117] "RemoveContainer" containerID="dab23aa0aaf3c335a7a53c2ef34d634faa252f4efd9be9a8ae036531bb11eb22" Sep 30 08:06:20 crc kubenswrapper[4691]: E0930 08:06:20.323049 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dab23aa0aaf3c335a7a53c2ef34d634faa252f4efd9be9a8ae036531bb11eb22\": container with ID starting with dab23aa0aaf3c335a7a53c2ef34d634faa252f4efd9be9a8ae036531bb11eb22 not found: ID does not exist" containerID="dab23aa0aaf3c335a7a53c2ef34d634faa252f4efd9be9a8ae036531bb11eb22" Sep 30 08:06:20 crc kubenswrapper[4691]: I0930 08:06:20.323154 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dab23aa0aaf3c335a7a53c2ef34d634faa252f4efd9be9a8ae036531bb11eb22"} err="failed to get container status \"dab23aa0aaf3c335a7a53c2ef34d634faa252f4efd9be9a8ae036531bb11eb22\": rpc error: code = NotFound desc = could not find container \"dab23aa0aaf3c335a7a53c2ef34d634faa252f4efd9be9a8ae036531bb11eb22\": container with ID starting with dab23aa0aaf3c335a7a53c2ef34d634faa252f4efd9be9a8ae036531bb11eb22 not found: ID does not exist" Sep 30 08:06:20 crc kubenswrapper[4691]: I0930 08:06:20.323246 4691 scope.go:117] "RemoveContainer" containerID="f950cdf871747e448a2a00653d859efb67ddab749746d91be32f1acf40bac201" Sep 30 08:06:20 crc kubenswrapper[4691]: E0930 08:06:20.323696 4691 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f950cdf871747e448a2a00653d859efb67ddab749746d91be32f1acf40bac201\": container with ID starting with f950cdf871747e448a2a00653d859efb67ddab749746d91be32f1acf40bac201 not found: ID does not exist" containerID="f950cdf871747e448a2a00653d859efb67ddab749746d91be32f1acf40bac201" Sep 30 08:06:20 crc kubenswrapper[4691]: I0930 08:06:20.323791 4691 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f950cdf871747e448a2a00653d859efb67ddab749746d91be32f1acf40bac201"} err="failed to get container status \"f950cdf871747e448a2a00653d859efb67ddab749746d91be32f1acf40bac201\": rpc error: code = NotFound desc = could not find container \"f950cdf871747e448a2a00653d859efb67ddab749746d91be32f1acf40bac201\": container with ID starting with f950cdf871747e448a2a00653d859efb67ddab749746d91be32f1acf40bac201 not found: ID does not exist" Sep 30 08:06:21 crc kubenswrapper[4691]: I0930 08:06:21.245777 4691 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39eb777b-f6a5-40bc-b2c2-f27c3c6ed1e1" path="/var/lib/kubelet/pods/39eb777b-f6a5-40bc-b2c2-f27c3c6ed1e1/volumes" Sep 30 08:06:52 crc kubenswrapper[4691]: I0930 08:06:52.850389 4691 patch_prober.go:28] interesting pod/machine-config-daemon-4w4k6 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Sep 30 08:06:52 crc kubenswrapper[4691]: I0930 08:06:52.851230 4691 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4w4k6" podUID="69b46ade-8260-448f-84b7-506632d23ff9" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused"